What is the Google Panda Update? How has it affected the SERPs? Which tactics might be taken to minimise the effects of the Google Panda Update?
The recent Panda, or ‘Farmer’, update to the Google search engine algorithm has had some interesting and unexpected results in the Search Engines Results Pages (SERPS). Working for the Search team at the Manchester digital media agency Fast Web Media, it is vital for me to understand the purpose, initial impact and long term implications of the Google Panda update and shape SEO strategy for our clients accordingly. Considering all that I’ve learnt on the SSMM course at Salford, I thought I’d tackle this Panda head-on.
The Panda first reared its head in February 2011 in the US, whilst the rest of the world watched and waited for Panda paw prints to appear in the rankings. Sites in the UK that received significant traffic from the US quickly saw if the Panda update would affect them once it was rolled out globally in March.
The Google Panda update had one principle aim:
Reduce SERP rankings for Low Quality Sites—i.e. sites with low value to users, generally containing unoriginal or shallow content.
The intended targets in the Panda’s sights? Low-quality domains which had little user trust and contributed trivial levels of information or services, such as, affiliate sites containing a high volume of content scraped from legitimate sites. As expected, such sites lost significant visibility in the US Google-SERPs at the end of February. But what else was hit? This would apparently affect the results of some 12% of search queries. An independent SEO software firm, Sistrix, collated a lot of data and published a list of some of the winners and losers from the update (although the recent article from State of Search questions these figures and their severity)
The main losers appear to be:
You can see a more recent list of those affected on Pete Young’s blog (of SSMM fame!) On first look, what is similar about these sites? Shallow content? Poor structure? Prolific use of ads? Poor content and aggressive ad placement generally results in poor user engagement – you are unlikely to stay on a site for long if it’s full of inane drivel and haranguing you with pop-ups.
Google’s algorithm has previously proven capable of identifying nonsensical spam (e.g keyword stuffing) but Panda’s mission is to identify shallow-content, low quality sites. A supposed by-product of the Panda update was that it would help Google to identify high quality sites and reward them in the SERPs accordingly. Sites such as those belonging to established brands, which have their own original content, and ones which promoted high value user experiences would win over the heavily optimised affiliate site that allowed for no quality user engagement. This very interesting interview by Wired with Google engineers Matt Cutts and Amit Singhal in March 2011 outlines the Google thought process behind the update.
[Image reproduced from batterymouth.com please contact firstname.lastname@example.org if you object to this image being used on this site].
Some in the Search industry feel that the Panda update was a long time coming and that “wise” SEO practices will have protected against algorithmic changes that targeted low-quality content. The side effects of this ambitious update have been quite a lot of collateral damage. It has hit legitimate sites with a lot of user-generated content, such as Review Centre (see their concerned reaction to the Panda update in a blog post on the Review Centre website).
Mahalo, an information sharing site with a large and active community, suffered heavily from the Panda update and 10% of Mahalo staff were apparently fired the day after the new algorithm took effect. Mahalo is widely viewed as a content farm and is exactly the kind of site Panda should be targeting. This interesting article about Mahalo by SEOBook discusses it in more detail: SEOBook: Black Hat SEO Case Study Nevertheless, traffic being heavily cut through these changes is a grave issue for many sites and businesses, big and small. And more recently, questions have been raised as to the possibility of competitive targeting of certain Microsoft owned sites by the algorithm changes.
Saying all that, there are many ecommerce affiliates that are still holding strong positions and all their product descriptions are duplicated. I know at Fast Web Media, we can still see 2 or 3 voucher sites ranking within the top 10 for brand specific keyword searches for a particular client. Google have removed the ability for webmasters to ask for reconsiderations for those who’ve suffered from the affects of algorithm changes but you can tell them if you think you’ve been unfairly dismissed.
What can we learn from sites like Mahalo which hold some genuine value being penalised by the algorithm? Mahalo’s content base is vast and in topics so broad that it is suspiciously vague in its purpose. It’s certainly no Hitchers Guide to the Galaxy. And what of voucher sites? They often contain many broken links, timed out deals, etc. Is it this kind of sites Google is to rid the SERPs of? It is interesting to note what happened to the price comparison site, beatthatquote.com, which was also negatively affected by the Panda update. Google bought that site last month for £37million. Why would Googly punish its own acquisition, other than to appear fair in its execution of its algorithm? Is it that valuable for Google in terms of a business for price comparison or is it a knowledge/data gathering exercise? It is likely that Google are investing in comparison websites as a way of gathering information about how people interact and use such sites. Under the current Panda update, the way content is produced, structured and shared across such sites is too subtle for the algorithm to distinguish between those more low-quality sites. This first generation Panda, although quite unruly, may evolve to be something a bit more personable and sophisticated when recognising quality content in successive incarnations.
So, what do you do if have been backhanded (or “backpawed”) by the Panda? Combating Panda at a basic level boils down to examining the structure and content of your site and being sure to eliminate duplicate or shoddy content. You can start by looking at the impact on traffic and user behaviour using Google Analytics across the different pages of the site and go from there. SEO Mark Nunney clearly outlines some more detailed steps to analyse any potential impact and steps to rectify a SERP slashing in his Panda mauling survival guide. In summary, the main things to look out for are:
These are all classic SEO issues which should be addressed when implementing best practice and have been covered extensively on this SSMM course at Salford. And although we can outline what quantifies a quality user experience (high traffic, high returning traffic, low bounce rate, long amount of time spent on site, etc), just how does Google begin to identify what is “quality content” algorithmically? How can the web crawlers scan the content on sites and obtain a substantive and accurate impression of the semantic value of that page? The easiest signals to look out for if content is quality is the amount that site is shared – linked to, tweeted, social bookmarked, etc. AKA The capital of social engagement!
The issue is that “quality of content” is a highly subjective matter. How does one define “low-quality content”? The Wired interview with Cutts and Singh mentions that they compared the Panda results using the Chrome Site Blocker (allowing users to specify sites they wanted blocked from their search results) as a case study for what qualified as “low quality content”. The intuition of the algorithm can only be so sophisticated.
“The Panda eats shoots and leaves; it doesn’t go on Masterchef!”.
Google try and collect enough information and data on user behaviour to create and apply an objective algorithm to subjective matter.
Keeping this in mind, this is where I wonder if the Panda update is a pre-emptive move before rolling out Google +1…
Google has also been trying to jump on the social bandwagon of late without much success. Sites such as Google Buzz, a social messaging and information sharing site, and Google Knol, similar to Quora, have failed to crack into the social media market with any noticeable effect. Back in 2009, Google introduced Google Social Search and it has updated and improved the service constantly since then. Matt Cutts not long ago revealed that Google would start taking into account social impact of URLs in the algorithm – i.e. the more a URL is tweeted and shared on Facebook, the more gravitas that link will be given in the eyes on Google. As a result, SEO now involved more than just on-site optimisation and PPC. Social media is now the dominant force in the way internet users share and consume content, and it is playing an increasingly significant role in determining where your site appears in SERPs.
This latest update is a significant shift in the way social affects a site’s position in the SERP. Whether users are posting videos to YouTube, publishing photographs on Flickr, writing content on their blogs or just talking to their friends on Facebook and Twitter, their activity now affects a site’s authority and how it is viewed by Google.
So called “Google +1” is being trailed in American and you can beta test it on your own account at the moment in the UK. It is a way of competing with social networks, such as Facebook, but whilst also being able to glean from user behaviour what results far more relevant quickly and effectively. What is Google +1? Google will allow you to click on a +1 button next to a link as a seal of approval. And other users in your social network groups will be able to see that you’ve “+1” a link.
You can read more about Google +1 from Techcrunch and the speculations on its uses but the reason I’ve included it in this post is in the Panda update preceding Google +1. By currently being largely closed off from the social media world, Google lacks the ability to be able to analyse user behaviour on a highly social level. This is where Google +1 could act as a key to unlocking some of the data potential whilst apparently bettering the user experience of the search engine.
Allow me to elaborate: Panda has apparently hit the tech blogging community quite hard. Many of these sites are genuine hubs of collective interest. But as pointed out by Patrick Altoft in his blogstorm post, how many times do you need to hear about the same gadget review? For such forums and blogging communities, the significant drop in traffic could drastically reduce their site’s viability. Thinking long term, I wonder if such updates that negatively affect the visibility of said communities may further catalyse the way that people will interact online – less through many review sites and forums and more through social media.
Much like if someone dictated you what you could and couldn’t do at a party, you’ll probably just sit sulking in a corner or end up not even going. The Panda update is more evidence of a paradigm shift in the way that content is structured and angled more towards enabling social online. With this in mind, I was wondering if the Panda update may be pre-emptive strike that encourages websites to structure themselves favourable, ready for Google +1.
Google wants to be more than just a search engine – and its forays into all sorts of projects, most particularly with social projects such as “Buzz” and “Knol” are testament to that. Google talks about wanting to produce the best user experience possible. Why? So users continue using their services. Yet I am curious about the long term impact on social communities, such as legitimate tech forums, which have been hit by the Google algorithm by such changes. Many of the Panda victims appear to make sense, and with any algorithm change there are winners and losers. But why remove the visibility of sites that allow and foster genuine community engagement? At the end of the day, the algorithm is a scientific formula that is being applied to millions and millions of sites. It is inevitable that some genuine sites, in particular ones which do have a lot of the same content (even if it is user generated) will be hit by the update.
Under the new Panda regime, what do you do if you search for something and forums/review sites don’t show up in the top 10? You search again, you use other sites. Users navigate the SERPs more and giving Google more user behaviour data. Users may also be more inclined to use Google reviews, thus helping to promote Universal Search, etc. By hitting the review sites, I wonder if it’s not just Google trying to promote their services and in turn getting more information out of its customers. Is Panda preparing us to be more social (along with the advent of +1) by clearing the SERPs of site that had poor user engagement?
As we all well know, you cannot force online communities to be social – social sharing and communities and manifests themselves in a way that external forces can try to influence but it is often an internal catalyst which drives it and helps it take form. You can create a social space but cannot really dictate the way it is used – trying to do so often spells disaster. But social is well and truly here in the SERPs. And it will be interesting to see how the SEO community shifts and adapts strategy in the coming months post-Panda.