Webbroi

View Original

Google Penguin Update Explained

It has been more than a year since Google launched the infamous Panda algorithmic update on February 22nd, 2011. The goal of this algorithm is to reduce rankings of low quality sites, content mills and sites that don't generate enough value or user interest. Prior to Panda, web search was largely being dominated by content factories who would churn out content just for the sake of gaining higher ranks on Google search results.

Google had to do something about this because if a large amount of low quality sites show up on the first page, it creates a very bad user experience. Hence, Panda was launched to target content farms and spam sites who produce or spin articles in bulk. This update in an ongoing update and it is being run at regular intervals, to ensure that spam sites are kept at bay and worthy content get the rank they deserve.

The results were fabulous as well as disastrous, depending upon the perspective. On one hand, thousands of spam sites literally disappeared from Google search while on the other hand, a lot of legitimate sites were hit and wrongly affected by this algorithmic change. Some popular examples are CultofMac, PocketLint, and Digital Inspiration.

Worse, a lot of webmasters reported that after the launch of Panda, original content was outranked by content farms.

Now you may wonder: ”Why were good sites were affected by this change? Google’s so called algorithmic update is targetted towards content farms but my site is not at all a content farm. My content has purely hand written content but when this algorithmic change was launched, I lost 70% of the traffic in one shot.”

The short answer is that an algorithm is an algorithm and it can not make any exceptions whatsoever. Low quality content on one part of a site can affect the rankings of other pages, so regardless of however small or big your site might be - it is high time you get rid off the junk pages and focus on creating valuable content.

What Is Google Penguin Update All About?

After Panda, Google has launched another algorithmic update to fight webspam. This algorithmic change, better known as the “Penguin update” was launched on April 24, 2012 and it affects 3% of all search queries.

Panda was launched to address the issue of low quality content. But Penguin has a slightly different goal - to find sites that violate Google’s content quality guidelines and use unfair means to gain higher ranks in Google search results. No wonder, millions of site owners use all sorts of black hat techniques to gain rankings and Penguin has been launched to address all those issues.

Compared to Panda, Penguin is broader in nature. What this means is that even if you have high quality content on your site, if you have used black hat techniques to gain rankings, your site’s traffic will take a huge dip (after the site gets flagged by Penguin’s filter).

In some extreme cases, it will be difficult to recover from Penguin’s hands because once you’ve done a lot of low quality link building, it is impossible to reverse the effect. That’s not the case with Panda, as it is possible to recover from Panda’s grip, provided you remove or improve all the low quality content on your site.

To know whether your site was affected by Google’s Penguin update, check your site analytics program (Google Analytics, Sitemeter, etc). Do you see a big dive in the organic traffic from Google on April 24th, 2012?

If "Yes" is the answer, I am afraid, your site has been hit by Penguin.

What Is Google Penguin Update All About?

Is Your Site Vulnerable To The Penguin Effect?

If you have used any of these techniques to inflate your site’s rankings or to gain greater visibility in Google search results, your site is vulnerable to the penguin effect:

Keyword stuffing: Do you stuff your site’s content with keywords? Do you use a software to automatically add all the important keywords randomly in a blog post? This is nothing but keyword stuffing and it creates a horrible user experience. There is no way to undo this except delete those keyword stuffed pages from your site.

Google has shared an example which shows how a webmaster has stuffed the source code of his page with thousands of keywords:

If you’ve done this on a large number of pages in a domain, the entire domain will be affected by the Penguin change.

Duplicate content or spun content: Adding plagiarized content or duplicate text to a significant number of pages increases the chances of being affected by an algorithmic update. Although Penguin is not specifically meant to address the “duplicate content” issue but it is very likely that sooner or later, your site will fall in that pit.

Link exchanges: This is where it gets critical.

Have you done a lot of low quality link building and bought tons of hundreds of links from cheap link directories, blogs, forums? Purchasing links from random sources is strictly a black hat technique and according to Google’s webmaster quality guidelines, it is not the attribute of a high quality website. The worse part is when you have tried to overwhelm the pagerank of a page through anchor text spamming. (I highly recommend reading this piece by Danny Goodwin - Anchor text spamming and link relevancy)

Unnecessary links to irrelevant sources and unnatural links from irrelevant sources: Time and again, Google Engineer Matt Cutts have stressed on the fact that the best links are those that are born organically.

You have to create a wonderful resource first and if it is really useful, links will flow on its own, without you having to knock doors and offer money. However, if you try to build a diverse backlink portfolio by buying links from irrelevant sources, it is not going to help in the bigger picture.

Google elaborates this case with an example where the publisher has linked to irrelevant sites that do not match with the subject of the blog post. Moreover, the page has no clear focus and talks about random topics in different paragraphs.

The reason is simple. Search engines can precisely detect whether the source website and the target website are contextually related and whether the page in question is really worthy of 206 backlinks (just an example). The moment you cross the line and try to manipulate things mechanically, the algorithms find you and the result is nothing but a domain that will rank poorly.

Over optimized metadata: Do you write Titles and meta descriptions thinking about search engines? If Yes, I would advise you to change that practice.

You should write for the user and not for search engines because whatever helps the user and entices him to click a link from the search result page, should also help the search engines. If your page’s title tag, header tag, URL and meta descriptions are stuffed with keywords and only keywords, there is a chance that your site will be affected by the Penguin update. The title and meta description of a page should read well and it should not appear over optimized with repetitive words and phrases.

###

We will discuss some ways to recover from Google’s Penguin slap in a follow up post. Stay tuned! In the meantime, were any of your sites or sites you manage affected, either positively or negatively, by Penguin?

Special thanks to Amit Banerjee and Ampercent on this post.