What’s new about Google Penguin and how to handle it
Google officially announced that they made Google Penguin part of their Core Algorithm. This means that webmasters don’t need to wait for years for a Google Penguin Update to recover the rankings of a website. Everything will happen in real time from now on.
In the past, Google did a massive scheduled refresh for all the websites affected by Penguin. They did this with every new Google Penguin update, and it always had a great impact on the SERPs. There were a couple of confirmed Google Penguin updates from 2012 until now:
- Penguin 1 on April 24, 2012 (impacting ~3.1% of queries)
- Penguin 1.1 on May 25, 2012 (impacting less than 0.1%)
- Penguin 1.2 on October 15, 2012 (impacting ~0.3% of queries)
- Penguin 2.0 on May 22, 2013 (impacting 2.3% of queries)
- Penguin 2.1 on October 4, 2013 (impacting ~1% of queries)
- Penguin 3.0 on October 17, 2014 (impacting around 1% of queries)
- Penguin 4.0 & Real Time on September 23, 2016 (Google did not give a specific number of the percentage of queries it impacted. Maybe it’s because the update is constantly happening and the percentage will constantly be changing.)
Read on. We will answer some important questions:
Since Google Penguin is no longer separate from the Google core algorithm, we will not see any other Google Penguin updates. There will be no more dramatic changes in the SERPs, as the new Penguin will run in real time.
Based on the feedback that we received from our SEO community, most SEOs find the real time Penguin to be a good thing since there will be no more massive changes coming with each update. There will also be no more long delays for Google to notice the improvements you’ve done in SEO.
How did past Google Penguin updates work?
Historically, the Google Penguin filter affected websites deemed spammy, in particular, those websites that bought links or obtained them through link networks designed primarily to boost Google rankings.
After triggering a Google Penguin filter, the rankings of the entire website dropped dramatically, to the point where the website was nowhere to be found the SERPs. It was stressful for webmasters, not only because they needed to find and clean up all their bad backlinks, but also because they needed to wait for the next Penguin update to recover.
How does the Real Time Google Penguin work?
Your website can trigger a Google Penguin filter any day if you have unnatural links. That’s why you need to keep an eye on your backlink profile and run a Link Detox report at least once a week if not more often. If you’re penalized by Google Penguin now and take the time to improve your website, Google will take all that work into consideration much faster, no need to wait for a Penguin refresh.
The work you put into analyzing your bad links and cleaning up your backlink profile will not go unnoticed by Google. According to their announcement on the Official Google Webmaster Central Blog they will see your links shortly after re-crawl or re-index.
Should you wait for Google to re-crawl your links?
The answer is no. Playing the waiting game with Google is always frustrating. No matter if you have an algorithmic or manual penalty.
You can make Google take notice of your disavow file right after you uploaded it to Google Search Console. Link Detox Boost speeds link indexation and your recovery from a Google Penalty.
Your website may have a lot of spammy links (like forum profile links on some long abandoned forum or link directory). Link Detox helps you find these links; you can disavow them, but there is a good chance that these links have a very low crawl rate. That means you will be waiting a very long time before you can see any changes in your search engine rankings.
Google Penguin is more granular now. What exactly does that mean?
In the official announcement of the Google Penguin 4.0 roll out, Google pointed out what’s different about this Penguin update. Apart from the real-time aspect, Google Penguin is also more granular.
“Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting the ranking of the whole site. “
Many webmasters ask themselves what does “granularity” mean? The fact that Google Penguin will be more granular means that now Google will not necessarily penalize a website as a whole, but it can also penalize only parts of it. Google can penalize a domain, a sub-directory, a group of keywords or just a page. So anything that goes into the organic rankings will now be affected by the Penguin algorithm on a fine level.
How to diagnose websites that have partial Penguin penalties?
Link Detox (DTOX) can help detect unnatural links on a Subfolder and Subpage level. You will need to check certain parts of your website more often than the others and manage the risk of your links even on a daily basis.
Here are some examples on where you can use the subfolder analysis to avoid triggering a Google Penguin filter:
- Monitor subfolders that contain User Generated Content (UGC), as they can contain spammy comments or posts.
- E-commerce: if you run a typical e-commerce website you will have various subfolders for each product category. You can assess individual categories for their DTOXRISK.
- Blog vs. product: if you have a blog subfolder on your product website, you need to make a separate risk assessment for the blog, since links to the blog are likely to come from other sources than for general product pages.
- Homepage: many websites have most of their links point to their homepage. This makes a separate analysis worthwhile, especially when your content and deep links are just as big.
Doing a competitive research using Competitive Link Detox (CDTOX) for every main topic a site covers is also very important if you want to understand different risk level standards in your niche.
Look at the bright side of the Real Time Google Penguin
If your website triggers a Google Penguin filter, you can recover your rankings much faster. The important thing is that you need to keep an eye on your backlink profile. Disavow the bad links, get Google to do a quick re-crawl using Link Detox Boost, build quality links and monitor your backlinks at all times.
More SEO Q&A about Penguin 4.0
Q: Are spammers still getting around it with private link networks?
Spammers or SEOs in general (remember that SEO is about pushing rankings and influencing Googles results? Google tends to call them Spammers, not to confuse with E-Mail spammers) have always been trying and testing with “private link networks” and will continue to do so.
The success will depend on how good those networks are setup. One popular off-label use for Link Detox from aggressive SEOs is actually to “proof” their networks and reduce the number of factors found in unnatural link networks with it.
Q: How will the real spammy sites be affected (or benefited) by the little “Pengy 4.0“?
The “real spammy” varies on a level from “not spammy in Google’s, but user’s eyes” to “very aggressive link spam”. Many SEOs will benefit from the faster responses and higher granularity, which both allow SEO tests to work better.
Q: When do you expect Google Penguin to roll out completely?
The roll out will never be „complete.“ As Google crawls the web ongoing and it will change on an ongoing basis. The question probably refers to “when will I see all the effects for my website” and that happens when all your links and all their links and all their links, etc. were crawled across the web, so I would give this a couple of weeks until than we will still see fluctuations.
Q: How often do we need to disavow?
Penguin 4.0 now detects spam signals in real time. This makes Link Risk Management, the practice of proactive link audits and disavows, even more, important for your business – and we always recommended to do it at least on a weekly basis. Just have in mind that Google crawls the web daily.
Q: Everybody talks about the "real time" factor of the new Penguin. But, what about the more granularity? Will be toxic links less contagious?
We assume that for massive spamming actions still whole site penalties are possible. After all “whole domain” is also a granularity, similar to page-level or directly level. Luckily Link Detox supports all of those granularities already since 2014.
Q: What SEO metrics should you consider for evaluating unnatural links?
Apart from looking at the LRT Power*Trust, you will also need to look at the DTOXRISK on a link and directory basis.