The only backlink analysis software you need. | LRT in Deutsch | Contact Us +43 720 116 440+1 866 347-3660+44 800 011 9736+31 85 888 1541+353 76 680 1480+1 877 736-7787
Home > Blog > Case Studies

The Power of Link Risk Management in the Real Time Penguin Era

seo-superhero-list

This case study was created using an LRT Superhero account.

Some of the use cases explained in this case study are not available in lower plans.

The LRT Superhero Plan (and higher) includes all our 25 link data sources and allows you to perform link risk management, competitive research, professional SEO and backlink analysis for your own or your competitor's sites. You get to see your website's full backlink profile picture and this can make all the difference for your SEO success.

the-power-of-link-risk-management-real-time-penguinCompetitive Link Risk Management in the Italian ads market

Our LRT Certified Professional, Stefano Robbi, provided us with a great case-study already showing some fluctuations of the Real Time Penguin in the Italian ads market.

He gives the example of an anonymous client, who managed to outperform his competitors by auditing and then reducing his link risk.

It is a very interesting story, as the goal was to protect his client's site against Google Penguin in the first place. This is yet another prove, that a strong backlink profile is key to success in today's SEO. We are very excited to get some real-life footage about Penguin 4.0 penalties and very proud that Stefano decided to share his findings with us.

We are also very glad that our tools helped Stefano boost his client's rankings so well while reducing his risk of being penalized.

We look forward to your feedback and help spread the word on this!

- Enjoy & Learn!

Christoph C. Cemper
and the team of
LinkResearchTools (LRT)

Bonus: Download the full version of this case study as PDF for easy print or offline reading!

Introduction

The algorithmic update known as Google Real Time Penguin started its roll-out on 23rd of September 2016. In these days we were hearing the first testimonials of recovery of some websites that were penalized by the previous update (Penguin 3.0) in October 2014, while other websites are struggling with a sudden ranking decrease.

In this time of general uncertainty on how to proceed, savvy web marketers are investing financial and human resources to determine the risk profile of their website. Although it’s been years since the previous official update of Penguin (October 2014), the memory of negative trends is still there, recorded by those penalized websites and the idea that this could happen to their businesses is increasingly tangible for the entrepreneurs.

Some websites lost 30% of their traffic, others 45% and the most affected ones lost more than three-quarters of their total organic visits. Penalties of such magnitude inevitably have an immediate impact on the business, resulting in lost sales, lower profits and often layoffs.

Facing the fear of an imminent threat, every successful manager needs to make decisions that will save his/her company, but at the same time must be able to weigh all the alternatives at his/her disposal to see if the threat can be transformed into opportunities for a professional or personal growth.

Brief excursus: the client

In March 2016, we were contacted by an Italian client who works in the online ads’ market (jobs, real estate’s ads, etc.). The company, of which we can’t reveal its name having signed a Non-Disclosure Agreement, was at that time one of the top 10 players in its sector, with earnings and revenues growing year after year, but with a market share well below the first three competitors.

The Italian ads’ market, in addition to being extremely competitive, both online and offline, is also an environment where links to spam and low-quality content are regular.

Nonetheless, the website in question was going through a relatively positive period with moderate but steady monthly growth of organic traffic. The director of the company contacted us expressing his concern because he noticed that three of its competitors, who had recorded the best growth trend in the last year, had just been penalized by Google. His fear was to be the next on the Mountain View’s blacklist.

Competitive environment analysis

To perform a link audit means to analyze in detail the inbound links of a particular website, understand their overall risk profile and implement the steps necessary to eliminate the identified threats.

Unlike what is commonly thought, a professional link audit not only involves the website of the client but should include even the major competitors in the field. As for the identification of links, it is sufficient to focus on a single website, to carefully ponder which of these links are risky or dangerous we must be aware of the inmate link’s standard profile of a top player.

In a hyper-competitive market, in fact, such as car’s insurance, the risk of a major player’s link profile is much greater than in less concentrated sectors such as, for example, that of agricultural products. For this reason, our analysis started from the first evaluation of the market in which our client was added. To serve this purpose we used three instruments included in your LRT Superhero subscription:

Bulk URL Analyzer (Juice Tool)

In order to understand the principal competitive dynamics that exist between various players in the market, the Bulk URL Analyzer is a great starting point. With this tool, we can quickly get an initial overview of the sector's competitiveness in which our client operates.

Unlike other tools that we will analyze later in this case study, the Bulk URL Analyzer can simultaneously analyze tens or even hundreds of different domains.

In our case, we have not only used this tool to analyze the major competitors in the market but also to locate them. The program has a built-in feature that allows the user to "Find Competing Pages." To do so, you must simply enter some keywords of interest, and the program will automatically provide the first 20, 30, 50 Google domain results or URLs.

In just a few seconds we can get a list of the major players in our industry, and we have all the input data necessary to perform the analysis.

find-competing-pages

bulk-url-analyzer

Before we proceed, however, you must tell the program which mode you prefer the analysis to be performed: "sample analysis" or "full analysis." This point needs some clarification: the "sample analysis" allows you to get very quick results (only a few minutes), but the program will only return data that has in memory for websites that we want to analyze.

Almost all SEO tools on the market today work this way, but the problem is that it returns a set of data measured in the past that may differ drastically from the actual value. The real added value of the LinkResearchTools suite is its unique ability to perform a full re-crawl and verify it in real time before returning the results to the user. Through the "Full Analysis," the program will perform a real-time control of all the off-page metrics analyzed and return an accurate result and updated with each query.

That is why the advice is always to select the "Full Analysis" whenever you need accurate data to make critical decisions and radicals. In the end, the wait for the updated results will be rewarded!

report-mode

After selecting the "Full Analysis" and entered the top 20 sites placed on Google.it for the main keyword of the sector in which our client operates, we'll get the results in less than 20 minutes.

For each domain the Bulk URL Analyzer can restore up to 20 different metrics, ranging by seniority to Power and Trust domain, not to mention the Traffic Indicators and data on social media profiles connected. Again, LRT can do an excellent job in partnership with the API of the leading SEO tools on the market today (Moz, SISTRIX, SEMRush ...): in a single search, we will return all the data for these metrics for each of the sites analyzed.

Below is an excerpt of the result obtained with only six metrics, for easy reading.

bulk-url-backlinks

Thanks to this tool we can have a general overview of the market sector in which we are competing in these very early stages of analysis.

By analyzing all the metrics provided by the Bulk URL Analyzer we could make many hypotheses, but for now let's highlight just two significant points:

  • The Power*Trust score of 20 domains analyzed has a very high variability. While four competitors have a Power*Trust equal to or greater than 25, most of them are included between values of 9 and 15.
  • Also, the number of keywords for which various competitors have placements on Google is very heterogeneous. The most experienced players have tens of thousands of keywords at the top of the organic results, while smaller ones are focused on niche keywords.

What does all this mean? It means that the market sees that we are analyzing the simultaneous presence of both large websites and smaller competitors, but both fail to appear on the top 20 of the organic results of Google. The differences regarding overall link profile are evident from the graph above, but the defensive strategy in the niche market implemented by smaller competitors allows them to compete effectively against a player with more resources.

At this point, after having obtained a first general overview of the market, we get more details from our analysis, and we focus on a smaller number of competitors, to look more closely into their competitive strategies.

Quick Domain Compare (QDC)

Thanks to this tool, LRT allows us to perform a more specific analysis to compare the website of our customer with four other competitors that we can choose according to the purpose of our analysis. Having run the report previously with the Bulk URL Analyzer, we are now in a position to consciously decide to which player we want to make a more specific comparison.

The results provided by QDC obviously depend on the websites you decide to compare your domain to. To get a clear picture of the market, to simply compare your website with the top four players in the sector could provide you with a limiting perspective. In our analysis, we compare the website of our client with the four competitors ahead of him and those who are just behind him in the chart.

Below you’ll find the first indications we can find already from this first initial analysis:

qdc-compare

By comparing our website with the top players, what we supported the initial analysis becomes evident: the competitive gap between the top three market leaders and the rest of the competitors is extremely relevant. The lowest LRT Power*Trust of the top three is 16, while from the 4th competitor onward the highest value is 6. In terms of the absolute number of links per domain, the lowest value of the three market stops at 114,822, while from 4° onward the highest value stops at 38,157.

While the first three players play a separated match, still we do not know how close or not the immediate pursuers are. That's why you need to repeat the same QDC report also analyzing the four direct competitors that precede our customers regarding organic rankings.

qdc-compare-2

The situation, in this case, is diametrically opposite. From the 5th to 9th result of Google for the main keyword of that sector, almost all websites (including the client’s) have the same value of the metric LRT Power*Trust. This does not mean that the five analyzed websites have the same Power and Trust rank, but that the product of these two quantities is very similar.

Like other cases of previously analyzed studies have shown, Penguin’s algorithmic penalties have often involved websites that showed Power values higher than the Trust rank values, which often means a large number of links but from a lower domain authority.

It’s interesting to notice that jobnetwork.it is the only competitor that shows a negative Link Velocity Trend (LVT) and at the same time is the only one for which the Power is greater than the Trust of the received link. This is possibly an early sign that the website has begun to change the methodology of the links’ acquisition, shifting its focus from quantity to quality.

Before proceeding with a more detailed analysis, we still have to compare the website of our client with the competitors that follow right behind him in the organic results of Google.

qdc-compare-3

Even in this case, it’s interesting to notice how the value of LRT Power*Trust is identical in three of the five websites analyzed. Then, by linking the two websites with different values with their trend of acquisition of the link speed, we come to another conclusion.

The one that has the highest value of LRT Power*Trust has a decreasing Link Velocity Trend (-25%); the one that instead has the lowest LRT Power*Trust value, is doing everything’s possible to gain new links quickly (+ 100%) and reach the value of LRT Power*Trust of its competitors. The tendency of these two companies, therefore, seems, at first sight, to converge on the average value of the other player.

After these initial analyses performed with the Quick Domain Compare, we now have a clearer understanding of the market the website of our customer is part of. In particular, we can deduce that:

  • the first three players have a very high competitive gap in terms of link juice compared to all other trackers;
  • from the fourth to fifteenth organic result, the value of LRT Power*Trust of the various websites is very similar;
  • in the first fifteen results, there are websites with incoming links much higher than average. Some of them today are slowing their pace of acquiring new links in favor of superior quality new links.

Based on these initial evaluations and being aware that the website of our client represented an organic ranking in 9th position of Google for the main keyword in the industry, we realized that there would be room for maneuver to make a step forward in the rankings, positioning it at the limit of the first three results.

To achieve this it is necessary to carry out another, much more in-depth analysis, highlight the differences between the website of our customer and the top three competitors. Once those have been identified, the next step will be to design a strategy that will gradually go to reduce them.

Competitive Landscape Analyzer (CLA)

cla

Once you click on the tool’s name, inside the dashboard of LRT, a window appears immediately: it asks the names of the URL of the website you want to analyze and of the websites we want to compare it with.

If we do not know the name of the main competitors, we can use the built-in feature of Bulk Url Analyzer we saw before for finding competing pages.

It is important to point out that this instrument can return results both regarding domains, and more specific results in terms of individual pages. If the work is carried out on the entirety of the website, it is almost always advisable to select "Domains" from the last drop-down menu.

Once the ten websites on which to perform competitive analysis have been identified, you must select which advanced metrics we want the tool to generate a report of.

With the LRT Superhero account, you can choose up to 10 advanced metrics for each analysis and up to 15 with the Enterprise account. Depending on the account available, simply repeat the same analysis two or three times by selecting different metrics to get the detail of each metric listed above.

Below we show some of the main metrics available in the Competitive Landscape Analyzer:

cla-metrics

After a few hours of analysis, the overall results have come out. Below is the main graphic:

metric-comparison-keyword

Here’s the first problem. The website’s analysis shows a clear disproportion of inbound links having money keywords as anchor text: just 26% versus 7% in the first three websites. The same disparity is reflected in the opposite way on the brand anchor text values: 54% of our client's website, against the 88% of the top players.

Links with money keywords as anchor text are the most serious threat to the health of the website and in the case studied the clear disproportion compared to the average values of the best-positioned competitors should trigger an alarm signal.

metric-comparison-deeplink

Another problem emerges quite clearly from the second chart, in which the inbound links directing to the website's homepage versus the direct inbound links to internal pages are coupled.

As you can see, the website of our client has an excessive amount of links directed to its homepage compared to the top three, while it is quite in line with the average of the top ten organic results. All this is consistent with the current position of the website on Google (8th), but it provides important information in regards to how the company will approach its earning links future campaigns. Another couple of graphs and then we will return to this point.

Now we ask ourselves: what kind of inbound links have our various competitors acquired?

metric-comparison-linkstatus

In this case, too, the metrics recorded on average from the first ten websites are aligned with those of our customer’s website, but the first three have different values: they have a greater amount of NoFollow links compared to their direct followers. The difference in percentage is reduced, unlike the previous graphics, but the existence of a proportional relationship between the organic rankings of the website and the percentage NoFollow links is still an indicative value in this area.

However, we must not believe that it is enough to increase the incidence of NoFollow inbound links to boost organic rankings, but it’s advised to take note of the current situation for all the following feedbacks we are going to give.

Before proceeding any further, it is important to consider at least two other important elements. The first one is the type of inbound links of the various websites we analyzed and effectively summarized in the following graph.

metric-comparison-linktype

As can be seen, 94% of total inbound links of our customer are in text form, and only 3% come from images. Normally these values are out of proportion, but in the market in which the website operates the differences between top players are not enough to justify penalties.

The only improvement that can be highlighted in this case concerns the opportunity to improve the overall link profile by acquiring links in the form of images rather than text. Perhaps through infographics or other techniques, as the website of our customer only has 3% of links from images and increase it would help to provide uniformity to the algorithms of the search engines.

The last important aspect emerged from the Competitive Landscape Analyzer refers to the distribution of links in relation to the metric LRT Power*Trust, which as we have already explained, defines the numerical value of a link. To put it simply, the higher the value of LRT Power*Trust of a link, the greater the weight of that particular link in the determination of the organic ranking of a website.

metric-comparison-powertrust

The distribution, in this case, is not equal. Our client has many (probably too many) incoming links from websites with LRT Power*Trust zero and very few links from authoritative websites LRT Power*Trust > 5.

At the end of this extensive analysis carried out with the Competitive Landscape Analyzer, we can now summarize the most important considerations about the link of our client’s profile.

  1. An excessive amount of money keywords as anchor text for inbound links (+20% compared to the top 3 players).
  2. An incorrect distribution of inbound links in the various pages of the website: the total number of links pointing to the homepage than the inside pages exceeds by 14% the ideal value of the top 3 players.
  3. The total of links obtained is properly divided between NoFollows and DoFollows, but you could optimize the result by increasing about a 5% the incidence of NoFollows. The same reasoning applies to the inbound links, because by increasing the incidence of links from images you could give much more uniformity to the overall link profile.
  4. The inbound links received have an LRT Power*Trust too low. 85% of them have an LRT Power*Trust equal to 0 and no greater than the limit value 8.

Link Audit

The theory behind a successful Link Audit

All the considerations just formulated are crucial, as they allow us to understand how we should optimize your website link profile to optimize the performance in the organic results of search engines.

We will now start the real work of link audit, and for each inbound link, we should understand whether and why it should be removed. However, thanks to the joint work with the Quick Domain Compare (QDC) and the Competitive Landscape Analyzer (CLA), we still know that we should try to remove potentially harmful links before we start. At the same time, we can act in order to smooth out the differences that emerged in the four points listed above.

4 steps to perform a Link Audit

  1. Firstly, it’s necessary to find the largest possible number of inbound links. To do so, we can use various tools such as Google Search Console (Google Webmaster Tools), Ahrefs, Searchmetrics, Alexa and many others. Each instrument has its own peculiarities and is able to identify specific types of links. By putting together the results obtained in the individual analysis you can re-create a full link profile, very similar to the real one. LinkResearchTools can combine all this data via API integration and serves as the platform to make decisions on the overall aggregated and recrawled results.
  2. Once identified, the links are then processed and analyzed to determine the relative risk. This coefficient, as we have repeatedly seen in the paragraphs above, is strongly influenced by the characteristics of the market in which the website operates. And for this, we have thoroughly analyzed its competitors before studying the link audit properly.
  3. Once the risk factor has been established, we must act in parallel on two fronts. On the one hand, webmasters have to remove negative links, on the other hand, a Disavow File with which it asks Google not to consider the link that remained negative has to be created.
  4. Speed up the whole process that Google must make to take note of the removals that took place and the Disavow File.

The practice: how we did the Link Audit

After using all the tools at our disposal to download the list of inbound links of our client, we have compiled a single list on a text file. It is normal that in the list the same links appear multiple times, simply because they can be detected by different software. Before proceeding further, we can import the old list of links in any data processing program and remove duplicates to assess the risk profile of the identified link.

With Excel, for example, you simply go to the "Data" menu and select "Remove duplicate."

link-audit

At this point, we’ve got a list of domains and unique pages that link to the website of our client. To proceed with the determination of the risk coefficient, we have to use the tool Link Detox.

Why Link Detox?

  1. Unlike any other instrument, Link Detox carries a weighting of threats regarding inbound links in real time, and this is crucial for us. Real time means that when activated, it does not show the data that the program stored in the past or actually in the cache, but every time it starts a new scan of each link to see if it is still active or has been removed. If you want to get the maximum effectiveness from the link audit, we can’t ignore real-time analysis.
  2. It can also report links that no longer exist or that webmasters have removed in the past. As we will see, it is also important to consider the links that no longer exist in order to notify Google of the Otherwise, it would not find out about them quickly enough. It provides the user with the ability to choose whether to carry out the analysis also considering the NoFollow links as a potential threat or not. There’s an endless debate between those who argue that the NoFollow link can’t constitute a danger and who does not. There’s even a research conducted in 2014 on the risk of NoFollows. Christoph Cemper, the founder of LRT, has already expressed his opinion about it but leaves the user to decide how to treat the NoFollow in their analysis.

I will not dwell further into the NoFollow\DoFollow debate, but let me give just one consideration. In case you do feel that NoFollow links don’t represent a problem, in any link audit you could easily forget all these links. However, if you were wrong, or Google was to change its mind in the future, the remaining NoFollow link could pose a serious threat to the website on which you have carried out the audit.

In the opposite case, however, suppose that the NoFollow links are a problem for you and so you evaluated them and weighted just as you would with any other DoFollow link. If you were wrong and Google did not consider them, what would happen? Most likely the website won’t face any problem: it would not lose rankings or be subjected to risks, but the only disadvantage is that you would have worked more to consider all the NoFollow links without any real advantage.

nofollow

It should also be considered that by enabling the evaluation of the NoFollow links the risk profile of a website calculated by the instruments is almost always greater than the assessment without the NoFollows. For this reason, if you can reduce the overall risk profile at an average market value with the active NoFollow, the website will probably be safe from any algorithmic penalties (regardless of the fact that Google assigns a risk or not to NoFollow links).

That's why in the link audit I recommend in almost all cases to enable the NoFollow evaluation. Are there exceptions? Yes, there are. But it would require a separate article to deal with them! Now let's focus on the continuation of our link audit.

After generating a complete list of our website backlinks and filtered by any duplicates with Excel or any other tool we are going to load it into the analysis program DTOX using the feedback functionality.

upload-backlinks

If we have already created a Disavow File in the past for the same website, we can add it with the second screenshot button above and ask DTOX to consider it as a removed disavowed link.

The instrument will then start scanning all the links inserted in real time, which will add the links that DTOX automatically found from 24 different data sources and your own APIs provided (e.g. Ahrefs, Sistrix). After a bit of time, which can vary from 15 minutes to several hours, depending on the number of loaded backlinks, the program will return us the total value of the risk of the website with a graphic to support its thesis.

Here is the value obtained from the website of our Client.

dtoxrisk-veryhigh

It may be easy to notice, that the higher the number, the more the red bar that covers the semi-circle is colored red, the more the website is at risk to get penalties. In our case, the red bar has completely covered the semicircle within which it operates, which means that the risk of penalization for our customers is the highest level.

Do you remember the considerations we made earlier in regards to NoFollow links? Here is the calculation of the website risk score without taking into account the NoFollow links:

dtoxrisk-high

As already mentioned before, the overall risk of a website tends to decrease as we ignore inbound links of the NoFollow type. By doing so, the risk is lower but is still on a too dangerous level. For this, it is necessary to intervene and take action as soon as possible.

During our initial analysis we assumed that the spam was very common in the Italian ads market, and now that we are digging more closely into its dynamics, we can confirm it without any doubts.

Does that mean that all links pointing to the website in question are penalized? Of course not. On the website analyzed, in fact, DTOX recommended to remove 50% of the links and to evaluate at least another 16% of inbound links carefully; 34% of the total links are classified, finally, as low risk.

dtoxrisk-classes

Link Detox is an amazing tool which can identify link patterns which represent potential threats to a website, thanks to its sophisticated algorithms. In the link audit, however, an instrument cannot completely replace the evaluation of a professional SEO; every instrument has the objective to provide SEO to the highest possible number of data and supporting analysis, but a professional SEO will always have to make the final decision whether or not to delete a certain link.

At this point, we realized that the risk associated with the website in question is very high, and we must not only act quickly to eliminate threats as soon as possible, but we must also act in a remarkable way. Bringing back the link profile to an acceptable risk (<700) is a long and dangerous job: on the one hand, we should eliminate all the negative links, and on the other, we’ll need to be aware that a website without links will almost completely lose its organic ranking.

The best solution is always a balance between the two, which realigns the link profile of our customers with the top competitors in the market and lower the overall risk. For this purpose, human intervention is necessary, which aims at analyzing each link to manually evaluate if it is the case to keep it or not.

However, DTOX comes to help on this, thanks to its "Link Detox Screener" that allows the user to view quickly and thorough at the same time all individual links in the entrance to the website that we are analyzing.

Link Detox Screener not only provides us with a preview of the page on which the link was posted but also some metrics decisive in the assessment of the maintenance or removal of the link:

  • anchor text: the words from which the link originates
  • DTOXRISK: overall measurement of the link’s risk profile
  • sitewide links: the quantity of links present in the domain and linked to the linked website
  • rules triggered: what DTOX rules have been activated to classify the link as dangerous.

dtoxrisk-details

In the specific case of this link, DTOX tells us that the risk factor is high (2225) as the TOX3 rules, SUSP4, SUSP1 were operated. Let us briefly see what this means:

  • SUSP4: “The SUSP4 means that the homepage of the website the link comes from does not rank for the title of the page, which usually indicates that the page or domain has been penalized”. In this case, the title of the homepage of the linked website is quite specific, as well as the DA of the linked website. All this suggests that the linked website has been subject to Google’s penalties. Let’s check with Searchmetrics:

searchmetrics-visibility

The graph is quite clear! DTOX was right: the linked domain was most likely subject to a significant drop in organic traffic in the previous year.

  • SUSP1: “The SUSP1 rule means that the link is coming from a page on a very weak domain that has no external links. This is often the case for links from forums or when the link is coming from some special automated spamming activity or listing in a link directory.” This rule tells us essentially that the website analyzed could be a directory or at least a website with a very low DA. Also, in this case, DTOX is correct, the website that we are analyzing is just a directory.
  • TOX3: “The TOX3 rule means that the Link Detox Genesis algorithm classified this link as very unnatural […]. If you agree with our opinion and the link risk estimation expressed in our calculated DTOXRISK score, you should disavow these links as quickly as possible, and then try to get them removed”. This is the overall evaluation of DTOX. At the end of all its mathematical analysis, it determines that the links concerned are highly unnatural. Thanks to DTOX Screener we have manually confirmed that the link was artificial (being very low profile directory) and supported DTOX’s theory.

At this point, we can confirm that the link in question is dangerous and must be included in the list of links that should be removed from the website of our client. The same process must be repeated for each incoming link of our customer to formulate an overall judgment for each link.

It’s a long process, which often takes several days of work. It’s necessary to evaluate not only the intrinsic characteristics of the individual links but also to take into account all the estimates developed in the market and competitors in the first few paragraphs of this analysis, mainly thanks to Competitive Landscape Analyzer (CLA).

In evaluating thousands and thousands of inbound links, in fact, the evaluation will not always be as simple and intuitive as in the case we have just analyzed. In many cases, it will be necessary to consider how the removal of a link may affect the overall risk profile of the website and also negatively affect the total organic traffic.

We have mentioned earlier that the safest option would be to remove a large number of links from the website of our client, but the inevitable consequence would be a considerable loss in ranking. In a competitive industry, a website with no links can hardly withstand the competition.

In our case, thanks to CLA, we knew that during the next audit we should have mainly removed:

  • links with money keywords as anchor text (and reduce the overall incidence of these links by around 15-20% of total links);
  • dangerous links directing to the website’s homepage (to reduce the incidence of at least 10% of the total);
  • incoming links we have received with very low LRT Power*Trust (less than 2).

Consequently, we opted to hold and NOT remove some links that would be judged as risky if analyzed by themselves, but in the overall profile of the website did not constitute a serious threat. It’s the case for example of links to internal pages of the website with high LRT Power*Trust and branded anchor text. In these cases, although the linked website could be a directory, we decided to keep it.

This is why it is always advisable to perform a manual review of every single link before deciding whether to keep it or remove it. The information that DTOX provides is critical to perform an in-depth analysis, but then the decision as well as the resulting liability remains with the SEO.

At the end of the manual analysis of the various links, we have a list of about 746 links to request the removal for. Before proceeding further, we have verified the theoretical risk coefficient website after removing the identified threats. Here is the result containing the active NoFollow evaluation:

dtoxrisk-average

The result is satisfactory. Despite not having removed all potentially dangerous links, the overall risk coefficient was lowered significantly and returned to a safer area.

At this point, we can conclude that the analysis of the link to be removed has been accomplished successfully, but the job is not finished yet.

Disavowing the bad links

We have identified the links to be removed; now we have to remove them! Probably many of you already know that you can create a file called "Disavow," with which you can request Google not to consider (for better or worse) some inbound links to a particular website.

However, before making this request, it is good practice to manually contact the linked domains and ask them to remove the risky links. Google itself considers proceeding this way to demonstrate commitment and determination in trying to solve the problem emerged. So, before compiling the Disavow, we proceeded to contact all the websites that we were able to find online with the corresponding telephone or email contact details.

After about two weeks since we sent the last removal request’s email, we removed about 110 links on a total of 746. Many websites we contacted did not respond to our email; others had no contact details on their website to write to, and so we got about 15% of all identified threats removed. The result is clearly not enough to eradicate the risk, so we then created the infamous disavow file with the remaining 630 links.

Important: through disavow you can ask Google not only to consider the removal of a link from a particular web page but alternatively request that no link from a particular domain is to be considered. In our case, the 630 links were grouped into a much smaller number of domains. In cases where we have considered not only risky because they have received the links in the past from a given domain but also considered risky to receive future links from the same domain, we proceeded with the request of complete removal of the whole domain.

At this point, it was time to load the disavow files to Google Webmaster Tools. Here is a screenshot of the upload:

disavow

Once you click on "Finish," the request will be sent to Google correctly. However, is our work finished?

Last but not least: speed up the disavow process

One last step is required. To understand why you need to take a step back and understand the logic of how the Disavow works.

The upload on Google Search Console of the file containing the list of links to ignore does not mean that Google has to put it into practice immediately. That is a request that the owner of the website has to make to Google, which will then have to queue to its processes and process it at your pace.

Without going too much into detail, it is important to understand, however, that the link audit becomes effective only when Google makes a complete scanning of the inbound website’s links analyzed. In fact, it’s thanks to a new scan that the search engine can understand what links have been removed from various webmasters, which are still present and which ones are to be ignored as contained in Disavow File.

If our work would end after the upload of the Disavow on Search Console, we would necessarily expect a long time for Google to make a full crawling of our website’s link.

The frequency of the crawling depends on many factors, including the authority of the domain on which they are entered in and its normal refresh rate. In cases where (as often happens) the links to remove or ignore are on low DA domains and now out of date, it can also take several weeks before Google re-visits the page from which the link in question it originated.

It is, therefore, necessary to find a way to make this process much quicker and avoid keeping for weeks or even several months for all the threats to the overall link profile of our client we have identified with the initial audit.

Fortunately, LRT has developed a tool that does just this purpose: Link Detox Boost. With this tool, we are able not only to speed up the crawling of pages or domains that interest us but also to see exactly the day and time in which Google is passed to scan on them.

Important: Link Detox Boost must be used extensively on all inbound links to the website that we are analyzing. It is not sufficient to simply scan those that are present in disavow, but we must make sure that Google understands at a glance which links have been removed. And we get this result by just letting the search engine do a full scan of each incoming link.

After using Link Detox Boost, we’ll see a screen like the following, in which the program informs us of the confirmed (or not) Google scan.

dtoxboost

At the end of this process, we completed our work, and we just need to wait for the effective confirmation of our assessments, keep monitoring the Google Analytics data or any software that measures the organic visibility of a website in search engine results.

Results and final consideration

In addition to Google Analytics, we used Sistrix.com as third-party software to monitor the performance of our audit work on the customer's website.

Sistrix produces a graph in two dimensions, in which the horizontal axis and the ordinate have the dates of a visibility scale. The more the line drawn in the graph tends upwards, the more the website is growing; Conversely, by decreasing the red line, the trend will be negative.

As mentioned during our initial analysis, we started working on this website because the customer had a strong fear of being penalized by Google, because two of its most direct competitors who used more or less the same as earning link’s strategies suffered from severe organic traffic losses.

Our goal was to make a full audit of the customer's link profile to avoid the risk of penalties in the future and to limit any loss of ranking due to the removal of inbound links.

How did it go at the end?

results

Before our intervention, the website had a normalized overall visibility in 0.183 and was at its historically highest peak. About two weeks after the loading of the disavow links and the use of Detox Boost, the visibility was already in 0.438 share with a net gain of about 140%.

The link audit work is done properly. Before receiving the penalty, it not only has dramatically reduced the risk of future penalties for links acquired in the past but also did not lose customer traffic and also allowed him an immediate and unexpected growth.

We knew from the start that the market sector in which the website was included was one of the most aggressive in terms of spam with very low profile links, as the graph above shows. Having cleaned up almost completely the website of poor quality links has enabled our customer to achieve a competitive advantage over its competitors who probably had not yet undertaken the full audit of their link profile.

In addition to the visibility measured by Sistrix, the website ranking has improved in a large number of keywords, and the new organic visitors measured by Analytics almost doubled in less than three months.

total-traffic

Websites of larger dimensions, always at the top of the charts, have been surpassed in some fundamental keywords’ chart by a competitor that only a few months before was far away.

The value of a periodic link audit and depth today is of fundamental importance for any website, especially after the arrival of Penguin 4.0 (or Real Time Penguin). The adage "better safe than sorry" applies perfectly to the Link Risk Management: to wait to receive a penalty (algorithmic or manual) before working on a tailored audit is a mistake that can dramatically affect businesses.

A preventive audit allows, on the one hand, to almost eliminate the possibility of receiving a penalty shortly and on the other, as we could see in this case study, to unlock the full potential of a website, allowing it to achieve an organic ranking as high as possible.

Even if a website hasn’t been penalized, it is normal for the inbound links profile to be polluted by poor quality links, often these links are generated by users in forums or worse by spiders or automated programs that reside on domain names without any authority or relevance. Thanks to a periodic audit we can detect and remove these links, managing to remove any resistance to the growth of the website rankings. It is for this reason that our audit has managed not only to eliminate the penalty risk but also to double the organic visibility of our client's website in 2-3 weeks.

What happened next?

As previously mentioned, in just three months since the end of our link audit, not only the website had achieved better rankings in many heterogeneous keywords and improved their rankings for the most competitive keywords in the market, but was also able to triple its overall organic visits.

Even six months after the end of the audit, the organic traffic continued to improve with no issues. The customer was fully satisfied with the work, and we are delighted to have made a significant contribution to the business prospects of the company we have worked with.

However, if you looked carefully at the Sistrix graph we entered above, you may have already noticed a problem. Ten months after our first intervention, the website began to lose some of the rankings that it had gained earlier. At first, it seemed a normal fluctuation, but it was not.

The customer called us nearly eleven months after to inform us that for some reasons he can’t seem to understand why the organic traffic is declining. At that point, we analyzed the data from GSC and GA, and we understand that indeed there is a problem. At top speed, we conducted from scratch a new link audit aimed at assessing the links acquired by the website after our previous audit (i.e. in the last 11 months).

At the end of the audit, here’s how the overall risk profile of the website looks like:

dtoxrisk-next

In less than a year from the audit, the website had naturally acquired links from very low-quality websites which could affect the overall profile risk again.

The final value is found to be equal to 1.113, therefore less than the previous audit score, 2.124, but still not sufficient to eliminate any potential danger. From personal experience, every website that shows a value above the 800 reported by the Link Detox is greatly at risk.

At that point we did again the whole process previously described. Let’s summarize it in three steps for the sake of simplicity.

  1. Contact and request the removal of all the links of which we had found a webmaster’s contact
  2. Create a new file to disavow all links which we were not able to remove
  3. Use Link Detox Boost to speed up the Google crawling of all links (not just those in Disavow File)

The last operation was completed towards the end of May 2016, when the visibility reported by SISTRIX was steady at 0.33. Three weeks later, the visibility was already growing rapidly as well as the overall organic visits. The following month the site has reached a pinnacle of visibility never achieved in the past (0.52) and the absolute organic visits records. Again we can say problem solved... at least for now!

link-audit-2

Conclusion

A link audit is a complex operation that can be used both before and after the penalization of a website, but what is important to highlight is that this is a weapon able not only to eliminate potential risks but also to produce organic and structural growth of a website very quickly.

To have a clean link profile with relevant inbound links is an authoritative source of strong competitive advantage that increases with the amount of spam in the market in which a website is operating. Today many resources are being used to acquire links, but far fewer are dedicated to the analysis of problems that the links already acquired perhaps unconsciously can create.

Until 2015, we used to advise our clients to perform a link audit at least twice in a year to avert the emergence of any problem, but in the case described here six or eight months of waiting would have been excessive. This is why there is not a certain and effective answer for each website to the question that every customer asks you, i.e. "how often I have to make a link audit not to run penalty risks?"

With the advent of the real-time Penguin, it has become even quicker to suffer a penalty, but just as quickly with the right tools you’re able to remove it. In the most competitive markets today, we run a link audit at least every fifteen days, while in the rather static ones every three months. However, there are no effective rules in advance because each site has its peculiar dynamic.

In conclusion, therefore, the advice is to pay attention to your link’s profile and assess the risks derived from it adequately. The assessment must be made in a periodic manner and constantly, in order to prevent a penalty. If a penalty had already arrived, do not just send Google the Disavow File, but use the necessary tools (i.e. Link Detox Boost) to speed up a new crawling all of your website’s links. Only in this way, you will get effective results much quicker, as we have repeatedly highlighted in this case study.

 

Stefano created this research using our Superhero Plan, which allows you to perform professional SEO and backlink analysis for you and your competitor’s sites.

stefano-robbi

LRT certified professionalA word from Christoph C. Cemper

Stefano Robbi is one of our newest LRT Certified Professionals and we are proud to welcome him to the club. 

With this case-study, Stefano delivered very useful information about today's market and I think every SEO can learn from his findings. He proved that, with the Real Time Penguin, it has become more important than ever before to do proper Link Risk Management to protect yourself against penalties. He also showed that understanding your market is one of the key components to be successful in SEO and to outperform your competitors.

Our goal is to provide our clients with quality service and knowledge. Our LRT Certified Professionals and Xperts are key to achieving this goal.

I look forward to his future work, and recommend Stefano Robbi to work with you, whenever you get the opportunity!

Stefano Robbi
With a passion for outstanding search marketing that incorporates results-driven quantitative data analysis, Stefano founded NetStrategy® - a leading Italian SEO agency - back in 2009. In addition to his wide range of search marketing skills, Stefano possesses specific and valuable strategic marketing knowledge gained through an M.Sc. in Marketing Management at Bocconi University and during previous work experience with Microsoft Italy.
Stefano Robbi
Stefano Robbi

Latest posts by Stefano Robbi (see all)

Leave a Comment