Prior to the recent arrival of Penguin 4.0, it had been nearly two years since Penguin was last updated. It was expected to roll out at the end of 2015, which then became early 2016. By the summer, some in the industry had given up on Google ever releasing Penguin 4.0. But why did it take so long?
I’d argue that criticism directed at Google is in many cases unjustified, as people often take too simplistic a view of the task at hand for the search engine.
Detecting and dealing with paid links is a lot harder than many people think, and there are likely good reasons why Google took longer than hoped to release the next iteration of Penguin.
Here are some of the challenges Google may have faced in pushing out the most recent Penguin update:
1. It has to be effective at detecting paid links
To run and deploy an effective Penguin update, Google has to have the ability to (algorithmically and at scale) determine which links violate guidelines. It’s not clear the extent to which Google is capable of this; there are plenty of case studies which show that links violating the guidelines continue to work.
However, not all paid links are created equal.
Some paid links are obviously paid for. For instance, they may have certain types of markup around them, or they may be featured within an article clearly denoted as an advertorial.
On the other hand, some links may have no telltale signs on the page that they are paid for, so determining whether or not they are paid links comes through observing patterns.
The reality is that advanced paid linking strategies will be challenging for Google to either devalue or penalize.
Penguin has historically targeted very low-quality web spam, as it is easier to distinguish and qualify, but a level above this is an opportunity. Google has to have confidence in its capability before applying a filter, due to the severity of the outcome.
2. Google is still dependent on links for the best quality search results
Maybe, just maybe, Google is actually capable of detecting paid links but chooses not to devalue all of them.
Most people will be familiar with third-party tools that perform link analyses to assess which links are “toxic” and will potentially be harming search performance. Users know that sometimes these tools get it wrong, but generally they’re pretty good.
I think it is fair to assume that Google has a lot more resources available to do this, so in theory they should be better than third-party tools at detecting paid links.
Google has experimented with removing links from their index with negative consequences for the quality of search results. It would be interesting to see the quality of search results when they vary the spammy link threshold of Penguin.
It’s possible that even though certain links are not compliant with webmaster guidelines, they still assist Google in their number one goal of returning users the best quality search results. For the time being, they might still be of use to Google.
3. Negative SEO remains a reality
If Google is sure that a link has been orchestrated, it is very difficult for the search engine to also be sure whether it was done by the webmaster or by someone else executing a negative SEO campaign.
If a penalty or visibility drop were as easy to incur from a handful of paid links, then in theory, it would be pretty straightforward to perform negative SEO on competitors. The barriers to doing this are quite low, and furthermore, the footprint is minimal.
Google has tried to negate this problem with the introduction of the disavow tool, but it is not realistic to think all webmasters will know of this, let alone use the tool correctly. This is a challenge for Google in tackling paid links.
4. It provides a PR backlash and unwanted attention
When rolling out large algorithm updates, it’s inevitable that there will be false positives or severe punishments for small offenses. After any rollout, there will be a number of “adjustments” as Google measures the impact of the update and attempts to tweak it.
Despite that, a large number of businesses will suffer as a result of these updates. Those who regularly join Google Webmaster Hangouts will be used to business owners, almost in tears, discussing the devastating impact of a recent update and pleading for more information.
While the vast majority of Google users will most likely never be aware of or care about the fallout of algorithm updates, these situations do provide Google with some degree of negative PR. Any noise that points toward Google yielding too much power is unwanted attention.
On a related note, sometimes penalties are just not viable for Google. When someone walks down Main Street, they expect to see certain retailers. It’s exactly the same with search results. Users going to Google expect to see the top brands. The user doesn’t really care if a brand is not appearing because of a penalty. Users will hold it as a reflection on the quality of Google rather than the brand’s non-compliance with guidelines.
To be clear, that’s not to say that Google never penalizes big brands — JCPenney, Sprint, the BBC and plenty of other large brands have all received high-profile manual penalties in the past. But Google does have to consider the impact on the user experience when choosing how to weight different types of links. If users don’t see the websites they expect in search results, the result could be switching to another search engine.
This is how Google deals with the problem
The above four points highlight some of the challenges Google faces. Fewer things are more important than meeting its objective of returning the most useful results to its users, so it has a massive interest in dealing with paid links.
Here are some ways Google could address the challenges it faces:
1. Prefer to devalue links and issue fewer penalties
Penalties act as a deterrent for violating guidelines, and they serve to improve the quality of search results by demoting results that were artificially boosted. A lot of the risk of “getting it wrong” can simply be mitigated through devaluing links algorithmically, rather than imposing manual penalties.
In the instance of a negative SEO attack, the spammy links, instead of causing a penalty for a website, could simply not be counted. In theory, this is the purpose of a disavow file. Penalties could be saved for only the most egregious offenders.
The fact that Penguin now runs in real time as part of the core ranking algorithm suggests that this is the direction they are heading in: favoring the devaluation of spammy links through “algorithmic” penalties (which websites can now recover from more quickly), and manual penalties only being applied for serious offenses.
2. Do a slow rollout combined with other updates
Slowly rolling out the Penguin 4.0 update provides Google two advantages. First, it softens the blow of the update. There is not one week when suddenly some large profile brands drop visibility, drawing attention to the update.
Second, it allows Google to test the impact of the update and adjust over time. If the update is too harsh, they can adjust the parameters. Penguin 4.0 may take several weeks to roll out.
To add to the confusion and make it more difficult to understand the impact of Penguin 4.0, it is probable Google will roll out some other updates at the same time.
If you cast your memory back two years to the introduction of Panda 4.1 and Penguin 3.0, they were rolled out almost in conjunction. This made it more difficult to understand what their impacts were.
There was a lot of SERP fluctuation this September. It is possible part of this fluctuation can be attributed to Penguin 4.0 testing, but there is no certainty because of the amount of other updates occurring (such as the local update dubbed “Possum“).
3. Encourage a culture of fear
Even if the risk of receiving a penalty is the same now as it was five years ago, the anxiety and fear of receiving one is much greater among brands. High-profile penalties have not only served their function of punishing the offending brand, but they also have provided a great deterrent to anyone else considering such a strategy.
The transition to content marketing and SEO becoming less of a black box assisted in this, but this culture of fear has been a large driver in the reduction of paid link activity.
Final thoughts
Google is often criticized for not doing more to tackle paid links, but I think that criticism is unfair. When one considers the challenges search engines face when tackling paid links, one can be more forgiving.
Now that Google has incorporated Penguin into the core algorithm, webmasters may have an easier time recovering from ranking issues that arise from spammy or paid links, as they will not have to wait until “the next update” (sometimes years) to recover from an algorithmic devaluation.
However, the fact that Penguin now operates in real time will make it more difficult for webmasters to know when a loss in rankings is due to spammy links or something else — so webmasters will need to be vigilant about monitoring the health of their backlink profiles.
I suspect that Google will continue to make tweaks and adjustments to Penguin after the rollout is complete, and I expect to see a continued shift from penalties to devaluing links over time.
The post Give Google a break: Tackling paid links is harder than you may think appeared first on Search Engine Land.
No comments:
Post a Comment