In the game of Internet search, in which updates to Google algorithms are studied like tea leaves, every action has a consequence– and it’s not always what’s intended. Google, for the first time in nearly two years, has announced an update to its Penguin algorithm, which the experts that most closely monitor search engine trends have called Penguin 4.0 and that may have a significant impact on search. But how we got here in the first place is an example of unintended consequences. Here’s the background, and what digital marketers need to know about the new update.

As we have written many times, including on the Google “Hummingbird” release and “Mobilegeddon,” search engine optimization is a never-ending game of cat and mouse. The mouse–anyone with a website that they want to get noticed–is always trying to find shortcuts that make their website come up high in a Google search. The cat–the search engine team at Google–wants to eliminate any shortcuts, work-arounds or downright cheating. That’s the game that has been going on for many years, because doing it the way Google wants, which is by having great content that is fresh, original, updated and linked to by other sites, is hard. In fairness, Google has been doing a great job of reacting and responding to the evolving nature of Internet usage and protecting the integrity of its search engine, and has made it much more difficult to game the system.

So here’s where Penguin 4.0 comes in. One of the ways that Google has assessed where a particular webpage should rank in a search query is by looking at the inbound links on that page. The theory goes that if The New York Times is linking to it, it must have the type of credibility and value that warrants a high ranking. Lesser websites have value, just not as much as The New York Times. However, by making inbound links an important component in the rankings, the mice on the other side found ways to artificially have lots of websites link back to key pages, sometimes by buying links or engaging with networks of link builders. That was the first unintended consequence.

In response, Google launched a Penguin update in April 2012, to better catch sites deemed to be spamming its search results, in particular those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings. At first, the Penguin algorithm would identify suspicious links when it crawled the Internet, and simply ignore them when it performed its ranking in response to queries. But sometime in the middle of 2012, Google started punishing websites with bad links, not just ignoring them but actually driving page rankings down for offenders. And that set off a mad scramble as sites needed to somehow get “unlinked” from the bad sites. Which led to unintended consequence number two: Not only did webmasters have to worry about being punished for bad links, they also had to worry about rivals purposely inserting bad links to undermine their competitor’s search results. Ugh!

So, in October of 2012, Google tried to fix the problem it created by offering a “Disavow Links” tool that essentially tells the Google crawlers when they find a bad inbound link that the website in question has “disavowed” that bad link, and therefore please don’t punish us for it any longer. Here’s how Searchengineland described the tool at the time: “Google’s link disavowal tool allows publishers to tell Google that they don’t want certain links from external sites to be considered as part of Google’s system of counting links to rank web sites. Some sites want to do this because they’ve purchased links, a violation of Google’s policies, and may suffer a penalty if they can’t get the links removed. Other sites may want to remove links gained from participating in bad link networks or for other reasons.”

And that created yet another unintended consequence, because, unfortunately, the Penguin algorithm wasn’t updated on a regular basis. So for websites trying to clean up their links, as SearchEngineland put it, “Those sites would remain penalized even if they improved and changed until the next time the filter ran, which could take months. The last Penguin update, Penguin 3.0, happened on October 17, 2014. Any sites hit by it have waited nearly two years for the chance to be free.”

Penguin 4.0 addresses that by integrating the Penguin “filter” into the regular crawler sweeps that assess websites on an ongoing basis. Waiting for up to two years for a refresh is now a thing of the past, as suspect pages will now be identified–or freed because they are now clean–on a regular basis.

What does this mean for websites? It’s what we’ve been writing now for a half dozen years. Good SEO doesn’t just happen, and it can’t be manipulated. It takes hard work, an effective strategy, and a long-term view to create the kind of content and links that elevate your brand for your customers, prospects, and the Google search engine. For more tips on navigating Penguin, download our eBook now.

Download Our eBook Now




Recent Post
No Data