Each year, Google makes hundreds of changes to search. In 2018, they reported an incredible 3,234 updates — an average of almost 9 per day, and more than 8 times the number of updates in 2009. but not all of them have an equally strong impact on the SERPs. To help you make sense of Google’s major algorithm changes in the past years and major Google ranking factors, I’ve put up a cheat sheet with the most important updates and penalties rolled out in the recent years, along with a list of hazards and prevention tips for each.
We start with oldest one:
Launch date: February 24, 2011
Goal: De-rank sites with low-quality content
A search filter meant to stop sites with poor quality content from working their way into Google’s top search results. (Poor quality content is most commonly content created solely for SEO purposes — featuring things like keyword stuffing and scraped or duplicated content.) It was first introduced in February 2011.
Launched: April 24, 2012
Goal: De-rank sites with spammy, manipulative link profiles
A search filter designed to better catch sites deemed to be spamming its search results, in particular, those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings. When a new Penguin Update is released, sites that have taken action to remove bad links may regain rankings. It was first introduced in April 2012.
Launched: Aug 2012
Goal: De-rank sites with copyright infringement reports
A search filter designed to prevent sites that have a lot of copyright infringement reports (as filed through Google’s DMCA system) from ranking well in Google’s listings. It was first introduced in August 2012.
Launched: August 22, 2013
Goal: Produce more relevant search results by better understanding the meaning behind queries
Hummingbird helps Google better interpret search queries and provide results that match searcher intent (as opposed to the individual terms within the query). While keywords continue to be important, Hummingbird makes it possible for a page to rank for a query even if it doesn’t contain the exact words the searcher entered. This is achieved with the help of natural language processing that relies on latent semantic indexing, co-occurring terms and synonyms.
Launched: July 24, 2014 (US)
Goal: Provide high quality, relevant local search results
update launched in July 2014 to provide more useful, relevant, and accurate local search results that are tied more closely to traditional web search ranking signals.
Launched: April 21, 2015
Goal: Give mobile-friendly pages a ranking boost in mobile SERPs, and de-rank pages that aren’t optimized for mobile
Google’s Mobile Update (aka Mobilegeddon) ensures that mobile-friendly pages rank at the top of mobile search, while pages not optimized for mobile are filtered out from the SERPs or seriously down-ranked.
Launched: October 26, 2015 (possibly earlier)
Goal: Deliver better search results based on relevance & machine learning
RankBrain is part of Google’s Hummingbird algorithm. It is a machine learning system that helps Google understand the meaning behind queries, and serve best-matching search results in response to those queries. Google calls RankBrain the third most important ranking factor. While we don’t know the ins and outs of RankBrain, the general opinion is that it identifies relevance features for web pages ranking for a given query, which are basically query-specific ranking factors.
Launched: September 1, 2016
Goal: Deliver better, more diverse results based on the searcher’s location and the business’ address
The Possum update ensured that local results vary more depending on the searcher’s location: the closer you are to a business’s address, the more likely you are to see it among local results. Possum also resulted in greater variety among results ranking for very similar queries, like “dentist Denver” and “dentist Denver co.” Interestingly, Possum also gave a boost to businesses located outside the physical city area.
Launched: March 8, 2017
Goal: Filter out low-quality search results whose sole purpose is generating ad and affiliate revenue
The latest of Google’s confirmed updates, Fred targets websites that violate Google’s webmaster guidelines. The majority of affected sites are blogs with low-quality posts that appear to be created mostly for the purpose of generating ad revenue.
Well well, these were Major (top) Google search algorithm update. As always, I’m looking forward to your comments and questions below. Have any of these updates had an impact on your ranks? If so, what was the tactic that helped you recover? Please share your experience in the comments!
Leave a Reply