Every year, Google updates its search results thousands upon thousands of times. While the majority of these updates are small adjustments to Google’s algorithm, they can have big implications for you, your site and your potential revenue. Even a small update can cause a page that once ranked at the top of the results to move to page two or beyond, lessening the chances that a user will see it, let alone visit it.
How can you prevent a Google algorithm update from impacting your site? Let’s explore Google’s algorithm updates:
Looking to expand your SEO knowledge or wondering is Wix good for SEO? Check out our SEO Hub, whether you’re just getting started with your SEO, or consider yourself an advanced professional, we’ve got resources for everyone.
What is the Google Algorithm?
Since Google can’t manually sort through billions of web pages every time a user gives it a query, it uses a collection of integrated algorithms to offer the right results for a search term.
Originally, Google used an algorithm called PageRank that relied very heavily on the number and quality of links a site/page received to rank content on its search results page. The model viewed links as an affirmation of the page’s content quality—if high-quality sites linked to a page often, then Google assumed it must contain quality content.
The problem, however, is that links are a secondary signal. The number of links doesn’t tell the search engine about the actual content quality, only the probability that it contains quality content since many people linked to it. Websites could easily game the algorithm through spammy links and content practices.
As time went on, Google’s algorithm evolved to better understand search queries and content. In 2013, Google took a semantic approach in revamping its core algorithm and named it Hummingbird. It measured the substance of a page’s content using signals to better understand how relevant it was to a query. The update also deemphasized keyword placement.
Then in 2015, RankBrain’s release ushered in the age of machine learning. With it, Google could analyze a massive amount of user data to see what pages satisfied users’ intent when searching for a specific query. With this information, RankBrain could build models to understand which on-page factors or content signals were more or less important to a searcher, and use them to structure search results.
For example, when you searched for “buy car insurance” prior to RankBrain, the search results page would list car insurance providers. However, Google quickly realized that often for such queries, users wanted information about buying the product or service (in this case, car insurance). Now Google shows a healthy mix of providers that educate users on how to choose the best insurance:
Similarly, because of RankBrain, Google found users wanted an image of the finished dish alongside the recipe. To now rank for a recipe keyword, all things being equal, a page must contain an image:
Moving beyond secondary signals, Google has implemented technology like neural matching, Bidirectional Encoder Representation from Transformers (BERT) and Multitask Unified Model (MUM) to better understand user queries.
Don’t get confused. While the SEO industry has long discussed specific ranking factors or “ranking signals,” a lot of misinformation exists out there on the topic and I strongly advocate for a holistic approach.
Google allegedly uses more than 200 official ranking signals to decide what content should or should not rank for a query. These factors include anything from page performance, to links to content relevance. Many SEO professionals debate one ranking factor’s importance over another, while others will try to “optimize” for as many “factors” as possible.
This is a mistake.
For starters, there is no universal list of the most weighted factors. The factors Google uses vary from query to query, from vertical to vertical. Google uses a complex process to evaluate content, employing machine learning to understand language and better classify and profile content to determine its quality.
To this, Google’s John Mueller said:
"I mean it is something where if you have an overview of the whole web or kind of a large part of the web and you see which type of content is reasonable for which types of content then that is something where you could potentially infer from that. Like for this particular topic, we need to cover these subtopics, we need to add this information, we need to add these images or fewer images on a page. That is something that perhaps you can look at something like that. I am sure our algorithms are quite a bit more complicated than that."
For example, “good” content for medical information that can greatly impact health should look, sound, and feel different than a gossip column. So for health information, Google trains a machine learning model to identify a content profile based on its leaders, such as the Mayo Clinic.
The assessing of good versus bad content happens in what I’ll call the “pre-algorithmic” stage or the “meta-algorithmic” stage and impacts what kinds of pages rank beyond a specific ranking factor.
Simply, don’t worry about “ranking factors.” Instead, write highly-targeted, substantial content. Cater your content to your target market’s needs and knowledge level. Make sure your content genuinely helps them. That’s the most important thing.
What is a Google Algorithm Update?
Google’s competitive landscape is more varied than you might think. To keep users highly satisfied, Google needs to serve the best results possible. They consider many factors, including user expectations and technological updates.
To keep users satisfied and improve its results, Google will often update or “tweak” its algorithm to change what the SERP shows.
In the early days, Google would release updates to keep people from abusing and manipulating the algorithm. For example, Penguin targeted spammy link practices, and Panda protected against thin content. While Google still releases updates targeting spam, more recently the company is placing more emphasis on surfacing the highest content quality on the SERP.
While we often think of an algorithm update as reevaluating the weight of certain factors on a SERP, this is an oversimplification. Recently, many of Google’s algorithm changes have incorporated technological advancements, specifically many in machine learning. As Google introduces new and better technology into the algorithm, it can better understand page quality and relevance as well as (or a domain overall as many of the quality assessments Google undertakes to look at the quality of the entire site, not just a single page). To that, experts speculate that many of Google’s updates are not changes to the algorithm in the strictest sense but machine learning recalibrating and testing. These changes are perhaps behind a good number of Google unconfirmed algorithm updates.
Want to learn more? Check out this guide to assessing the impact of Google Algorithm Updates.
Confirmed and unconfirmed Google Updates
Google makes thousands of changes to its algorithm every year, yet only officially announces a minority of these updates. Instead, search marketers use an “SEO weather tool” to track significant algorithm changes. When they see more rank movement than typical, they’ll indicate the rank volatility level.
The Semrush Sensor, Semrush’s rank volatility weather tool, shows moderate levels of increased rank movement on the Google SERP
In rare instances, Google will officially announce a new algorithm update. Examples of this include the Page Experience update which introduced performance metrics (known as Core Web Vitals) to the algorithm along with the aforementioned iterations of the Panda and Penguin updates.
Google’s broad core updates have been the most commonly confirmed update, but other confirmed updates include Product Review Updates, which aim at ensuring only the best product reviews appear on the SERP.
As a rule, confirmed updates result in far more rank volatility than unofficial updates.
Google’s core algorithm updates
Broad core algorithm updates happen when Google implements wide-ranging changes to the algorithm’s operation. Rather than slight modifications to ancillary aspects, these updates signal a broad change in how Google’s algorithm ranks pages and sites.
While Google has long released broad core algorithm updates, Danny Sullivan, Google’s search liaison, began announcing core updates officially, in March 2018. These tremendously impacted how search marketers think about content.
The most notable of these updates was known as the Medic Update, as it disproportionately impacted Your Money Your Life (YMYL) sites, including finance, health, and other sites that, if contained inaccurate information, could significantly harm a user.
In many ways, the Medic Update served as the prototype for following core updates. It showed a clear qualitative leap in Google’s ability to understand and profile content. Those significantly impacted by the update included sites with a thin content experience and those that put marketing aims above substantial content. For example, if a user searched “how to eat better,” pages that had heavy marketing language or those that showed bias in its own product or service would likely rank lower after this update. On the other hand, Google’s algorithm rewarded authoritative, expert, and unbiased articles on the same topic.
Since then, Google’s core updates have shown an increased ability to understand what quality content looks and sounds like.