Google algorithm updates are moments in internet history.
If you work in digital marketing, they might even mark significant moments in your career history.
The biggest Google algorithm updates get their fair share of publicity thanks to our mile-a-minute news media.
Why is something that’s so behind-the-scenes such a big deal?
Nearly 75 percent of all desktop searches come from Google. Followed by Baidu at a distant 11 percent. As if dominating desktops isn’t impressive enough, Google dominates tablet and mobile search, owning more than 92 percent of the traffic. Baidu, at 4 percent, is in second place.
Given Google’s search dominance, a Google algorithm update is like an internet tremor. Entire industries are shaken.
The bigger Google algorithm updates are often christened with a name by Google or the community at WebmasterWorld.
What’s the Google dance about? What does Google really want?
So what does the past, present and future of Google algorithm updates look like?
And how can you take advantage of updates?
Past: A Brief History of Major Google Algorithm Updates
Google wants the best user experience for searchers. That’s the motive behind every update.
Let’s explore the Google algorithm changes in a chronological order. The pattern will be apparent, and you’ll have a much clearer idea of what Google looks for in terms of quality.
Vince | February 2009
In 2009, Google made an update they called Vince. This wasn’t perceived as a major update. Vince was intended to factor trust into generic searches, so it ranked big brands high for generic keywords.
According to Matt Cutts, the Vince change didn’t affect longtail keywords. Here’s a video of Matt explaining the change.
Though it wasn’t a major update, it was an early nod to authority—and to Google algorithm changes to come.
Panda | February 2011
Following its announcement on January 21, 2011, Google introduced the Panda algorithm in February of that year. The algorithm impacted 11.8 percent of search results in the US alone, a much larger percentage than its regular updates would impact.
Panda is Google’s algorithm for combatting low-quality content that ranks high on its search engine result pages (SERPs).
From April 28 through May 3 back in 2010, Google made algorithmic changes to address poor-quality content, but that focused on longtail keywords.
In 2011, Panda took this quality change further. It targeted content farms and scraper sites.
The image below is an excerpt from Google’s announcement on January 21, 2011 which clearly spelled out what the algorithm would target.
Google updated the Panda algorithm occasionally from 2011 until 2016, when it announced Panda was officially complete and wouldn’t be updated significantly again. Whenever these updates occurred over the years, websites had a variety of experiences:
- If the site was negatively affected by a Panda update but made no changes to improve quality, it continued to see negative effects.
- If the site was negatively affected by a Panda update but did make changes to improve quality, it got back into Google’s SERPs.
- If the site had poor-quality content that initially escaped a Panda update, it got caught.
- If the site was a false positive (flagged by Panda as low quality undeservingly) it was released from penalties.
- If the site only produced high-quality content, it didn’t experience any changes (or maybe even saw a positive change in the SERPs).
In short, Google wants the best, most relevant and most up-to-date content delivered to searchers for their search terms.
Panda weeds out content that violates this intent in any way.
Top Heavy | January 2012
The Top Heavy algorithm makes it difficult for websites that have their top (i.e., header, title, top nav, first paragraph area) heavy with ads to rank high in the SERPs.
Google announced the algorithm on January 19, 2012. In Google’s words:
sites that don’t have much content ‘above-the-fold’ can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
Just like Panda, Google has updated Top Heavy from time to time.
This update doesn’t affect overlay ads, pop-ups and pop-unders. Top Heavy focuses on static ads that appear above the fold. An ads-heavy but not top-heavy site may escape impact from this algorithm.
Use a fold viewer extension to know if your site is prone to impact from Top Heavy. The extension helps you understand how much content on a page is immediately visible to visitors under various screen resolutions, in comparison to ads.
Penguin | April 2012
On April 24, 2012, the Google algorithm update Penguin hit search engine results pages.
Penguin penalizes websites that use spammy link building tactics to boost their search engine rankings. If a website builds links through SEO-purposed link networks or buys links, Penguin punishes it by tanking its rankings.
If you’ve participated in link networks or bought links but can’t remove them, then you can use the Google disavowal tool. You can also use this tool if you notice that spammers are trying to attack your site with low-quality links.
Over-optimized sites and sites that have other forms of unnatural link building are usually trapped in Penguin. When Penguin launched, it allowed direct feedback. This is no longer open. Webmasters who need help can participate in and ask their questions on Webmaster Central.
Pirate | August 2012
Pirate Google algorithm update went live in August 2012. Google uses this algorithm to stop sites from ranking high on their SERP if they have an excessive amount of copyright infringement reports filed against them via Google’s Digital Millennium Copyright Act (DMCA) system.
Google updates Pirate from time-to-time. So new and offending sites may be caught by Pirate, others previously caught may be released, and false positives may be corrected.
On August 10, 2012, Google announced their Piracy algorithm launch, saying:
Starting next week, we will begin taking into account a new signal in our rankings: the number of valid copyright removal notices we receive for any given site. Sites with high numbers of removal notices may appear lower in our results. This ranking change should help users find legitimate, quality sources of content more easily
This algorithm is Google’s response to the spike in copyrights infringement notices they get. In the last 30 days leading to the August 10 announcement, Google had received more than 4.3 million URL removal requests. That’s more than all the requests they received in 2009 combined!
At the time of writing this post, Google has more than 4 billion URLs requested to be removed.
Exact Match Domain | September 2012
Exact Match Domain went live in September 2012. The algorithm stops poor-quality websites from ranking high on search engines just because they have exact matching search keywords in their domain names.
Minor weather report: small upcoming Google algo change will reduce low-quality "exact-match" domains in search results.
— Matt Cutts (@mattcutts) September 28, 2012
Payday | June 2013
Payday Google algorithm update was launched to clean up search engine spam related to payday loans, pornography, pharmaceuticals, financial industries, casinos and such heavily-spammy query targets. The algorithm went live on June 11, 2013.
Since its initial launch in June 2013, Payday has been updated twice.
On May 16, 2014, Google launched Payday 2.0 to combat spammy payday sites.
Then, June 2014 was for Payday 3.0 to clean up spammy search queries and provide better SEO attack protection.
Hummingbird | August 2013
The Hummingbird Google algorithm update takes its name from being “fast and precise.” It aims to rank pages that match the meanings of a search better than those that match just the words of the search.
In short, Hummingbird is an overhaul of the Google search algorithm. With this Google algorithm update, the company officially named its search algorithm The Hummingbird.
This update also introduced Google’s knowledge graph.
Although Google announced the algorithm on September 26, 2013, they’d launched it a month before the announcement.
Pigeon | July 2014
The Hummingbird Google algorithm update, sadly, threw local search into disarray, especially with the introduction of Google knowledge graph.
That disorder was exactly what the Pigeon update fixed, among other things, search issues related to synonyms and spelling.
This algorithm aims to give more relevant, useful and accurate results for local searches that are closely related to the usual web search ranking signals. So Google uses this algorithm to improve their parameters for distance and location ranking.
Mobile Friendly | April 2015
Mobile Friendly is a Google algorithm update that prioritizes for mobile-friendliness when delivering answers to a query. This algorithm went live on April 21, 2015.
The algorithm is Google’s response to the dramatic increase in preference for accessing the internet via mobile devices instead of PC.
Just a year after Mobile Friendly went live, more people began accessing the internet from mobile and tablet devices than from desktops. Mobile search traffic has spiked for Google, according to Hitwise.
Although Mobile Friendly sent some panic across the industry, most of that settled as SEO experts better understood and aligned with Google’s goals.
RankBrain | October 2015
The ultimate goal of a Google algorithm update is helping serve searchers exactly what they seek.
Search has gone beyond texts to include image and voice searches, so the need to understand context and meanings beyond words (i.e., semantic search) has grown dramatically.
About 15 percent of Google’s daily searches are so unique that no one has ever used those search terms before. RankBrain uses AI-powered machine learning to find the best answers to these queries.
In order to deliver flawless results, Google developed RankBrain in April 2015. It wasn’t until October 26, 2015, that the company officially announced the algorithm.
According to the Bloomberg video
RankBrain uses artificial intelligence to embed vast amounts of written language into mathematical entities — called vectors — that the computer can understand. If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.
Googles says that RankBrain is their third most influential ranking factor. According to Google’s Search Quality Senior Strategist, Andrey Lipattev, your backlinks, and content are the other two most important ranking factors.
In a Backchannel interview, Google’s Jeff Dean said RankBrain has since moved from just 15 percent of search to participate in almost all searches. AI-powered Machine learning has come to stay and is probably used in more ways than Google is talking about.
This one is still undergoing changes, including the most recent Google algorithm update which occurred on March 7th, 2018.
More on that in just a minute. Let’s finish our history lesson first!
Mobile Ad Interface | January 2017
Intrusive and interstitial mobile ads don’t add to a pleasant user experience. A Google algorithm update rolled out on January 10, 2017 set out to combat this issue.
Tweets from Gary Illyes and John Mueller confirmed these changes.
it's rolling out, yes
— Gary "鯨理" Illyes (@methode) January 10, 2017
it's rolling out. yay!
— John ☆.o(≧▽≦)o.☆ (@JohnMu) January 11, 2017
Google’s roll-out announcement received a rave welcome from webmasters initially, but that wouldn’t last long.
SEO experts have noted that this algorithm’s performance has been underwhelming. Big brands seem to be getting away with the ads, while smaller brands suffer.
So, what’s Google fixing with this penalty?
Just like any other Google algorithm update, this one will go through iterations to meet its goal.
Now, that about takes us through the major updates of the past. It’s time to move our focus to the present, the here and now—then, we’ll be able to take on the future of Google algorithm changes with eyes open.
Present: The Core, RankBrain and the 3/7 Google Dance
Since the March 7th update, of this year webmasters have been wondering when a Google algorithm update is a “core” or “broad” update.
It’s an interesting and relevant thing to consider right now. Here’s Google’s reply:
Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year….
— Google SearchLiaison (@searchliaison) March 12, 2018
A new Google algorithm update goes through several iterations.
An algorithm becomes part of the “core” when it doesn’t need any more attention from Google’s engineers and developers to do its job.
We can deduce an answer from the question Ammon Johns asked in a Google Q&A session in January 2018, and the answer that Google’s Andrey Lipattsev gave.
Ammon asked, “Once they forgot how it works, it is core?”
Andrey answered, “That is exactly right.”
But that doesn’t mean a new core update can’t affect the rest of what’s considered core.
For example, take our most recent core update in 2018.
March 7: The core update for under-rewarded content
On March 7, 2018, Google launched a core algorithm update that boosts “under-rewarded” content pages.
As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded….
— Google SearchLiaison (@searchliaison) March 12, 2018
Webmasters have been discussing what “under-rewarded” means and how the algorithm change is impacting their websites. A discussion on Marie Hayne’s website suggests that this change might be RankBrain related.
Marie Haynes replied Ed.
Our post “SEO Mud Season: Google’s 3/7 Core Update Affects Rankings and Traffic” gives you a lowdown how the Google algorithm update influences search results and how you can take advantage of it.
Experts have come to see the March 7 Google algorithm update as Google’s reward for past (unrewarded) effort, an alignment with today’s AI-powered search results, and a preparation for upcoming updates.
Speaking of preparations for the future, what’s the road ahead?
Future: Experts Forecast Coming Google Updates
It’s hard to say if these coming changes or updates will be part of the core algorithm, but they address the same goal: Giving searchers a pleasant user experience, and delivering the best information for every Google search.
The truthfulness algorithm (combats fake news)
Following the media’s attack on Google for promoting “fake news,” the company has been taking steps to combat the issue.
Removing fake news from Google’s search results has become so important that Danny Sullivan called it “Google’s biggest-ever search quality crisis.”
For example, when Google users search “are women evil,” they can stumble across imperfect results like “all women have some prostitute and evil in them.”
— Danny Sullivan (@dannysullivan) December 4, 2016
Of course, there’s good reason why this particular result turned up at the top of the SERPs. The writer of this article had played the SEO game correctly.
Still, Google remains dedicated to delivering the best information possible, and that’s probably not it.
Or the suggestion that Obama is planning a coup.
— Danny Sullivan (@dannysullivan) March 5, 2017
Again, since we know how Google’s algorithm work, we might understand why the best match was considered to be an article which strongly targets the given keywords, “is Obama planning a coup.”
The issue is that Google isn’t delivering the most valuable, fact-based results to these queries. And it’s not exactly a good look for Google.
With this in mind, some of the steps Google is now taking to combat fake news include:
1. Banning fake news ad promoters.
In 2015, Google removed 780 million ads. That number more than doubled to reach 1.7 billion ads in 2016. These ad removals have led Google to ban almost 200 ad publishers who published fake news or misleading content.
2. Enabling users to report offensive autocomplete predictions.
Although Google provides searchers an opportunity to report office autocomplete suggestions, the form is tucked away in Google’s support pages. The company is now experimenting for a more intuitive and easily-accessible reporting method.
3. Devaluation of non-authoritative sources using algorithms.
Google is making their algorithm to devalue information not coming from authoritative sources. One example is the anti-Holocaust story which ranked number one on Google but gradually fell off search results after The Guardian made news of the situation.
On the use of algorithms to fight off non-authoritative news sources, Google said
When non-authoritative information ranks too high in our search results, we develop scalable, automated approaches to fix the problems, rather than manually removing these one-by-one. We recently made improvements to our algorithm that will help surface more high quality, credible content on the web. We’ll continue to change our algorithms over time in order to tackle these challenges.
4. Using “fact check” labels.
Google announced in October 2016 that they’ve started labeling “Fact Check” on news articles. News stories that earn this label must come from sites aligned with Google’s “characteristics of fact-checking sites” or stories that use the schema.org ClaimReview markup.
Google’s Head of News, Richard Gingras, in announcing the “fact check” tag says,
Today, we’re adding another new tag, “Fact check,” to help readers find fact checking in large news stories. You’ll see the tagged articles in the expanded story box on news.google.com and in the Google News & Weather iOS and Android apps, starting with the U.S. and the U.K.
5. The Knowledge-Based Trust system.
Google’s interest in fighting off fake news using a truthfulness algorithm is not new. Back in 2015, the company’s researchers released a paper on Knowledge-Based Trust (KBT), as reported by the New Scientist.
The researchers explained that KBT, an alternative method for determining page authority, used information accuracy instead of link profile to determine the authoritativeness of a page.
The paper says:
The quality of web sources has been traditionally evaluated using exogenous signals such as the hyperlink structure of the graph. We propose a new approach that relies on endogenous signals, namely, the correctness of factual information provided by the source. A source that has few false facts is considered to be trustworthy.
The guest-post algorithm (combats manipulative guest-posting)
Google thinks that if your goal for guest blogging is to create links back to your site, then the link isn’t natural. If you do this on a large scale, then you’re outrightly violating Google’s guidelines.
If you publish guest posts on a large number of different sites or have lots of article contributions on only a few large sites, Google would perceive this as spam. This practice signifies to Google that your intent for publishing these articles is to gain links back to your site.
The implied link algorithm (for brand mentions)
Google’s patent suggests that “linkless” brand mentions or implied links are a key component of their ranking factor. Although the company hasn’t announced any algorithms released for this purpose, experts are convinced that this will happen eventually as AI-powered machine learning continues to influence rankings.
— Rand Fishkin (@randfish) March 26, 2014
Bing says they already use linkless mentions in their ranking factor. This gives small businesses a huge advantage, especially those not capable of earning press mentions. These businesses can get involved in and incite conversions on the web, all to their benefit.
The mobile-first indexing algorithm
The Google algorithm update for mobile-first indexing is already being tested out on some sites, according to Google’s Gary Illyes. Google has a robust best practices page that details their goals and expectations on this switch.
Mobile-first indexing means Google will predominantly use the mobile version of the content for indexing and ranking. Historically, the index primarily used the desktop version of a page’s content when evaluating the relevance of a page to a user’s query. Since the majority of users now access Google via a mobile device, the index will primarily use the mobile version of a page’s content going forward.
Google acknowledges that this will be a big change. SEO experts expect this change to happen in 2018. Google says they’ll communicate their process to webmasters after the tests have been completed.
Factors like schema, content, multimedia, links and the like are under test.
How to Key In and Enjoy Any Google Dance
If you don’t know already, SEO isn’t a way to cheat Google. It’s a way to “dance” with Google; to make your content more relevant and helpful to the people who need and search for them on Google.
So how do you key in and dance with every Google algorithm update?
- Favor quality content over quantity and frequency
- Keep linkbuilding natural
- Get brand mentions, or implied links
- Keep fresh content only (and kill outdated or stale content)
Quality over quantity and frequency
Google doesn’t rank a site. It ranks pages. The March 7 update targets “under-rewarded” pages. This implies that Google doesn’t care if your website is laden with content as much as it cares that each page completely answers a searcher’s query.
This is why long-form content gets higher rankings. According to Moz, articles between 1800 and 3000 words attract more than 15 times more unique links than articles below 600 words. Searchers also spend more time on long-form content, which gives Google a positive signal about your content.
It doesn’t come as a surprise that the average article length of a Google first-page result is 1,890 words.
Build links naturally
What’s natural linkbuilding? This just means building links with your human visitors in mind, rather than only building links for SEO benefits. It also means not trying to manipulate the current Google algorithm to improve your site’s ranking.
No buying links. No building links on poor-quality sites. No spammy anchor texts. Minimal outreach. Focus on providing real value to readers on the sites that place your links.
The higher the number of relevant backlinks you attract, the better you’ll enjoy every Google algorithm update.
In short, Google doesn’t want you “building links.” It wants you attracting them, from the most natural and relevant sources, by virtue of your amazing content being highly attractive and linkworthy.
Now, linkbuilding actively isn’t totally off limits, but if you’re going to build links, keep them natural.
This means that you should only be appealing to relevant, authoritative websites when building links actively. Look for good Domain Authority (>40), Trust Flow (>10) and Citation Flow (>10) at the very least before doing any outreach.
When you do build backlinks, use a healthy variety of anchor texts (rather than repeating the same exact-match keyword over and over), destination pages and nofollow/dofollow status.
The first thing to score you a penalty, in the event of another algorithm update, will be an ugly or unnatural backlink profile.
For this reason, you have to keep an eye on your backlink profile. Once backlinks are built, you can then confirm that they’re all high quality and look natural. Our very own Monitor Backlinks is our SEO tool of choice, of course—we built it for this very purpose.
In your Monitor Backlinks account, you can track keyword rankings, organic traffic and, most importantly, backlinks.
If you’re not a user, you can start your free trial and check out all these stats for yourself:
And if you spot any bad backlinks that are dragging down your SEO efforts—or might earn you a penalty in the next algorithm update—you can take action here to disavow them.
Disavow bad backlinks
Disavowing just means telling Google that you don’t endorse a backlink. Then, the search engine won’t factor this into your site’s SERP rankings.
Here’s a quick guide on how to disavow bad backlinks.
If you already know the backlinks you’d want to disavow, you can do that manually with the Google tool. Open the tool and select the site you want to disavow links on and click “Disavow Links.”
You’ll get the message below. Proceed if you’re all set.
There’s just one problem. It’s almost impossible to know for sure what negative backlinks you’ve gathered deliberately or unintentionally. You’ll want to use the Monitor Backlinks tool to make this cleanup a breeze.
Head over to your Monitor Backlinks account to figure out the low-quality backlinks coming to your site. Your dashboard has a set of numbers representing your backlinks profile (as seen in the screenshot below).
Click on “Links with Warnings.”
Your dashboard will list only sites that have warnings on them.
Your goal isn’t to find all the warnings. You want to focus on warnings about spammy links. Use the filter function in the top-right-hand corner of your dashboard to narrow your list of warnings down to the most relevant data.
Click on “Filters” to get a list of all the warnings.
You want to make this easy to tackle. Start by unchecking all warnings but the “High Moz Scam Score.” Moz Spam Score uses 17 factors that have powerful predictive capabilities to spot sites that are prone to Google penalties, so it’s the best metric to start finding bad links.
If you have a long list of backlinks to sift through, save time by extending the number of links displayed per page. Scroll down to the bottom of the page. On the bottom left-hand side, you’ll see a box to change the number to 100.
In the example here, I have more than 900 backlinks with high Moz Spam Score. So I opted to display 100 backlinks per page.
You may disavow these links right from your Monitor Backlinks dashboard. To do this, select the links and click “Disavow” from the top menu.
If you feel inclined to manually check the spammy sites linking to your site before disavowing them, go ahead. Just be ready, this is going to take a lot of time and energy.
You may also want to disavow sites that have Unnatural Anchor Text or High External Backlinks. As for High external links, I’ll suggest you manually choose the sites to disavow perfectly safe directories may fall into this category.
On your Monitor Backlinks dashboard, you can see the sites you’ve disavowed.
All done! No more SEO risk.
Citations and brand mentions
Sometimes it’s okay to not have a link back to your site, since a brand mention may carry the same weight.
Discussions about your brand in forums and blogs are good for SEO today. You want to jump in and control your brand discussions? Join industry-specific forums, get media features with HARO and get on your industry’s most popular podcasts and vlogs.
In short, be known in your industry as an authority. As you grow your industry influence, you’ll be cited, featured and discussed. All these links, mentions and citations have juicy SEO benefits you don’t want to miss out on.
Fresh content only
If your content is already doing well in Google SERPs, you want it to stay that way.
The key here is to update your content from time to time. Flush out irrelevant parts of the content and infuse fresh research and stats.
You don’t want your visitors giving Google negative behavioral signals about your content. So keep your pages fresh and relevant.
Ready for the Next Google Algorithm Update?
The next update isn’t far away in the future. It’s today or tomorrow! Literally.
Google confirms that they make daily changes to their algorithms and unannounced core updates a couple of times a year.
You can instantly start dancing with Google by making necessary changes and fixing any negative impact due to a Google algorithm update.
You can also get more Google SEO juice by improving where you’re already doing well.
Keep up the great work, and stay frosty.