All You Need to Know About the Google BERT Update

Performing a Google query has become second nature to us digital natives. It’s become the starting point for many activities online and even offline. Whether you’re looking for...

Iss Bautista
Iss Bautista November 7, 2019

Performing a Google query has become second nature to us digital natives. It’s become the starting point for many activities online and even offline. Whether you’re looking for a place to eat or scouring the World Wide Web for rare information, Google can give you answers in the blink of an eye. 

But while the world’s most popular search engine can do wonders, it has its flaws too. For example, getting the most accurate results would require a certain mastery of keywords, which is an art form in itself. Of course, the answers you get are just as good as your vocabulary and level of curiosity. 

Google’s core algorithm has its loopholes, as well, and one of which is that it can’t always understand your questions. For example, if you search for “which country is south of Romania?” the search engine will show you the geography of Romania instead of a Wikipedia page of Bulgaria. When you get down to the specifics, like what age was Taylor Swift when she was interrupted by Kanye West on stage, Google’s inherent flaws would be out in the open. 

But change has come in the form of another major update called BERT. Pandu Nayak, search executive of Google, boasts that it’s the single best change in terms of ranking in the last five years. That’s big considering that Google rolled out a broad core algorithm update just last year. 

The update is said to significantly improve the way Google interprets queries. The technology at work is state-of-the-art natural language processing, which has a sharper ability to understand the semantic context of search terms. After the roll-out on October 24, 2019, you can now see how old Taylor Swift was when Kanye rained on her parade and what country lies south of Romania in the featured snippet.

Given that BERT will affect 1 in 10 Google searches, it is, indeed, a major revolutionary step since the RankBrain update, which first introduced machine learning to search. Now that the roll-out is complete and the dust has settled, we want to know how this will affect our profession and clientele. 

A Big Leap in the History of Search 

What Does BERT Stand For?

BERT stands for Bidirectional Encoder Representations from Transformers. Its main task is to look at sequences of words in queries to better understand the intent behind them. While Google has been impressive at picking up syntactic and contextual cues, not to mention a super fast and accurate voice recognition API, the previous algorithm merely took a “bag of words,” according to Nayak, and looked at important words and retrieved local results. When it came to complex sentences, Google would only return the closes relevant result. The example given by Nayak during the briefing was this:

Before the BERT update, if you search for “Can you get medicine for someone pharmacy?” you would get a link to answers about getting a prescription filled. Not one result would tell you how you can pick up a prescription for someone else. This is how the search result looks now:

featured snippet SEO Google BERT update

BERT in Action: How Does it Impact Search?

BERT traces its roots back to RainkBrain. While the former doesn’t replace the latter, both are AI-based methods for understanding content and discerning a query’s true intent. Think of BERT as a more refined way to understand not just a string of words but language and sentence structures as a whole. What Google did was to build a language model and trained it to have a deeper sense of language context and flow. The result is the first-of-its-kind “deeply bidirectional, unsupervised language representation, pre-trained using only a plain text corpus,” as described by Google in a blog post.

Unlike RankBrain and previous efforts, BERT is a bidirectionally-trained language model, meaning it looks at a text sequence from left to right and right to left. This key innovation was previously impossible, making BERT a significant landmark not just for Google but for Machine Learning on NLP in general.

A paper published by Google shows that the BERT model also makes use of a Transformer, which is an attention mechanism that learns and processes words in relation to all the other words (and sub-words) in a sentence, rather than one by one in a left-to-right or right-to-left order. This means that the search algorithm will be able to understand even the prepositions that matter a lot to the meaning of a sentence or the intent of a query. Google will also be able to decipher even the most complex, conversational queries and understand nuances in language. 

With a bidirectional model, it can be a challenge to assign a prediction goal. So Google used two training strategies, namely Masked LM (MLM) and Next Sentence Prediction (NSP). Read the paper if you wish to know more about the workings of this technology.

Achieving these meant that Google needed vast amounts of training data and had to expand its hardware, as well. So, along with the integration of BERT to Search, Google now uses Cloud TPUs to serve search results so users can get more relevant, useful information in split seconds.

Given all these, it’s safe to say that BERT will not only affect how your pages will rank based on ranking signals (as previous updates did) but also determine if your content will be displayed at all for a given query. 

The inclusion of BERT into Google’s ranking system is a major step toward the search giant’s main goal, and that is to deliver the most relevant, up-to-date, and accurate information to users. Additionally, it would weed out black hat and non-compliant SEO content, or push them further down the SERPs. And companies that somehow outsmarted the ranking systems should be on their toes. The change will most likely ripple down to all of Google’s products, as well, including YouTube and Google Maps. Simply put, websites that rely on Google for traffic will get the biggest blow. That is, if your content is not up to par with Google’s standards. 

Updating Your SEO Strategy 

Google combs through billions of pages to rank the ones it believes to be best in the featured snippets and first pages of the SERPs. Companies have lived and died by these changes to the search engine’s ranking systems.

So what can you change and improve on to get in the good graces of Google? The simple answer is to continue crafting high-quality content, and that includes:

1. Optimizing for featured snippets 

Featured snippets are Position 0 in the SERPs and are displayed in paragraphs, lists, and tables. Content that answers the true intent of a query gets to take this coveted spot. If you haven’t been optimizing for this real estate, it’s time to put this up on the front burner because BERT is now live in 25 languages for featured snippets. But that’s not the only reason. Check out our previous blog post to know why it’s crucial to have a featured snippet strategy.

2. Crafting more relevant and useful content 

While technical SEO does a good job of making sure the algorithm indexes and crawls your web pages, ranking high on the SERPs rests on how good your content is. In the world of Search, quality is always tied to intent and relevance. Does your content directly answer the query? Does it provide specific solutions to a problem? Whatever keyword you aim to rank for, make sure it’s clear to you what the intent is before you start writing. Keep in mind the four types of search intent, namely informational, navigational, transactional, and commercial.

3. Updating high-ranking content

Re-optimizing existing content following a Google update is nothing new. Take this as an opportunity to audit your blog and improve posts that have significant traffic. Rebuild that momentum by updating dated information, adding new details that will enhance the content, and inserting it with high-volume keywords and related phrases. 

4. Using semantic keywords strategically 

To make BERT work to your advantage, try to make your vocabulary as diverse as possible. Instead of sticking to the main keywords, enrich your copy with secondary keywords and LSI using Search Console. But don’t get carried away and sprinkle your copy with synonyms. Semantics are important signals, but make sure they’re still within context. 

Currently, BERT will only impact 10% of queries and is still not being used in other countries and languages. But knowing Google and its tendency to roll out change incrementally to protect the businesses that rely on it (or people who hate big change), you can expect that BERT will be omnipresent across Search and Google’s network in the next couple of years. 

Is your SEO strategy lagging behind and needs updating? Connect with us today and discover your options. 

More From Growth Rocket