Will Google’s BERT change SEO Forever, Again?

Written by Giorgi-M | Published 2019/10/30
Tech Story Tags: google | seo | google-algorithm-update | what-is-google-bert | google-new-search-algorithm | google-algorithm-vs-seo | latest-tech-stories | contextual-search-results

TLDR Bidirectional Encoder Representations from Transformers or BERT for short was first showcased in November 2018 as a new implementation from Google to display only relevant results for people searching on their platform. BERT’s main goal is to clearly understand the context of the query in order to determine the exact result a person may be looking for. It will be calculating the results mostly based on the query itself, but it may not be too far away to see Google implement something considering a specific device's previous search history. For now, it's going to only work with the English language.via the TL;DR App

Google has recently implemented a new search algorithm on its platform which will not necessarily revolutionize the way we input queries and get results in its engine, but change the process dramatically.
Bidirectional Encoder Representations from Transformers or BERT for short was first showcased in November 2018 as a new implementation from Google to display only relevant results for people searching on their platform.
So, what exactly will this new algorithm change first for the average user, and second for the SEO expert?
Let’s first take a look at what BERT brings to the table.

The understanding of context

BERT’s main goal is to clearly understand the context of the query in order to determine the exact result a person may be looking for. Although it will be calculating the results mostly based on the query itself, it may not be too far away to see Google implement something considering a specific device’s previous search history.
Anyway, BERT will try to identify all of the words in as many ways as possible, thus forming a clear understanding of the context.
The previous versions of the algorithms would consider the query by trying to comprehend it from only one side.
For example, it would start reading the query from either left to right or right to left without any correlation of the words in different locations of the query.
For example, a query “which books should I as a student consider to buy?”, has the goal behind it as a way to buy books. Although it would still be relatively accurate in displaying the best options, comprehending this query would have been a bit different, considering that it would not correlate the second word with the last one and conclude that it’s about buying books.
With BERT, it will start considering the correlation of every word to each other in the query, sort of like a back-and-forth method, thus the name Bidirectional Encoder Representations from Transformers.
So, in the case of the query I brought as an example, BERT would immediately identify the most likely goal as “buy student books”, thus displaying only the most relevant of results.

The most relevant results

The next issue that BERT is going to tackle is the use of words that have two or more meanings. For now, it’s going to only work with the English language, but developers promise to introduce adaptations on all languages in the nearest future.
One such example is the word crane, which could mean both the piece of hardware used for constructions, as well as a bird.
So, if you were to Google search “where can I buy a Crane in Florida?” it could potentially display both options of buying the hardware as well as the bird when your intention was to get results about the hardware.
BERT would immediately tap into its vast library of 9 billion words from Wikipedia and determine how often the word Crane was used in correlation with Florida as well as in correlation with the two variations of its meaning.
It would then deduce that your query is most likely about the hardware and not the bird (because not too many people Google ways to buy the bird Crane), thus display the nearest dealerships for the hardware you’re looking for.

Things to consider as an SEO expert

BERT is most definitely not a threat to your SEO strategies if you’ve been following the best practices. If you’ve based around 70% of your efforts towards linking to a specific landing page, then you’ll still be good to go.
However, if your rankings for a specific keyword do take a hit, then the only adjustment you’ll have to make is editing the page’s content a little bit to suit BERT much better. This doesn’t mean the invention of new keywords, but maybe a better placement of them within contextual syntax.
So for example, if you have a landing page about buying cranes in Florida and focus your keyword like “buying cranes in Florida is extremely easy”, it may be better to phrase it into something like “you can buy cranes in Florida at the following dealerships:...”
This lets BERT know that you have factual evidence on the keyword the user has searched for, and thus will give you a better position in its rankings.
But remember, links still remain the dominant force of SEO juice, it’s just that Google wants all of the new content to be much more relevant to the users.

Written by Giorgi-M | I'm a beginner Software developer from Georgia with a big love for all things blockchain!
Published by HackerNoon on 2019/10/30