What is Google BERT and What to Do About It
Google has been hinting for a while about an algorithm update and on October 25, 2019 formally announced that is is indeed dropping the biggest update since RankBrain in 2015 with it's newest change, BERT. It’s rolling out this week initially only in English.
And boy howdy, this is a big one. In fact, BERT will impact 1 in 10 of all search queries according to Google. The SEO community is falling over ourselves to get ahead of the change so I’m excited to share some initial ideas from the Upgrow SEO agency team.
What is BERT?
On the Google Blog, BERT's creation "was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries."
The blog also explains "Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it--BERT, for short."
The best way to understand it though may be looking at some examples of pre-BERT and post-BERT to explain what this means:
Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn't understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.
Let’s look at another query: “do estheticians stand a lot at work.” Previously, our systems were taking an approach of matching keywords, matching the term “stand-alone” in the result with the word “stand” in the query. But that isn’t the right use of the word “stand” in context. Our BERT models, on the other hand, understand that “stand” is related to the concept of the physical demands of a job, and displays a more useful response.
Here are more examples of how BERT has helped grasp the nuances of language that computers don’t interpret the way humans do.
BERT will also affect rich snippets
Below is an example of Google showing a more relevant featured snippet for the search “Parking on a hill with no curb”. Pre-BERT, a search like this would confuse their algroithm. Google said, “We placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb.”
In examples like this, Google looked really silly before. They were serving “how to park on a hill with a curb” for a query of “parking on a hill with no curb”. Now they look much smarter...
Is BERT good or bad for SEO's?
As a Google user, you’ll more than likely enjoy the benefits of BERT.
The ability of Google to use Natural Language Processing (NLP) is a big step towards better understanding natural human language. Search engines had historically relied just on keywords with little understanding of intent but are steadily making progress towards just that as seen with the examples above.
As an SEO professional, if you create quality content and have a burning desire for more relevant organic search traffic then BERT is definitely good for you. It has always been Google intent to answer user’s questions (queries) with the most accurate, direct, and authoritative information available. Their challenge has always been sifting through the billions of pages on the internet to rank. To put it simply, our goal as SEO’s is to answer user questions through creating content and properly formatting it so search engines can easily digest and index it. BERT improves Google's ability to understand our content.
Now if you create a lot of low quality content that’s just filled with the keywords then you might not like BERT and there are already reports of black hat SEO’s complaining about major traffic drops in the last few days.
How do you plan for BERT
Google has stated that there is no way to optimize for BERT. One one level, that’s true. We directly optimize for a blackbox NLP algorithm.
So what can we do?
Simply write content for users, like you always do. Then make sure that content is properly formatted for indexing.
To get you started, here are some tips to adjust your SEO strategies around not just BERT but for delivering an excellent answer to solutions that your target user is seeking.
Optimize for keywords with search intent
This strategy may be older than BERT but it’s even more important now than ever and will only increase in necessity. SEO’s need to move away from strictly looking at keywords and prioritize the ranking for the queries that are important to their specific target customers.
As an example, let’s say you handle SEO for a bank and want more loan customers. When you’re planning your next blog, you consider the solutions your bank can solve for a customer. Maybe you offer the best farming equipment financing.
Where BERT would be a factor in is potentially giving higher priority to your “how to” article compared to a less related intent page about “farming equipment financing”. Based on what Google has shared about BERT, it should now understand the intent of the user for a “how to” article.
That means there will be more benefit to creating a wider variety of content to address different queries, where pre-BERT just having a general page would have been enough.
A user searching for “how to get farming equipment financing” is closer to making a decision on a lender than “bank loans”. As Google and users get smarter, searches will get more specific. And the more specific the search, the more likely they would be to become a customer.
Directly answer queries early on the page
To earn a coveted rich snippet placement directly on the SERP (search engine results page), you need to concisely answer the query. With BERT, Google is moving away from dropping a user on a broad, general topic page and instead prioritizing content that directly addresses the query.
In the earlier example of “how to park on a hill with no curb”, the new top ranking page very concisely answer the question with a single image and a short paragraph. To earn a rich snippet it’s smart to directly answer the question at the top of your content and then provide additional details and information below. Of course you also want to include rich snippet markup as well.
Write for users
It’s the safest way to get long term SEO results that only get better with every algorithm update. Simply think about the questions, or even better - the pain points that you can solve. If you do that and then take care of the on-page SEO you’ll be fine more times than not. Using excessive keywords, dynamically creating content or churning out low-quality, unhelpful blog posts are dying as effective strategies.
I hope this has given you a nice overview of BERT and what some of the impacts to expect will be. There is still much to be revealed as BERT is still actively rolling out this week but following these white hat SEO strategies and you can expect a nice increase in organic traffic and conversions.
Ryder Meehan has been a on a 15 year journey to master online marketing from every aspect of acquisition. After a romp at digital agencies like Slingshot and Razorfish he then went on to roles at Fossil, Samsung Mobile and Tatcha before becoming Co-Founder & CEO at Upgrow, a lead generation SEO and digital marketing agency.
He has been featured as a digital marketing leader on Forbes, PRNews, Business.com, and other outlets. As an industry expert he has also been a featured speaker and instructor at PRNews Summit, San Francisco State University, IndiBio, General Assembly, and AMA San Francisco.