Context is critical in searches. That’s why Google’s latest update is taking a big step to bridge the gap between syntax and intent in queries. How will it work and what will the impact on e-commerce look like? Let’s discuss.
The update is based on a form of natural language processing called bidirectional encoder representations from transformers — or BERT. The technology allows Google to process queries in relation to all the other words in a sentence, rather than sequential order. And make no mistake: it’s a huge leap forward for Google’s algorithm. So much so, in fact, that it’s touting it as one of the biggest algorithm updates of all time (up there with RankBrain and others).
Practically speaking, for longer, more conversational queries or searches where prepositions like “for” and “to” matter a lot to the meaning, BERT will be able to better understand the context of the words in the query.
It’s also important to underscore the point that BERT isn’t an actual ranking signal. Its function is simply to be able to better synthesize queries and produce more contextually accurate results. Because of its nature, you cannot optimize for BERT. But you should ensure that your content uses simplified language, appropriate keywords and accounts for the intent of users’ queries. Furthermore, we suggest bypassing tracking tools to gauge this particular update’s impact in favor of Google Analytics and Search Console to ensure that you’re getting insightful data.
How will BERT Impact E-Commerce Sites?
Where this will impact e-commerce sites is in their resource sections, buyer’s guides and — potentially — blogs. If a site has been using a Quick Answer strategy and its content wasn’t researched or written around how users search, but they were still ranking, they may see a drop. In this case, it’d be a good opportunity to compare the sites that have overtaken their listing and flag the differences to build out content that is more relevant in the algorithm’s eyes.
If you’ve been following EXCLUSIVE for any amount of time, you know the value of unique, well-optimized content on category and product pages. Our Valuescaping methodology looks at long-tail terms that have converted in the past — meaning they’re terms that users have actually searched for (user intent!). On the product level, this is important when looking at a multi-product listing pages with variants by color. We see a lot of sites falling into the “nail polish trap” of naming their colors fun names that don’t say what the color actually is. You can give a color swatch a cute nickname, but don’t forget to also mention that “siren” is a bright, neon red, for example.
On the category level, having a site architecture and accompanying content that’s topically comprehensive in terms of addressing more nuanced searches is likely to yield a fruitful strategy in a post-BERT organic world. The technology will likely add some color to searches that were once monochromatic, and sites that employ a robust, content-centric strategy — like many of EXCLUSIVE’s clients — are positioned to be rewarded. Where we see the most success is with the creation of pages where there is search volume and enough product selection to support its creation. Roll these out sparingly though, so you’re not cannibalizing rankings or traffic to other similar pages on your site.
Building a Winning Strategy for BERT
Make no mistake: there’s no gaming this update. It’s a technical leap forward that will reduce the percentage of queries that aren’t producing sufficiently accurate results. And that’s really the big takeaway. Continue to produce customer-centric content that speaks in your customer’s language. To get an e-commerce marketing agency with a history of breaking performance boundaries for clients to inspect your organic search strategy, contact EXCLUSIVE.
Rebecca has been with EXCLUSIVE for more than seven years, and is currently the Director of Organic Search. She majored in Marketing at Bentley University, with a concentration in Global Studies. Her favorite part of her job is analyzing data to make successful site recommendations. She enjoys cooking (and especially eating), good food and drink, working out, shopping, golf, and travel.
As of September 2019, Google is no longer supporting any unpublished and unsupported rules in their robots exclusion protocol, which mainly targets the noindex directive in the robots.txt.
What do these changes mean?
Put simply, Google is making this change because they wish to establish a standard for the robots exclusion protocol (REP). This means that their web crawlers will no longer look at the robots.txt for a noindex direction, so if you want to keep your page from being indexed, you need to use a different method.
A few noindex alternatives
Fortunately, Google has offered a few alternatives if you’re looking to prevent your page from being indexed by their web crawlers.
Prepared for the future
It’s important that you keep these updates in mind when considering the indexation status of any page. One thing to remember is that this change will only affect sites using the noindex directive in the robots.txt. Other directives and commands will remain as they are for the time being, so your only concern is making sure that you stay up to date on your noindex approach.