The Technicality of BERT

Although, BERT is literally technical on the backend, it’s nothing to worry about. Its practicality is purposely to help search engine spiders understand the context of words and nuances in searches better and connects those search queries with more relevant search results.

It’s the kind of technicality that is leveraged by anyone to restructure the system in which they answer questions. Simply put, BERT can really make the human language clear and understandable to the computer.

This breakthrough, according to Google, was the outcome of Google research on transformers – representations that have words processed in similarity to all other words in sentences, rather than separately and in succession.

Therefore, BERT representations pay attention to the full word context by considering words that appear before and after it. This is especially helpful for understanding the search intent.

But this possibility isn’t due to just the software advancements: Google stated that some new hardware will also be needed. Some of the representations Google can possibly develop with BERT are so technical that they extend the limits of what Google can do utilizing traditional hardware.

Thus, Google, for the first time, is leveraging the most recent Cloud TPUs to deliver search results and give you more relevant search information as quickly as possible.



Join the Discussion
Write something…
Recent messages
Jukkah Premium Plus
Great information on BERT. I just read up on it the other day as my traffic defnitely took a bit of a hit but it's seems to be bouncing back. Your training confirmed my understanding that the update is about understanding user intent and the actual context of the content.

This, of course, means that targeting single phrases is becoming redundant and you should cover a topic as thoroughly as possible. Especially targeting misspelled or incomplete sentences will be redundant as Google will know what the user actually means.
Reply
Dhind1 Premium Plus
It sounds like BERT will simply make things easier for all of us when we are searching for things on Google. When we use natural language we will get more relevant results.

For our websites, I think you have hit the nail on the head - we just need to keep producing quality content and offering answers to people's questions. Then we will be ok.

There will be more information soon about BERT as more people explore its capabilities. I look forward to hearing more.

Alex
Reply
Israel17 Premium
You just got it, Alex. Thanks for taking the time to read this training! Much grateful! Producing more relevant, quality and remarkable content will help benefit from the Google's latest search algorithm update. Thanks for dropping by!

Israel Olatunji
Reply
julesnp Premium
Thanks Israel for sharing this very interesting post.
Have a great weekend.

Jules
Reply
Israel17 Premium
Hey Jules, thanks for appreciating this post! Glad you found it interesting! Just follow up with the old Google's rule - which is building out quality, relevant content to satisfy user and search intent. Thanks for taking the time to read this post!

Israel Olatunji
Reply
Claudiojuan Premium
Thanks Israel for keeping us informed of all the progress that Google is making.
Claudio
Reply
Israel17 Premium
Most welcome onboard, Claudio! Thanks for taking the time to read this training! Much appreciated, my friend! For those who consistently produce relevant, quality content for users, both the BERT and RankBrain updates will benefit them ultimately. The wise will concentrate more on building out content more than anything else now. Thanks for your contributions!

Israel Olatunji
Reply
keishalina Premium
hey hi Israel --- appreciate the 'digging' research that you've done to get this post published .... thanks! ... ⭐️😊⭐️
Reply
Israel17 Premium
Hey Keisha, thanks for dropping by! Glad you appreciate this training tutorial, my friend! Bidirectional Encoder Representations from Transformers (BERT) is the latest update in Google's search algorithm to better understand natural language and better serve more relevant results than ever. Thanks for reading!

Israel Olatunji
Reply
Top