Google is rolling out what it says is the biggest step forward for search in the past 5 years, and one of the biggest steps forward in the history of Search altogether.
Google is using a new technology it introduced last year, called BERT, to understand search queries.
BERT stands for bidirectional encoder representations from transformers. Transformers refer to models that process words in relation to all other words in a sentence.
That means BERT models can interpret the appropriate meaning of a word by looking at the words that come before and after. This will lead to a better understanding of queries, compared to processing words one-by-one in order.