Because BERT no longer simply processes queries word for word: it weaves links between the terms used in order to take into account the context of the search and grasp the “deep meaning”. With this in mind, it looks at all the terms used, including linking words and prepositions, and evaluates the “sentiment” that emerges from the query by assigning it a score positive, negative or neutral. At the time of its launch, the BERT algorithm for Bidirectional Encoder Representations from Transformerswas the technological culmination of Google's research in NLP.
It is based on two pillars: data pre-trained models: sets of information to be analyzed using natural language processing; and methodology how the algorithm uses these models. In other words, with BERT, Google intends to “read” users’ minds by BTB Directory understanding not only the query, but what it does not explicitly say . It is also a lever for understanding new queries, those which are formulated for the first time, and which Google estimates at the time at around % of daily searches.

In , Google's work on NLP intensified to give MUMMultitask Unified Model , an update to its algorithm that further improves the understanding of natural language and, in doing so, the relevance of the answers provided to Internet users . In particular, MUM focuses on what Google calls “complex queries,” characterized by their length and the inclusion of multiple propositions. The goal of MUM is to be able to answer these queries in one go by relying on advanced functionalities: extraction of information from several content formats, display of resources extracted from results inlanguages with instant translation and supporting multiple tasks simultaneously.
|