Earlier in the year, Microsoft detailed the ways Bing has benefited from AI at Scale, an initiative to apply large-scale AI and supercomputing to language processing across Microsoft’s apps, services, and managed products. AI at Scale chiefly bolstered the search engine’s ability to directly answer questions and generate image captions, but in a blog post today, Microsoft says it has led to Bing improvements in things like autocomplete suggestions.
Bing and its competitors have a lot to gain from AI and machine learning, particularly in the natural language domain. Search engines need to comprehend queries no matter how confusingly they’re worded, but they’ve historically struggled with this, leaning on Boolean operators (simple words like “and,” “or,” and “not”) as band-aids to combine or exclude search terms. But with the advent of AI like Google’s BERT and Microsoft’s Turing family, search engines have the potential to become more conversationally and contextually aware than perhaps ever before.
Bing’s autosuggest feature, for instance, which recommends relevant completed searches matching a partial search, can now better handle next phrase prediction thanks to new language generation models. With next phrase prediction, full phrase suggestions are surfaced and generated in real time, meaning they’re not limited to previously seen data or the current word being typed.
Another Bing feature, the related resource list People Also Ask, has improved thanks to recent AI and machine learning innovations. As before, the feature allows users to expand the scope of their search by exploring answers to questions related to their original search query. But now, it’s able to better understand questions it hasn’t seen before by leveraging a model trained on billions of documents that generates question-answer pairs within the documents. When the same documents appear on the search result page, Bing uses the previously generated pairs to populate the People Also Ask block alongside existing questions that have previously been asked.
Microsoft also says that Bing is now serving higher-quality searches in more than 100 languages and over 200 regions. That’s thanks to Microsoft’s Turing Universal Language Representation (T-ULR), which incorporates a pretrained model called InfoXLM that drives multilingual understanding and generation of tasks, scenarios, and workloads. The same model was reused to improve image captions in all Bing markets; it’s powering semantic highlighting, a new feature that expands highlighting in captions beyond simple keyword matching. For example, instead of highlighting the words “Germany” and “population” in response to the search “German population,” semantic highlighting bolds the actual figure of 81,453,631 as the population of Germany in 2019.
“Advancements in natural language understanding continue to happen at a very rapid pace,” Microsoft wrote. “Bing users around the globe perform hundreds of millions of search queries every day. These queries are diverse in many ways, from the intent the users are seeking to fulfill, to the languages and regions where these queries are issued. To handle such a dynamic range of usage, AI models in Bing must continuously evolve.”
Author: Kyle Wiggers
Source: Venturebeat