Sign in or Register

Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A,
Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
+603-8023 9829
+603-8023 7089
Fictron Industrial
Automation Pte Ltd

140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

Google Refines Search To Better Understand Sloppy Queries

29 Oct 2019
Google Refines Search To Better Understand Sloppy Queries
View Full Size
Google on Friday proclaimed its “biggest leap forward” in a long time in its search algorithm, offering an unusually detailed public explanation of its secret formula. The world’s most well-known internet search engine said its latest refinement uses machine learning to develop how it manages conversationally phrased English-language requests.
 
“We’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search,” Google search vice president Pandu Nayak said in an online post.
 
The California-based internet company just the past year premiered a neural network-based technique for processing “natural language.” The company said the new effort is based on what it calls Bidirectional Encoder Representations from Transformers (BERT), which is looking to realize query words in the context of sentences for insights, according to Nayak.
 
Google software, like humans, has to cope with understanding what people are trying to say despite the fact that they most likely is not expressing themselves clearly, or even be making sense. Some BERT models for foreseeing queries out are very complicated they need to be handled by high-powered computer processors particularly designed for the cloud, according to Google.
 
“By applying BERT models to both ranking and featured snippets in search, we’re able to do a much better job helping you find useful information,” Nayak said.
 
“In fact, when it comes to ranking results, BERT will help search better understand one in 10 searches in the U.S. in English.” He provided the example of Google software now understanding that the word “to” in a query like “2019 brazil traveler to usa need a visa” is about a Brazilian heading to the U.S. and not the other way around.
 
“Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil,” Nayak explained. “With BERT, search is able to grasp this nuance and know that the very common word ‘to’ actually matters a lot here, and we can provide a much more relevant result for this query.”
 
Google scheduled to spread the improvement to more languages and locations “over time.”
 

You have 0 items in you cart. Would you like to checkout now?
0 items
Switch to Mobile Version