Speak and ye shall be heard. The Google App announces that it has gotten smarter – understanding your complex questions to serve up top search results.
When Google started answering questions with voice search in 2008, the results were childish. They understood the basics of words but not the intent behind the words.
“Now we’re “growing up” just a little more. The Google app is starting to truly understand the meaning of what you’re asking. We can now break down a query to understand the semantics of each piece…”
Google can now understand the natural language of a search engine query much better. This includes time-based queries, superlatives, and even your more complicated questions.
Here are some examples:
Time-Based Search Engine Queries
- What were the final Packers stats in 1975?
- What time is it in San Francisco?
- What will the temperature be at 11:57am today?
Superlative Search Engine Queries
- What is the largest city in New York?
- What is the best place to go on vacation?
- Who are the richest women alive?
Complicated Search Engine Queries
- Who was the president of the United States on May 11th, 1951?
- What was the Asian population when Steve Forbes was born?
- What are some of Anne Hathaway’s movies?
Here’s how it works:
Google can now understand the intent behind the questions. This ensures that the Knowledge Graph is more reliable and your search engine results are tailored to provide more human-based answers.
Images courtesy of Business Insider and Google Inside Search.