According to a recent study, approximately 80 percent of Google Home results wind up coming from snippets, and yet despite this, they’re not always the source that Google Home goes to for answers.
ROAST is a digital agency that released their Voice Search Ranking Report, although registration is required to access it. The report sought to categorize and then comprehend just how Google actually processes voice queries before responding to them. The report also studied things in an attempt to determine when Google Home winds up using Answer Box results and snippets as compared to when it doesn’t.
The company utilized keyword analytics to come up with a list of over six hundred search phrases in the United Kingdom that featured snippet answer boxes. It then sorted out the top phrases by query volume across a variety of verticals, such as finance, travel, retail, and medical. The tests were all run in the month of November and then contrasted to both traditional search results and Google Home results.
How many search phrases actually got answered on Google Home? Did the answers align with the results of answer boxes? What search phrases prompted Google to choose something other than answer boxes? Is it possible to compare visibility on voice search to that of answer boxes? Does any difference exist? In most instances, the Google Home result actually mirrored the Answer Box or snippet, per the research.
However, in some cases, when in the case of a snippet, Google Home provided either a different answer or answer at all.
I didn’t personally do any of my own testings to confirm the findings of ROAST, yet there is a pair of ad hoc cases where ROAST claimed there was no answer but my own queries got the same answer the snippet features. It’s also true that rephrasing questions might yield results which initially got the response of “I can’t help you with that yet.” The report also grouped various Google Home answers into not one but six distinct categories:
1) Standard answers were the most common kind of response referencing sources or domains
2) Location results
3) Suggestions or action prompts
5) Flight searches
6) Similar queries where Google Home wasn’t sure but could provide answers to similar questions
Google Home wound up answering over 70 percent of the test queries. When Home gave an answer, approximately 4 of 5 of them were identical to the Answer Box, per ROAST. However, 20 percent of queries got answers from other sources of data, in particular with an emphasis on flight searches or local queries.
Here are a few of the conclusions though:
Even if you occupy the answer box result with your own featured snippet, it won’t mean that you own that voice search result. There are times that the assistant doesn’t provide an answer or even references different domains.
Be on the lookout for search phrases where Google is unlikely to use answer box information as a source of its results. Locations, actions, and flights are common categories of this.
Google My Business proves key to local-related searches. You should start fleshing out a list of search phrases just for tracking voice search reports, as this list is going to be different from your typical search phrase list. ROAST claims it’s going to produce even more reports for different verticals in the 2018 calendar. I’d also encourage others to provide their own similar testing so they can add their own insights to this conversation.