Google Assistant No Longer Needs To Be Called To Interact Through The Look And Talk Feature

 


Google Assistant features are also enhanced in Google I/O with the ability to understand conversations more naturally. When instructions are given by a human, the Assistant can understand them even if the user stops talking, sneezes and makes mistakes. This makes the interaction with the Assistant more like a conversation between two humans.



For Nest Hub Max users, the Assistant will recognize the user's face and it can understand the voice commands given even if the user does not shout "hey Google" as usual. The user simply needs to look at the device and then give the command “find a Thai restaurant nearby” and it will continue to perform the search.



A feature called Look and Talk was given to users in the United States first starting today. They need to activate the Face Match and Voice Match features to enjoy this feature.


The Quick Phrases feature has also been expanded to Nest Hub Max. With Quick Phrases commonly given instructions can be given as well without the Assistant being called upon first. For example the command to place an alarm and turn on the smart light is mentioned only and the Assistant will understand it.

Previous Post Next Post

Contact Form