Google is presumably planning a new way of communicating with your Assistant with the help of your Pixel tablet, code-named “Look and Sign,” as discovered in the latest Google Search app update. As discussed, it was supposed to replace the previously rumored “look and talk” feature.
Here’s the twist: Google offers something, Look and Sign privacy, that is kept behind the scenes. By doing this we can determine if the tablet has been viewed by someone or stolen because the video it captures is likely the face of the thief in the video. But will you just write your name or sign through an AI partner?
One of them may be a simple issue of hand gestures. Imagine you’re doing the gestures you often do on a phone or tablet such as pointing at the screen, giving a thumbs up, saying “Hey Google” or using a specific sign to activate the Google Assistant instead (perhaps a bird). No!) say the words “Hey Google.”
At first, the hint seems so profound! What is possible with Look and Sign is not only the diversity of language but also the fact that now, people who have communication difficulties (be they deaf or mute) can interact with an assistant using sign language. Will be able to do it. This would be a powerful capability, yet, would require highly advanced algorithms and machine learning to accurately detect and interpret various gestures/movements/phrases.
The approach here takes it to the next level as Google emerges as a mighty untouchable on the scene, in the entirety of the AI helper episode featuring Gemini. ARP (Android Runtime for Phones) will also run Assistant on Pixel tablets when docked.
The developers of Look and Sign and many are expecting the Pixel Tablet 2 to be introduced next year. Although Google has not announced to sell it, Google will likely continue to develop it further. So, turn it into a smart home by improving the sign and look with these features.