Like any other busy parent, I’m always looking for ways to make daily life a little bit easier. And Google Assistant helps me do that — from giving me cooking instructions as I’m making dinner for my family to sharing how much traffic there is on the way to the office. Assistant allows me to get more done at home and on the go, so I can make time for what really matters.
Every month, over 700 million people around the world get everyday tasks done with their Assistant. Voice has become one of the main ways we communicate with our devices. But we know it can feel unnatural to say “Hey Google'' or touch your device every time you want to ask for help. So today, we’re introducing new ways to interact with your Assistant more naturally — just as if you were talking to a friend.
Get the conversation going
Our first new feature, Look and Talk, is beginning to roll out today in the U.S. on Nest Hub Max. Once you opt in, you can simplylook at the screen and ask for what you need. From the beginning, we’ve built Look and Talk with your privacy in mind. It’s designed to activate when you opt in and both Face Match and Voice Match recognize it’s you. And video from these interactions is processed entirely on-device, so it isn’t shared with Google or anyone else.
Let’s say I need to fix my leaky kitchen sink. As I walk into the room, I can just look at my Nest Hub Max and say “Show plumbers near me” — without having to say “Hey Google” first.
There’s a lot going on behind the scenes to recognize whether you’re actually making eye contact with your device rather than just giving it a passing glance. In fact, it takes six machine learning models to process more than 100 signals from both the camera and microphone — like proximity, head orientation, gaze direction, lip movement, context awareness and intent classification — all in real time.
Last year, we announced Real Tone, an effort to improve Google’s camera and imagery products across skin tones. Continuing in that spirit, we’ve tested and refined Look and Talk to work across a range of skin tones so it works well for people with diverse backgrounds. We’ll continue to drive this work forward using the Monk Skin Tone Scale, released today.
We’re also expanding quick phrases to Nest Hub Max, which let you skip saying “Hey Google” for some of your most common daily tasks. So as soon as you walk through the door, you can just say “Turn on the hallway lights” or “Set a timer for 10 minutes.” Quick phrases are also designed with privacy in mind. If you opt in, you decide which phrases to enable, and they’ll work when Voice Match recognizes it’s you.
Looking ahead: more natural conversation
In everyday conversation, we all naturally say “um,” correct ourselves and pause occasionally to find the right words. But others can still understand us, because people are active listeners and can react to conversational cues in under 200 milliseconds. We believe your Google Assistant should be able to listen and understand you just as well.
To make this happen, we're building new, more powerful speech and language models that can understand the nuances of human speech — like when someone is pausing, but not finished speaking. And we’re getting closer to the fluidity of real-time conversation with the Tensor chip, which is custom-engineered to handle on-device machine learning tasks super fast. Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.
We're working hard to make Google Assistant the easiest way to get everyday tasks done at home, in the car and on the go. And with these latest improvements, we’re getting closer to a world where you can spend less time thinking about technology — and more time staying present in the moment.
0 Comments