Google today announced three new features for its voice-activated Assistant that will make it easier and more natural to interact with it.
The first is about making it easier to initiate a conversation with the Assistant by simply looking at a device like the Nest Hub, with its built-in camera, and talking to the Assistant without using the “Hey Google” wake word. This will roll out later this week to users who pair their Nest Hub Max with an Android device, while iOS users will have to wait a few more weeks.

The other new feature is extended support for quick phrases, that is, the ability to use a quick phrase to answer a phone call, turn off the lights or ask about the weather, all without having to use a wake word either. That means going forward, you’ll be able to simply set a timer without saying “Hey Google.” Google notes that this is an opt-in feature and that it will use the company’s voice match feature that’s already available on the Nest Hub today.

Finally, Google is also making some changes to how the Assistant processes your requests so that it will be able to better understand your intent, even if have to correct yourself or make small pauses while you think about how you want to phrase your question, for example.
“We realize when evaluating real conversations, they’re full of nuances,” said Nino Tosca, the director of product management for the Google Speech team and the Google Assistant. “People say ‘uhm,’ interruptions when two people are speaking back and forth, pauses, self-corrections — but we realized that with two humans communicating, these things are natural. They don’t really get in the way with people understanding each other. [ … ] We’re trying to bring these natural behaviors to the Google Assistant so that a user doesn’t have to think before they say a command — or actually process the command in their head, make sure they have every word right and then try to get it out perfectly. We want you to be able to just talk to the Google Assistant like you would with another human and we’ll understand the meaning and be able to fulfill your intent.”

Sadly, this feature is still in development but should roll out sometime in early 2023. Google has always used I/O to showcase upcoming features, even though some of them never launch, so we’ll just have to wait and see where this one goes.
Disrupt 2026: The tech ecosystem, all in one room
Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.
Save up to $300 or 30% to TechCrunch Founder Summit
1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately
Offer ends March 13.
Overall, though, these seem like worthwhile additions to the Google Assistant feature set. Saying “Hey Google” quickly gets old, after all, and continues to feel a bit weird. Indeed, I can’t help but think that the shine has worn off a bit from the Assistant (and its competitors). Personally, despite having a bunch of Nest Hubs and Google Homes at home, I don’t think I’ve used them for anything but turning on the lights using their touchscreen and setting the occasional cooking timer in recent months. Google has major ambitions around “ambient computing,” but when the Assistant doesn’t understand you and then randomly starts playing a Justin Bieber video on your TV, it feels like that future still needs some tuning. Anything to remove those barriers is welcome.

