Google is rolling out a wave of upgrades to Gemini Live, its real-time conversational AI assistant, making interactions more visual, useful, and natural.
See What Gemini Is Talking About
Starting next week, Gemini Live will be able to highlight objects directly on your screen when you share your camera. For instance, if you’re unsure which tool to grab for a project, you can point your phone’s camera at the options, and Gemini will visually mark the correct one.
This feature debuts on the newly announced Pixel 10, launching August 28th, and will then expand to other Android devices, followed by iOS “in the coming weeks.”
Deeper App Integrations
Google is also weaving Gemini Live into more native apps like Messages, Phone, and Clock. Imagine asking Gemini for directions, then quickly pivoting:
“This route looks good. Now, send Alex a message that I’ll be about 10 minutes late.”
Gemini will handle the switch seamlessly, drafting and sending the message without breaking the flow of your conversation.
More Natural Voice and Storytelling
To make interactions feel more human, Google is introducing a new audio model that improves intonation, rhythm, and pitch. Gemini will soon adjust its tone depending on context—for example, using a calmer delivery when discussing something stressful.
You’ll also gain control over Gemini’s speaking speed. And when asking for a story told dramatically—from the perspective of a character or historical figure—Gemini may even adopt fitting accents for a more engaging narrative.
We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.