Google has started rolling out real-time AI video features for Gemini Live. These updates let the assistant “see” your screen and camera feed in real time. Some Google One AI Premium subscribers already have access.
What’s New in Gemini Live?
Google’s latest update adds two powerful features:
- Screen Reading: Gemini can now analyze your phone screen and provide instant insights.
- Live Video Analysis: Using your smartphone camera, Gemini can interpret live footage and respond in real time.
Both features come from Google’s “Project Astra.” The company first demonstrated this technology a year ago. Now, it’s finally available to users.
How Does Gemini’s Screen Reading Work?
Imagine you’re browsing a website or reading an article. Instead of switching apps, you can ask Gemini to analyze the content. Need a quick summary? A definition? A suggestion? Gemini delivers answers without making you copy and paste text.
A Reddit user recently shared a video of this feature working on a Xiaomi phone. This confirms that some users already have early access.
Live Video Assistance: What Can It Do?
The live video feature takes things even further. Instead of typing questions, you can show Gemini what you need help with.
Here are some examples:
- Home Improvement: Need help picking a paint color? Show Gemini your freshly-glazed pottery and get recommendations.
- Tech Troubleshooting: Struggling with a broken device? Point your camera at it and ask for possible fixes.
- Cooking Help: Show Gemini your ingredients, and it will suggest recipe ideas.
Also read: Google Gemini Now Remembers Your Chats
How Does Gemini Compare to Alexa and Siri?
Amazon’s Alexa Plus is preparing for early access. Apple has delayed its upgraded Siri. Both will likely include similar AI features. Meanwhile, Samsung still supports Bixby. But Gemini is now the default AI on Samsung phones.
Feature | Google Gemini | Alexa Plus (Upcoming) | Siri (Upcoming Update) |
Screen Reading | Available | Not Announced | Not Announced |
Live Video Analysis | Available | Limited Features | Delayed Launch |
Real-Time Responses | Yes | Yes (Limited) | Yes (Limited) |
What This Means for AI-Powered Devices
Google’s AI update sets a new standard. Now, AI can “see” and interpret the world in real time. This makes digital interactions more natural and intuitive. Instead of typing, you can simply show Gemini what you need help with.
This could change:
- Education: Students can point their cameras at math problems or history texts. Gemini provides instant explanations.
- Shopping: Not sure if a product is right for you? Gemini can analyze and compare options.
- Accessibility: People with vision impairments could use it for real-time descriptions.