Skip to main content

Google teases smart glasses with amazing Project Astra update

google teases smart glasses with amazing project astra update smartglasses
Google

Google has been hard at work improving Project Astra since it was first shown during Google I/O this year. The AI bot that understands the world around you is slated to be one of the major updates to arrive with Gemini 2.0. Even more excitingly, Google says it’s “working to bring the capabilities to Google products like the Gemini app, our AI assistant, and to other form factors like glasses.”

What’s new in Project Astra? Google says language has been given a big performance bump, as Astra can now better understand accents and less commonly used words, plus it can speak in multiple languages, and in combinations of languages too. It means Astra is more conversational, and speaks more like we do every day. Astra “sees” the world around it and now uses Google Lens, Google Maps, and Google Search to inform it.

Recommended Videos

A big part of Project Astra’s appeal is its ability to remember things, and Google has given it a 10-minute in-conversation memory, ensuring conversation flows and questions don’t need to be phrased in a certain way to be understood. Its long-term memory has also been improved.

Please enable Javascript to view this content

The final update is around latency, which Google says now matches the flow of human conversation. Take a look at the video below to see how Project Astra is progressing. I’s looking very exciting indeed.

We were impressed with Project Astra when we saw it demonstrated at Google I/O, but frustrated that Google didn’t effectively even tease, let alone announce, a pair of smart glasses that would work with it in the future. It has now teased such a product with the announcement of Gemini 2.0 and the Project Astra updates.

Project Astra | Exploring the future capabilities of a universal AI assistant

The wearable is being tested in the real world, as you can see in the Project Astra video, where it’s obvious how well Astra suits hands-free use. Such a product can’t come soon enough as brands like Meta, with the help of Ray-Ban, and newcomers like Solos are already ahead of the game. Even Samsung is expected to launch some kind of smart eyewear in 2025.

What about Gemini 2.0? Google’s CEO Sundar Pichai calls it the company’s most capable model yet, adding: “If Gemini 1.0 was about organizing and understanding information, Gemini 2.0 is about making it much more useful.” While the updates and new features are mostly of interest to developers at the moment, we will see Gemini 2.0 in action on our phones and in our searches.

A chat-optimized, experimental version of Gemini 2.0 Flash — the first model in the Gemini 2.0 family — will be available as an option in the mobile and desktop Gemini app for you to try, while Gemini 2.0 will enable AI Overviews in your Google searches to answer more complex questions, including advanced math equations and coding queries.

This feature begins testing this week and will be more widely available in early 2025. There’s no information on when Project Astra or the prototype smart glasses will be more widely available.

Andy Boxall
Andy is a Senior Writer at Digital Trends, where he concentrates on mobile technology, a subject he has written about for…
I’m really worried about the future of smart glasses
The front of the Ray-Ban Meta smart glasses.

The Ray-Ban Meta smart glasses are among the most interesting, unexpectedly fun, and surprisingly useful wearables I’ve used in 2024. However, as we go into 2025, I’m getting worried about the smart glasses situation.

This isn’t the first time I’ve felt like we’re on the cusp of a new wave of cool smart eyewear products, only to be very disappointed by what came next.
Why the Ray-Ban Meta are so good

Read more
The Galaxy S25’s lock screen will have its own AI assistant
A person holding the Samsung Galaxy S24 Ultra.

Samsung has given us an early look at one of the big software features we can expect on the Galaxy S25 series. It’s called the Now Bar, and Samsung shared some details about it when it released the beta version of One UI 7. Now, we’ve got a lot more details and a better idea of how it fits in with Samsung’s big AI vision.

The Now Bar will live on the lock screen and show personalized data about your day and activities. Samsung states it’s where you will “control your entertainment, time your next personal best workout, get directions to your next meeting, or start communicating in other languages.” It doesn't specifically say the Now Bar is an AI assistant, but when it goes into more detail about what it can do, it certainly begins to sound like one.

Read more
Samsung’s next stylus may take inspiration from the Apple Pencil
A person using the S Pen stylus with the Samsung Galaxy S24 Ultra.

In its quest for thinner, lighter phones, Samsung is considering a new design for the iconic S-Pen. The next Samsung Galaxy Z Fold 7 could be a bit thinner than the current model, with an S-Pen that more closely resembles an Apple Pencil. Such a change would come with some trade-offs: namely, that the S-Pen would require charging to function.

Most of Samsung's phones have what's called a digitizer — a thin layer of glass that translates touch input into something the phone understands. That digitizer takes up space, however, and a thinner option would allow phones to be even smaller. According to tipster Jukanlosreve, Samsung might do away with the digitizer on the Galaxy Z Fold 7 in favor of the Active Electrostatic (AES) method.

Read more