Skip to main content

I saw Google’s futuristic Project Astra, and it was jaw-dropping

Sundar Pichai stands in front of a Google logo at Google I/O 2021.
This story is part of our complete Google I/O coverage

Google presenting Project Astra at Google I/O 2024.
Google

If there’s one thing to come out of Google I/O 2024 that really caught my eye, it’s Google’s Project Astra. In short, Astra is a new AI assistant with speech, vision, text, and memory capabilities. You can talk to it as if it were another person in the room, ask it to describe things it sees, and even ask it to remember information about those things.

During the I/O keynote announcing Astra, one of the most impressive moments happened when a person was running Astra on a phone, asking it to describe things in a room. When the person asked Astra where their glasses were, Astra quickly pointed out where they were in the room — even without being prompted earlier in the video about them.

But does Project Astra actually work like that in the real world? I got to see it in action during a quick 10-minute demo at I/O, and you know what? I’m pretty impressed.

‘That’s a good stick figure!’

Project Astra: Our vision for the future of AI assistants

Google walked us through a handful of Astra demos: Alliteration, Pictionary, Storytelling, and Free Form. They all did what you’d expect, and they were all equally impressive. For some context, the Astra demo Google showed during its I/O keynote had the AI running on a phone. In the demo I saw (which I wasn’t allowed to take photos or videos of), Astra was running on a laptop and connected to a camera plus a touchscreen display.

In the Alliteration demo, members from the Project Astra team had it “look” at random objects (with the camera pointed at a demo table). It accurately identified what it was looking at — a stuffed animal, a toy apple, and a toy hotdog — and talked in alliterations the whole time it was describing what it saw. It was all a bit goofy, but Astra knew everything it was looking at, and it did put a smile on my face.

Another fun moment happened during the Pictionary demo. Someone from the Astra team used the connected touchscreen to draw a stick figure. As she was explaining that she was drawing the stick figure first, Astra — unprompted — exclaimed, “That’s a good stick figure!” with much enthusiasm.

Google Astra on a phone.
Google

It was a subtle moment, but it really drove home just how different Astra is from, say, Google Assistant. No one needed to ask, “Hey Astra, what do you think about this stick figure?” It saw the stick figure, heard the Googler talk about it, and provided feedback all on its own. It was kind of jaw-dropping. From there, the Astra team member put a skull emoji on the stick figure’s outstretched hand. When asked what play the drawing was supposed to represent, Astra immediately guessed Hamlet.

Storytelling and Free Form had their moments, too. For the Storytelling demo, Astra was shown a toy crab and asked to tell a story about it. Astra started telling a detailed story about the crab walking through a beach. A fidget spinner was then placed on the table, and Astra was asked to incorporate it into the story. It did so without skipping a beat.

As the name suggests, the Free Form demo put Astra in a position to do whatever was asked of it. It was shown three stuffed animals and told their names. Someone then asked Astra to recall the names of the various animals, and it got two out of three correct. Just like you and me, Astra remembers things it sees and hears. Google is still figuring out how much Astra should remember and how long it should retain that information, and those are critical details to be ironed out. But the fact that this happens at all is nothing short of magical.

Hearing is believing

Project Astra demonstration on a phone.
Google

Perhaps what stuck out to me the most during my demo was just how natural Astra felt. The Astra team members never needed to say “Hey Astra” or “OK Astra” to get its attention for voice commands. Once Astra was up and running, it was able to continuously listen for questions/commands/comments and respond to those as if it were another person in the room.

The quality of its responses was just as impressive. Listening to Astra, I never once felt like I was hearing a virtual assistant speak to me. The voice inflections and natural speaking pattern Astra delivered was really something. If I closed my eyes, I might be able to trick myself into thinking I was listening to someone else in the room with me — not a computer.

If we’re ever going to get to a point where AI feels like a friendly, helpful, and personable assistant, it needs to feel like you’re talking to a friend. Astra feels like it’s really close to that, and that’s infinitely more exciting than Gems, tokens, or any of the other AI jargon Google spent two hours talking about during its keynote.

Is Astra really the AI of the future?

The Google I/O sign at Google I/O 2024.
Joe Maring / Digital Trends

As the name “Project Astra” suggests, Astra is very much still a work in progress and not something Google is ready to ship anytime soon. Will Astra eventually replace the Google Assistant on my Android phone? Will I even need a phone if I can just have a pair of smart glasses with Astra integrated into them? Perhaps more importantly, are we anywhere close to Astra being ready for normal, everyday use?

Those are all very big questions Google still needs to address, and I imagine it will be a while before we have answers to any of them. But after experiencing Astra for myself and reflecting on my time with it, I can’t help but feel excited about its potential.

A smart, friendly, memorable, and easy-to-talk-to AI assistant that actually feels like something out of a sci-fi movie? That’s something to talk about.

It’s very easy to feel bad about AI, and rightfully so. When Google spent parts of the I/O keynote bragging about AI image generation, using AI to create movies, or having AI summarize Google Search results — which could very well kill the modern internet as we know it — I couldn’t help but dread the AI-riddled future we’re rapidly barreling toward. But a smart, friendly, memorable, and easy-to-talk-to AI assistant that actually feels like something out of a sci-fi movie? That’s something to talk about.

I don’t know if Astra will ever be as cool or encompassing as I’m dreaming it up to be. But it really feels like there could be a future where that happens, and I hope that’s the AI future Google puts its efforts toward.

Joe Maring
Former Digital Trends Contributor
Joe Maring has been the Section Editor of Digital Trends' Mobile team since June 2022. He leads a team of 13 writers and…
Gemini might soon drive futuristic robots that can do your chores
DIGIT sensors mounted on a robot hand manipulating glass marbles.

The inevitable outcome of artificial intelligence was always its use in robots, and that future might be closer than you think. Google today announced Gemini Robotics, an initiative to bring the world closer than ever to "truly general purpose robots."

Google says AI robotics have to meet three principal qualities. First, they should be able to adapt on the fly to different situations. They must be able to not only understand but also respond to changing environments. Finally, the robots have to be dexterous enough to perform the same kind of tasks that humans can with their hands and fingers.

Read more
Google Pixel 10 leak is a warning shot for Apple to lift its iPhone game
Leaked render of Google Pixel 10.

Google’s Pixel phones have cultivated a solid reputation for their stunning camera chops. In 2025, Google might take things to the next level with the Pixel 10, edging past mainline iPhones in the process. As per a fresh leak, it seems the company’s next baseline flagship will take the zoom capture prowess to newer heights.

The folks over at Android Headlines (in collaboration with @OnLeaks) have shared alleged product renders of the upcoming Pixel 10, which shows a triple-lens camera array at the back. On the Pixel 10, buyers will be greeted by a dedicated telephoto zoom camera.

Read more
Google Gemini set to close gap on ChatGPT with rumored new feature
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app displaying the transcribe of a conversation and the steps taken

The Gemini app offers a whole bunch of useful things, but it's lacking one thing: Video analysis based on uploads from your PC or phone. That might be about to change, though, as looking into the APK code reveals that Google is working on a video upload feature. This could soon help Gemini analyze and summarize videos uploaded directly by users; it'd also help it rival ChatGPT, which already offers such a feature.

Android Authority went on a deep dive into the APK source code of the Google app beta and came up with some interesting findings. Given that this was found in the official Google app, there's a good chance it'll eventually make it into Gemini, but just to be extra safe, read the following with a little bit of skepticism.

Read more