Skip to main content

With Google Duplex and more, the Pixel 3’s A.I. brains are unrivaled

google pixel 3 xl back
Julian Chokkattu/Digital Trends

After an unprecedented level of leaks, Google’s New York event was light on surprises. The new Pixel 3 and Pixel 3 XL are exactly what we expected to see. Few phones have suffered so much criticism in the build up to an unveiling. The deep notch gouged out of the top of the Pixel 3 XL’s OLED screen seems universally reviled, and our Pixel 3 XL hands-on did not change our mind – it’s ugly.

We’re glad the Pixel 3 doesn’t have a matching notch; the big bezels definitely make it look dated. But there’s more to a phone than its looks. We’re often told that it’s what’s on the inside that counts and Google’s groundbreaking artificial intelligence (A.I.) advancements remind us of that. The dull design definitely disappoints, but we’re genuinely excited about what the A.I. smarts inside can do for us — including Google Duplex, which is rolling out to some Pixel phones in Atlanta, New York City, Phoenix, and the San Francisco Bay area.

Recommended Videos

“The big breakthroughs you’re going to see are not in hardware alone, they come at the intersection of AI, software, and hardware,” Rick Osterloh, senior vice president of Hardware at Google, said on stage at the event, in front of a huge screen bearing the words “AI + Software + Hardware.”

We can’t help feeling that the order is no accident. Google is all-in on AI and it has always prized the software experience above its hardware. This is what sets Google apart from competitors like Apple and Samsung. If you want a beautifully-crafted smartphone, there’s a galaxy out there to choose from, but if you want a phone that can do things no other phone can, the Pixel 3 stands out.

The A.I. hype train

Everyone has been hyping artificial intelligence for so long now that it’s easy to get weary of the hyperbole. The sad truth is that the experience of using A.I. on most phones doesn’t come close to matching the promise.

pixel 3 camera lens
Julian Chokkattu/Digital Trends

Without naming names, we’ve seen A.I. in cameras consistently misidentify subjects and scenes, producing shots that are clearly worse than the normal auto mode. We’ve had countless suggestions for news stories or places to visit that the most cursory understanding of our tastes or location would reveal as erroneous. We’ve been repeatedly misunderstood when attempting to issue simple voice commands.

It feels as though as soon as Google started to realize some success from its investment in A.I., everyone wanted to jump on the bandwagon, but they’re years behind and there’s no shortcut to catch up.

Pre-emptive help

We’re used to things like predictive text, but the first time we can remember being offered some helpful information by our phone without asking for it was in 2012 when Google Now rolled out. Looking at your phone first thing in the morning, you’d see a card displaying your optimum commute. Pulling your phone from your pocket at a bus stop or train station would throw up a timetable.

It’s the only service that tries to anticipate your needs.

It didn’t do a great deal else, beyond alerting you to the latest sports scores for the teams you supported, but it was an exciting first step. Being able to see at a glance if there was a traffic delay or knowing exactly when the next bus would arrive, made life a little easier.

Google Now has become Google Assistant and it’s easily our favorite digital butler. While Siri can set a reminder and Alexa will play the music you want, Google Assistant goes a bit deeper. It’s the only service that tries to anticipate your needs — and does it well. Anecdotally, it’s also far better at understanding Scottish accents than the competition.

Google Assistant handling calls

When we saw the Google Duplex demo earlier this year we were blown away. This is A.I. conducting a natural sounding conversation and booking a restaurant reservation or scheduling a haircut appointment. It can work within the parameters you set, so you can stipulate you want a reservation for 8 p.m. but between 8 p.m. and 9 p.m. is fine and it will go ahead and book the table for you, automatically adding the reservation to your Google Calendar once it’s booked.

If it didn’t tell you it wasn’t human, you wouldn’t guess. We don’t doubt it could beat the Turing test, provided the topic didn’t stray too far from booking your appointment.

This functionality starts rolling out to Google’s Pixel phones next month on a city-by-city basis, starting with New York City. Though it’s fairly limited in scope right now, we can see it growing into something we use often in our daily lives.

Another exciting exclusive for Pixel phones is the Call Screen feature. If you get an incoming call you can’t or don’t want to take, then you can tap screen call and the caller will hear this:

If it didn’t tell you it wasn’t human, you wouldn’t guess.

“Hi, the person you’re calling is using a screening service from Google, and will get a copy of this conversation. Go ahead and say your name, and why you’re calling.”

As the caller explains, the transcribed text pops up on your screen in real-time and you can choose to pick up, send a quick reply, or mark it as spam. If you do mark as spam it will automatically say:

“Please remove the number from your mailing and contact list. Thanks, and goodbye.”

We think the immediacy and convenience of this beats visual voicemail, which is worth remembering is also carrier specific and not available everywhere right now.

Amazing camera performance

One of the biggest arms races for smartphones in the last couple of years, and easily the biggest area of improvement, has been the camera. We’ve seen more and more dual-lens cameras and even triple-lens cameras as manufacturers struggle to outdo each other.

If you’re seeking proof of Google prioritizing A.I. over hardware, look no further than the Pixel camera. Google has stuck with a single-lens main camera, even reducing the megapixel count from the original Pixel for the Pixel 2 and yet it continues to outperform most of the competition.

Julian Chokkattu/Digital Trends

The Pixel 2 is our reigning camera phone champion, because it most often takes the photos we want to keep or share.

“That’s not a fluke,” said Osterloh at the Google event. “We spent years researching computer vision technologies, analyzing hundreds of millions of photos.”

All these A.I. features make your camera easier to use and result in you getting better photos.

Google is getting better, more consistent results by employing artificial intelligence and computational photography, than its competitors are getting by packing in more lenses. The Pixel 2 turns out awesome portrait shots with that coveted bokeh background blur. With HDR+ multiple images are merged to produce the best image every time.

The Pixel 3 brings more A.I. smarts with Super Res Zoom, which stitches together multiple shots to boost the resolution when you zoom in. Machine learning in Night Sight will re-color photos shot in low light environments to brighten them without the need for a flash. Top Shot takes a burst of photos and then selects the group selfie where everyone is looking at the camera and smiling. Portrait shots in Photobooth mode can be triggered by a funny face or a smile, so you won’t miss out on the perfect photo.

All these A.I. features make your camera easier to use and result in you getting better photos. You may be able to achieve something technically better with the latest triple-lens camera from a competitor, but it will often require a little planning or tweaking. Google’s Pixel cameras are designed to be quick and easy so you can just point and shoot, which is how most people really use their phone cameras.

Really useful A.I.

The A.I. innovation in the new Pixel continues with Smart Compose in Gmail, which offers to finish your sentences with contextual phrases, cutting down on repetitive typing like addresses. It’s like a super-charged version of predictive text that could genuinely save you a lot of time.

We know that a lot of people find this stuff creepy or have legitimate concerns about privacy, but for us the utility eclipses our disquiet.

From warnings about train delays and reminders of where you parked to capturing the best possible photo, Google is doing things with A.I. that no one else can right now. Google Assistant is far closer to being the real-life personal assistant you can’t afford than any of its competitors. If you want to explore the full potential of that, you’re going to need a Pixel phone. It might not be the best-looking choice, but it’s surely the smartest.

Simon Hill
Former Digital Trends Contributor
Simon Hill is an experienced technology journalist and editor who loves all things tech. He is currently the Associate Mobile…
Here’s every Pixel phone that can download Android 16 Developer Preview 1
The Google Pixel 9 Pro XL next to the Google Pixel 8 Pro.

Even though Android 15 launched only recently, Google is already moving on to Android 16, which is much earlier than is typical. And if you have a Pixel device from the past couple of years, you can get the Android 16 Developer Preview 1 right now.

Typically, when Google releases a beta for Android, the Pixel lineup gets it first before any other phones. When Google announced Android 16 earlier today, we didn’t know exactly which Pixel models would be able to get the Developer Preview. But Google just revealed which models can run Android 16, and two of them are a bit surprising.

Read more
Google Gemini arrives on iPhone as a native app
the Google extensions feature on iPhone

Google announced Thursday that it has released a new native Gemini app for iOS that will give iPhone users free, direct access to the chatbot without the need for a mobile web browser.

The Gemini mobile app has been available for Android since February, when the platform transitioned from the older Bard branding. However, iOS users could only access the AI on their phones through either the mobile Google app or via a web browser. This new app provides a more streamlined means of chatting with the bot as well as a host of new (to iOS) features.

Read more
Google’s Pixel phones may finally catch up with the iPhone
Tensor chip inside Pixel 6a.

“The Tensor inside Pixels is essentially a mirror of Samsung’s Exynos silicon, and so are the problems.” That's a rough summation of Pixel user sentiments over the past few years.

If you’re a Pixel smartphone user and have encountered problems such as overheating and poor battery, as well as subpar performance when compared against the Qualcomm processors built atop the TSMC stack, you get the gist. Things might change next year, though, for good.

Read more