Skip to main content

Google wants Android O to make users of accessibility services more productive

Sundar Pichai stands in front of a Google logo at Google I/O 2021.
This story is part of our complete Google I/O coverage

accessibility updates in android o dont buy your google pixel from verizon
Image used with permission by copyright holder
The Android accessibility services team took the stage at Google’s I/O developer conference Wednesday to discuss a number of changes coming in Android O that aim to make the platform much more user-friendly for everyone.

Increasing productivity for accessibility services users was job one in preparing for Android O, according to Victor Tsaran, technical program manager on the Accessibility development team. To that end, the upcoming version of the mobile operating system delivers several critical improvements to TalkBack, an Android accessibility service that reads screen content to users who are visually impaired.

Recommended Videos

First, Android O introduces a separate volume stream for times when the system reads back to you. In other words, media like music and YouTube videos no longer has to play at the same volume that TalkBack does, so it’ll be easier to distinguish between them.

Please enable Javascript to view this content

An even bigger addition is support for multilingual text-to-speech. Tsaran demonstrated the feature by having the system read an email out loud that contained phrases in several different languages. Android was intelligent enough to differentiate between them and adjust on the fly.

Android O will also allow fingerprint sensors on devices to support basic gestures so users can swipe between options. In tandem with TalkBack, this means a user who is unable to see the screen can swipe successively between menu items, hearing each one individually read back to them.

Finding and triggering accessibility services was another major focus for Android O. The update will bring a context-aware dedicated accessibility button at the bottom-right of the navigation bar, that will be able to trigger certain actions depending on what’s visible on the screen, and what services you have enabled.

For example, if you’re browsing the home screen, pressing the button will trigger magnification. If you’re using text-to-speech, it will bring up a remote control that allows you to start and stop screen reading, and determine the speed that the system reads to you.

Image used with permission by copyright holder

The focus on just making accessibility services easier to understand has made its way to the settings menu as well. Gone are the vague category descriptors, like “System” and “Services.” The menu now groups features based on the actions they perform, and also contains descriptions for what each service does. What’s more, a new shortcut has been added to turn accessibility services on and off on the fly, by pressing both volume buttons.

During the event, the development team stressed that Google arrived at many of these improvements by testing them, in iterative fashion, with real users. Likewise, the company is imploring third-party developers to perform their own accessibility research.

Last year, Google released an app called Accessibility Scanner that could examine developers’ apps and suggest changes to help enhance accessibility, like improving text contrast. Since that time, the company says developers have used the app to find over one million opportunities to improve their apps’ functionality for users with accessibility needs.

Adam Ismail
Former Digital Trends Contributor
Adam’s obsession with tech began at a young age, with a Sega Dreamcast – and he’s been hooked ever since. Previously…
Google fumbled what could have been its biggest product in years
A person holding the Ray-Ban Meta smartglasses.

What is one of the hottest, most interesting mobile devices around at the moment? It’s Ray-Ban Meta, smart glasses that not only look great but work really well too.

They’re suitably incognito yet still highly functional, giving you a reason to wear them all the time if the mood takes you. Plus, they have AI -- the big feature beloved by tech firms at the moment -- built right in. So, where was Google’s competitor at Google I/O?
A tease and nothing more
Google's concept smart glasses circled Google

Read more
Google has a magical new way for you to control your Android phone
Holding the Google Pixel 8 Pro, showing its Home Screen.

You don’t need your hands to control your Android phone anymore. At Google I/O 2024, Google announced Project Gameface for Android, an incredible new accessibility feature that will let users control their devices with head movements and facial gestures.

There are 52 unique facial gestures supported. These include raising your eyebrow, opening your mouth, glancing in a certain direction, looking up, smiling, and more. Each gesture can be mapped to an action like pulling down the notification shade, going back to the previous app, opening the app drawer, or going back to home. Users can customize facial expressions, gesture sizes, cursor speed, and more.

Read more
Google just announced 10 huge updates for your Android phone
The Home Screen on the Google Pixel 8 Pro.

Google I/O, the annual everything-Google-software fest, has kicked off. As usual, Android takes center stage. From enhanced privacy and Google Wallet upgrades to theft detection and app safety checkups, there’s a lot to look forward to here.

From Android 15 features to more general Android updates, here’s a breakdown of all the major Android announcements from I/O 2024.
Making life easier with Google Wallet

Read more