Google I/O has been home to many of the company’s biggest announcements, including the Pixel 3A, the Nest Hub Max, the latest versions of Android, and Google Duplex. Due to the pandemic, Google canceled the event in 2020, but now Google I/O is back — and this time, it’s free for everyone. While a trip to California to see everything in person might be fun, getting to view it remotely online is the next best thing.
When is Google I/O 2021?
Google I/O runs from May 18 to 20 and is free for everyone who wants to attend. You can register now through the Google I/O website. Although it’s traditionally a developer conference, the event typically features big news that interests consumers, too.
How can you watch Google I/O 2021 live?
Google streamed the entire keynote on YouTube, and it is still available for viewing. Tune in to the official Google YouTube channel to stay up to date with the conference and catch further major announcements live.
What does Google I/O 2021 look like?
This year’s Google I/O is an entirely virtual conference.
The consumer and developer keynotes will focus on both company- and product-related news and can be watched and rewatched on demand. If you missed a keynote, don’t worry — you can catch it later. Technical sessions will focus on product announcements, as well as how users can adopt any new features. There is no set day for these; they’ll be spread throughout the entire I/O event.
There will also be workshops and Ask Me Anything sessions, all of which will be interactive. You must reserve a time slot to participate. The workshops will be led by a designated instructor, while the AMAs will offer chances to ask questions of experts on various Google products. If you’re interested in connecting with other attendees, Google will host Meetups, which Google describes as “casual, open, facilitated forums.” Again, you must be registered and make a reservation to attend these forums.
One major draw is the Interactive Sandbox, a part of I/O Adventure that allows developers to try Google’s latest products and features through a virtual hands-on experience. If you prefer a more solo experience, Codelabs and Learning Pathways are self-guided experiences that will help you adopt new Google technology.
Google just released the breakout session schedule to the public, and there are some interesting topics being discussed on May 19. At 9:45 a.m. PT, the “What’s new in Google Assistant” session will tackle changes and updates coming to Google Assistant, and may even see a few new features being announced.
There are sessions all throughout the day, but another one of particular interest starts at 4:15 p.m. PT and is called “What’s new in smart home.” It’s unlikely that new Nest devices will be announced, but this session might touch on new software capabilities and better connectivity between different brands.
Another session that is worth consideration is “Debugging the smart home,” at 11:30 p.m. PT. It would be a late night (or very early wake-up time for east coast participants), but discussing the bugs that crop up in an expansive smart home is always enlightening.
Finally, for Android fans (as many Google users are), the “What’s new in Android” session will be held from 2:30 p.m. to 3 p.m. PT on day one of the conference, Tuesday, May 18.
Android 12 is the company’s most personal OS ever
Android 12 received a lot of love at Google I/O 2021. New color customization features allow users to combine a variety of colors and styles to create a unique look and feel for their mobile device. Minute details including font, size, and even line width can be adjusted.
This concept, called Material You, is coming in the fall. It applies not just to Android, but to all of Google’s products. It doesn’t stop there, though — beyond the customization, Android 12 has been redesigned with a brand-new look.
The new UI has seen a complete overhaul in every aspect. It uses a feature called Color Extraction to change colors based on photos you set as the background. This is all done automatically to create an aesthetic look that helps you further personalize your phone.
Smaller details like dynamic lighting on the lock screen add to the experience, as do usability upgrades. Android 12 has been designed to be easier to use with controls accessible right at your fingertips. For example, you can trigger Google Assistant just by long-pressing the power button.
A new privacy dashboard provides users with detailed information on what type of information was accessed and when. You can revoke app permissions directly from the dashboard, too. New icons indicate if an app is using your camera or microphone, while added quick settings allow you to disable permissions with just a few taps.
An added benefit is that all audio and language processing happens on-device, which means it is much more secure than information processed in the cloud.
You can expect to see the complete release of Android 12 in September, as is traditional for Google. It’ll likely come alongside a new Pixel phone (or phones). But the good news is that the Android 12 beta is available today.
Google and Samsung join forces for Wear
The biggest update ever to Google’s smartwatch platform, Wear (formerly Wear OS), is on the way. Google has partnered with Samsung to create a unified platform between Wear OS and Tizen with better battery life and a more vibrant developer community.
New on-device navigation features make it easy to switch between apps for a more fluid experience. Users can double-tap to go back to the last-used app, as well as swipe left and right to navigate the app carousel.
Google is also bringing some of the most popular Fitbit features onto Wear devices. All of these updates come together to propel Google’s wearable devices to a more prominent position in the market, compared to the somewhat stagnant position it has held until now.
Google Map updates with new data, more AR
Google Maps can now use AR in specific cities to show a real-time overlay to better help you navigate an area that might otherwise be too complicated to understand in pure map form through new Live View capabilities.
However, Google Maps has also added more granular details such as the locations of sidewalks and crosswalks. An example of this is in Columbus Circle, one of Manhattan’s most complicated intersections. You can see exactly where you can cross the streets or navigate the circle.
Area Busyness will show how busy a given neighborhood is at any time of day. This is an expansion of the same feature that showed how busy a restaurant was. This feature will roll out in the coming months.
Workplace collaboration with Google Canvas and Meet
Google has planned updates to Google Meet and Smart Canvas to better streamline remote work and collaboration between teams no matter where they are.
Google Meet will add noise-cancellation capabilities, as well as intelligent camera zoom and lighting adjustments to ensure everyone can be seen and heard clearly. Companion Mode gives each person their own view. Smart Canvas and Google Meet will launch later this year.
Language and Translation
Google’s machine learning-powered translation tools have been upgraded. Android users can now live caption any video running on their device. Google Lens can also be used to translate anything the camera is pointed at — even math problems.
Language interpretation has also been improved to provide more accurate search results and translations, as well as speech queries. What does this mean for users? It means Google Assistant will be more accurate than before, and better able to understand questions and even colloquial terms.
For example, if you say “I’m freezing,” Google will understand that you’re cold — not that you are literally freezing.
User privacy
Google announced new updates to user privacy, through three product guidelines: secure by default, private by design, and you’re in control. Auto-Delete is now the default for all users. After 18 months, your data is automatically removed from Google’s servers unless you request it sooner.
An example is Android’s Private Computer Core, a feature on Android phones that protects your data from Google and from other companies.
Improved search
Google’s new MUM search capabilities, or Multi Unified Model, can provide more refined search results to complex queries. Users can enter multiple keywords, such as “I hiked Mt. Fuji, but now I want to hike Mt. Everest. What should I do to prepare?”
Google can provide results that answer all aspects of the question. MUM searches can also provide translated results from other languages; for instance, a lot of information about Mt. Fuji is in Japanese. Without translation, it would be out of reach for non-Japanese speakers.
Google has also implemented AR capabilities to demonstrate things through 3D models, like showing Simone Biles landing some of the most world’s most impressive gymnastics techniques.
In addition, Google has introduced the “About this result” feature to help verify the credibility of search results in an effort to combat false information on the internet. The feature will be available later this month for all English results, and will come to other languages later.
Google Shopping updates
Google’s Shopping Graph is a feature that better helps consumers find the items they’re looking for. It applies across all of Google’s platforms, including Lens, Images, and even YouTube. You can use Google Lens to take a picture of a chair you like and find that same chair somewhere on the web.
You can also search in your own Photos, and even search specific parts of that photo, to find shopping results. Google Shopping will now alert you to the status of different shopping carts at different retailers — for example, if you forgot to check out or you have multiple sitting in your cart, Google will tell you.
Cinematic moments with Google Photos updates
Google Photos is receiving an update in the near future called “Little Patterns” that will identify less than obvious patterns in your photographs and show them to you, grouped together by shape and color. For example, it can identify a backpack in a series of photos, or a similar type of shape.
If you take multiple photos to capture the perfect shot — and who doesn’t, honestly? — a new feature in Cinematic Photography will compare these near-identical photos and use a neural network to create a photo that combines the best aspects of each picture together.
New control features will allow users to remove certain photographs from a memory, or to block photos from being shown for a specific period of time.
Project Starline
Video conferencing has been a major part of the past two years. Google has a new project that is still early in development called Project Starline that enhances the video conferencing experience to make it feel more lifelike and immersive.
A combination of high-resolution cameras and custom-built depth sensors work to capture different images and fuse them together to create an extraordinarily detailed model. Users that experienced Project Starline say it feels like they’re in the same room with the other person.
It’s only available in a few Google offices right now, but as it continues to grow, it holds the potential to revolutionize the video conferencing experience.