Skip to main content

Apple is stepping up its Photo game with AI in iOS 10 and MacOS Sierra

Promotional logo for WWDC 2023.
This story is part of our complete Apple WWDC coverage

Photos on Apple products are about to get a whole heck of a lot more interesting. At its annual Worldwide Developers Conference keynote on Monday, Apple showed off its next-generation mobile and desktop operating systems. As part of this update, Apple introduced a collection of new features and tools to help you better sort through your images and get the most out of the photos and videos you capture.

WWDC 2016: Apple cranks up heat on PayPal by finally bringing Apple Pay to websites

Recommended Videos

Starting with iOS 10, Apple announced new Places and Faces features, which use “Advanced Computer Vision” and deep learning to learn the faces of people in photographs and place them on a map using the built-in GPS metadata.

Please enable Javascript to view this content

Much like Google has done with Google Photos, Apple is also using this new AI to break down the objects and scenes in images. Now, you’ll be able to search for individual objects, locations, and people in images via “Intelligent Search.”

Also taking a page out of Google’s book is a new feature called Memories. Using the new image recognition features in conjunction with metadata from photographs and videos, you can now instantaneously create clever, contextualized videos of life’s special moments and adventures. To do this, Apple uses the location metadata alongside the new Faces and Places features.

One of the more interesting aspects is that you can customize your memories. By choosing a collection of Moods and lengths, you can tweak the clip to fit the style you’re looking for. Specifically, you can choose between short, medium, and long clips, as well as moods like “Epic, Uplifting, Club, Extreme, Happy,” and more.

While Apple mainly showed off these feature on an iPhone, both iOS 10 and macOS Sierra will get all of the above features.

What sets Apple’s features apart from Google is that all of this creation and image recognition is done locally, on your iOS device or macOS computer. Whereas Google uses its cloud-based platform, Apple hopes to appeal to the privacy crowd by ensuring no one can see your content besides yourself and those you specifically share it with.

Something Apple didn’t elaborate on, but teased, is RAW photo editing in iOS 10. No note of it was specifically referenced in the keynote, but on the iOS developer overview slide, Apple made note that RAW photo support is on its way to iOS devices. This includes the ability to capture RAW and JPEG photos at once, as many DSLRs are capable of, as well as the ability to edit the photos across third party apps as developers implement it into their apps.

Other features teased in the iOS overview list include Live Photos stabilization, live filters for Live Photos, brilliance adjustment slider in Photos, Live Photos editing, and faster camera launch.

It’ll be exciting to see these new features in action. Once we get our hands on the iOS 10 and macOS, we’ll be sure to give you a hands-on rundown of the new features.

Gannon Burgett
Former Digital Trends Contributor
Apple addresses false AI summaries, but does it go far enough?
Close up of Apple Intelligence Notification Summary on iPhone 16 Pro

On Monday, we reported on the increasing criticism of Apple’s summarized notification feature, which is part of Apple Intelligence. The company has now addressed this issue, indicating that it recognizes the concerns of many users, according to Ars Technica.

Introduced in iOS 18.1, the summarized notifications feature aims to help users manage notification overload by grouping alerts and displaying only the essential details. It’s at that last point that the BBC and others have thrown their criticism. These notifications, particularly news alerts, do not consistently provide accurate information.

Read more
Apple’s AI notification feature is making up news
iPhone 16 Pro homescreen with an Apple Intelligence Notification Summary

With the launch of Apple Intelligence and iOS 18.1, Apple introduced summarized notifications to assist users in managing notification overload. These notifications group alerts and display only the essential details. However, the BBC has recently pointed out that these notifications, particularly news alerts, do not consistently deliver accurate information.
What summarized notifications are supposed to do
In iOS 18.1 and later, instead of being overwhelmed with individual notifications, you can receive them bundled into a single, summarized notification. This feature is handy for group chats and news updates.

The summaries utilize AI to identify and present the most critical information. For instance, a summary might indicate when a group chat is especially active or highlight breaking news you should be aware of.

Read more
The good and bad of Apple Intelligence after using it on my iPhone for months
Apple Intelligence on iPhone 15 Pro.

Whether you love or hate it, AI doesn’t appear to be going away anytime soon. In fact, AI is evolving quite rapidly, and it’s now in the palms of our hands with our smartphones, as Google, Samsung, and even Apple have now fully embraced our AI future.

Though Apple was late to the game with Apple Intelligence, the company majorly hyped it up for the iPhone 16 launch in September, even though, amazingly, it did not roll out until October with the iOS 18.1 update. The staggered release schedule for Apple Intelligence confused many consumers as to why they did not have Apple Intelligence immediately with their iPhone 16 purchases, and it felt like a big misstep from Apple.

Read more