Skip to main content

Apple will pay up to $1M to anyone who hacks its AI cloud

Apple's Craig Federighi speaking about macOS security at WWDC 2022.
Image used with permission by copyright holder

Apple just made an announcement that shows it means business when it comes to keeping Apple Intelligence secure. The company is offering a massive bug bounty of up to $1 million to anyone who is able to hack its AI cloud, referred to as Private Cloud Compute (PCC). These servers will take over Apple Intelligence tasks when the on-device AI capabilities just aren’t good enough — but there are downsides, which is why Apple’s bug-squashing mission seems like a good idea.

As per a recent Apple Security blog post, Apple has created a virtual research environment and opened the doors to the public to let everyone take a peek at the code and judge its security. The PCC was initially only available to a group of security researchers and auditors, but now, anyone can take a shot at trying to hack Apple’s AI cloud.

Recommended Videos

A lot of Apple Intelligence tasks are said to be done on-device, but for more complex demands, the PCC steps in. Apple offers end-to-end encryption and only makes the data available to the user to ensure that your private requests remain just that — private. However, with sensitive data like what AI might handle, be it on Macs or iPhones, users are right to feel concerned about the potential of the data leaving their device and ending up in the wrong hands.

Apple's Craig Federighi discussing Apple Intelligence at the Worldwide Developers Conference (WWDC) 2024.
Apple

That’s presumably partly why Apple is now reaching out to anyone who’s interested with this lucrative offer. The company provides access to the source code for some of the most important parts of PCC, which will make it possible for researchers to dig into its flaws.

The $1 million bounty is not universal. That’s the highest reward for the person or the team who manages to run malicious code on the PCC servers. The next-highest bounty sits at $250,000 and covers exploits that might allow hackers to extract user data from Apple’s AI cloud. There are also smaller rewards, starting at $150,000, which will be paid out to anyone who accesses user data from a “privileged network position.”

Apple’s bug bounty program has previously helped it spot exploits ahead of time while rewarding the researchers involved. A couple of years ago, Apple paid a student $100,000 for successfully hacking a Mac. Let’s hope that if there are any bugs to be found in Apple’s AI cloud, they’ll be spotted before Apple Intelligence becomes widely available.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
The internet is a sticky mess, and Apple just gave us an AI mop
Apple Intelligence on iPhone pulling data from across apps.

If less is more, then yesterday’s AI announcement from Apple was both modest and well, by implication, huge.

Sure, you can use AI to create your own emojis, find the right images for a presentation, or transcribe phone calls, but the real theme of Apple’s long-awaited AI fiesta was the ability to remove content from your life. And I’m not talking about crushing centuries of human ingenuity into an iPad. Apple Intelligence, as Cupertino’s AI variant is called, feels more like a Marie Kondo for your iPhone than a Picasso painting -- and it comes at a time that the internet's clutter problem has never been worse.
The poison and the remedy
Unlike the chatbot interface you probably know from ChatGPT and Google Gemini, Apple Intelligence is more of a series of features baked into Apple apps.

Read more
Apple hasn’t answered the most important question about its AI features
Apple Intelligence features.

During the debut of Apple Intelligence at WWDC 2024 yesterday, Senior Vice President of Software Engineering Craig Federighi repeatedly touted the new feature's security and delicate handling of sensitive user data. To protect user privacy, Apple Intelligence performs many of its generative operations on-device. And for those that exceed its onboard capabilities, the system will transfer the work up to the company's newly developed Private Cloud Compute (PCC).

However, as Dr. Matthew Green, associate professor of Computer Science at Johns Hopkins University in Baltimore, asked in a thread Monday, Apple's AI cloud may be secure, but is it trustworthy?

Read more
Apple says it made ‘AI for the rest of us’ — and it’s right
An Apple executive giving a presentation at WWDC 2024.

After many months of anxious waiting and salacious rumors, it’s finally happened: Apple has revealed its generative artificial intelligence (AI) systems to the world at its Worldwide Developers Conference (WWDC).

Yet unlike ChatGPT and Google Gemini, the freshly unveiled Apple Intelligence tools and features look like someone actually took the time to think about how AI can be used to better the world, not burn it down. If it works half as well as Apple promises it will, it could be the best AI system on the market.

Read more