Apple will pay up to $1M to anyone who hacks its AI cloud
|
By
Monica J. White Published October 25, 2024 |
Apple just made an announcement that shows it means business when it comes to keeping Apple Intelligence secure. The company is offering a massive bug bounty of up to $1 million to anyone who is able to hack its AI cloud, referred to as Private Cloud Compute (PCC). These servers will take over Apple Intelligence tasks when the on-device AI capabilities just aren’t good enough — but there are downsides, which is why Apple’s bug-squashing mission seems like a good idea.
As per a recent Apple Security blog post, Apple has created a virtual research environment and opened the doors to the public to let everyone take a peek at the code and judge its security. The PCC was initially only available to a group of security researchers and auditors, but now, anyone can take a shot at trying to hack Apple’s AI cloud.
A lot of Apple Intelligence tasks are said to be done on-device, but for more complex demands, the PCC steps in. Apple offers end-to-end encryption and only makes the data available to the user to ensure that your private requests remain just that — private. However, with sensitive data like what AI might handle, be it on Macs or iPhones, users are right to feel concerned about the potential of the data leaving their device and ending up in the wrong hands.
That’s presumably partly why Apple is now reaching out to anyone who’s interested with this lucrative offer. The company provides access to the source code for some of the most important parts of PCC, which will make it possible for researchers to dig into its flaws.
The $1 million bounty is not universal. That’s the highest reward for the person or the team who manages to run malicious code on the PCC servers. The next-highest bounty sits at $250,000 and covers exploits that might allow hackers to extract user data from Apple’s AI cloud. There are also smaller rewards, starting at $150,000, which will be paid out to anyone who accesses user data from a “privileged network position.”
Apple’s bug bounty program has previously helped it spot exploits ahead of time while rewarding the researchers involved. A couple of years ago, Apple paid a student $100,000 for successfully hacking a Mac. Let’s hope that if there are any bugs to be found in Apple’s AI cloud, they’ll be spotted before Apple Intelligence becomes widely available.
Related Posts
New study shows AI isn’t ready for office work
A reality check for the "replacement" theory
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20