Apple releases unprecedented amount of information on new AI security

Company will pay up to $1 million to people who find issues with AI security

Andrew Griffin
Thursday 24 October 2024 13:55 EDT
Comments
(Getty Images)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

Apple has released an unprecedented amount of information on the security of its new AI systems, along with a commitment to pay up to $1 million to anyone who finds a problem with them.

The new information, tools and rewards are part of Apple’s plan to ensure that its new Apple Intelligence systems are private and secure.

Apple said the publication is an invitation to security and privacy researchers as well as “anyone with interest and a technical curiosity” to dig deeper into the security technology and make sure it is safe.

When Apple introduced its new AI tools earlier this year, it said that they would rely on a mix of on-device processing but also more powerful cloud computers for responding to particularly intensive requests. It also said that it had built an entirely new cloud computing system to ensure that those requests were handled with the same privacy and security that they would be if they were made on the device.

That system is called Private Cloud Compute and it means that personal data is not accessible to anyone other than the user, including Apple. The company has said that it “meant a whole bunch of technical invention” including building kinds of AI cloud processing systems that had not been made before.

Apple said that, in order to ensure people trusted the system, it would allow researchers to inspect and check the security and private promises of Private Cloud Compute. Now it has released more information on those promises.

The new announcement is really three. First, Apple will release a new security guide that offers deep detail on how Private Cloud Compute was built; the second is a new virtual research environment that allows security experts to recreate those cloud computers on their own Macs; the third is the announcement of a bug bounty to incentivise that research.

That programme offers a reward of $1,000,000 to researchers who find the most dangerous kind of vulnerability, which would allow hackers to run their own code and break into the central parts of the device. Less dramatic bugs will receive smaller payouts, and Apple says that some bugs might not fit into existing categories but that it will evaluate them nonetheless.

The other major announcements offer ways for security researchers to find those bugs. The security document offers information on how the systems were built, while the virtual research environment will let people examine how the cloud compute systems work and look at their source code.

Apple said that the new announcements were part of its belief that privacy is a human right, and that security is part of that. It invited security researchers to test the system with the hope of making them stronger.

“We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time,” it wrote in a blog post.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in