1 Facebook x.com Reddit
Apple said at launch that Private Cloud Compute's security would be open to third-party review. On Thursday, it made good on that promise.
In July, Apple unveiled Apple Intelligence and its Private Cloud Compute. It was pitched as a secure and private way to handle Siri queries in the cloud as part of Apple Intelligence.
In addition to using cryptography and not storing user data, it also insisted that the features could be independently verified. On October 24, it offered an update to that plan.
In a Security Research blog post titled “Private Cloud Computing Security Research,” Apple explains that it has given third-party auditors and some security researchers early access. This included access to resources built for the project, including PCC’s virtual research environment (VRE).
The post also says that the same resources will be made publicly available starting Thursday. Apple says this allows all security and privacy researchers, “or anyone with an interest and technical curiosity,” to learn about how private cloud computing works and conduct their own independent verification.
Resources
The release includes a new Private Cloud Computing Security Guide that explains how the architecture is designed to meet Apple’s core requirements for the project. It includes technical details about the PCC components and how they work, how requests are authenticated and routed, and how the security withstands various forms of attack.
VRE is Apple’s first project for any of its platforms. It consists of tools for running the PCC node software in a virtual machine.
It’s not exactly the same code that runs on servers, as there are “minor modifications” to make it run locally. Apple insists that the software runs identically to the PCC node, with only changes to the boot process and kernel.
VRE also includes a Secure Enclave virtual processor and uses macOS’s built-in support for paravirtualized graphics.
Apple is also making the source code for some key components available for review. Offered under a limited-use license intended for analysis, the source code includes the CloudAttestation project for building and verifying PCC node attestations.
There is also the Thimble project, which includes a daemon for the user's device that works with CloudAttestation to verify transparency.
PCC Bug Bounty
In addition, Apple is expanding its Apple Security Bounty program, promising “significant rewards” for reporting security and privacy issues in Private Cloud Compute.
The new categories in the program directly correspond to the critical threats in the Security Guide. These include accidental data disclosure, external compromise due to user requests, and physical or internal access vulnerabilities.
The reward scale starts at $50,000 for an accidental or unexpected data disclosure due to a deployment or configuration issue. At the high end of the scale is demonstrating arbitrary code execution with arbitrary privileges, which can earn participants up to $1 million.
Apple adds that it will consider any security issue that has a “significant impact” on PCC for a potential bounty, even if it doesn’t fit into one of the defined categories.
“We hope you’ll dive deeper into PCC’s design with our security guide, explore the code yourself with a virtual research environment, and report any issues you find through the Apple Security Bounty,” it says.
In conclusion, Apple says it designed PCC “to take an exceptional step forward in privacy in AI,” including verifiable transparency.
It concludes: “We believe Private Cloud Compute is the most advanced security architecture ever deployed for large-scale cloud AI computing, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time.”
Follow AppleInsider on Google News