Apple assured customers that their data transferred to the cloud through Apple Intelligence is “never stored or made available to Apple.” The open server code allows experts to “test this privacy promise.” The company states that the Apple Intelligence AI system it integrates into its products will use “private cloud computing on servers custom-built with Apple processors.”
Because most large language models run on remote cloud server farms, users are often reluctant to share personal data with AI companies. In its WWDC keynote, Apple emphasized that the new Apple Intelligence integrated into its products will utilize “private cloud computing” to ensure transparent and verifiable security for any data processed on its cloud servers.
Privacy-First Approach
“You won’t have to hand over every detail of your life to be stored and analyzed in someone else’s cloud by AI,” said Apple senior vice president of software engineering Craig Federighi. He noted that many of Apple’s generative AI models can run entirely locally on devices based on A17+ or M series chips, eliminating the risk of sending personal data to a remote server. If a larger cloud model is needed to fulfill a user’s request, it “will run on servers that we built specifically using Apple processors,” allowing for the security tools built into the Swift programming language. Federighi added that Apple Intelligence “sends to servers only the data you need to complete your task” rather than giving full access to all contextual information on a device.
Federighi emphasized that even this minified data set will not be stored for future access or used to further train Apple’s back-end models. The server code used by Apple will be open and independent experts will be able to check it for the confidentiality of user data. The entire system is protected by end-to-end encryption, ensuring Apple devices “will refuse to communicate with the server unless their software is publicly registered for verification.”
While Federighi’s keynote provided few technical details, the focus on privacy shows that Apple is at least rhetorically prioritizing security concerns in the generative AI space, notes NIXSolutions. Apple calls its privacy policy “an entirely new standard for privacy and artificial intelligence.” We’ll keep you updated as independent security experts provide their conclusions on Apple’s new system.