Apple has launched a new bounty program, offering a reward of up to $1 million for identifying serious security vulnerabilities in its Private Cloud Compute (PCC) system, which is central to its cloud AI features. As reported by TechRadar, this initiative aims to encourage security researchers to rigorously examine the PCC platform for potential security flaws.
The PCC system is designed to safeguard sensitive user information, ensuring that unauthorized access is prohibited, even from Apple itself. By inviting the security research community to investigate the PCC, Apple aims to bolster the platform’s security measures.
Participants in the program will have access to a virtual environment (VRE) and the PCC source code, enabling them to analyze and identify vulnerabilities. For significant findings, particularly those that could allow arbitrary code execution, researchers could receive up to $1 million. Additional rewards range from $50,000 to $250,000 for less severe issues.
Apple emphasizes that the Private Cloud computing system represents a highly advanced security architecture for large-scale cloud AI computing. The company expresses a commitment to collaborating with the research community to enhance trust and security in the system, continually working to protect user data and strengthen the overall security of its AI services.