News

Apple Offers $1 Million Bug Bounty for Hacking Apple Intelligence Servers

Apple is challenging security experts with a lucrative new bug bounty program, offering rewards up to $1 million for successfully hacking its Apple Intelligence servers. As the company prepares to launch its AI-driven Apple Intelligence service, it has ramped up efforts to secure the servers—known as Private Cloud Compute (PCC)—that will process some of these requests.

While Apple Intelligence will mostly operate on users’ devices, certain complex requests will need processing by Apple’s servers. This means PCC must be highly resistant to any form of cyberattack. To safeguard user data, Apple has invited researchers and hackers to attempt to breach PCC’s defenses, providing a guide to the PCC’s security features and access to a Virtual Research Environment (VRE) where researchers can examine PCC software.

Apple’s bounty program is open to anyone who can uncover vulnerabilities in three critical areas:

  • Accidental Data Disclosure: Exposing data due to configuration flaws or design weaknesses.
  • External Compromise via User Requests: Exploiting user requests to gain unauthorized access.
  • Physical or Internal Access Vulnerabilities: Accessing internal interfaces to compromise the system.

Payouts range from $50,000 to $1 million, with rewards based on the nature and impact of the discovered vulnerabilities:

  • $50,000 for exposing data through configuration issues.
  • $100,000 for executing uncertified code.
  • $150,000 to $250,000 for unauthorized access to sensitive user data outside of secure boundaries.
  • $1,000,000 for achieving arbitrary code execution without user consent.

Apple has made PCC source code available on GitHub, and the VRE offers a controlled environment for hackers to test the software, investigate security protocols, and simulate attacks. The company hopes the research community will engage deeply with PCC’s architecture, advancing Apple’s mission to build the most secure AI infrastructure possible.

For more details on the program and guidelines for submitting findings, researchers are directed to the Apple Security Bounty page. Apple has stated it is committed to refining the security of PCC and aims to build lasting trust in the AI system’s privacy and resilience.

Leave a Reply

Your email address will not be published. Required fields are marked *