When Apple is publicly dishing out dosh, it’s usually related to one of the company’s compensation schemes. Some users were recently afflicted by the infamously failure-prone Butterfly keyboard, for example, were recently handed out $400 each. But if you think you can hack into Apple’s AI servers, the company might give you considerably more than that.
Apple has made a big deal about the security and privacy of Apple Intelligence, and is now putting its money where its mouth is by offering up to $1M to security researchers who can hack Private Cloud Compute, the company’s new AI servers.
Apple has revealed a list of bounty amounts for security researchers, including $1M for those who can achieve “arbitrary code execution with arbitrary entitlements”, or “access to a user’s request data or sensitive information about the user’s requests outside the trust boundary”.
In a blog post introducing Private Cloud Compute, Apple explained how “to build public trust in the system,” it has taken the “extraordinary step of allowing security and privacy researchers to inspect and verify the end-to-end security and privacy promises of PCC.”
“We designed Private Cloud Compute as part of Apple Intelligence to take an extraordinary step forward for privacy in AI,” Apple adds. “This includes providing verifiable transparency — a unique property that sets it apart from other server-based AI approaches.”