In an unexpected move, Apple disclosed its use of Google’s Tensor processors (TPUs) in the development of crucial components for its Apple Intelligence. The decision to utilize 2,048 Google TPUv5p chips for constructing AI models, along with 8,192 TPUv4 processors, is particularly significant given Nvidia’s historical dominance in the AI processor market. Apple seldom reveals its hardware choices, making this announcement even more surprising.
The adoption of Google’s TPU is believed to empower Apple engineers in efficiently training large and intricate AI models. It’s important to note that while Nvidia offers chips and systems as standalone products, Google provides access to TPUs through a cloud service, requiring customers to develop software within its ecosystem.
Apple has underscored its commitment to responsible data practices by affirming that it did not utilize private user data in AI training. However, the company has unveiled plans to invest over $5 billion in AI server upgrades over the next two years. This investment aims to bolster AI capabilities and decrease reliance on external hardware vendors.
This collaboration exemplifies the fierce competition in the AI industry and the readiness of major technology firms to engage with competitors in pursuit of their development objectives.