The research paper, called “Apple Intelligence Foundation Language Models” is pretty technical, and details the already-known sources of the language model at the core of the company’s new technology. However, a quote buried inside the paper hints that Apple may have been using Google hardware in early development.
In the paper, it says that Apple’s Foundation Model (AFM) and the server technology that drives it, were initially built on “v4 and v5p Cloud TPU clusters” using Apple software. There is a great deal of information in the research about how that’s done, and what data sources they used to train.
A CNBC report on Monday suggests that Apple rented time on existing Google-hosted clusters, but the research doesn’t directly support that, nor say anything about Google or Nvidia at all. What’s more likely is that Apple bought the hardware outright from the company, and used it within its own data centers.
Go Here to Read this Fast! Apple admits to using Google Tensor hardware to train Apple Intelligence
Originally appeared here:
Apple admits to using Google Tensor hardware to train Apple Intelligence