Local AI Models vs. Centralized AI Models
Chips that run locally on the phone will reduce the need for chips that run in the datacenter
Google (GOOGL) has been running advanced AI models at scale on its Pixel phones since 2018. Those models and the TPU chips have only become more powerful with time. Many personal AI use cases do not require a frontier model (ex. book my trip from SFO to DFW), a second tier model that runs on the phone is fine for most use cases.
Google has a phone play, Apple (AAPL) has a phone play but is weak in AI. OpenAI plans to roll out a family of AI-powered personal devices. Local chips will reduce demand for datacenter compute over time, I believe not only for consumer use cases, but also for many if not most Enterprise use cases.
Not sure where that leaves Microsoft (MSFT), whose best move was acquiring a 27% stake in OpenAI. MSFT had a phone play, but Nadella didn’t want to keep Nokia.
Magic Cue, a suite of new AI-powered capabilities for Pixel that proactively suggests helpful information, right as you need it.


