According to 9to5Mac, a new report from The Information sheds light on Apple’s long-term AI strategy, which has faced criticism after the company delayed several Siri upgrades earlier this year. The report suggests that while Apple is still developing its own internal AI models, a key view among its leadership is that large language models (LLMs) will become low-cost commodities in the years to come. This belief reportedly informs Apple’s decision not to spend a fortune now on its own massive models, a stark contrast to the heavy investments from competitors like OpenAI, Meta, and Google. The report also mentions Apple’s rumored partnership with Google to power a new version of Siri. If this outlook is correct, Apple’s future AI success would hinge less on building the best model and more on controlling the hardware, software, and services where AI runs.
The Commodity Mindset
Here’s the thing: this is a fascinating and deeply pragmatic stance. While the rest of Silicon Valley is in an all-out arms race to build the biggest, most powerful proprietary model, Apple’s leadership seems to be asking, “Why spend billions now on what might be cheap and widely available tomorrow?” It’s basically a bet on the democratization of the core technology. They’re looking at the AI landscape like the market for processors or memory—eventually, the foundational tech becomes a standardized, high-quality component you can license or acquire, not the crown jewel itself. This allows them to conserve an immense war chest and avoid the eye-watering compute costs their rivals are currently swallowing.
What This Means for Everyone Else
So, what does this mean for users and developers? For the average iPhone user, it probably means a more integrated, privacy-focused, and seamless experience, but one that might not always have the absolute bleeding-edge chatbot capabilities. Apple’s play would be to own the entire stack—the chip, the operating system, the app ecosystem—and slot in the best available “commodity” AI model that fits its needs for features like Siri. For developers, it reinforces that Apple’s ecosystem will remain the primary gate. The AI features that get privileged access to system-level functions and hardware will be Apple’s, even if the underlying model comes from a third party like Google. It’s a classic Apple move: control the experience, not necessarily the raw ingredient.
Where Apple Actually Wants to Win
This is where Apple’s strategy gets really interesting. If LLMs become true commodities, the competitive battleground shifts entirely. It’s no longer about who has the smartest model in a vacuum. The fight is about who has the most efficient silicon to run it on-device, the tightest operating system integration, and the most compelling services built around it. And let’s be honest, that’s a fight Apple is ridiculously well-equipped for. They design their own chips, they control iOS and macOS end-to-end, and they have a billion-device installed base. They’re betting the house that their real advantage is the ecosystem, not the model itself. Now, is that a risky bet? Maybe. But it’s also the one that plays to their historic strengths.
