This is why the iPhone 15 can’t run Apple Intelligence and the iPhone 15 Pro can

Those who shelled out the extra $200 to buy an iPhone 15 Pro instead of the iPhone 15 are probably walking around with a self-satisfied smile happy to be rewarded for spending the extra money for a premium iPhone. That’s because it turns out that the iPhone 15 and iPhone 15 Plus cannot support Apple Intelligence, the tech giant’s new AI initiative. On the other hand, the iPhone 15 Pro and iPhone 15 Pro Max along with all of the upcoming iPhone 16 series models will support all of the new AI-based features.
This is unfortunate for those who purchased one of the non-Pro iPhone 15 models but there is a reason why Apple had to do this according to TF International’s highly-regarded Apple analyst Ming-Chi Kuo. Looking at which devices will be running Apple Intelligence, Kuo noted that Apple Intelligence is supported on devices powered by Apple’s M1 chip and won’t be supported on the iPhone 15 and iPhone 15 Plus both of which are equipped with the 4nm A16 application processor (AP).

The M1 chip’s AI computing power can handle 11 trillion operations per second (TOPS) while the A16 AP can perform up to 17 TOPS. This means that AI capabilities between the chips is not the issue. Kuo says that DRAM memory is the differentiator. The amount of DRAM on the iPhone 15 and iPhone 15 Plus is 6GB which is lower than the 8GB of DRAM found on the M1. More importantly, not only is 8GB of RAM used on the latter chip, it is also used on the A17 Pro SoC that powers the iPhone 15 Pro and iPhone 15 Pro Max.

As a result, the iPhone 15 and iPhone 15 Plus does not get Apple Intelligence while the iPhone 15 Pro and iPhone 15 Pro Max do. We hate to keep rubbing this in, but we are only the messenger (and I do own the iPhone 15 Pro Max. Sorry!).

With these figures, Kuo says that Apple Intelligence on-device AI LLM (Large Language Model) requires about 2GB or less of DRAM. LLM is used by AI platforms to recognize and generate text. The analyst says that Apple Intelligence uses a 3 billion parameter LLM and he writes “After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM.”

Kuo adds, “Microsoft believes the key specification for an AI PC is 40 TOPS of computing power. However, for Apple, integrated with cloud AI (Private Cloud Compute), 11 TOPS of on-device computing power is sufficient to start providing on-device AI applications.” He also notes that in the future, Apple will move to a 7B LLM for Apple Intelligence which will require iPhone models to have even more DRAM in the future. The question, as Kuo notes, is whether Apple will use the DRAM requirement to continue to differentiate between non-Pro and Pro iPhone models. It would be a way for Apple to generate more revenue from iPhone buyers.

The analyst does state, “Whether the user experience is as good as Apple claims still needs to be observed. Samsung S24’s AI capabilities are limited, and Microsoft’s AI PC still confuses consumers. Apple has successfully defined on-device AI (at least consumers are already aware of the rich AI features and selling points of Apple’s AI devices), which will accelerate competitors’ imitation and catch-up, thereby driving rapid growth in the on-device AI-related industries.”

Articles, #iPhone #run #Apple #Intelligence #iPhone #Pro

Leave a Reply

Your email address will not be published. Required fields are marked *