Here's When Apple Plans to Roll Out Its Biggest Apple Intelligence Features
Apple made a splash during last week's WWDC keynote when it announced Apple Intelligence. It's the company's official foray into the trendy AI features most tech companies have adopted already. While Apple Intelligence might have generated the most headlines over the past week, many of its main features will not be present when you update your iPhone, iPad, or Mac this fall.
According to Bloomberg's Mark Gurman, Apple is staggering the rollout of these highly-anticipated AI features. A key reason is, simply, these features just aren't ready yet. Apple has been scrambling for over a year to implement generative AI features in its products, after the tech exploded in late 2022. (Thanks, ChatGPT.) Many of these features are quite involved, and will take more time to get right.
That said, Apple probably could release these features sooner and in larger batches if it wanted to, but there's a strategy here: By rolling out big AI features in limited numbers, Apple can root out any major issues before adding more AI to the mix (AI hallucinates, after all), and can continue to build up its cloud network without putting too much pressure on the system. It helps that the company is keeping these features to a specific, small pool of Apple devices: iPhone 15 Pro and 15 Pro Max (and likely the iPhone 16 line), as well as M-Series Macs and iPads.
Apple Intelligence in 2024
If you installed the iOS 18 or macOS 15 beta right now, you might think no Apple Intelligence features were going to be ready in the fall. That's because Apple is delaying these AI features for beta testers until sometime this summer. As the public beta is scheduled to drop in July, it seems like a safe assumption that Apple is planning on dropping Apple Intelligence next month. Again, we don't know for sure.
There are some AI features currently in this first beta, even if they aren't strictly "Apple Intelligence" features: iOS 18 supports transcriptions for voice memos as well as enhanced voicemail transcriptions, and supports automatically calculating equations you type out. It's a limited experience, but seeing as it's only the first beta, we'll see more features soon.
In fact, Apple currently plans to roll out some flagship features with the first release of Apple Intelligence. That includes summaries for webpages, voice memos, notes, and emails; AI writing tools (such as rewriting and proofreading); and image generation, including the AI-generated emojis Apple is branding "Genmoji." You'll also receive AI summaries of notifications and see certain alerts first based on what the AI thinks is most important.
In addition, some of Siri's new updates will be out with iOS 18's initial release. This fall, you should notice the assistant's new UI, as well as the convenient new option for typing to Siri. But most of Siri's advertised features won't be ready for a while. (More on that below.)
The timeline for ChatGPT integration is also a bit up in the air: It may not arrive with the first release of iOS 18 in the fall, but Gurman believes it'll be here before the end of the year. For developers, Xcode's AI assistant, Swift Assist, is likely not out until later this year.
Apple Intelligence's new Siri won't be here until 2025
The largest delay appears to be to Siri's standout upgrades, many of which won't hit iOS and macOS until 2025. That includes contextual understanding and actions: The big example from the keynote was when a demonstrator asks Siri when her mom's flight is getting in, and the digital assistant is able to answer the question by pulling data from multiple apps. This "understanding" that would power many convenient actions without needing to explicitly tell Siri what you want it to do, needs more time to bake.
In addition, Apple is taking until next year for Siri's ability to act within apps from user commands. When available, you'll be able to ask Siri to edit a photo then add it to a message before sending it off. Siri will actually feel like a smart assistant that can do things on your iPhone, iPad, and Mac for you, but that takes time.
Siri also won't be able to analyze and understand what's happening on your screen until 2025. Next year, you should be able to ask Siri a simple question based on what you're doing on your device, and the assistant should understand. If you're trying to make movie plans with someone to see Inside Out 2, you could ask Siri "when is it playing?" and Siri should analyze the conversation and return results for movie times in your area.
Finally, Apple Intelligence remains English-only until at least next year. Apple needs more time to train the AI on other languages. As with other AI features, however, this is one that makes a lot of sense to delay until it's 100% ready.
AI might be the focus of the tech industry, but big AI features often roll out to disastrous ends. (Just look at Google's AI Overviews or Microsoft's Recall feature.) The more time Apple gives itself to get the tech right, the better. In the meantime, we can use the new features that are already available.