Apple Intelligence After One Year: What Changed and What Didn't
Apple's AI initiative has delivered writing tools, image generation, and smart notifications to hundreds of millions of devices. But the fully revamped Siri — the feature that generated the most excitement — remains a work in progress.
When Apple introduced Apple Intelligence at WWDC 2024, the company made its most ambitious AI promise to date: a suite of generative AI features that would be deeply integrated into every major Apple device, powered by on-device processing and a new privacy-focused cloud computing architecture. The initial rollout began in October 2024 with iOS 18.1, and additional features arrived through successive updates over the following months. Now, more than a year later, it is possible to assess what Apple Intelligence has actually delivered — and where it has fallen short.
What Shipped and Works
The core writing tools were among the first Apple Intelligence features to reach consumers, and they remain among the most practically useful. The ability to rewrite, proofread, and summarize text works across Mail, Messages, Notes, Pages, and third-party apps that have adopted the API. The tools are reliable for routine tasks — cleaning up a hastily typed email, condensing a long document into key points, or adjusting the tone of a message before sending it.
Image generation capabilities arrived with iOS 18.2, including Genmoji (custom emoji created from text descriptions), Image Playground (an AI image generation app), and Image Wand (which transforms rough sketches into polished images). These features have found an audience primarily among casual users who enjoy the novelty of generating personalized visual content, though professional designers have generally found the output quality insufficient for anything beyond fun experiments.
Priority Notifications, which use AI to scan incoming notifications and surface the most important ones, have proven to be a quiet but valuable quality-of-life improvement. The system is not perfect — it occasionally buries notifications that users consider important — but it generally does a credible job of reducing notification fatigue.
Visual Intelligence, which lets users point their device's camera at real-world objects to identify them, extract text, or take contextual actions, shipped with iOS 18.2 and has been steadily improved. The feature works well for tasks like identifying plants, translating text on signs, and adding event details from physical invitations to the calendar.
iOS 26, released in late 2025, added further Apple Intelligence features including screenshot-based calendar event creation, screenshot image search, and the ability to query ChatGPT about screenshot content. Live Translation, available in Messages, FaceTime, and phone calls, has been one of the more practically impactful additions for users who regularly communicate across languages.
What's Missing: The Siri Overhaul
The elephant in the room is Siri. Apple heavily promoted an enhanced, personalized version of Siri at both WWDC 2024 and the September 2024 iPhone launch event. The upgraded assistant was supposed to feature deep contextual awareness, the ability to understand and act on on-screen content, cross-app task execution, and personalized responses based on the user's own data — all while maintaining Apple's privacy standards.
None of those features have shipped as of April 2026. The enhanced Siri was delayed when Apple found that it only performed reliably about two-thirds of the time — a failure rate that the company apparently deemed unacceptable for a feature that would be used by hundreds of millions of people. Internal leadership changes followed, with long-time AI executive John Giannandrea stepping down and new leadership being brought in to oversee the AI organization.
In January 2026, Apple and Google announced a partnership under which Apple's next-generation foundation models are expected to incorporate Google's Gemini technology. Reports from early 2026 indicated that Apple was aiming to include the revamped Siri features in iOS 26.4, scheduled for a spring release. However, more recent reporting suggests that the company is now considering spreading the features across iOS 26.5 (expected in May) and iOS 27 (expected in September), rather than shipping everything at once.
Apple has stated publicly that the enhanced Siri features remain on track for 2026, but the company has not committed to a specific date. This ambiguity has fueled frustration among consumers and analysts who expected these capabilities much sooner, and has led to a class-action lawsuit alleging that Apple marketed AI features in connection with the iPhone 16 that did not actually exist at the time of sale.
The Developer Ecosystem
One dimension of Apple Intelligence that has received less attention is its impact on third-party developers. Apple has opened its on-device foundation models to developers through the Foundation Models framework, which provides native Swift integration for intelligent features like smart search, text understanding, and contextual action suggestions. These capabilities run on-device at no cost per request, which is a significant differentiator from cloud-based AI APIs that charge per token or per call.
Developer adoption has been gradual but growing. Apps that have integrated Apple Intelligence features report improved user engagement for tasks like search and content summarization. The Shortcuts app, enhanced with Apple Intelligence models, has become a more powerful automation tool — though its complexity still limits adoption to more technically sophisticated users.
Privacy as a Differentiator
Apple's approach to AI continues to emphasize privacy as a foundational principle. The on-device processing model means that many Apple Intelligence features run entirely on the user's hardware, without sending data to external servers. For tasks that require more computational power, Apple uses Private Cloud Compute — a server infrastructure built on Apple silicon that processes requests without retaining user data.
This architecture represents a genuine technical and philosophical differentiation from competitors. Google, OpenAI, and other AI providers generally process user queries on centralized cloud servers, which offers more computational flexibility but creates data handling questions that Apple's approach largely avoids. Whether this privacy advantage is sufficiently valued by consumers to offset the capability gaps — particularly around Siri — remains an open question.
The Year Ahead
The next 12 months will be critical for Apple Intelligence. If the revamped Siri delivers on even a substantial portion of its original promise, it could validate Apple's cautious, privacy-first approach to AI and generate a meaningful upgrade cycle for iPhone. If the delays continue or the shipped features disappoint, the narrative around Apple and AI could shift from "carefully deliberate" to "falling behind."
Apple's financial position gives it time — the company holds more than $130 billion in cash and marketable securities, and iPhone sales remain robust regardless of AI capabilities. But in a market where competitors are shipping increasingly capable AI features at a rapid pace, patience has limits. The features that Apple Intelligence has already shipped are solid. What's still missing will determine whether the initiative is remembered as a success or a missed opportunity.


