‘Apple Intelligence’ Generative Personal AI Unveiled for iPhone, iPad, and Mac

Apple at WWDC today announced Apple Intelligence, a deeply integrated, personalized AI experience for Apple devices that uses cutting-edge generative artificial intelligence to enhance user experiences across iPhone, iPad, and Mac.

Apple says that Apple Intelligence “combines the power of generative models with personal context to deliver intelligence that’s incredibly useful and relevant.” Baked into iOS 18, iPadOS 18, and macOS Sequoia, it can create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks. And with Private Cloud Compute, Apple is focused on delivering AI based on privacy, with the ability to flex and scale computational capacity between on-device processing and larger, server-based models that run on dedicated Apple silicon servers.

“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products — and what our products can do for our users,” said Tim Cook, Apple’s CEO. “Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them. This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”

Top of the features list are new ways for users to enhance their writing and communicate more effectively. Apple says that with brand-new systemwide Writing Tools built into ‌iOS 18‌, iPadOS 18, and ‌macOS Sequoia‌, users can rewrite, proofread, and summarize text nearly everywhere they write, including Mail, Notes, Pages, and third-party apps.

For example, with Rewrite, Apple Intelligence allows users to choose from different versions of what they have written, and then adjust the tone “to suit the audience and task at hand.” According to Apple, the Rewrite feature helps users deliver the right words to meet the occasion, whether that’s finessing a cover letter, or adding humor and creativity to a party invitation. A new Proofread option checks grammar, word choice, and sentence structure while also suggesting edits — along with explanations of the edits — that users can review or quickly accept. And with Summarize, users can select text and have it recapped in the form of a digestible paragraph, bulleted key points, a table, or a list.

In Mail, a new section at the top of the inbox called Priority Messages shows your most urgent emails, like a same-day dinner invitation or boarding pass. But instead of previewing the first few lines of each email, you can see summaries without needing to open a message. Meanwhile, for long threads, you can view pertinent details with a tap. There’s also a Smart Reply feature that offers suggestions for a quick response. The feature also identifies questions in an email to ensure everything is answered, according to Apple.

Elsewhere, a new Priority Notifications feature surfaces alerts at the top of the stack to show what’s most important to you, and summaries help you scan long or stacked notifications to show key details right on the Lock Screen, such as when a group chat is particularly active. To help users stay present in what they are doing, there’s also a new Focus called Reduce Interruptions that surfaces only the notifications that might need immediate attention, like a text about an early pickup from daycare.

In the Notes and Phone apps, Apple Intelligence allows users to record, transcribe, and summarize audio. When a recording is initiated while on a call, participants are automatically notified, and once the call ends, Apple Intelligence generates a summary to help recall key points. A new Image Playground feature also allows users to choose from a range of concepts from categories like themes, costumes, accessories, and places to create their own images for sharing in Messages, Notes, Keynote, Freeform, and Pages, and third-party apps that support the new Image Playground API.

Taking emoji to new places, users can also create an original Genmoji to express themselves. By simply typing a description, their Genmoji appears, along with additional options. Users can also create Genmoji of friends and family based on their photos. Just like emoji, Genmoji can be added inline to messages, or shared as a sticker or reaction in a Tapback.

Switching to the Photos app, natural language can be used to search for specific photos, such as “Maya skateboarding in a tie-dye shirt,” or “Katie with stickers on her face.” Search in videos also becomes more powerful with the ability to find specific moments in clips so users can go right to the relevant segment. Additionally, the new Clean Up tool can identify and remove distracting objects in the background of a photo, without accidentally altering the subject, according to Apple.

With Memories, users can create the story they want to see by simply typing a description. Using language and image understanding, Apple Intelligence will pick out the best photos and videos based on the description, craft a storyline with chapters based on themes identified from the photos, and arrange them into a movie with its own narrative arc. Users will even get song suggestions to match their memory from Apple Music. As with all Apple Intelligence features, user photos and videos are kept private on device and are not shared with Apple or anyone else.

Apple is also emphasizing cross-platform enhancements to Siri, which is said to be “more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks.” For example, it can follow along if users stumble over words and maintain context from one request to the next. Additionally, users can type to ‌Siri‌, and switch between text and voice to communicate with ‌Siri‌ however they like. ‌Siri‌ also has a brand-new design, suggestive of its new capabilities, with a glowing light that wraps around the edge of the screen when ‌Siri‌ is active.

‌Siri‌ can answer queries about how to do something on ‌iPhone‌, ‌iPad‌, and Mac, and with onscreen awareness, ‌Siri‌ will be able to understand and take action with users’ content in more apps over time, according to Apple. For example, if a friend texts a user their new address in Messages, the receiver can say, “Add this address to his contact card,” and ‌Siri‌ will do just that. Similar actions will extend across Apple and third-party apps.

Siri will be able to deliver intelligence that’s tailored to the user and their on-device information. For example, a user can say, “Play that podcast that Jamie recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. Or they could ask, “When is Mom’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.

Thanks to on-device processing, Apple says its AI features are aware of your personal data without collecting your data. And when it needs more compute capacity, it sends only the data needed to fulfill the request to Apple’s Private Cloud Compute servers, which share the same privacy and security capabilities of your ‌iPhone‌, according to Apple.

To verify privacy, Apple says that independent experts can inspect the code that runs on Apple silicon servers, and Private Cloud Compute cryptographically ensures that ‌iPhone‌, ‌iPad‌, and Mac do not talk to a server unless its software has been publicly logged for inspection. “Apple Intelligence with Private Cloud Compute sets a new standard for privacy in AI, unlocking intelligence users can trust,” says Apple.

Apple is integrating ChatGPT access into experiences within ‌iOS 18‌, iPadOS 18, and ‌macOS Sequoia‌, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to jump between tools. Apple says users are asked before any questions are sent to ChatGPT, along with any documents or photos, and ‌Siri‌ then presents the answer directly.

Additionally, ChatGPT will be available in Apple’s systemwide Writing Tools to help users generate content for anything they are writing about. For example, with Compose, users will also be able to access ChatGPT image tools to generate images in a variety of styles to complement what they are writing.

Apple says that ChatGPT will come to ‌iOS 18‌, iPadOS 18, and ‌macOS Sequoia‌ later this year, powered by GPT-4o. Users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features right from these experiences.

Read More

Tim Hardwick