MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Everything we know about Apple Intelligence

Friday August 22, 2025. 10:46 AM , from ComputerWorld
Everything we know about Apple Intelligence
Apple devices support Apple Intelligence, the growing collection of artificial intelligence (AI) tools the company began to roll out in October 2024. These include both consumer-friendly tools such as automatic mixing in Apple Music alongside a growing selection of increasingly powerful tools to get tasks done, ranging from writing tools to image creation, visual intelligence, translation tools and more.

Apple Intelligence supplements Apple’s existing machine-learning tools using generative AI (genAI) technology similar to that used by OpenAI’s ChatGPT.

Along the road to building these tools, Apple encountered unexpected obstacles; some of the more powerful services, such as a contextually-aware Siri, remain unavailable well over a year after they were announced. The tools are now expected in 2026.

These setbacks, which damaged the company’s reputation and morale, have not stopped Apple from pressing on; at WWDC 2025 it confirmed plans to introduce a smarter Siri next year and unveiled many additional Apple Intelligence tools and services.

Apple Intelligence currently combines two systems: the older one based on machine intelligence and pattern matching and a new genAI system based on large language models (LLMs). The newer technology can handle context changes in questions to Siri, for example. But the existence of two forms of AI makes successful deployment more complex.

Apple Intelligence aims “to make your most personal products even more useful and delightful.” (That’s how Apple CEO Tim Cook described it.) It uses Apple’s own self-trained genAI models, which are built for use across its platforms, capable of using a user’s personal information, and private by the design.

Why Apple Intelligence matters

Apple has worked with AI since its earliest days (more about this below), but in in the last couple of years — since the arrival of ChatGPT and others — the company has been seen as falling behind its competitors. In part, this is because Apple made a major error by announcing AI features that had not yet been developed. That error was compounded by, and likely reflected, its secretive corporate culture and internal squabbles over precious R&D resources.

Apple decided to put its energy behind AI in late 2023, when Apple Senior Vice President for Software Craig Federighi tested GitHub Copilot code completion. Blown away by this experience, he directed Apple’s software development team to begin to apply LLMs across Apple products. The company now sees this work as foundational to future product innovation and has diverted vast resources to bringing its own genAI technologies to its devices.

Analysts note that with Apple Intelligence now available across the newer Macs, iPhones, and iPads, the company has become one of the most widely used AI ecosystems in the world. Here’s a look at the Apple Intelligence tools that exist and those expected in future operating system updates.

Apple

How Apple approaches Apple Intelligence

To deliver AI on its devices, Apple has found a way to keep to its longstanding commitment to user privacy, offering a three-point approach to handling queries using Apple Intelligence:

On device

Some Apple Intelligence features will work natively on the device. This has the advantage of working faster while preserving privacy. Edge-based processing also reduces energy requirements, because no cloud communication or server-side processing is required. (More complex tasks must still be handled in the cloud.)

In the cloud

Apple is deploying what it calls Private Cloud Compute. This is a cloud intelligence system designed specifically for private AI processing and capable of handling complex tasks using massive LLMs.

The idea is that this system provides the ability to flex and scale computational capacity between on-device processing and larger, server-based models. The servers used for these tasks are made by Apple, use Apple Silicon processors, and run a hardened operating system that aims to protect user data when tasks are transacted in the cloud. The advantage here is you can handle more complex tasks while maintaining privacy.

Externally

Apple has an agreement with OpenAI to use ChatGPT to process AI tasks its own systems can’t handle. Under the deal, ChatGPT is not permitted to gather some user data. But there are risks to using third-party services, and Apple ensures that users are aware if their requests need to be handled by a third-party service. 

Apple says it designed its system so when you use Private Cloud Compute, no user data is stored or shared, IP addresses are obscured, and OpenAI won’t store requests that go to ChatGPT. The focus throughout is to provide customers with the convenience of AI, while building strong walls around personal privacy.

Apple

What Apple Intelligence features exist?

Apple began a staggered introduction of Apple Intelligence tools across its devices late last year. In the background, Apple is not resting on its laurels; its teams are thought to be exploring additional ways Apple Intelligence can provide useful services to customers, with a particular focus on health.

Apple

At present, these are the Apple Intelligence tools Apple has announced:

Writing Tools

Writing Tools is a catch-all term for several useful features, most of which should appear in October with iOS 18.1 (and the iPad and Mac equivalents). These tools work anywhere on your device, including in Mail, Notes, Pages, and third-party apps. To use them, select a section of text and tap Writing Tools in the contextual menu.

Rewrite: This takes your selected text and improves it.

Proofread: This is like a much smarter spellchecker that checks for grammar and context.

Summarize: It takes any text and, well, summarizes it. (This also works for meeting transcripts.) 

Priority notifications: Apple Intelligence can figure out which notifications are most important to you.

Priority messages in Mail: The system will prioritize the emails it thinks are most important.

Smart Reply: Apple’s AI can generate email responses. You can edit these, reject them, or write your own.

Reduce Interruptions: A new Focus mode should be smart enough to let important notifications through.

Call transcripts: It is possible to record, transcribe, and summarize audio captured in Notes or during a Phone call. When a recording is initiated during a call in the Phone app, participants are automatically notified. After the call, Apple Intelligence generates a summary to help recall key points.

Search and Memory Movies in Photos

Search is much better in Photos. It will find images and videos that fit complex descriptions and can even locate a particular moment in a video clip that fits your search description.

Search terms can be highly complex; enter a description and Apple Intelligence will identify all the most appropriate images and videos, put together a storyline with chapters based on themes it figures out from within the collection, and create a Memory Movie. The idea is that your images are gathered, collected, and presented in an appropriate narrative arc; this feature is expected to debut with iOS 18.1.

Apple

Clean Up tool in Photos

At least in my parts of social media, the Photos AI tool that most seemed to impress early beta testers was Clean Up. This super-smart implementation means Apple Intelligence can identify background objects in an image and let you remove them with a tap. I can still recall when removing items from within images required high-end software running on top-of-the-range computers equipped with vast amounts of memory. Now you can do it in an instant on an iPhone.

Image Playground for speedy creatives

Image Playground uses genAI to create animations, illustrations, and sketches from within any app, including Messages. Images are generated for you by Apple Intelligence in response to written commands. You can choose between a range of themes, places, or costumes, and also create an image based on a person from your Photos library.

Apple has made it so Image Playground can use ChatGPT to allow for new styles, such as oil painting or vector art. In another enhancement, it’s now possible to change facial expressions or features when making images inspired by family and friends using Genmoji and Image Playground.

Apple

Genmoji gets smarter

Genmoji uses genAI to create custom emoji. The idea is that you can type in a description of the emoji you want and select one of the automatically generated ones to use in a message. You are also able to keep editing the image to get to the one you want. (The only problem is that the person on the receiving end may not necessarily understand your creative zeal.)

Apple has also improved Genmoji by enabling users to mix different emoji together to combine them with test descriptions to create new icons. 

Image Wand

This Notes-only AI-assisted sketching tool can transform rough sketches into nicer images. In Notes, tap the pen icon and begin to sketch with your finger, then tap the new blue-purple-topped Image Wand tool in the tool palette. The AI will analyze the content to create an image based on what you drew. You can also select an empty space and Image Wand will look at the rest of your Note to identify a context from which to create an image for you.

Apple

Camera Control in the iPhone 16 Pro

The iPhone 16 Pro gained a new feature that relies on visual intelligence and AI to handle some tasks. You can point your camera, for example, at a restaurant to get reviews or menus. It will also be possible to use this feature to access third-party tools for more specific information, such as accessing ChatGPT.

Additional visual tools are coming. For example, Siri will be able to complete in-app requests and take action across apps, such as finding images in your collection and then editing them inside another app.

Live Translation will open up communication

Live Translation is profound. Built into Messages, FaceTime, and iPhone it runs entirely on device and can automatically translate messages between languages, offer translated live captions on FaceTime calls, or speak the translation aloud during a phone conversation.

It is widely believed this feature will come to AirPods at some point, effectively giving users the equivalent of a translator in their ears — a major win for travelers.

Communication tools

Starting in iOS 26, Messages gains automatic poll creation tools so you can get feedback during a conversation, you also benefit from natural language search and live translation. In the Phone app, summaries are now available automatically for voicemail transcripts. Smart Reply in Messages and Mail provides suggestions for a quick response, and will identify questions to ensure everything is answered.

Translation features in iOS 26 will be a huge boost to business communications
Apple

Siri now supports ChatGPT

ChatGPT integration in Siri means that when you ask Siri a question, it will try to answer using its own resources; if it is unable to do so it will ask whether you want to use ChatGPT to get the answer. You don’t have to, but you will get free access to it if you choose. Privacy protections are built in for users who access ChatGPT — IP addresses are obscured, and OpenAI won’t store requests. 

Following a host of problems in its development, Siri will eventually get significant improvements to deliver better contextual understanding and predictive intelligence based on what your devices learn about you. You might use it to find a friend’s flight number and arrival time from a search through Mail or to put together travel plans — or any other query that requires contextual understanding of your situation. 

The contextual features should appear by the end of the year, Apple has said, though this might slip into early 2026.

Apple

On-screen awareness

A new evolution in contextual awareness will give Siri the ability to take and use information on your display. The idea is that whatever is on your screen becomes usable in some way — you might add addresses to your contacts list, or track threads in an email, for example. It’s a profound connection between what you do on your device and wherever you happen to be; the feature is now expected to arrive in 2026.

Another, and perhaps even more powerful, improvement will allow Siri to control apps. And because it uses genAI, you’ll be able to pull together a variety of instructions and apps — such as editing an image and adding it to a Note without having to open or use any apps yourself. This builds on the accessibility tools Apple already has and leans into some of the visionOS user interface improvements.

It’s another sign of the extent to which user interfaces are becoming highly personal.

Visual Intelligence

While these contextual features are taking time to bake, Apple is seeding some to show what will be possible. In OS 26, Visual Intelligence, for example, now understands what is on screen, lets you ask ChatGPT questions about what you are looking at, and allows you to search for specific items or add events to your calendar. 

Visual intelligence also recognizes when you are looking at an event and suggests adding it to your calendar. Apple Intelligence will then gather together the date, time, and location to create an event reminder. These tools can be activated when you press the same buttons as you use to take a screenshot.

That’s a big step toward a wider contextual intelligence.

Shortcuts can now include Apple Intelligence tasksApple

Shortcuts, now with Apple Intelligence

Another major step toward smart, responsive devices is the decision to add Apple Intelligence support to the Shortcuts app. This makes it possible to create your own workflows using AI. 

All Apple’s own AI tools have been made available for use in Shortcuts, including summarization from Writing tools. It is also possible to implement AI models from third-party services, as well as on-device, from within a Shortcut.

Effectively, this means you can use Apple Intelligence in your own Shortcuts routines; third-party developers can access the same tools when putting together their own apps. An AI-tinged instruction can now be executed within a string of other actions, for potential productivity gains.

There does seem to be a lot of potential ready to be unlocked as productivity workers figure out how and where Apple Intelligence and Shortcuts can help them get things done. For example, a student might build a routine that uses Apple Intelligence to compare their own class notes to an audio transcript of a lecture to ensure they didn’t miss anything.

It is also possible to run one of your created Shortcuts from within Spotlight on your Mac, or make it available on spoken command using Siri.

New services gain AI features in 2026Apple

Smarter than your average wallet

Apple Wallet uses AI to automatically identify, summarize, and display order tracking details from emails sent from merchants or delivery carriers. Apple Pay lets you pay with rewards and installments for in-store purchases and offers access to installment loan offers from eligible credit or debit cards when making a purchase with Apple Pay in the US. This is being rolled out across selected providers in the US, UK, and Canada.

Apple Maps gains intelligence

iPhone can use on-device intelligence to learn the routes people take between the places they frequently visit, like home and work. Maps then previews that commute to warn of delays and suggest alternative routes. Visited Places is a new personalization feature in which the device intelligently detect the places we spend time in, automatically saving those locations to Maps. It’s then much easier to find and share those places. (Apple promises this feature is highly secure, and it is possible to delete places.)

Better boarding passes

Boarding passes in Wallet have been enhanced with AI, providing you with lots of additional information, maps, and real-time flight data. You can also create and use a US passport to create a Digital ID in Apple Wallet, though this is only valid for domestic travel and cannot be used internationally. 

Boarding passes in Wallet wield additional powers, including lost luggage tracking using Find My, access to airline services such as seat upgrades, and more. The latter features are being introduced across a limited number of airlines, including Air Canada, American Airlines, Delta Air Lines, JetBlue, Jetstar, Lufthansa Group, Qantas, Southwest Airlines, United Airlines, and Virgin Australia.

Apple Intelligence helps you stay more organized

Apple Intelligence will organize and categorize your Reminders list for you, which should help you stay organized. It can also create new reminders based on content from an email, website, or note, if you wish.

Apple Music boosted by AI

AutoMix, which mixes songs like a DJ in Apple Music. AI analyzes songs to build unique transitions, using time stretching and beat matching to enhance the experience of listening to music.

Lyrics Translation helps listeners understand what songs mean, while Lyrics Pronunciation helps you sing lyrics correctly, even in another language.

Sing is a new karaoke system that transforms your iPhone into a handheld microphone for Apple TV, so you can have your voice amplified for impromptu singlongs, boosted by real-time lyrics and on screen visual effects.

Apple Podcasts now provides an Enhance Dialog solution to isolate voices to make them sound clearer. It’s also possible to slow or accelerate podcast speed.

Many Apple Intelligence tools, such as translation (above), now work on Apple Watch,
Apple

Apple Intelligence on Apple Watch

Some Apple Intelligence features also show up on Apple Watch, but Workout Buddy is unique to the device. This is an AI-powered wrist-worn fitness coach that uses your own workout data and fitness history to generate personalized inspirational encouragement during your workouts.  It will be available starting in English for the following types of workout: Outdoor and Indoor Run, Outdoor and Indoor Walk, Outdoor Cycle, HIIT, and Functional and Traditional Strength Training.

Where can I get Apple Intelligence?

Apple Intelligence is available across Apple’s ecosystem, with additional features planned across every Apple device. It’s an operating system feature, though it is also understood that not all of these tools will be available in every market. It supports the following languages: Chinese (Simplified), English (Australia, Canada, India, Ireland, New Zealand, Singapore, South Africa, UK, or U.S.), French, German, Italian, Japanese, Korean, Portuguese (Brazil), or Spanish.

What devices work with Apple Intelligence?

Apple Intelligence requires an iPhone 15 Pro, iPhone 15 Pro Max, or iPhone 16 series device. It also runs on Macs and iPads equipped with an M1 or later chip, Vision Pro, and Apple Watch.

Vision Pro gained Apple Intelligence support in a 2026 update.Apple

What about third-party apps?

One of the most important introductions at WWDC 2025, Apple’s Foundation Models Framework provides developers with tools with which to exploit Apple’s own on-device AI LLMs in their apps. Deployment within an app requires as little as three lines of Swift code. That means Writing Tools and translation tools can be built into apps used across Apple’s devices.

That’s more than a convenience for developers and for customers; because these are Apple’s models, they work on the device, which means developers aren’t obliged to pay fees to access online services that provide equivalent functionality — plus privacy is protected.

Apple has also woven AI within Xcode 26 in which ChatGPT support is built-in; developers can use API keys from other providers or run local models on Apple Silicon-based Macs.

What AI is already inside Apple’s systems?

All these features are supplemented by numerous forms of AI tools Apple already has in place across its platforms, principally around image vision intelligence and machine learning. You use these built-in applications each time you use FaceID, run facial recognition in Photos, or use the powerful Portrait Mode or Deep Fusion features when taking a photograph.

There are many more AI tools, from recognition of addresses and dates in emails for import into Calendar to VoiceOver all the way to Door Detection — even the Measure app on iPhones. What’s changed is that while Apple’s deliberate focus had been on machine-learning applications, the emergence of genAI unleashed a new era in which the contextual understanding available to LLM models uncovered a variety of new possibilities.

The omnipresence of various kinds of AI across the company’s systems shows the extent to which the dreams of Stanford researchers in the 1960s are becoming real today.

denvit

An alternative history of Apple Intelligence

Apple Intelligence might appear to have been on a slow train coming, but the company has, in fact, been working with AI for decades.

What exactly is AI?

Broadly speaking, AI is designed to enable computers and machines to simulate human intelligence and problem-solving capabilities. The hardware becomes smart enough to learn new tricks based on what it learns, and carries the tools needed to engage in such learning.

To trace the trail of modern AI, think back to 1963, when computer scientist and LISP inventor John McCarthy launched the Stanford Artificial Intelligence Laboratory (SAIL). His teams engaged in important research in robotics, machine-vision intelligence, and more.

SAIL was one of three important entities that helped define modern computing. Apple enthusiasts will likely have heard of the other two: Xerox’s Palo Alto Research Center (PARC), which developed the Alto that inspired Steve Jobs and the Macintosh, and Douglas Engelbart’s Augmentation Research Center. The latter is where the mouse concept was defined and subsequently licensed to Apple. 

Important early Apple luminaries who came from SAIL included Alan Kay and Macintosh user interface developer Larry Tesler — and some SAIL alumni still work at the company.

“Apple has been a leader in AI research and development for decades,” pioneering computer scientist and author Jerry Kaplan told me. “Siri and face recognition are just two of many examples of how they have put this investment to work.”

Back to the Newton…

Existing Apple Intelligence solutions include things we probably take for granted, going back to the handwriting recognition and natural language support in 1990s Newton. That device leaned into research emanating from SAIL — Tesler led the team, after all. Apple’s early digital personal assistant first appeared in a 1987 concept video and was called Knowledge Navigator. (You can view that video here, but be warned, it’s a little blurry.)

Sadly, the technology couldn’t support the kind of human-like interaction we expect from ChatGPT, and (eventually) Apple Intelligence. The world needed better and faster hardware, reliable internet infrastructure, and a vast mountain of research-exploring AI algorithms, none of which existed at that time.  

But by 2010, the company’s iPhone was ascendant, Macs had abandoned the PowerPC architecture to embrace Intel, and the iPad (which cannibalized the netbook market) had been released. Apple had become a mobile devices company. The time was right to deliver that Knowledge Navigator. 

Apple

When Apple bought Siri

In April 2010, Apple acquired Siri for $200 million. Siri itself is a spinoff from SAIL, and, just like the internet, the research behind it emanated from a US Defense Advanced Research Projects Agency (DARPA) project. The speech technology came from Nuance, which Apple acquired just before Siri would have been made available on Android and BlackBerry devices. Apple shelved those plans and put the intelligent assistant inside the iPhone 4S (dubbed by many as the “iPhone for Steve,” given Steve Jobs’ death around the time it was released).

Highly regarded at first, Siri didn’t stand the test of time. AI research diverged, with neural networks, machine intelligence, and other forms of AI all following increasingly different paths. (Apple’s reluctance to embrace cloud-based services — due to concerns about user privacy and security — arguably held innovation back.)

Apple shifted Siri to a neural network-based AI system in 2014; it used on-device machine learning models such as deep neural networks (DNN), n-grams and other techniques, giving Apple’s automated assistant a bit more contextual intelligence. Apple Vice President Eddy Cue called the resulting improvement in accuracy “so significant that you do the test again to make sure that somebody didn’t drop a decimal place.”

But times changed fast.

Did Apple miss a trick?

In 2017, Google researchers published a landmark research paper, “Attention is All you Need.” This proposed a new deep-learning architecture that became the foundation for the development of genAI. (One of the paper’s eight authors, Łukasz Kaiser, now works at OpenAI.)

One oversimplified way to understand the architecture is this: it helps make machines good at identifying and using complex connections between data, which makes their output far better and more contextually relevant. This is what makes genAI responses accurate and “human-like” and it’s what makes the new breed of smart machines smart.

The concept has accelerated AI research. “I’ve never seen AI move so fast as it has in the last couple of years,” Tom Gruber, one of Siri’s co-founders, said at the Project Voice conference in 2023.

Yet when ChatGPT arrived — kicking off the current genAI gold rush — Apple seemingly had no response. 

The (put it to) work ethic

Apple’s Cook likes to stress that AI is already in wide use across the company’s products. “It’s literally everywhere on our products and of course we’re also researching generative AI as well, so we have a lot going on,” he said. 

He’s not wrong. You don’t need to scratch deeply to identify multiple interactions in which Apple products simulate human intelligence. Think about crash detection, predictive text, caller ID based on a number not in your contact book but in an email, or even shortcuts to frequently opened apps on your iPhone. All of these machine learning tools are also a form of AI. 

Apple’s CoreML frameworks provide powerful machine learning frameworks developers can themselves use to power up their products. Those frameworks build on the insights Adobe co-founder John Warnock had when he figured out how to automate the animation of scenes, and we will see those technologies widely used in the future of visionOS.

All of this is AI, albeit focused (“narrow”) uses of it. It’s more machine intelligence than sentient machines. But in each AI application it delivers, Apple creates useful tools that don’t undermine user privacy or security.

The secrecy thing

Part of the problem for Apple is that so little is known about its work. That’s deliberate. “In contrast to many other companies, most notably Google, Apple tends not to encourage their researchers to publish potentially valuable proprietary work publicly,” Kaplan said.

But AI researchers like to work with others, and Apple’s need for secrecy acts as a disincentive for those in AI research. “I think the main impact is that it reduces their attractiveness as an employer for AI researchers,” Kaplan said. “What top performer wants to work at a job where they can’t publicize their work and enhance their professional reputation?” 

It also means the AI experts Apple does recruit subsequently leave for more collaborative freedom. For example, Apple acquired search technology firm Laserlike in 2018, and within four years, all three of that company’s founders had quit. And Apple’s director of machine learning, Ian Goodfellow (another a SAIL alumni), left the company in 2022. (I imagine the staff churn makes life tough for former Google Chief of Search and AI John Giannandrea, who is now Apple’s senior vice president of machine learning and AI strategy.)

That cultural difference between Apple’s traditional approach and the preference for open collaboration and research in the AI dev community might have caused other problems. The Wall Street Journal reported that at some point both Giannandrea and Federighi were competing for resources to the detriment of the AI team; we’ve since learned of additional problems within the development team, with leadership replaced.

Despite these setbacks, the company has assembled a large group of highly regarded AI pros, including Samy Bengio, who leads company research in deep learning. Apple has also loosened up a great deal, publishing research papers and open source AI software and machine learning models to foster collaboration across the industry. Competition, however, remains intense, with highly-placed AI engineers regularly poached by competitors.

Apple has also hinted it may be prepared to use its financial muscle to acquire strategically useful companies. “We’re very open to M&A that accelerates our road map,” Cook said during the company’s July 25, 2025 earnings call. Cook has also told an all-hands meeting of Apple staff that “AI is ours to grab.”

Despite the challenges it’s faced, Apple is far from finished in the space, with its AI team leaders instructed to do “whatever it takes” to build the best available AI tools and services.

“Apple is one potential AI partnership away from breaking out,” Morgan Stanley said in August 2025.

What next?

History is always in the rear view mirror, but if you squint just a little bit, it can also show you tomorrow. Speaking at the Project Voice conference in 2023, Siri co-founder Adam Cheyer said: “ChatGPT style AI…conversational systems…will become part of the fabric of our lives and over the next 10 years we will optimize it and become accustomed to it. Then a new invention will emerge and that will become AI.”

At least one report indicates Apple sees this evolution of intelligent machinery as foundational to innovation. While that means more tools, and more advances in user interfaces, each those steps leads inevitably toward A
https://www.computerworld.com/article/3511199/everything-we-know-about-apple-intelligence.html

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Current Date
Aug, Sat 23 - 00:31 CEST