MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple Intelligence FAQ: What it is, what it does, and when you’ll get it

Wednesday September 18, 2024. 09:51 PM , from Mac 911
Macworld

At WWDC on June 10 2024, Apple took the wraps off its ambitious project to inject generative AI features throughout its operating systems. Apple calls this Apple Intelligence, and it’s going to transform the way you use your iPhone, iPad, and Mac—but it’s also got significant limitations and caveats.

Here’s everything you need to know about Apple Intelligence before it lands on your devices: What it is, what it does, how it works, when it’s coming, and what you’ll need to be able to use it.

Updated December 12: iOS 18.2, iPadOS 18.2 and macOS Sequoia 15.2 are out now, bringing more Apple Intelligence features to compatible iPhones, Macs and iPads.

What is Apple Intelligence?

Apple Intelligence is Apple’s branded term for its suite of generative AI features that debuted in iOS 18, iPadOS 18, and macOS 15 Sequoia.

Since Apple has been using machine learning and “artificial intelligence” in its products for years (though not generative AI), and because Apple is a company that can never pass up a good branding opportunity, it took “AI” and gave it a snazzy new name.

Apple Intelligence involves several features including some centered around reading and writing, some for image generation, and a few other things. It’s a way of getting things done quicker through voice and text prompts that draw on personal context and understanding to deliver results quickly and privately.

When does Apple Intelligence come out?

The first set of Apple Intelligence features became available on October 28 with the release of iOS 18.1, iPadOS 18.1, and macOS 15.1. Additional features arrived in December with the x.2 releases, as outlined in How Apple Intelligence levels up with iOS 18.2 and macOS 15.2.

Yet more Apple Intelligence features will arrive in the spring with the x.3 or x.4 releases.

We have an article that gives you a detailed roadmap of Apple Intelligence features and their likely release dates.

When will Apple Intelligence work in my language?

The first release only supported US English and will only be available in some regions. Localized English for UK, Canada, Australia, South Africa, and New Zealand arrived in December 2024.

Support for German, Italian, Korean, Portuguese, Vietnamese, and some other languages will be added throughout 2025.

Regardless of language, Apple Intelligence will be released in the European Union in April 2025.

Apple Intelligence features: What does Apple Intelligence do?

Apple Intelligence, in this first release, is a broad set of features that can be loosely categorized into four groups: Siri, writing, images, and summaries/organization.

Siri

Siri will be vastly improved with Apple Intelligence. It will be more natural and easier to talk to using normal speech, even if you mess up your words. It will use context about you from throughout your iPhone–photos, messages, contacts, locations, and more–to give results that are specific to you, personally. It’s such a big upgrade that Apple calls it, “The start of a new era for Siri.”

Siri will remember context from one command to the next so you won’t need to summon Siri a second or third time to do more than one task. It can also perform lots of new actions within apps. Apple is adding a lot to its “App Intents” which is how apps–including third-party apps–integrate with Siri. Siri will also be able to look at the screen and understand what’s on it, so you can give it commands related to what you’re looking at. If your friend messages you his address, you can say “Siri, save this address to his contact info” and it will see the address on the screen to know what you mean, and know who “his” is from the context of the message.

The biggest improvements to Siri are not coming until an update in the spring of 2025.




Siri’s ability to perform multiple actions across apps will be a big step up.Apple

Writing

Almost anywhere in the system in which you write (Messages, Mail, Notes, web forums, you name it) you’ll be able to quickly call up new AI-powered writing tools to make it easier to say what you want how you want to. The tools can take selected text and change its style (friendly, professional, concise), create summaries or lists, or just proofread it for spelling and grammar.

If you want to generate new text, smart replies can take a few contextual bits of information provided by you and craft an appropriate response. Say someone sends you an email inviting you to their holiday cookout. You can provide simple info like whether you’re going or not, when you’ll be there, or offering to bring something, and the system will create a whole reply email for you.

New in iOS 18.2 is a text input line where can describe any sort of change you want. For example you could ask for a poem or to write something scary.




Writing tools to change style, produce text, or proofread are central to Apple Intelligence.Apple

Images

Apple is including image generation tools in Apple Intelligence. It can create new images in two styles: illustration and animation. The lack of realistic photographic depictions seems like a safety choice. You can type a description to get an image or start with a rough sketch of what you want. You can even take an image from your photos library as a starting point or create a “Genmoji” out of people in your photos or contacts. Genmoji will make a new custom image in the style of Apple’s emoji, just describe the emoji you wish to see and Apple’s AI will create various options. Read: How to create your own custom emoji with Genmoji.

There’s a new dedicated app called Image Playground where you can experiment with all these image-generation tools, but they’re also available throughout the system. For example, you can circle a sketch or even a space in Notes using the new ‘Magic Wand’ tool and use Apple Intelligence to generate a properly contextual image there. Just circle the blank area or rough sketch to create an AI-generated sketch. Make an image of someone you know in an iMessage thread that is relevant to your conversation. Apple is creating APIs for third-party developers to use these tools in their apps, too.

Apple Intelligence’s enhanced image understanding shows up in other ways, too. You can get very specific when searching your photos, with prompts like “Show photos of Charlotte from last summer when she was wearing sunglasses.” Finally, a new Clean Up tool in the Photos app lets you instantly remove unwanted objects from the background of your photos.

Some image generation tools arrived as part of the second wave of Apple Intelligence features in December. You can now use AI to create an image by typing a description or using various modifiers, with the image created in your choice of animation or illustration style. You can select people from your Photos library to use a subject.




The Image Playground app lets you experiment with AI art, but the tools are integrated throughout iOS.Apple

Summaries and organization

Apple Intelligence has a much greater understanding of language, so it can do a better job of understanding and presenting all the text you deal with.

Your Mail inbox, for example, can show summaries of emails instead of just the first few lines, so it’s easier to find the one you’re looking for. Apple Intelligence understands the content of your emails and will automatically place them into categories (Primary, Transactions, Updates, and Promotions), and build “digests” of emails from the same sender. Important emails can be discovered and bumped up to a list at the top.

Safari’s reader mode can summarize web pages. It can find priority notifications and show them in a brief list, with summaries, at the top of your notification stack. You can even engage a Focus mode setting that checks notifications as they come in and will silence most, but let through the ones that seem like they could be important.

A great example of how AI will permeate the system can be found in the Phone app, where you’ll be able to record any call (both parties will be notified) and save a transcript and summary of it, which can also help the AI find it later or know what you talked about to deliver more personal results.




Apple

How does Apple Intelligence differ from other AI?

Apple Intelligence has many qualities similar to other generative AI, but Apple stresses several qualities that make it stand out.

First is Apple’s focus on privacy. All the features described here run mostly on-device, using Apple’s advanced hardware and silicon, so your data never actually goes anywhere. Apple is not “scooping up” your information to sell or train their AI models.

When something needs to be done that requires a bigger and more complex model than can be run on-device, Apple employs a new private cloud architecture called Private Cloud Compute that uses its own hardware. Only the very specific data necessary to complete your request is sent in a secure fashion, and after your request is completed the data is discarded. Apple has promised to make the server code accessible to outside security researchers who can audit it to make sure Apple is keeping its privacy promise.

Second, Apple Intelligence is personal rather than general. Because it runs mostly on the device and the cloud implementation is very private and fleeting, it can build a knowledge graph about you using all sorts of information on your iPhone–locations, photos, messages, mail, contacts, and much more. This enables it to give answers and produce results that are specific to your life and not just generalized.

And finally, it’s deeply integrated throughout the system, available in most of Apple’s apps but with tools and APIs to allow developers to use the tools within their own apps.

What about ChatGPT, Gemini, Copilot, or Meta AI?

Apple acknowledges that those popular AI chatbots have a massive base of general knowledge and more information about things like current events. So it’s not locking them out—on the contrary, it’s inviting them in.

As of iOS 18.2 is a “Compose” button in the Writing Tools box that invites you to compose something with ChatGPT.

Siri will work seamlessly with the ChatGPT-4o model to answer complex questions or those that require broad general knowledge rather than personal info about you.

Just ask Siri anything and, if it needs to hand the query off to ChatGPT, it will first ask you if that’s OK (and if you’re providing an image, if it’s okay to send it), and if you allow it, you’ll get an immediate ChatGPT response with no need to install an app, log in to anything, or register a ChatGPT account. However, if you do have a ChatGPT subscription, you’ll have access to the more advanced features you pay for.




ChatGPT integration in Siri, writing, and image tools is coming later this year.Apple

While ChatGPT is the first AI integration, Apple promises to allow others in the future such as Google Gemini, which was rumored to be in talks with Apple ahead of WWDC. It wouldn’t be surprising to see other AI applications join Apple Intelligence quickly.

Once again, we should note that Apple will always ask before sharing any data with ChatGPT and that the company’s arrangement with OpenAI stipulates that IP addresses will be obscured and no data will be saved.

Visual Intelligence

iOS 18.2 bring a new feature to the Camera Control button on the iPhone 16 series. Just point your camera at something, press and hold the Camera Control button, and Visual Intelligence will use Siri and ChatGPT to tell you about it.

What devices do I need to use Apple Intelligence?

Unfortunately, all this powerful local generative AI has a steep hardware cost when it comes to which devices are compatible with Apple Intelligence.

If you have an iPhone, you’ll need an A17 Pro (or newer) processor, which means only the iPhone 15 Pro and iPhone 15 Pro Max, and all iPhone 16 models will be compatible with Apple Intelligence.

For Macs and iPads, you’ll need an M1 or newer processor. That means no Intel Macs no matter how powerful, and no iPads that run A-series processors. Nearly all new Macs from the last few years will qualify but only iPad Pro models from 2021 and the two most recent iPad Air models that have M1 and M2 processors.

Apple Intelligence is not coming to other operating systems just yet, either. You won’t find its features in tvOS 18 or visionOS 2, and while HomePod’s screenless state makes most of these features moot anyway, it’s not yet clear if or when the superior Siri experience will come to HomePod.
https://www.macworld.com/article/2362804/apple-intelligence-faq-features-release-date-hardware-suppo...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Dec, Sun 22 - 07:12 CET