MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

WWDC: For developers, Apple’s tools get a lot better for AI

Thursday June 12, 2025. 06:51 PM , from ComputerWorld
Apple announced one important — and immediate — upgrade at WWDC this week, the introduction of support for third-party large language models (LLM), such as ChatGPT from within Xcode. It’s a big step that should benefit developers, accelerating app development.

“Developers play a vital role in shaping the experiences customers love across Apple platforms,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations. “With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we’re empowering developers to build richer, more intuitive apps for users everywhere.”

Xcode 26: GenAI inside

Apple explains that as of now, the LLM integration means developers can connect models directly into their coding workflow to write code, tests, and andocumentation; iterate on a design; fix errors; and more. 

[ Related: Apple WWDC 2025: News and analysis ]

ChatGPT support is built-in, and developers can use API keys from other providers or run local models on Apple silicon Macs. It is interesting that developers can begin to make use of ChatGPT in Xcode without creating an account, though ChatGPT subscribers do get more from that service.

Used alongside Apple’s new Foundation Models framework, which lets you use Apple AI tools within their apps with just three lines of code, it’s pretty clear that even if Apple Intelligence hasn’t yet met the company’s ambitions for AI, the era of artificial intelligence has certainly arrived on its ecosystem. After all, once developers build with AI, they will inevitably create AI services; the rest is an as-yet-unwritten history to be unveiled one application at a time.

How does the ChatGPT integration work?

When developers are working in Xcode, they will be able to access ChatGPT from within the coding pane. The idea is that a developer simply types a prompt in the pane to get ChatGPT to generate previews, fix coding errors, or create new functions. These tools should optimize code development and mean developers can focus their skills on more complex application development tasks.

There was something missing from Xcode, and that was a tool called Swift Assist. Apple announced that tool, which was intended to help developers write code using AI, at WWDC last year. It possibly reflects some of the failures of Apple’s internal AI development projects that the tool hasn’t yet shipped, and the introduction of support for ChatGPT hints that perhaps it won’t.

In the run up to WWDC, expectations had built that Apple might work with Anthropic to power the AI inside Xcode 26. This hasn’t happened, but perhaps that situation will change once the new ’26-branded Apple operating systems ship this fall. Developers equipped with an Anthropic API key can access the service, however, and it is good the company has chosen not to lock developers into one AI approach.

What the developers are thinking

While it is a little early to say for sure, Apple developers do seem accepting of this integration. One developer very swiftly switched on Xcode 26 to build a fully on-device AI ChatBot that makes use of Apple’s Foundation models, making full use of ChatGPT’s code-complete help when he did. “I leaned on the new code-complete features in Xcode to scaffold the project ridiculously fast. There were bugs, of course, but it significantly sped up the development of boilerplate code,” he wrote.

His work confirmed some positives to Apple’s approach, particularly that developers concerned about code privacy can hook Xcode up to their own internal AI models, including locally hosted ones. Developers curious to use ChatGPT in Xcode must also be running macOS 26 beta, so they may want to wait a while before using this on primary machines.

These new GenAI code-creation features will make a difference to app developers. But Apple also introduced a host of supporting technologies and APIs to unleash machine learning across its ecosystem, including Foundation Models, improved speech-to-text capabilities, improvements to Metal 4, App Intents, and welcome enhancements to the open source library MLX, which helps you train your own LLMs.

There’s an excellent in-depth developer talk explaining some of the latter new features here.

The bottom line? Apple might not have made Siri the smart assistant it wants it to become quite yet, but it has still decisively enriched its offering to enable developers to build AI-informed applications, all while using Apple silicon Macs capable of the best computational performance in the industry. 

It appears reports of the demise of the company might have been somewhat exaggerated.

You can follow me on social media! Join me on BlueSky,  LinkedIn, and Mastodon.
https://www.computerworld.com/article/4006108/wwdc-for-developers-apples-tools-get-a-lot-better-for-...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Current Date
Jun, Fri 13 - 17:28 CEST