MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

How Apple tech can deliver your very own private AI answers

Wednesday November 19, 2025. 05:40 PM , from ComputerWorld
Yesterday, we looked at how Macs can provide on-premises AI services for you; today we’re going to speculate a little more. Consider this: millions of people use iPhones to access public generative AI (genAI) tools such as ChatGPT, Gemini, and others. When you use those tools, you’re sharing your data with cloud providers, which isn’t necessarily a good thing. 

What if there were another way?

Well, there is another way, one in which your own AI Mac cluster becomes the first port of call for AI functions you can’t do natively on your Apple device. This article describes the installation of a working version of Deepseek on a Mac for access from a remote iPhone. 

In business, this becomes an on-premises AI that can be accessed remotely by authorized endpoints (you, your iPhone, your employees’ devices). The beauty of this arrangement is that whatever data you share or requests you might make are handled only by the devices and software you control. 

How it might work

You might be running an open-source Llama large language model (LLM) to analyze your business documents and databases — combined with data (privately) found on the web — to give your field operatives access to up-to-the minute analysis relevant to them.

In this model, you might have a couple of high-memory Macs (even an M1 Max Mac Studio, which you can get second-hand for around $1,000) securely hosted at your offices, with access managed by your choice of secure remote access solutions and your own endpoint security profiling/MDM tools. You might use Apple’s ML framework, MLX, installing models you choose, or turn to other solutions, including Ollama. 

As the useful how-to guide noted above shows, people are already experimenting with usage models like this. When they do, they should find performance seems pretty solid, thanks to the excellence of Apple Silicon chips and their capabilities for efficient memory management. There are some limits, such as how many tokens of thought your Mac can produce, and things slow down as tasks become more complex. But it works pretty well up to a point. 

Apple continues to raise that point for more and better efficiency.

Apple continues to improve its infrastructure

Apple is reportedly about to allow for the creation of ad hoc Mac clusters over Thunderbolt 5, making it much easier to deploy teams of Macs. That means better performance and the power of the combined memory from those machines.

That matters. LLMs demand a lot of resources, so the capacity to easily cluster multiple Macs makes it possible to use on-prem solutions for more complex AIquestions. (MacOS Tahoe 26.2 will also give MLX full access to the neural accelerators hosted on M5 chips, which will deliver immediate and dramatic speed improvements for AI inferencing.)

Developers, meanwhile, are making extensive use of Apple’s Foundation Models to access Apple Intelligence LLMs from within their apps. If you’d like to test the potential of this a little for yourself, you could use an app that supports this, or even explore a project called AFM; it lets you run those models from the command line. 

The steady democratization of artificial intelligence continues. It must, if we’re to break the stranglehold of elite ownership of the world’s leading AI models. To a great extent, the power/performance per watt advantages of Apple Silicon are really coming into their own, as well they might, given that AI was part of Apple’s goal when it designed Mac silicon. 

Now, imagine how these kinds of private, personal AI deployments could help when using a visionOS device to interrogate the Mac AI cluster you keep safe and sound in your office or home.

That’s just one possible end game in the drive to local on-device, edge AI — and it’s closer to realization than you think.

You can follow me on social media! Join me on BlueSky,  LinkedIn, and Mastodon.
https://www.computerworld.com/article/4093120/how-apple-tech-can-deliver-your-very-own-private-ai-an...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Current Date
Nov, Wed 19 - 20:04 CET