|
Navigation
Search
|
Apple’s macOS: AI for the rest of us
Tuesday November 18, 2025. 06:55 PM , from ComputerWorld
Apple is building the world’s best ecosystem for artificial intelligence. The Mac, supported by iPads and iPhones, ticks nearly every box for the inevitable future of the technology. After all, if that future is to be fully realized, a big part of it will involve on-premises, highly secure, private AI systems handling focused problems specific to the business or individual that runs them.
The ongoing eye-watering investments in hyperscalers cannot be the only way to unleash this tech; the levels of investment are unsustainable economically, politically, and environmentally. While there will always be a need for large cloud-based AI clusters to handle big problems, service providers and customers won’t want to carry so much cost. They need AI they can run at more reasonable prices. That’s why 73% of organizations using Macs are already using them to run AI in their business. What do we want? Flexible AI Logically, that means using independent systems that can run efficiently without consuming vast quantities of water or energy. Those systems also need to be flexible so they can be deployed for other uses when AI functions aren’t required. That also implies these systems will be clustered, potentially using off-the-shelf parts and a combination of proprietary, open-source, and self-made data and models. What will these systems be like? I think they’ll be like the Mac, because Apple Silicon is already built for AI. Apple has created systems that deliver industry-leading performance per watt, make efficient use of memory (boosted by the use of unified memory), and can churn through most large language models (LLMs) at world-class speed. “Whether you’re working with AI or building it, the [Mac] is an amazing system for many of the things employees need from a performance perspective,” Apple’s Colleen Novielli, who focuses on MacBook product marketing, told me recently. “We’re seeing this amazing spectrum of adoption across the Mac range.” Coming soon in macOS Apple continues to be very focused on how Macs can support AI. The company’s upcoming macOS 26.2 release, for example, brings significant software improvements for AI workloads, such as better utilization of neural accelerators in Apple Silicon, resulting in up to 4.1 faster prompt processing for LLMs. Even better, macOS 26.2 introduces low-latency communication over Thunderbolt 5, allowing developers to cluster multiple Macs (such as Mac Studios) for high-performance AI applications. That’s going to permit the creation of ad hoc AI supercomputers using pro Macs. The newest Mac Studio supports up to 512GB unified memory per device. and macOS 26.2 will let you cluster multiple Mac Studios using Thunderbolt 5 for low-latency, high-bandwidth communication. That enables them to run massive AI models (like a 1-trillion-parameter coding model) efficiently, using the combined 2TB of unified memory across all four Macs. The Thunderbolt clustering support makes it possible to use off-the-shelf software such as EXO, combined with Apple’s open open-source MLX framework, to run AI locally on clustered Macs. Doing so is highly efficient, deeply private (as no information need ever be shared outside your location), and, because of how Apple has designed its platforms, highly energy efficient. (The energy consumption is a fraction of that of traditional GPU-based solutions.) AI for the rest of us For all these reasons, Apple’s Macs are seeing wide deployment across the AI industry, where thousands of researchers are using Apple Silicon and MLX. Macs also democratize access to AI. They are cost-effective, energy-efficient, and easy to set up, which means high-performance AI becomes accessible to small teams, businesses, and researchers. With macOS 26.2, it becomes possible to grab your Thunderbolt-5 supporting Macs, connect them using EXO and any old Thunderbolt 5 cable, and build a system capable of running DeepSeek, Qwen, Llama, or Mistral for you — privately, securely, and without compromise. It’s on-prem, without breaking the bank. You don’t need a special cooling system and can run the whole thing using a domestic power supply. As the needs of your LLM systems expand, you can just bring in another Mac, increasing the total number of CPU and GPU cores you have available on the combined system each time you do. All of this can be achieved once macOS 26.2 eventually appears. The truth may be out there, but right here, right now, Apple’s truth is that it has positioned its Windows-eating PC platform as the very best tool for the creation, delivery, and consumption of AI, making a powerful case for the future of its platforms. It is making systems that promise AI for the rest of us. You can follow me on social media! Join me on BlueSky, LinkedIn, and Mastodon.
https://www.computerworld.com/article/4092162/apples-macos-ai-for-the-rest-of-us.html
Related News |
25 sources
Current Date
Nov, Tue 18 - 20:20 CET
|







