MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple Intelligence has exposed HomePod as a not-so-smart speaker

Monday July 1, 2024. 04:14 PM , from Macworld UK
Macworld

During WWDC24’s opening keynote, Apple previewed some of the generative AI features rolling out on select devices later this year. From text and notification summaries to artificially generated images and emojis—Apple Intelligence will equip compatible iPhones, iPads, and Macs with a plethora of handy tools that work on a system level.

Notably, several devices are omitted from that list—Apple Watch, Vision Pro, and perhaps most glaring of all, HomePod. For the time being, Apple’s smart speakers ironically won’t be getting any of the Apple Intelligence perks. And they probably never will.

HomePod’s hardware constraints

Perhaps the main reason Apple Intelligence won’t be supported on current HomePod models is their internal specs. For reference, the upcoming AI features will only work on some of the highest-end Apple products, including M-series powered iPads and Macs, in addition to the iPhone 15 Pro and 15 Pro Max—thanks to the A17 Pro chip.

What all of these compatible devices have in common is at least 8GB of RAM and a powerful Neural Engine. So, while the iPhone 15 and 15 Plus launched alongside the 15 Pro models, they won’t support Apple Intelligence features. That’s presumably due to the A16 Bionic chipset and the lower 6GB of RAM it packs.

The latest HomePod 2 is fueled by an S7 chip, featuring just 1GB of RAM and no Neural Engine. It’s the same SoC found in 2021’s Apple Watch Series 7. If the most capable, S9-equipped Apple Watch Series 9 and Ultra 2 can’t handle Apple Intelligence, then neither can devices powered by inferior processors—such as the HomePod lineup.

Private Cloud Compute isn’t the answer

Apple Intelligence, by design, prioritizes on-device processing. However, Apple has also developed a fallback, cloud-based infrastructure to process more advanced queries that consumer devices can’t handle locally. Dubbed “Private Cloud Compute,” these server-based models utilize Apple silicon chips and their Secure Enclave to analyze encrypted user data. So, why not bake support for Private Cloud Compute into HomePods?




The HomePod mini is the perfect vehicle for Apple Intelligence—but it won’t get it.



The HomePod mini is the perfect vehicle for Apple Intelligence—but it won’t get it.Foundry

The HomePod mini is the perfect vehicle for Apple Intelligence—but it won’t get it.Foundry


Foundry

Well, as mentioned earlier, Apple wants Private Cloud Compute to act as a fallback option when the device can’t perform a certain task. It’s not meant to be the default or sole engine powering Apple’s AI features on incompatible products. This could be due to several reasons, such as the consequent server overload if millions of underpowered devices actively send requests for cloud processing.

John Gruber at Daring Fireball explains in greater detail: “The models that run on-device are entirely different models than the ones that run in the cloud, and one of those on-device models is the heuristic that determines which tasks can execute with on-device processing and which require Private Cloud Compute or ChatGPT.” He adds that Apple Intelligence’s on-device processing “component of Apple Intelligence “isn’t just nice to have, it’s a keystone to the entire thing.”

That’s not to say this vision could potentially change down the road, once Apple polishes Private Cloud Compute and upgrades its servers. After all, Apple Intelligence is launching as a beta, and the company is only getting started in this domain. Additionally, we’ve read rumors about Apple possibly relying on cloud computing in the future to power lighter wearables. So, things may change when Apple eventually masters the AI and cloud processing formats.

A future AI HomePod

Beyond potentially adopting Private Cloud Compute down the road, which seems somewhat unlikely, Apple’s most straightforward path for bringing Apple Intelligence to the HomePod is to introduce a new form factor. According to reputable leakers, the company has been working on an overhauled HomePod that features a display and a FaceTime camera. While we aren’t expecting this long-rumored HomePod with a display to launch this year, it could address the current hardware limitations when it does.




Could Apple open up Private Cloud Compute to one day include less-powerful devices such as Homepod and Apple Watch? Possibly not.



Could Apple open up Private Cloud Compute to one day include less-powerful devices such as Homepod and Apple Watch? Possibly not.Foundry

Could Apple open up Private Cloud Compute to one day include less-powerful devices such as Homepod and Apple Watch? Possibly not.Foundry


Foundry

Given that the added screen and camera would unlock new HomePod capabilities, Apple will naturally have to adopt a more powerful processor with more RAM. This could consequently elevate the HomePod’s hardware to reach the minimum specs required to run Apple Intelligence. However, Mark Gurman of Bloomberg reports that HomePod is “too low-volume a product to waste the engineering time” on building a device that can support Apple Intelligence. Rather, Apple is focused on building the next generation of HomePod, including, “an entirely new robotic device with a display that includes Apple Intelligence at its core.”

That makes sense, because a display is somewhat central to Apple Intelligence, as many of its AI features are primarily visual. So, for example, a HomePod in its current form factor won’t be able to generate images or compose emails on your behalf. The Siri it packs revolves around voice commands for controlling music playback and HomeKit accessories, along with answering basic questions (or at least trying to). Apple Intelligence, on the other hand, seemingly excels at productivity and entertainment tasks that a smart speaker can’t necessarily output. For the most part, I think using Apple Intelligence features on the current HomePods would be an unintuitive experience—except, of course, for Siri.

The best-case scenario for current HomePod users is Apple expanding integration with ChatGPT to include the HomePod. By doing so, the underpowered HomePods would be able to answer more advanced questions without needing to change the way Apple’s Private Cloud Compute works, as OpenAI would be handling the queries instead.

Despite this execution being technologically possible, it’s unlikely Apple will go that route. ChatGPT and whatever other third-party AI chatbots Apple partners with are meant to fill the gaps in Apple Intelligence, not serve as the main AI engine. Nevertheless, Apple needs to find some way to upgrade the current subpar Siri experience, especially when our iPhones, iPads, and Macs will be so much better. Perhaps Siri can break free from the larger Apple Intelligence experience so current HomePods can get an upgrade too. Until then, we’re all going to keep hearing that all-too-familiar Siri response, “I found some web results. I can show them if you ask again from your iPhone.”

Update 7/1: Mark Gurman of Bloomberg claims Apple isn’t planning to update the current HomePod speakers to support Apple Intelligence.

HomePod
https://www.macworld.com/article/2376876/apple-intelligence-homepod-siri-hardware-specs-chatgpt-priv...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Nov, Thu 21 - 17:18 CET