MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple embraced Meta’s vision (and Meta embraced Apple’s)

Friday September 20, 2024. 12:00 PM , from ComputerWorld
During an earnings call in the summer of 2021, Facebook CEO Mark Zuckerberg first publicly used the M-word. “In the coming years,” Zuck told investors, “I expect people will transition from seeing us primarily as a social media company to seeing us as a metaverse company.”

“Um, what?” said every cyberpunk sci-fi fan on Earth in unison. 

Until that moment, Neal Stephenson (who coined the word) described the “metaverse” in his 1992 science fiction novel “Snow Crash” this way: as a virtual reality (VR) platform controlled by wealthy leaders of powerful corporations that exacerbated social inequality and was so addictive that people spent all their time there, neglecting their real lives in the real world. 

The “metaverse” was a warning, not a business plan. 

Still, in October 2021, Zuckerberg announced that Meta would replace Facebook as the company’s name, and the “metaverse” would be its main focus henceforth.

His essential vision back then was a new internet anchored in VR. Just as today we shop, learn, and find entertainment on the internet, the “metaverse” version would do all those things in 3D environments, in which we would move around as avatars. Sure, elements of this VR world would be accessible via augmented reality (AR) and even phones, tablets, and PCs. But Meta’s essential belief was that the future is VR. 

Not so fast, said Apple

“AR is going to become really big,” Apple CEO Tim Cook said in 2016. “VR, I think, is not going to be that big compared to AR… and we will wonder… how we lived without it.” 

Back then, Apple was hard at work in its labs creating what it hoped would be the future of consumer technology — AR. And Meta was working on what it hoped would be the future of consumer technology — VR.

Apple envisioned business meetings, random social interactions, professional conferences, and family get-togethers as happening in person in the real world. Everyone would wear Apple glasses that displayed digital information based on the context of the interaction.

Meta envisioned business meetings, random social interactions, professional conferences, and family get-togethers happening in virtual spaces in the “metaverse,” with everyone wearing Meta goggles that immersed them in a believable 3D world.

Apple envisioned ordinary-looking eyeglasses. Meta envisioned big, bulky headsets. 

Based on these respective inclinations, something unpredictable happened. Meta released ordinary-looking eyeglasses, and Apple released big, bulky headsets. 

Specifically, a year ago, Meta replaced its lackluster and uninteresting Ray-Ban Stories glasses with Ray-Ban Meta glasses, which took off in popularity. They looked like regular Ray-Ban glasses, but contained high-quality microphones, speakers, and a camera. Best of all, they accessed AI via the camera, including (later) multimodal AI.

It’s likely that Meta was surprised by the success of Ray-Ban Meta glasses as a product and thrilled that Meta alone provided a compelling daily mobile use case for its AI. 

Then, in January, Apple shipped Apple Vision Pro. Let’s be very clear about what Apple Vision Pro hardware is — it’s VR hardware. It’s a big, heavy, bulky headset that delivers incredible visuals and features unique to Apple Vision Pro. But it’s VR delivering an AR experience. 

Apple has made a big point of emphasizing the categorization of Apple Vision Pro as spatial computing, not AR or VR. The spatial features are among the best things about Apple Vision Pro. The augmented reality feel of Apple Vision Pro is achieved through pass-through video. You don’t actually see the room you’re in; you see a video of the room. Others don’t actually see your eyes. They see an avatar of your eyes.

Apple required a lot of VR hardware to create AR and eventually wants to sell spatial computing AR glasses that look like ordinary eyeglasses. But that technology is a few years in the future, which is why Apple’s AR vision requires VR hardware. 

Meta, meanwhile, also seems super excited about augmented reality glasses — something like Ray-Ban Meta glasses, but with spatial-computing visuals. It seems less excited about VR, as evidenced by losses and cutbacks. Meta’s Reality Labs division has lost tens of billions of dollars and laid off thousands of employees in the past few years.

Enter Project Nazare

Instead of going big on the “metaverse,” Meta focuses more on AR and AI. 

Project Nazare is its first big hope in that space. Zuckerberg described this project as the company’s first attempt at creating true AR glasses. The device they’re working on sounds like Ray-Ban Meta glasses, plus holographic displays and sensors for mapping the physical environment for spatial computing (the placement of virtual objects in relation to the physical environment).

As with Apple Vision Pro, Nazare glasses would facilitate interaction with holographic avatars mapped to real people in real time, showing facial expressions, mouth movements, and hand gestures. 

Meta is focusing on a problem critics have drawn attention to with Apple Vision Pro, Microsoft Hololens, and Magic Leap: the narrow visual field. Nazare is reportedly working on a 200- to 220-degree field of holographic visuals. 

The company is also working on using multimodal AI through the camera to enable AI image recognition.

And that maps with Apple’s glasses

Meanwhile, Apple is reportedly focused on something similar. Bloomberg’s Mark Gurman reported that Apple is working on lightweight AR glasses that could be worn all day and could be launched as early as 2027 (but are more likely to arrive in 2028 or 2029). 

Both Apple and Meta face immense hurdles in reducing the size and cost of these glasses. Battery size and weight are an enormous issue, and the miniaturization of all components remains a major focus. 

But both companies are moving in the same direction. The disparate visions of the future each once had appear to no longer exist. 

Even though Apple’s current face computer is essentially VR hardware and Meta’s is essentially AR hardware (minus the light engine for holographic imagery), both companies appear to be well on their way to realizing what used to be Apple’s vision — everyday, all-day AR glasses that will one day replace the smartphone as our main device.
https://www.computerworld.com/article/3523752/apple-embraced-metas-vision-and-meta-embraced-apples.h

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Dec, Tue 3 - 18:12 CET