MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
glasses
Search

The AI glasses market comes into focus

Monday September 2, 2024. 12:00 PM , from ComputerWorld
According to credible rumors, Snap and Meta will soon unveil their next-generation AI glasses.

Snap might introduce its fifth-generation Spectacles at the Snap Partner Summit on Sept. 17. Features like a wider field of view and improved battery life could grace a production run of fewer than 10,000 units for developers. (The current 4th-generation Spectacles are also for developers only; Snap never sold them to the public.)

Meanwhile, Meta could well unveil its “Orion” project at its Connect conference, scheduled for Sept. 25-26. According to the latest tech chatter, Orion glasses are expected to be highly advanced augmented reality (AR) glasses with immersive technology and a design that makes them look like standard glasses. While Orion glasses won’t be available for sale right away, Meta is prepping around 1,000 units for demonstration and early developer exploration.

Both companies hope to kickstart a third-party developer ecosystem for high-quality AI-driven AR glasses that anyone can wear every day in polite society as ordinary eyeglasses. And while the developers are working on apps, the companies will work hard to bring down the costs of manufacturing the products.

This generation of glasses combines an AI voice assistant with AR holographic visuals plus all the features and functionality of Bluetooth earbuds (calls, podcasts, etc.). In theory, this is the Holy Grail of wearables — fantastic power, rich user interfaces, and invisible and inaudible to everyone around you. Wearing such glasses will make you feel like you know everything and are constantly aided by powerful AI.

Meanwhile, Meta’s Ray-Ban Meta glasses are the surprise hit of the year. The glasses look more or less like regular Ray-Bans but offer multimodal AI and an audio interface. They’re inexpensive because they don’t attempt visual output, only sound.

While the Ray-Ban Meta glasses are well-designed and well-made and equipped with quality speakers and microphones, the basic concept is easily replicable. Chinese companies accessing lower-cost components can make much cheaper glasses containing batteries, Bluetooth connectivity, speakers, microphones, and an app that connects to the hardware and gains access to generative AI (genAI) chatbots through APIs.

In other words, good-enough AI glasses are relatively easy and inexpensive to produce. That’s why the success of Ray-Ban Meta glasses has Chinese companies taking notice.

Wait, what’s happening in China?

Smaller Chinese companies are focused on the growing market for AI-powered smart glasses, aiming to compete directly with Ray-Ban Meta smart glasses. They’re coming from companies you probably never heard of:

Superhexa: Backed by Xiaomi, Superhexa is a Chinese startup that has launched “Jiehuan” branded AI glasses that provide access to large language models (LLMs) and offer voice-guided navigation and AI chat features.

Solos: Hong Kong-based Solos has introduced its AirGo Vision smart glasses, which enable voice access to Open AI’s ChatGPT. The glasses also have a detachable camera, which enables multimodal AI via GPT-4o.

Even Realities: This Shenzhen-based startup makes G1 glasses featuring LED microdisplays. Although they don’t have speakers, they do output information in the form of visible green text.

Liweike: Based in Hangzhou, China, this company developed smart AR glasses, unfortunately branded as Meta Lens S3 glasses. They integrate sports functionality with AI-powered voice interaction with the company’s AI chatbot. One standout feature is an integrated 120-degree ultra-wide 2K high-definition sports camera.

Sharge: This company’s OptoX AI Glasses have a camera, speakers, and all the trimmings. Users can access ChatGPT-4o by talking and listening. Also, they can function like a dashcam, constantly recording and deleting while retaining only the last 30 minutes of video, which you can watch or keep.

As you can tell from this list, some companies are making audio-only AI glasses, some of which will cost less than $100. Other glasses add holographic heads-up displays, which could cost a few hundred dollars a pair. At least two of these products offer both cameras and interaction with the advanced GPT-4o chatbot, able to essentially do all the stuff from the May 13 OpenAI Spring Update (everything, of course, except get Scarlett Johansson’s voice), but through glasses instead of a smartphone.

Great glasses or cheap glasses?

To oversimplify the coming AI glasses market, the American companies will make them great, and the Chinese companies will make them cheap. The result will be an incredible selection of variable features, quality, and styles.

As a result, we’ll quickly arrive at a place where the question won’t be, “Why buy AI glasses?” It’ll be: “If you’re going to buy glasses, why wouldn’t you buy AI glasses?”

The AI glasses revolution will also eviscerate the in-the-ear earbud market and might damage the smartwatch industry. With audio in your glasses, why put plastic in your ears? And with apps, notifications, and information hovering holographically in space in front of your eyeballs, who needs it on your wrist.

Smartphones could even be affected. If a big holographic display in the lenses provides the main interface, then a big-screen smartphone might be unnecessary.

The emergence of AI glasses as a ubiquitous category creates interesting and valuable possibilities for enterprise and business apps running on the platforms, everything from factory training and instructions to board room presentation teleprompters.

Beyond that, we might eventually see the rise of BYOG — bring your own glasses — policies. Companies will also need to cope with this generation of glasses’ privacy and security implications. The prescription glasses employees rely on to see clearly will often have cameras and microphones capable of secretly recording anything. (Even Ray-Ban Meta glasses have a bright light that indicates when the camera is taking a picture or recording video, a system easily foiled, according to hundreds of how-to videos on TikTok.)

Social implications will abound. Today, we’re still trying to figure out the social norms around looking at a smartphone during a conversation. What happens when people can be looking right at you but secretly distracted by online content only they can see?

AI glasses are about to significantly impact business, society, and culture. This will become clear when Snap and — more importantly — Meta will likely demonstrate the future of AI glasses to developers and the public.
https://www.computerworld.com/article/3496284/the-ai-glasses-market-comes-into-focus.html

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Sep, Thu 19 - 18:12 CEST