MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple may have just spilled a bunch of details on iOS 16

Wednesday May 18, 2022. 01:15 PM , from Mac 911
In case you missed it yesterday, Apple just gave us our first official look at iOS 16. Of course, it didn’t actually call it that—as it did last year, the company highlighted several upcoming accessibility improvements to its operating systems, saying only that they are coming “later this year with software updates across Apple platforms.” That’s basically code for iOS 16, iPadOS 16, watchOS 9, and macOS 13.

While the features Apple announced are fantastic improvements for those with various vision, speech, or motor impairments, they also speak to some overall improvements—especially in AI and machine learning—that we will likely see throughout the next generations of Apple’s operating systems. If we read into the announcements, here are a few of the major advancements we can expect to see in iOS 16:

Live captions = Better speech recognition

Android has had a live captions feature since version 10, and now Apple will get it three years later. With this setting enabled, your iPhone or Mac (if it has Apple silicon) will automatically produce captions in real time for literally any audio content, including videos, FaceTime calls, phone calls, and more. It’s a natural extension of the on-device speech processing that was introduced last year in iOS 15, but it speaks to a big improvement in the sophistication of that feature.

We hope this means an improvement in Siri’s understanding of your commands and dictation, but one could easily see these features show up in other places. Take for example, the Notes app, where one can imagine a “transcribe” feature to create text out of any audio recording or video. If Apple’s billing it as an accessibility feature, Live Caption’s transcription will need to be rock-solid, and it opens up a world of possibilities for the rest of iOS 16.

Apple Watch mirroring = AirPlay improvements

Another accessibility feature coming later this year will let you mirror your Apple Watch on your iPhone and use your iPhone’s display to control your watch It’s designed to make elements easier to manipulate for those with motor function problems and allow disabled users to enjoy all of the iPhone’s extra accessibility features.

Apple will allow Apple Watch mirroring later this year—thanks to new AirPlay advancements.Apple

However, Apple Watch mirroring also has intriguing implications. Apple says the feature “uses hardware and software integration, including advances built on AirPlay.” That doesn’t necessarily mean that we’ll see something like AirPlay 3, but it sounds like there are some improvements coming to AirPlay, probably in the way of new frameworks for developers.

Notably, this seems like it allows devices to communicate control intent in a way that AirPlay doesn’t right now. AirPlay pushes audio and video out to devices, and allows for simple controls (play/pause, volume, and so on), but allowing AirPlay-compatible devices to signal advanced touch controls seems new and could lead to some incredible new features.

Here’s a killer scenario: If Apple can mirror your Apple Watch to your iPhone and allow you to fully interact with it, it could probably mirror your iPhone to your Mac or iPad and do the same! That alone would be a game-changing feature.

Door Detection = Real-world AR object recognition

Apple’s been quietly improving its object recognition for some time now. For example, you can search for all sorts of things in the Photos app and get images containing them, and iOS 15 added a neat visual lookup feature that uses the camera to identify plants and animals, famous landmarks, artwork, and other things.

Now Apple has announced it will be adding the ability to detect doors in real time using the Magnifier app, including judging their distance and reading text on them. It’s only for devices with LiDAR (which how it’s measuring range), but it speaks to a broader improvement in object recognition.

The iPhone’s camera will soon be able to detect if doors are open. Apple

The most obvious use-case is augmented reality glasses or goggles, which are not expected to be released until next year at the earliest. But Apple already has a robust ARKit framework for developers, which is used for AR apps, and it includes the ability to recognize and track certain everyday items. And it wouldn’t be out of character for Apple to preview new technology that’s not launching for a while.

It seems reasonable to presume that the Door Detection feature is a natural extension of work Apple’s already doing in augmented reality scene and object detection. So don’t be surprised if you see a demo at WWDC of new ARKit framework features for developers. It might start in iOS 16 with new AR apps, but it’s also bound to show up in much bigger projects as Apple continues to march its AR software tools toward eventual integration in AR glasses.
iOS
https://www.macworld.com/article/702442/ios-16-airplay-speech-recognition-augmented-reality.html
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
May, Sat 4 - 19:49 CEST