HomeGame GuidesiPhone and iPad will get support for eye tracking and other accessibility...

iPhone and iPad will get support for eye tracking and other accessibility features later this year

Published on

Apple has announced a slew of new accessibility features that will make their way to its devices later this year. For starters, the company is preparing a feature called Eye Tracking for iPhone and iPad that people can use to navigate their devices “with eyes only.”

Eye Tracking doubles the front camera as eye tracking hardware and leverages on-device machine learning to do the job. In other words, it doesn’t need special hardware to work on iPadOS and iOS apps. You can jump between different app elements and use the Dwell Control to activate each element, and access functions such as physical buttons, swipes, and more.

Apple said its Music Haptics feature is designed for people who are deaf or hard of hearing to experience music on the iPhone. The accessibility feature will work on the iPhone’s Taptic Engine to play subtle taps, textures and vibrations in sync with the songs on Apple Music. The feature will support millions of songs on Apple Music and will be available to developers as an API.

Apple is also working to reduce motion sickness for people using an iPhone or iPad in a moving vehicle. Vehicle Motions Cues will use the device’s built-in sensors to know if a user is in a moving vehicle and display animated dots at the edges of the screen. The points will change direction when the vehicle accelerates, brakes or turns.

Explaining the reason behind motion sickness, Apple said:

Studies show that motion sickness is usually caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using an iPhone or iPad while traveling in a moving vehicle. With Vehicle Motion Cues, animated dots at the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content.

Furthermore, users can assign custom sounds and phrases to activate shortcuts and perform complex tasks using a feature called Voice Shortcuts. Another feature called Atypical Speech uses on-device machine learning to improve speech recognition for a wider range of speech and speech pattern recognition.

What’s on the table for Apple Vision Pro and CarPlay?

The Cupertino giant is also preparing some accessibility features for Vison Pro headsets and CarPlay. Vision Pro will get system-wide live captioning support to help users “follow spoken dialogue from live conversations and audio from apps.”

Users can move the subtitles using the window bar during Apple Immersive Video. Vision Pro will also support MFi (Made For iPhone) hearing aids and visually accessible features such as Transparency Reduction, Smart Invert and Dim Flashing.

Meanwhile, CarPlay will get support for three new accessibility features. Voice control allows users to control the CarPlay interface with their voice, and sound recognition displays visual alerts to notify users of car horns and sirens. Color filters are designed for color blind users to make the CarPlay interface easier to use visually.

Apple also announced updates to existing accessibility features, such as new voices for VoiceOver, hover typing, personalized voice in Mandarin, and a new reader mode in Manifier. You can read the company’s Blog post to learn about these updates.

Latest articles

More like this