Door detection, voice control and captions are some of the accessibility features coming to Apple this year
Technology giant Apple has revealed a raft of innovative accessibility features that will be available later this year to give disabled people new tools for navigation, health, communication, and more.
The customisable tools can support people with a wide range of disabilities, such as people with vision impairments, physical disabilities, motor impairments, and difficulties hearing.
Among the new accessibility features that will be available later this year are:
- Door Detection: This feature enables people who are blind or have vision impairments to use an iPhone or iPad to navigate the last few feet to their destination
- Voice Control and Switch Control: These assistive features are designed for those with physical and motor disabilities to fully control an Apple Watch from an iPhone with Apple Watch Mirroring
- Live Captions: This is simple captioning technology for people with hearing impairments on iPhone, iPad, and Mac
Apple is also expanding support for its screen reader, VoiceOver, with over 20 new languages and locales.
“Apple embeds accessibility into every aspect of our work, and we are committed to designing the best products and services for everyone,” said Sarah Herrlinger, Apple’s senior director of Accessibility Policy and Initiatives. “We’re excited to introduce these new features, which combine innovation and creativity from teams across Apple to give users more options to use our products in ways that best suit their needs and lives.”
This “cutting-edge” navigation feature can help users locate a door upon arriving at a new destination, understand how far they are from it, and describe door attributes. This includes if the door is open or closed; when it is closed; and whether it can be opened by pushing, turning a knob, or pulling a handle.
Door Detection can also read signs and symbols around the door, like the room number at an office, or the presence of an accessible entrance symbol. This new feature combines the power of LiDAR, camera, and on-device machine learning, and will be available on iPhone and iPad models with the LiDAR Scanner.
Door Detection will be available in a new Detection Mode within Magnifier, Apple’s built-in app supporting blind and low vision users. Door Detection, along with People Detection and Image Descriptions, can each be used alone or simultaneously in Detection Mode, offering users with vision impairments a go-to place with customisable tools to help navigate and access rich descriptions of their surroundings, Apple highlights.
In addition to navigation tools within Magnifier, Apple Maps will offer sound and haptics feedback for VoiceOver users to identify the starting point for walking directions.
Physical and motor accessibility features
Apple Watch will also become more accessible for those with physical and motor disabilities later in 2022 with Apple Watch Mirroring, which helps users control Apple Watch remotely from their paired iPhone.
With Apple Watch Mirroring, users can control Apple Watch using iPhone’s assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display.
In addition, Apple is making it easier for users to control an Apple Watch with a wider range of simple hand gestures. These include: a double-pinch gesture that can answer or end a phone call; dismiss a notification; take a photo; play or pause media in the Now Playing app; and start, pause, or resume a workout.
This builds on the innovative technology used in AssistiveTouch on Apple Watch, which gives users with upper body limb differences the option to control Apple Watch with gestures like a pinch or a clench without having to tap the display.
For people who are deaf or have hearing impairments, Apple is introducing Live Captions on iPhone, iPad, and Mac. Users can follow along more easily with any audio content, whether they are on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them. Users can also adjust font size for ease of reading.
Live Captions in FaceTime attribute auto-transcribed dialogue to call participants, so group video calls become even more convenient for users with hearing disabilities. When Live Captions are used for calls on Mac, users have the option to type a response and have it spoken aloud in real time to others who are part of the conversation.
New languages to Apple’s screen reader
VoiceOver, Apple’s screen reader for blind and low vision users, is adding support for more than 20 additional locales and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese.
Users can also select from dozens of new voices that are optimised for assistive features across languages. These new languages, locales, and voices will also be available for Speak Selection and Speak Screen accessibility features.
Additionally, VoiceOver users on Mac can use the new Text Checker tool to discover common formatting issues such as duplicative spaces or misplaced capital letters, which makes proofreading documents or emails easier.