Apple accessibility features image

Being able to type to speak during calls and conversations for people who cannot speak and using a feature to identify text on physical objects and read it aloud for people who have vision impairments are two of the features that Apple will roll out.

Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesised voice that sounds like them for connecting with family and friends.

For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances.

“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

The technology giant says it works closely with community groups representing a broad spectrum of users with disabilities to develop assistive features that make “a real impact” on people’s lives.

Assistive Access

Assistive Access uses innovations in design to distil apps and experiences to their most basic and essential features to lighten cognitive load on iPhones (Apple’s smartphones) and iPads (Apple’s tablets).

The feature reflects feedback from people with cognitive disabilities and their trusted supporters, so that they can easily enjoy activities like connecting with loved ones, capturing and browsing photos, and listening to music.

Assistive Access includes a customised experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Camera, Photos, and Music. The feature offers a distinct interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support.

For example, for users who prefer communicating visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.

Live Speech and Personal Voice

With Live Speech on iPhone, iPad, and Mac (Apple’s range of desktop computers and laptops), users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations.

Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.

For users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.

Users can create a Personal Voice by reading along with a randomised set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and it integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.

Detection Mode in Magnifier introduces Point and Speak

Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels.

For instance, while using a household appliance, such as a microwave, Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.

Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.

Additional features

Other features that Apple will introduce later this year include the following:

  • Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customise them for their hearing comfort.
  • Voice Control adds phonetic suggestions for text editing, so users who type with their voice can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.” Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
  • Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favourite games on iPhone and iPad.
  • For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
  • Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages, and Safari.
  • For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customise the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.

Last year, Apple introduced a range of features that support people with a wide range of disabilities, such as people with vision impairments, physical disabilities, motor impairments, and difficulties hearing.

These included Door Detection, which enables people who are blind or have vision impairments to use an iPhone or iPad to navigate the last few feet to their destination; Voice Control and Switch Control, which are designed for those with physical and motor disabilities to fully control an Apple Watch from an iPhone with Apple Watch Mirroring; and Live Captions, which is simple captioning technology for people with hearing impairments on iPhone, iPad, and Mac.

AT TODAY UPDATES
Over 7,000 healthcare professionals stay informed about the latest assistive technology with AT Today. Do you?
We respect your privacy