Amazon rolls out feature to tablets that helps people carry out tasks using their eyes
Earlier this year, Amazon announced a new feature, Eye Gaze on Alexa, which supports people with mobility or speech disabilities to use Alexa, Amazon’s virtual assistant technology, with their eyes. This feature has now rolled out to Amazon users in the UK on Fire Max 11 tablets.
Eye Gaze on Alexa is a simple way for individuals with speech and mobility disabilities to have more independence and take specific actions, like playing music or turning on lights. Caregivers can also make use of this feature by customising Eye Gaze on Alexa dashboards with different Alexa actions, colours, and icons shown on the tablet screen as tiles to fit their individual needs.
Robin Christopherson, Head of Digital Inclusion at AbilityNet, said: “We can’t underestimate the importance of the ease of use of smart speakers, for people with a range of difficulties, who would otherwise have to have a much more complicated interaction. Just to be able to ask something and instantly get the right response and a really succinct response or transaction is amazing.
“For me, the utility and the ease of use of the smart speaker is light years better than interacting with a website, for example, which is a really complicated proposition for someone like myself who can’t see. We can’t just glance at the middle of a webpage, we have to plough through lots of words, lots of links and buttons and things like that. The simplicity of interacting with Alexa is important, and for me that interaction is gold.”
Eye Gaze on Alexa can be accessed on a Fire Max 11 tablet by going to Settings > Accessibility > Alexa. This feature is now available at no additional cost.
According to research conducted by Amazon Devices and Services, 77 percent of adults with a disability use technology to help them with everyday tasks, with those who do use devices to help with tasks that might otherwise be difficult doing so, on average, 13 times per day.
In addition to Eye Gaze on Alexa, the Amazon AI can be used for a variety of day-to-day tasks including connecting with friends and family, telling the time, setting handy reminders (including those for medication) and more.
Alexa can also be used to control the smart home hands-free, including turning on the lights, monitoring the home with smart cameras, turning on appliances such as kettles, adjusting the heating, and managing entertainment, which facilitates greater independence.
There is a raft of other Amazon Alexa accessibility features. These include Show and Tell, which helps people who are blind or partially sighted use any Echo Show to identify common packaged food goods that are hard to distinguish by touch, such as canned or boxed foods.
VoiceView Screen Reader is a screen reader for Echo devices with a screen. When enabled, VoiceView lets people use gestures to navigate the device while the feature reads aloud the actions performed on screen.
Tap to Alexa is available on Echo Show devices, which allows people to use Alexa without their voice by tapping the touchscreen to access helpful features such as the weather, news, timers, and other information.
In addition, Adaptive Listening gives users more time to finish speaking before Alexa responds to them, making it easier for people to interact with Alexa. Preferred Speaking Rate enables users to adjust Alexa’s speaking rate according to their preferences.
In early 2023, Amazon teamed up with Cochlear to pioneer a solution that helps to make watching television more accessible for people with hearing implants.