New collaboration will help make AI more inclusive and useful for people with sight loss
Microsoft has announced a new collaboration with Be My Eyes to bring high-quality, disability-representative data to help train AI systems.
The partnership means that Microsoft’s AI models will become more inclusive and useful for people with sight loss.
AI requires large amounts of data for training and utility. However, Microsoft says that disability is often underrepresented or incorrectly categorised in datasets.
In Microsoft Research’s most recent paper on AI performance for describing images from blind or visually impaired people, disability objects, like a braille device, were included less frequently in popular large-scale image-text datasets. This led to recognising those objects ~30 percent less accurately. Microsoft describes this as a “disability data desert”, which can limit the utility of a technology, strengthen existing stereotypes, and magnify bias.
To help address this disability data desert, Microsoft and Be My Eyes’ partnership aims to ensure disabled people are represented in datasets so that AI is accessible, representative, and inclusive.
Transparency and user control are the guiding principles for data privacy in this agreement. Be My Eyes will provide video data sets, including unique objects, lighting, and framing that realistically represents the lived experience of those with sight loss. Personal information will be removed from metadata by Be My Eyes prior to sharing.
Microsoft will then use the data to improve the accuracy and precision of scene understanding and descriptions with the goal of increasing the utility of AI applications for blind and visually impaired individuals.
Earlier this year, Be My Eyes announced that its AI visual assistant app became available on any Windows 10/11 PC through the Microsoft Store, for free. The Be My AI app is designed to provide rich AI-powered visual descriptions for people who are blind or visually impaired.