Envision Glasses image

Assistive technology specialist Envision has unveiled the upcoming release of the 2.5 version of its Envision Glasses, which will enable people who are blind or visually impaired to have a richer understanding of their surroundings.

Set to launch on 22 January 2024, the 2.5 version of the Envision Glasses incorporates an upgrade to its Describe Scene feature powered with Open AI’s GPT-4 Vision.

This “breakthrough” smart glasses upgrade aims to give people who are blind or visually impaired greater independence by providing them with a richer understanding of their surroundings.

The new Describe Scene feature on the Envision Glasses offers immediate, succinct, and accurate scene descriptions of their surroundings, with the help of Envision’s virtual assistant, Ask Envision.

This feature upgrade allows users to command detailed and focused descriptions on-demand. It ranges from providing concise overviews to delivering rich, detailed narratives, empowering users with the freedom to choose how they perceive the world around them.

Additionally, this feature enables users to gain targeted information about specific elements in their environment, further enhancing their interaction and understanding.

Karthik Kannan, the CTO and co-founder at Envision, explained: “With the 2.5 version of the Envision Glasses, we’re proud to use OpenAI’s GPT-4 Vision, our most advanced model to date. This iteration ensures that users are not overloaded with a long description when they hear the first description of the image they capture but rather receive concise and essential information.

“With Ask Envision, they then have the option to delve into more detailed or specific information as needed. This approach aligns with our philosophy of giving users control over their experience, ensuring they get exactly the level of detail they require.”

The 2.5 version also introduces improvements in multilingual voice commands. Users can now activate features seamlessly by simply speaking the name of the feature, with the glasses recognising commands in multiple languages.

“The introduction of multilingual voice commands in the Envision Glasses 2.5 is a leap towards a future where our technology becomes more conversational and personal,” said Karthik Kannan, the CTO and co-founder at Envision. “This feature underlines our commitment to breaking down language barriers, making our assistive technology more accessible and intuitive for users globally.”

Recently, Envision hosted an online workshop in coordination with the Braille Institute to demonstrate the Envision Glasses and Envision AI app.

AT TODAY UPDATES
Over 7,000 healthcare professionals stay informed about the latest assistive technology with AT Today. Do you?
We respect your privacy