UC Berkerley researchers create assistive device image

Engineers at the University of California, Berkeley, based in the USA, have developed an innovative wearable biosensing device that can recognise hand gestures based on electrical signals detected in the forearm.

The system, which couples wearable biosensors with artificial intelligence (AI), could one day be used to control prosthetics or to interact with almost any type of electronic device, the university says.

“Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers,” said Ali Moin, who helped design the device as a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences. “Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”

Ali is co-first author of a new paper describing the device, which appears in the journal Nature Electronics.

To create the hand gesture recognition system, the team collaborated with Ana Arias, a professor of electrical engineering at UC Berkeley, to design a flexible armband that can read the electrical signals at 64 different points on the forearm. The electrical signals are then fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.

The team succeeded in teaching the algorithm to recognise 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.

“When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibres in your arms and hands,” Ali said. “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibres were triggered, but with the high density of electrodes, it can still learn to recognise certain patterns.”

Like other AI software, the algorithm has to first “learn” how electrical signals in the arm correspond with individual hand gestures. To do this, each user has to wear the cuff while making the hand gestures one by one.

However, the new device uses a type of advanced AI called a hyperdimensional computing algorithm, which is capable of updating itself with new information, the university states.

For instance, if the electrical signals associated with a specific hand gesture change because a user’s arm gets sweaty, or they raise their arm above their head, the algorithm can incorporate this new information into its model.

The university says the assistive device could become commercially available with a few tweaks.

AT TODAY UPDATES
Over 7,000 healthcare professionals stay informed about the latest assistive technology with AT Today. Do you?
We respect your privacy