Wednesday 23 November 2016

Can someone give me a hand?

Written by Dr Ed Chadwick (Senior Lecturer in Biomedical Engineering, ISTM)

----------------

A couple of weeks ago I broke my collarbone. This has meant keeping my arm in a sling for a while, and making do with only one hand. While slightly inconvenient, it’s not the end of the world. In a few weeks I’ll be back to normal operation. It has, however, given me a new understanding of how difficult it is to get things done with only one hand. Previously simple tasks like tying my shoelaces are now impossible!

As I said, for me this will be short lived. For some people though, born missing a limb, or perhaps losing one in an accident, this is something they have to live with day in, day out. As a Biomedical Engineer, my work involves using engineering principles to tackle clinical problems, hopefully improving the quality of life of people with disabilities.

All of which is a rather long-winded introduction to a paper we have recently published on our research to make better artificial (prosthetic) hands.

Computer modelling


The focus of my research has been understanding human movement (in particular involving arms and hands) by building computer models. We use these models to help us understand what goes wrong in certain diseases or injuries, and to design better medical devices and treatments. Our latest paper shows how we might use a computer model of the hand to design a better prosthetic hand.

A better prosthetic hand


Over the last few years, prosthetic hands have become better and better. They now have individually moving fingers and thumbs that are almost as good as the real thing. But there are still a couple of problems. People find it difficult to control many different actions at the same time: they can open or close the fingers, they can bend or rotate the wrist, or they can move the thumb in and out. But not all at the same time, and that makes the use of the hand a bit unnatural. It takes too much effort. Together with our partners at RIC / Northwestern and Cleveland State Universities, we are trying to make a better interface for a prosthetic hand that will make using it really natural.

We have built a computer model of the hand, or ‘virtual hand’, that predicts how the missing hand of an amputee would behave if it were still there. The vision is that this could be used to control an a prosthetic hand worn by an amputee. We would know what the user wanted to do with their hand by recording the signals from the remaining nerves in the arm; the virtual hand would tell us what the prosthetic hand should do. Of course, for this to work, the movements of the ‘virtual’ hand would have to be known as quickly as they would from a real hand. This is known as ‘real-time simulation’.

The real-time hand model showing off postures from the American Sign Language.

The paper describes how the model works, how it simulates the actions of the missing fingers, and how it does it in real time. You can access the full text of the publication here if you want to know how it works (warning: lots of equations!).

Closing the loop


As well as allowing the user to ‘talk to the hand’ (telling the hand what it should be doing), this approach allows the hand to talk to the user! That is, we can simulate the signals from the missing hand that would tell the user where their fingers are, how fast they are moving and whether they are gripping something.

A prosthesis user testing a new type of control for an artificial hand. Image courtesy
of Newcastle University.

This closing the loop of the control system has the potential to really take artificial hands to the next level. This is what the Senseback project aims to achieve and will be the subject of our next publication.

...

Thanks to Dimitra Blana, Wendy Murray, Ton van den Bogert & Kia Nazarpour.

Useful links



...

Originally published on Medium.