Mechanical engineer Marcia O’Malley talks to Rice Magazine about designing and building better wearable prosthetic devices.
There are lots of sophisticated prosthetic devices on the market for people who have lost limbs, but no matter how well they replicate the intricate movements of the human hand, they can’t provide much feedback to an amputee. You can have a really nimble robotic hand that’s controlled by a person’s muscle and nerve impulses, but that person still won’t feel anything the hand touches. If they want to pick up a water bottle, they’ll need to look at the way the hand interacts with the bottle to know when to squeeze, or infer how firmly they’re grasping it.
So how do we give them back the information they used to get from touching an object? One approach that has been proposed is to implant electrodes in an amputee and then stimulate their nerves electronically when sensors on their prosthetic limb detect contact with objects. It’s an invasive procedure, however, and it’s extremely expensive. Even if it was to become widely available, not everyone would want that sort of experience.
In my lab, we design and build wearable devices that fit on the wrist, forearm or upper arm. They can provide vibration, squeeze and other tactile sensations that give a user haptic — or touch — feedback from a prosthetic device without having to look at it. The problem is that we’re still not quite certain how to map what a prosthesis is “feeling” — its touch interactions with the environment — and turn it into information that’s useful and intuitive for an amputee.
Most of us are able to do this sort of thing without even thinking. We can easily sort objects or stack blocks just by using our sense of touch, but we have a long way to go before amputees can easily do the same thing, despite the engineering advances that have led to advanced prostheses.
Illustration by Alex Eben Meyer