Now closer to reality: Prosthetics that can feel

Humans do a lot of things with their hands: We squeeze avocados at the grocery story, scratch our dogs behind the ears and hold our significant others’ hands. They are things that many people who have lost limbs can’t do.

CU Boulder biomedical engineer Jacob Segil is working to bring back that sense of touch for amputees, including veterans of the wars in Iraq and Afghanistan.

Segil, an instructor in the Engineering Plus Program and program director for the Center for Translational Research at CU Boulder, is part of a long-running effort to achieve the stuff of science fiction: designing artificial limbs that may one day allow amputees to feel the world around them through electronic sensors. Picture Luke Skywalker twitching after he gets poked in his robotic hand.

“In my field, we have a gold standard, which is the physiological hand,” said Segil, also a research healthcare scientist at the U.S. Department of Veterans Affairs. “We’re trying to recreate it, and we’re still so far off.”

Far off, but closer than you might think. Through an effort led by Case Western Reserve University and the VA, Segil and his colleagues have used a unique “neural interface” to give a small number of amputees back the sense of touch in their missing fingers. In a new study, published this month in the journal Scientific Reports, the team demonstrated just how effective this sensory restoration technology can be—helping one amputee to feel his hand adopt a series of postures, such as a gesture resembling the thumbs-up sign.

For Segil, who recently received a $1 million Career Development Award from the VA to continue his work, the project is a chance to use his engineering skills to help real people.

“A lot of us engineers would just be happy building stuff,” Segil said. “But as a VA researcher, your work can help the people who served our country. It’s a powerful motivator.”

Body sense

Segil grew up in a family of physicians and had long been fascinated by the idea of studying the human body as a machine—albeit a pretty complicated one.

“It’s really just a bunch of wires and actuators, with a huge microprocessor in your head,” Segil said.

That interest led him to earn his Ph.D. in mechanical engineering from CU Boulder and to take on a postdoctoral research position split between two VA centers: the Louis Stokes Cleveland VA Medical Center in Ohio and the Rocky Mountain Regional VA Medical Center in Aurora, Colorado. In Cleveland, Segil worked with biomedical engineer Dustin Tyler and explored the benefit of prosthetic limbs that can feel.

In the 2000s, Tyler invented a way to, essentially, hotwire the human nervous system.

His interface, called a nerve cuff electrode, surrounds the nerves and zaps them with electronic pulses. Adjust those signals just right, and they will travel to the brain, tricking it into thinking that it can feel fingers, even if there are no fingers to feel.

“We’re tapping into that wire before it gets to the brain, and then the brain can’t tell whether it’s coming from the finger or from our artificial system,” said Tyler, a professor at Case Western and a VA researcher.

To date, only four patients have undergone the surgery needed to receive that sensory feedback. But the project is a huge breakthrough in prosthetics, Segil said. He explained that while artificial hands have grown more high-tech in recent years, many amputees still choose not to use them—in large part, because these devices are numb.

“All prosthetic devices that have ever been used are ‘disembodied,'” he said. “They are a tools, external to the body. They’re the equivalent of a tennis player and their racket.”

In the recently published case study, Segil and his colleagues began to probe whether sensing prosthetics could do more—becoming a meaningful part of a person’s body.

The researchers worked with one volunteer, a man in his 40s who had lost his arm below his elbow six years before. They fed his neural interface varying patterns of sensory information—say, cues that he was picking up a penny. The group then asked the man to, while his prosthetic was hidden from view, decide what position his hand was in from a menu of seven postures.

https://youtube.com/watch?v=wZcCFK9S5ac%3Fcolor%3Dwhite

The interface did the trick. With enough practice, the man was able to identify the seven postures with up to 95% accuracy.

“When you have five points of information, the user is able to synthesize those to get a broader view of what the state of the hand is,” Tyler said.

The team is still a long way from their Star Wars moment. But it’s a good start.

“I think we need to go further into this embodiment space where the artificial and the physical are blurred,” Segil said. “That’s the stuff the goes beyond prosthetic limbs and redefines the interface between man and machine.”

Stories to tell

Segil is planning to push his mind-bending research further with his new funding, which will be split between the VA centers in Ohio and Colorado. His current work will focus on the psychology of prosthetics as much as the engineering—what will it take for amputees to think of their artificial limbs as real body parts? He also hopes that patients in Colorado will soon be able to receive their own neural interfaces.

For now, the engineer takes his inspiration from people like the (anonymous) human subject who volunteered to participate in his recent study.

Source: Read Full Article