Through 3D-Printed Prosthetic, Illinois Students Lending a Hand in Ecuador

11/4/2014 Mike Koon, Engineering Communications Office

Tim Bretl's group combines electromyographic control research with 3D-printing to develop an artificial hand.

Written by Mike Koon, Engineering Communications Office

Neuroscience graduate student Aadeel Akhtar and AE graduate student Mary Ngyugen traveled to Ecuador in August and plan to return in January with their latest prototype.
Neuroscience graduate student Aadeel Akhtar and AE graduate student Mary Ngyugen traveled to Ecuador in August and plan to return in January with their latest prototype.
Neuroscience graduate student Aadeel Akhtar and AE graduate student Mary Ngyugen traveled to Ecuador in August and plan to return in January with their latest prototype.
The 3D-printed prosthetic hand can be programmed to recognize a hand at-rest, open-faced, closed (tool grip), a three-finger grip, and fine pinch.
For most amputees, the road to a more functional prosthetic device is slow and costly. However, thanks to a research group at the University of Illinois, that might be changing, which is especially good news for those who are most in need, residents of the developing world.

In August, the group’s team leader Aadeel Akhtar, an MD/PhD candidate in neuroscience from the College of Medicine at Urbana-Champaign, and Mary Nguyen, a master’s student in aerospace engineering, traveled to South America. They put their latest creation, an open-source dexterous artificial hand, to the test on an Ecuadorian man.

Akhtar, Nguyen, and six other engineering undergraduate students operate out of the research group advised by Tim Bretl, an associate professor in aerospace engineering who specializes in robotics and neuroscience. The group has been conducting research on electromyographic (EMG) control of prosthetics and sensory feedback for the past three years, but it was the use of 3D-printing to create the model last November that moved things along at a faster rate.

The group has created one of the first 3D-printed prosthetic hands with pattern recognition capability. A machine-learning algorithm allows it to do more than just open and close. It learns other positions of the hand for more functionality. In addition, it can be created for a mere $270 compared to the average myoelectric prosthetic, which retails for between $30,000-$40,000. Even taking in consideration mark-up, it still represents a significant cost decrease to the patient.

The hand is trained to replicate several motions by taking the electrical signal from muscles in the arm and sending it to an EMG board, which is then sent to a microprocessor with a machine-learning algorithm on board. Based on those signals, it sends commands to motor drivers, which churn the motor and make the hand move.
Although the EMG board that is being used for the current prototype is the size of a standard audio mixing board, it will eventually shrink to a size that can fit into the socket of a residual limb.

Akhtar’s team has created a mathematical model of five actions – a hand at rest, open-faced, closed (tool grip), a three-finger grasp, and a fine pinch. The initial training takes about one to two minutes and involves a patient going through each one of the gestures.

“Using the machine-learning algorithm based off the signals it picks up from the muscles, it can figure out which of these grips he is actually doing,” explained Akhtar. “The microcontroller with the machine-learning algorithm will then replicate the grip he’s trying to make.”

A connection last spring with David Krupa, an Illinois alumnus, accelerated the project even more. Krupa co-founded the Range of Motion Project (ROMP), a non-profit organization in Guatemala and Ecuador that provides prosthetic and orthotics to those without access to rehabilitative care. Krupa was back on campus to receive the International Young Humanitarian Award from the U of I, and Akhtar met with Krupa to discuss his team’s research.

After hearing about its work, Krupa approached the U.S. embassy in Ecuador about sponsoring team members in travel to Ecuador to test the device on a patient as soon as August. That put the project on a much quicker timeline.

Akhtar and Nguyen spent two weeks in Quito, Ecuador, putting the final touches on the prototype, demonstrating it for members of the embassy and working with patient Juan Suquillo, who has a below-elbow amputation on his left arm for an injury suffered 33 years ago in a war with Peru. The team demonstrated the product first by having Adam Namm, the U.S. ambassador to Ecuador, successfully control the arm.

“The goal of the trip was to get it to work with a patient,” Akhtar said. “Although it took some debugging, we were successful.”

The event attracted its share of attention, including national media from Ecuador

The hand itself takes about 30 hours to print, then another two hours to assemble. All the electronics that are necessary to convert the neural signals into movements are located within the hand.

This semester the team is working on the third iteration of their design in which the palm will be thinner and the fingers stronger. They plan to return to Ecuador in early January to leave it for a patient.

“In the next version, the finger will use a four-bar linkage,” said team member Patrick Slade, a mechanical science and engineering major who is in charge of the mechanical design of the hand. “Rather than the tendons having to bend one joint at a time, the four-bar linkage will allow the joints to bend more smoothly and naturally. It will also be more robust and simpler to maintain.”

“The next version will also have linear actuators which will allow for printing on a very small circuit board,” added Michael Fatina, an electrical and computer engineering major.

“Replacing the motors with linear actuators will also make it stronger, more energy efficient and increase the battery life.”

While Fatina and Edward and Alvin Wu (ECE) continue to improve EMG and logic circuits for making the motors work, Sam Goldfinger (ECE) and Joseph Sombeck (Bioengineering) handle the soldering work and sensory feedback. It’s the sensory feedback that Akhtar says will set the next prototype apart from anything else on the market.

“No commercial prosthetic device has any sort of feedback,” Akhtar said. “We’re going to put sensors in the fingers. Based on the amount of force that the fingertips are detecting, we are going to send a proportional amount of electrical current across your skin to stimulate your sensory nerves. By stimulating your sensory nerves in different ways with different amounts of current, we can make it feel like vibration, tingling, pain, or pressure.”

Through a mechanical connection from one of the artificial fingers directly to the skin, the patient will also be able to better feel the position of their hand without looking at it.
“We did some initial experiments in the lab and found that with only six minutes of training, users could distinguish between six different grips with 88 percent accuracy without looking. With that kind of result, image how well someone could operate it in even a week’s time.”

“It’s really awesome to be able to help people,” Nguyen aid. “I didn’t imagine doing something that has this direct impact on the world while still in college.”
 


Share this story

This story was published November 4, 2014.