Xianta Jiang uses machine learning and wearable-sensor technology to study how the brain works in telling the body what to do and how to move. Currently, he’s using these techniques to solve a health-care problem for those who have prosthetic hands. But he doesn’t work in a hospital setting; rather, he works in a computer lab.
A professor of computer science at Memorial University, Jiang works with a team of students to study all of the different ways humans use their hands. With that information, they’ll use machine learning to help those who wear prosthetics control them.
“Controlling the prosthetic hand is a super difficult problem to solve, because when a human’s hand is amputated, the brain has a harder time communicating to the arm about how to maneuver the prosthetic limb,” Jiang says. “We are trying to help people control them as naturally as possible, and without surgery. To try to solve the problem, we use muscular sensors attached to a part of the arm to infer movement intentions from the area of the brain used to control the hand.”
However, it’s difficult to make the connections that allows the brain to send the right signals to the hand.
“For that, we attach a camera to the prosthetic and when the camera can identity the target, the hand can configure correspondingly — just like self-driving,” he says.
That’s the hands-on part of the research, but, in the end, much of the work is done by his students in a computer lab using the high-performance computing resources of the Digital Research Alliance of Canada and ACENET. They identify the grasp types the prosthesis user will need and use computer modelling to fine-tune the fit.
“It’s quite a basic question, but we need a lot of computing resources for it,” he says. “We are working to cover 95 per cent of daily life grasps with a total of 16 movements and need a lot of data to train this model, so we use high-performance computing.”
He said he couldn’t do his work without ACENET’s services.
“It's not affordable to purchase the computing power we’d need,” he says. “We need a lot of memory.” In 2022 alone, Jiang’s group used 128 CPU years and 23 GPU years of compute power.
Another of Jiang’s projects involves monitoring human activity using wearable devices such as smartwatches. The goal is to find a way to allow rehabilitation staff or sports coaches to monitor their patients’ progress digitally. This work is in the early collaboration stage and not yet commercialized. “For this one, we’ll collaborate with industry over the next few years,” he says.