In 2016, at the Frontier Conference of the White House held at the University of Pittsburgh, the then-U.S. President Barack Obama had a fist with Nathan Copeland, a robot arm user, while visiting innovative projects. A mechanical arm that completes direct tactile feedback via a brain-computer interface can allow paralyzed patients to complete the action of pouring water from one cup into another more quickly and naturally. On May 21st, a research team from the University of Pittsburgh in the United States published a study “A brain-computer interface that evokes tactile sensations improves robotic arm control” in the top international academic journal “Science”. Research says that when a person controls an object with his mind, the robotic arm can provide direct tactile feedback to the person’s brain. In the past, robotic arms could only be guided by vision. The team has been working with Nathan Copland. Fifteen years ago, the teenage Copeland was paralyzed in an accident. He has now learned to control the movement of a robotic arm through a brain-computer interface. Copeland said, “When I only have visual feedback, I can only see that the hand touches the object. If I use it to pick up things, sometimes things will fall off.” Copeland needs to complete a typical grasping task. About 20 seconds. “With sensory feedback, he can do it in 10 seconds,” said Jennifer Collinger, an associate professor in the Department of Physical Medicine and Rehabilitation at the University of Pittsburgh. (Note that the colon is only used in very formal or important occasions) Collinger said that tactile information is very important for the use of prosthetic robotic arms, “because it is difficult for you to grasp an object that you can’t feel.” Even for simple things, such as picking up a cup and trying to move it. Maintain proper pressure during the process, which depends to a large extent on the tactile feedback of the hand. Therefore, Klinger and a team of researchers spent years looking for ways to add sensory feedback to robotic arms and hands. The research team used a two-way brain-computer interface to record neural activity in the motor cortex and generate tactile sensations through micro-stimulation of the somatosensory cortex in the cortex. In the experiment, the researcher first placed electrodes in the area of the Copland brain that processes sensory information, so that electrical impulses can be used to simulate a series of sensations. Collinger said, “It turns out that the sensation produced by stimulating the fingertip-related areas of the brain is like coming from one’s own hands.” Next, the University of Pittsburgh team studied how to generate these signals when the robotic arm is in contact with an object. The last step is to time when Copland completes some tasks, such as picking up a stone or pouring water, how much time it takes if there is tactile feedback. The results show that Copland completes some manual tasks at roughly the same speed as humans use their own hands. Copeland revealed, “The intensity of this sensation actually varies according to the amount of force the hand exerts on the object. So I can also tell if I have grasped it. There is an additional benefit, after the increased tactile sensation , The feeling of using the robot arm is more natural. This kind of control is very intuitive, so that I basically just think about things, but it seems to be moving my own arm.” Jeremy D. Brown, assistant professor of John C. Malone in the Department of Mechanical Engineering at Johns Hopkins University, said that the significance of the research results goes far beyond the robotic arm. “High-tech prostheses also work better when simulating the sense of touch. Some are achieved through vibration or other forms of tactile feedback. This is the same way that many smart phones help users type on the screen.” The latest prostheses operate just like our natural limbs. Their elbows can be bent, their wrists can be rotated, and their fingers can be grasped. But most sensors still only have basic capabilities, such as detecting resistance or temperature. Before they have direct tactile feedback, they are actually very clumsy. And when using my hand to touch the surrounding objects, as Brown said: “I can feel pressure, feel sliding, feel whether the object is wet or dry. I can feel its texture, I know it is rough Still smooth.” Scientists are just beginning to learn how to make artificial hands and fingers that can detect these subtle features of objects. Brown said that as prosthetics or robotic arms provide more sensory feedback, they will become more useful. “The sense of touch is not just for flexibility. It’s not just the ability to reach into your pocket for the key. It can also hold your lover’s hand and feel the emotional connection.”
You must log in to post a comment.