In the science fiction drama "Love, Death & Robots", there is an episode where characters use their minds to control monsters to fight each other through head-mounted devices. This device that allows people to control with their thoughts is a brain-computer interface (BCI). The protagonist controls the monster with his mind through a head-mounted device | Stills from "Love, Death & Robots" The mind can control it, but there is no feeling. Some of the current brain-computer interfaces allow us to type with our eyes and let monkeys play games by looking at the screen, but if it involves more sophisticated operations, such as controlling monsters to fight like in science fiction dramas, or even just catching a fallen cup, it would be difficult to do without a sense of touch. We are so used to the sensations coming from our limbs and hands that we don’t even realize how important these sensations are. If you use a brain-computer interface without tactile feedback, it’s like operating a claw machine. Did you grab the right place? Did you grab it tight enough? Will you lose it? Will you poke the doll? We don’t know—it’s hard to grab something you can’t feel at all. Patients who have lost their ability to move due to spinal injuries can use brain-computer interfaces to achieve simple conscious control. Existing brain-computer interfaces mainly use vision to control the robotic arm to grab and move objects, just like when we grab a doll, there is no real sense of grabbing, so the movements are very clumsy and slow. Patients participating in the brain-computer interface experiment also said: "When there is only visual feedback, I can only see that the hand touches the object. If I use it to pick up things, sometimes the things will fall down." How can we make the consciousness control the machine and still have a real feeling? The answer is a two-way brain-computer interface, that is, brain-machine-brain. After the brain's consciousness controls the machine, the machine can also feed back tactile stimulation to the brain to achieve real-time control. Brain-machine-brain is better Recently, a research team from the University of Pittsburgh in the United States published the latest research results of a bidirectional brain-computer interface in Science, which allows people to control a robotic arm with their thoughts to pick up objects while also providing direct tactile feedback to the human brain. Nathan Copeland, who participated in the study, became a paralyzed patient in a car accident in 2004. He only has feeling in his wrists, a few fingers and part of his shoulder. He started participating in the laboratory's brain-computer interface research six years ago. Copeland uses his mind to control a robotic arm to shake hands with a person | UPMC/Pitt Health Sciences Because of this experiment, he became the first person in the world to have microelectrode arrays implanted in both the motor cortex and the somatosensory layer of the brain, which is the area of the brain that processes sensory information from the body. This allows the brain-computer interface to not only decode his movement intentions to operate the robotic arm, but also allow him to feel tactile feedback. When Copeland first participated in the experiment, he received traditional training to operate the robotic arm based on vision. Three years later, Copeland could skillfully use his mind to make the robotic arm move a sphere or grab a cube. However, a healthy human only needs 5 seconds to complete the grasping task, while Copeland usually takes about 20 seconds to complete it. Training based solely on visual control has reached a bottleneck, so the researchers decided to let Copeland receive tactile feedback from the robotic arm to open up a two-way feedback pathway from the machine to the brain. With tactile feedback, it takes a lot less time to complete the same task. When the performance is good, the task can be completed in less than 4 seconds. Reference [1] By adding torque sensors to the base of the robotic arm's fingers to detect the rotational force generated when the fingers move, and then simulating the electrical signals generated when human fingers feel pressure, the electrical signals are transmitted to the microelectrode array implanted inside Copeland's scalp through the brain-computer interface. In this way, Copeland can feel the object touched by the robotic arm. Copeland experienced a wonderful sense of touch, similar to pressure and slight tingling. The intensity of this tactile feedback varies depending on the amount of force his hand applies to the object, which has greatly improved his training results. In the past, he would fumble hesitantly, trying to make sure he had really grasped the object. Now with tactile feedback, he only needs 10 seconds to complete the grasping task, which is half the time. The tactile sensors added to the robotic arm achieve "real-time conduction" similar to that of biological bodies. The delay time of the brain to the hand's touch is about 30 milliseconds, and the sensor sends a signal to the brain-computer interface every 20 milliseconds. This real-time feedback makes Copeland feel very natural. "This control is so intuitive that I am basically just thinking about things, but it seems like I am moving my own arm." The sense of touch is not real, the brain is still connected A bidirectional brain-computer interface allows a person to control a robotic arm with thoughts to make complete and natural movements, which is a major step forward in helping paralyzed patients regain motor skills. However, this brain-computer interface technology is still subject to many limitations, as it requires large wired equipment to connect to the subject's brain. Last year, Musk demonstrated a compact and lightweight wireless connection device. If similar equipment can be put into use, it will save a lot of trouble. Musk lets monkeys play games with their minds. The brain-computer interface device here is wireless. | Neuralink In addition, the realism of tactile feedback needs to be improved. Existing sensors are still unable to control subtle forces and complete more complex tasks like real human hands. Scientists are exploring the restoration of motor ability by inserting electrodes into the patient's limbs and then using exoskeletons. Perhaps one day, paralyzed patients will no longer control mechanical arms, but their own hands and feet. In the future, this technology may allow many paralyzed patients to achieve some freedom, such as being able to pour themselves a glass of water and bring it to their mouths without relying on a caregiver. Such "small things" will be "big things" that change their lives. References [1] Flesher SN, Downey JE, Weiss JM, et al. A brain-computer interface that evokes tactile sensations improves robotic arm control[J]. Science, 2021, 372(6544): 831-836. [2]https://www.wired.com/story/this-brain-controlled-robotic-arm-can-twist-grasp-and-feel/ Author: Jia Lan Editor: Xiao Towel, You Shiyou |
Due to the basic attributes of OEM, there is litt...
Douyin is gradually improving its e-commerce func...
April 14 is World Quantum Day. Why was this date ...
Testing applications is very cumbersome. Where ca...
Source: China National Geographic Exploration Pla...
my country is one of the earliest countries to do...
Su Shi, a great writer and calligrapher in the No...
Online education has developed rapidly in 2020 an...
Do you think I'm bragging? Let me calm down f...
Mixed Knowledge Specially designed to cure confus...
Have you ever bought cheese sticks for your kids?...
Snacks are really a magical thing. From children ...
Currently, the two sessions of the National Peopl...
inject That may be everyone's childhood shado...
How does a brand , starting from its brand name a...