Even if they are 300 kilometers away, humans can be immersed in the scene, convey touch, restore expressions, participate in art exhibitions, and complete competitions . This breakthrough was achieved by a robot called iCub3, which brings a comprehensive immersive experience and unlimited possibilities to human-computer interaction . Figure|iCub3 robot Over the past four years, the research team of the Artificial Intelligence and Mechanical Intelligence Laboratory of the Italian Institute of Technology has developed advanced Avatar technology, built the iCub3 robot system, and continuously optimized it. Today, by using virtual reality helmets, tactile gloves and body sensors in remote operation, iCub3 achieves excellent motion control, tactile feedback and expression reproduction . During the ANA Avatar XPrize competition, iCub3 successfully provided tactile feedback to the remote operator, conveying the texture of rocks, relief vases, etc. , while also reflecting the facial expressions of the human operator, including smiles and eye morphology (e.g., squinting under bright stage lights). In addition, during the competition, iCub3 demonstrated its ability to walk and manipulate objects, and completed a puzzle together with a human . The research team said iCub3 was "the only system to complete the task at the final stage using bipedal locomotion and a suite of lightweight manipulation devices, including commercial and custom wearable devices." This milestone research result was published today in the authoritative scientific journal Science Robotics under the title “iCub3 Avatar System: Enabling Remote Fully-Immersive Embodiment of Humanoid Robots”. The research team said that the iCub3 robot lays the foundation for the development of safer and more reliable remote-controlled robotic systems in the future, providing new possibilities for disaster response (supporting disaster relief work in less dangerous situations), exploring dangerous environments, and expanding remote experiences (remote and difficult-to-reach places in the physical world). There are so many robots, why iCub3? According to the paper, the breakthrough of iCub3 lies in its realization of a full sensory immersive experience. Through the ingenious combination of virtual reality, tactile gloves and body sensors, the remote operator can have an immersive experience 300 kilometers away, transmitting the operator's sense of touch and restoring his facial expressions. Compared with other studies, this system uses deep learning technology to perform intelligent processing of multimodal information, achieves efficient user intention recognition and robot motion conversion, and makes remote control more flexible and accurate. This breakthrough feature makes iCub3 unique in the field of robotics. First of all, iCub3 realizes remote control in a real environment. Through the synchronous feedback of multiple senses, users can get a richer and more realistic experience. Secondly, deep learning makes the system adaptive and intelligent, which can better understand and respond to the user's intentions and improve the accuracy and flexibility of control. In addition, compared with other similar studies, iCub3 has been successfully applied to actual scenarios, including art exhibitions, stage performances and international competitions, showing excellent adaptability and stability in complex environments. This makes it an irreplaceable position in the development of future remote control robot technology. How was iCub3 made? In 2004, the research team started to study robot intelligence and designed and launched the first generation of iCub. The first generation of iCub is a robot that simulates the cognitive and learning process of infants. Its functions are relatively limited and it mainly performs specific tasks. After years of research and technological innovation, the research team recently launched the latest generation of the iCub series - iCub3. iCub3 has been comprehensively upgraded in terms of technology. This includes a significant increase in battery capacity, optimization of the battery location to the torso, and an improvement in the performance of the leg motors, making the robot more flexible and stable in walking and movement. The introduction of a depth-sensing camera further enhances iCub3's ability to perceive the surrounding environment. Particularly noteworthy is that iCub3 is built into a machine avatar platform, namely Avatar System. Through the iFeel equipment kit, the operator can synchronize his body movements in real time, allowing iCub3 to remotely imitate the operator's movements, including various body and facial expressions, through optical fiber transmission. Specifically, the operator wears a virtual reality helmet, tactile gloves and body sensors to transmit his body movements, hand movements and facial expressions to the iCub3 robot in real time. Virtual reality technology provides the operator with a 360-degree view of the environment, tactile gloves capture hand movements and touch perception through sensors, and body sensors track the operator's overall movement. This information is intelligently processed by a deep learning model to achieve motion mapping from the operator to the robot, allowing the robot to highly restore the operator's movements, touch, and facial expressions. To verify the performance of iCub3 in real scenarios, the research team conducted a series of experiments. In November 2021, iCub3 made its debut at the Biennale di Venezia art exhibition. The operator remotely controlled the iCub3 robot to visit the exhibition, achieving stable remote communication and the robot's movement in complex environments. By wearing the iFeel sensing suit, the operator precisely controlled various parts of the robot, while the customized tactile device allowed the operator to feel the touch of a remote hug. Experimental results show that the communication delay is only about 100 milliseconds through a standard fiber-optic Internet connection. Figure | Facial expression redirection and iFeel walking. A. The operator can directly control the robot; B. The emotions displayed; C. iFeel shoes, which measure the forces and torques that the operator exchanges with the ground, and when the shoes are paired with a set of trackers, can also detect their position; D. An example of an intention mechanism for a motion interface. In June 2022, at the We Make Future Show digital innovation festival, operators successfully controlled the robot to receive and transport objects on stage, while performing well in an environment with high electromagnetic interference. Specially designed tactile devices allow operators to sense the weight of the objects carried by the robot, and the robot also interacts with the audience in a richer way. Photo: iCub3 interacts with the recipient at the We Make Future Show Digital Innovation Festival. In November 2022, the robot performed well in the ANA Avatar XPrize competition, completing load-bearing tasks and time-limited tasks. The robot's hands are equipped with sensor skins that can sense the texture of the objects it manipulates. Throughout the competition, the operator controlled the robot's movement through walking simulations, and the robot demonstrated efficient and intuitive manipulation. Figure | iCub3's performance at XPrize. A. iCub3 is manipulating a puzzle piece; B. iCub3 is checking the texture of a vase. These experimental results show that iCub3 can demonstrate excellent adaptability and stability in different scenarios, successfully realizing the full sensory experience of remote-controlled robots. The research significance of the iCub3 system lies not only in promoting the development of remote control technology, but also in pushing robotics technology from the laboratory to real scenes. The limitations of the system are mainly reflected in the delay of remote transmission and electromagnetic interference in specific environments. However, these limitations do not reduce its wide application prospects in fields such as art, entertainment and emergency response. Although iCub3 has made significant breakthroughs, there are still some areas that need to be broken through in its process of achieving a full sensory experience of remote control: First, operating the iCub3 robot system requires detailed training to master complex remote control skills, which places certain demands on the user's professional background and learning ability, limiting the popularity of this technology. Secondly, even through a standard fiber-optic Internet connection, the system still has a delay of about 100 milliseconds in remote communication. In some application scenarios with high real-time requirements, this delay may affect the user's operating experience. At the same time, the system's full sensory experience relies on advanced virtual reality helmets, tactile gloves, body sensors and other equipment. The dependence of these devices may make the system difficult to flexibly apply in certain environments, especially in some special tasks or emergency situations. In addition, running iCub3 requires a lot of energy supply, especially in long-distance scenarios. This poses certain energy challenges for the long-term operation of remote-controlled robots. Finally, although iCub3 successfully reproduces the operator's movements and expressions, in some situations, the face-to-face interaction experience with real humans still needs to be further improved, especially in complex social scenarios. iCub3, a promising future In addition to the breakthrough of iCub3, a series of eye-catching research results have emerged recently in the field of remote-controlled robots and full sensory experience. For example, in the medical field, Sakura Sikander's team at the University of Central Florida has successfully developed a series of telemedicine robots that can perform surgery and examination tasks under the remote control of doctors. This technology provides new possibilities for the rational allocation of medical resources. With the successful launch of iCub3, future research will focus more on improving the real-time performance and stability of the system to meet more complex and demanding application scenarios. At the same time, in-depth research on human-computer collaboration and its wide application in the fields of medical care, education, entertainment, etc. will also be important research directions in the future. The successful experience of iCub3 provides useful inspiration for the research and development of subsequent robot systems, and will promote the wider application of robot technology in social life. Paper link: https://www.science.org/doi/10.1126/scirobotics.adh3834 https://www.sciencedirect.com/science/article/pii/S2667099223000269 |
>>: Do animals sleep well? They sleep standing up, hanging upside down, with their eyes open...
[[283734]] On November 25, the Ministry of Indust...
In 2018, the entire mobile Internet entered the s...
During festivals, nuts are the most popular snack...
Another year has passed, and another year of writ...
With the support of Gree Electric Appliances Chai...
What is the price for customizing the Yibin Movie...
For a long time, people have had many opinions an...
Why do I call it basic? We know that the AA AR R ...
On Douyin, a large platform with hundreds of mill...
Follow the flow , you will never go wrong. This s...
According to the latest retail sales data from th...
It is indeed a challenge, but there is no need to...
[September 9 news] After ASUS launched the Nexus 7...
The goal of building a PaaS cloud platform based ...
Autumn wind blows, eat dog meat In Guangzhou, it ...