The robots have arrived at the International Space Station. How are they doing their job?

The robots have arrived at the International Space Station. How are they doing their job?

Yesterday was the first day of 2025, and the crew of Shenzhou 19 sent holiday greetings from the space station. Netizens said that the Chinese space station has become sci-fi, and they were attracted by astronaut Wang Haoze's new partner, Xiaohang, the smart assistant in space. In this article, let's learn about the robots working on the International Space Station!

The commissioning of the International Space Station has provided a broad space for the development of robots for operating the International Space Station, and various robotic arms, automatic filming robots, and automatic satellite testing platforms have emerged.

The International Space Station is the largest space facility ever built.

ROTEX Project

The ROTEX project of the German Space Agency in 1993 was the starting point for German space automation and robotics. In fact, ROTEX was initiated in 1988 and launched in 1993 as part of the "Skylab D2" mission on the Columbia space shuttle.

The core of the project is a small six-axis robotic arm equipped with a complex multi-sensor gripper. The main experimental goal is to achieve sensor-based autonomous operation as much as possible, but ground-based remote control technology is also fully considered. Specific sensors include two sets of six-axis torque sensors (based on strain gauges and optical sensors), a tactile array for grip force control, and an array of 9 laser rangefinders. In addition, there is a pair of miniature stereo cameras that take stereo images from inside the gripper to the outside, and a pair of fixed cameras that take stereo images of the robot's working area.

Robotic arm of the ROTEX project

There are three specific tasks: assembling the mechanical grid structure, connecting/disconnecting the orbital replaceable unit (ORU) using the gripper, and grabbing floating objects. Flight verification shows that the robotic arm of the ROTEX project can operate automatically according to ground pre-programming, can be remotely operated by the space shuttle crew, or remotely operated from the ground, and can also be uploaded and refreshed after reprogramming on the ground.

ROKVISS

ROKVISS is a two-degree-of-freedom manipulation hardware testbed developed by Germany outside the International Space Station. This is an interesting robot with a strange structure. It is not used to perform robotic arm tasks, but to test some key technologies and devices to perform more intensive and complex space activities in the future. ROKVISS means Robotic Science Component Verification on the International Space Station, which is used to test the reliability of remote presentation control concepts and its joint electronics. Its purpose is to create robots that can work on the space station or even farther orbit in the future.

Rockwell was installed on January 26, 2005, by the astronauts during a six-hour extravehicular activity, after attaching a universal platform to the exterior wall of the Russian Zvezda service module.

The main part of Rockvis is a two-jointed robotic arm with a metal "finger" at the tip, a stereo camera and a mono camera. The universal platform contains electronic equipment boxes for power distribution and image processing, as well as a strange-looking metal structure with several holes cut into it, a spring inside and a hook hanging from it. This is a special device for dynamic robot movement experiments and determining joint parameters. The robot joints and cameras are controlled by the central experiment computer inside the International Space Station.

The arm can operate in two different modes. The automatic mode is used in scenarios where neither the ground nor the astronauts intervene. The experiments are controlled by the experimental computer on the ISS and the data are saved for later evaluation. There is also a remote control operation mode, which still requires direct human operator involvement in the control loop in the envisaged satellite maintenance and repair work, because such tasks are unpredictable and difficult to pre-program on the ground. Given the ISS's proximity to the ground and the sufficient measurement and control coverage, ground operators can operate it around the clock. Of course, the German operators prefer to use the DLR's own antenna to directly control the space station when it flies over Weilheim in southern Germany, without a significant time delay. In this way, scientists operating the arm on Earth can receive visual and sensory feedback about the arm's movements in almost real time.

During the test, ground operators could intuitively feel the special force conditions of the robot in the space environment through force feedback. Engineers also tested how it absorbed energy during movement and the friction performance of bearings and gears during long-term operation in space. Cameras recorded the test process and transmitted the video to the control room in real time, giving scientists a real impression of the status and function of the experiment.

German Rockwell Robot Ground Experiment

Space radiation is a major challenge for Rockwith, as frequent ion bombardment can damage electronic components. To prevent the electronics from being destroyed, a function is integrated into the control module that automatically shuts down the power supply and removes the stored energy in the event of a short circuit. Due to extreme temperature fluctuations, the robot joints must withstand temperatures between -20 degrees Celsius and +60 degrees Celsius.

On-orbit test of the German Rockwell robot

The main purpose of this experiment is to develop future lightweight robots that can perform more complex maintenance or assembly work and can be operated directly by operators on the ground.

The ROKVISS experiment cost €11.5 million, including €3.5 million for launch, assembly and operation to the ISS.

SPHERES

SPHERES is the internal microsatellite testbed on the International Space Station. Its full name is "Synchronized Position Holding, Engagement and Reorientation Experiment Satellite", and it is also the predecessor of "Spacebee". It was developed by the Massachusetts Institute of Technology's Space Systems Laboratory for NASA and the US military as a low-risk, scalable test platform for developing metrology, formation flying, rendezvous, docking and autonomous algorithms.

SPHERES experiments inside the space station

Initial development of SPHERES began in 1999. Professor David Miller challenged students to develop a combat training remote control similar to the ones used in the films Star Wars: A New Hope and Star Wars: Attack of the Clones, coming up with the concept of a satellite-in-a-pod.

After the initial development, the SPHERES program was taken over by MIT's Space Systems Laboratory, which built six flight-ready satellites, three of which were put into the International Space Station. They can be used from the International Space Station and in ground laboratories, but do not have the ability to fly in real space.

Each SPHERES satellite resembles an 18-sided polyhedron. The satellite's aluminum structure is enclosed in a translucent plastic shell. The shell is red, blue, orange or black to aid identification. The three satellites of the International Space Station are red, blue and orange. Each unit has a maximum diameter of 22.9 cm and a mass of 4.16 kg including consumables. The satellites can communicate with each other using a 916.5MHz, 16kbit/s radio link, and communication with the control station (laptop) is done using an 868.35MHz, 16kbit/s radio link.

SPHERES satellite in the cabin

The SPHERES satellite determines its position and attitude by using 23 ultrasonic receivers (Murata MA40S4R) and 5 external ultrasonic reference beacons, supplemented by data from accelerometers (3x Honeywell QA-750 single-axis accelerometers) and gyroscopes (3x Systron Donner QRS14 single-axis rate gyroscopes).

The SPHERES satellite is powered by two non-rechargeable 12v battery packs, which need to be replaced when they are low. This has been improved in subsequent models. The SPHERES satellite uses 12 carbon dioxide cold gas thrusters for maneuvering and attitude control. Liquid carbon dioxide is stored in a small container. The maximum linear acceleration of the satellite is 0.17m/s2 with an accuracy of 0.5cm. The maximum angular acceleration is 3.5rad/s2 with an accuracy of 2.5 degrees. However, the injection of carbon dioxide in the cabin is not ideal, and subsequent models such as the "Space Bee" have also made improvements.

"Space Bee"

The SpaceBee system consists of three cube-shaped robots, software, and a docking station for charging. The robots are shaped like 12.5-inch (31.75-cm) cubes that can return autonomously to recharge their batteries. The SpaceBee replaced SPHERES as the station's robotic testing facility. The robots use electric fans as a propulsion system and no longer emit carbon dioxide, allowing them to fly freely in the microgravity environment of the space station. Cameras and sensors help them "see" and navigate their surroundings. The SpaceBee also carries a perching arm that allows them to hold onto a handrail and remain motionless to save energy or to grab and hold items.

Mission scientists can use the SpaceBee to conduct research that will help develop hardware and software technologies for future missions. Because the robot is modular and can be upgraded, the system provides researchers and scientists with the ability to conduct a variety of experiments within the space station. Such robots also have the potential to serve as "caretakers" for future spacecraft, monitoring and keeping systems running smoothly while astronauts are away.

The SpaceBee support facilities were launched to the International Space Station on the CRS-10 commercial resupply mission on November 17, 2018, and were installed in the Japanese Experiment Module on the space station on February 15, 2019.

The "space bee" during the test

On April 17, 2019, the first two robots of "Spacebees", "Bumble" and "Honey", were launched into space on the CRS-11 mission.

On July 25, 2019, the third free-flying robot, Queen, and three perching arms entered the International Space Station on SpaceX's 18th Commercial Resupply Services Mission (CRS-18).

One of the main tasks of the SpaceBee is to take photos inside the cabin, but in addition, it has another scientific research mission, which is of great significance to the development of technology outside the cabin.

NASA once conducted an experiment called Relative Operation of Autonomous Maneuvers (ROAM), which mainly demonstrated the process of rendezvous between a robotic satellite and space debris. Most space debris are "dead satellites", some of which can be repaired, and some need to be sent into the atmosphere to be destroyed. However, most of these satellites are tumbling, and it is very risky for robotic satellites to approach them to capture them. Therefore, NASA tried to use the ROAM experiment to let the "Space Bee" simulate the tumbling of an out-of-control satellite in the cabin, observe and recognize, and plan the rendezvous and capture algorithm.

The main idea of ​​the ROAM program is to first approach a tumbling satellite with a capture satellite to establish a coordinate system, and then use a three-dimensional (3D) time-of-flight camera and a visual estimation algorithm to remotely estimate the target's rotational state, rotational inertial parameters and accompanying covariance. Offline simulations of potential target tumbling types are used to generate lookup tables that are resolved on-orbit using the estimated data. This nonlinear programming-based algorithm takes into account known target geometry and important practical constraints such as field of view requirements to generate motion plans in the target's rotational frame. At the same time, uncertainty characterization methods are used to transfer the uncertainty in the target tumbling, so that a perturbation bound can be provided for the reference trajectory in the inertial frame. Finally, this uncertainty bound is provided to the robust tube model predictive controller to provide a guarantee for the system's ability to translationally track the reference trajectory.

The ROAM experiment requires one or two astronauts to oversee the setup of the SpaceBee and one ground operator to control the two SpaceBee robots. The astronauts first set the SpaceBee in an initial orientation, while the ground controller performs final positioning and command tests to begin execution. Data is collected on the two SpaceBees using the Robot Operating System (ROS) and then manually transmitted back to the ground. In addition, ground operators and researchers observe a live video stream of the experiment to measure the progress of the test and view real-time information provided by the SpaceBee ground station.

The purpose of the "SpaceBee" system is also to free up astronauts' time from simple labor, allowing them to focus more on things that only humans can do. The "SpaceBee" can work autonomously or be remotely controlled by astronauts, ground control personnel or ground researchers. It can be used to take inventory of supplies, record experiments in the cabin, and even help move goods. The "SpaceBee" system can also be used as a research platform to carry out equipment and programming, conduct experiments in microgravity, and accumulate experience and data for the subsequent development of space robots.

Japan's "Inside Ball"

Astronauts not only need to conduct experiments and maintain facilities in the space station, but also need to take photos and videos to record the operation of the space station. These images themselves have high scientific value and can be used as popular science materials. However, it seems too wasteful to take up an astronaut's precious time to take pictures. After all, the cost of keeping a person in orbit is very high. Therefore, NASA has successively launched the development and flight of two generations of cabin robots, including the SPHERES mentioned in the previous article and the "space bees" currently flying in the space station.

In addition to the United States, Japan is also very interested in cabin robots. For example, there is a robot called "JEM Interior Ball Camera 2" on the International Space Station, referred to as "Interior Ball", also known as a free-floating remote-controlled panoramic camera. It was first launched to the space station by the Japan Aerospace Exploration Agency in 2017. Its function is to use the camera to autonomously shoot videos and photos of research activities, saving astronauts' time, and it can also be used in testing other missions in the future.

The "Inner Ball" is really a ball. It floats freely in the zero gravity of the space station, flying around under the control of ground personnel or astronauts to take pictures of the interior. The "Inner Ball" weighs only 1 kg and has a diameter of only 15 cm. So how does it control its position and attitude? In fact, it is similar to drones on Earth. In the "Inner Ball", 3 reaction flywheels and 12 propeller thrusters are installed. This allows it to fly, hover and change its attitude in a microgravity environment, and it is more energy-efficient than drones on Earth. The main components and shell of the "Inner Ball" are made of 3D printing. Two simulated "eyes" are installed on the outside of the ball. There are also lights on the shell to show the current working status, such as recording, etc. Its charging method is also the same as various consumer electronic products, through the USB interface.

Astronauts interact with the "inner sphere"

JAXA also developed and launched the Inner Sphere 2, which adds some expansion slots to enable more complex functions.

According to the Japan Aerospace Exploration Agency, the time used to shoot images accounts for about 10% of the daily working time of the space station crew, which is really not a small number. Therefore, if the shooting activities can be remotely controlled by ground personnel through the study and practice of the "inner sphere", the astronauts' time can be saved. Such technology can not only be used on the International Space Station, but also in lunar and deep space exploration missions in the future. However, the communication delay between the space station in low orbit and the ground is relatively small, and remote control is relatively easy. When it comes to the moon, asteroids or Mars, the time delay will increase significantly. How can ground personnel control the camera equipment? There are more technical problems that need to be solved.

<<:  It's called the "Queen of Flowers", but it has no petals...

>>:  Is adding chlorine to tap water harmful to the body?

Recommend

It takes 400,000 years to encounter aliens? How did you calculate this?

Science fiction writer Liu Cixin said in his book...

Why do cats and tigers look so similar?

Regarding cats and tigers, many people have proba...

How to get the most accurate users with the least money?

I selected some data and screenshots from my adve...

All copywriting cannot escape these six doors

Why can’t I write a copy no matter how hard I try...

How does Pinduoduo guide users?

This article will analyze the three aspects of Pi...

Mazda to launch new energy vehicle series from 2021

According to overseas media reports, Mazda offici...

How to create a hit product on Xiaohongshu? Xiaohongshu’s hot-selling guide!

As a gathering place for the new generation of co...

Anniversary event planning tips!

Recently, a friend of mine went for an interview,...

The ninth portrait class of Ace will end in February 2022

Ace's Ninth Portrait Class, February 2022, Co...