How do humanoid robots perceive the external world? What types of sensors are they typically equipped with?
Okay, regarding how humanoid robots perceive the world and the sensors they have, let me break it down for you.
You can imagine a humanoid robot as an "Iron Man," but its "suit" is packed with various electronic components, which are its "senses." Just as humans use eyes, ears, and skin to perceive the world, robots rely on these sensors.
Below, I'll categorize and explain what "superpowers" they possess:
I. "Eyes" - Visual Sensors
This is the robot's most crucial sense, allowing it to "see" the world. There are several main types:
- Standard Camera (2D Camera): Like the camera on our phones, it captures colorful, flat images. Robots use it to identify objects, recognize faces, understand road signs, and so on. However, with only one camera, its judgment of distance isn't very accurate.
- Stereo Camera (3D Camera): This is comparable to human eyes. By simultaneously capturing images from two cameras at different positions, the robot can calculate the distance to objects, forming stereo vision. This is vital for walking, avoiding obstacles, and grasping objects.
- Depth Camera (LiDAR and ToF): These are more advanced "eyes."
- LiDAR (Light Detection and Ranging): You can imagine a small, constantly rotating device on its head. As it spins, it emits laser beams in all directions and calculates the precise distance and shape of all surrounding objects based on the time it takes for the laser to return. Many self-driving cars use this technology, which can generate a very detailed 3D environmental map.
- ToF (Time-of-Flight) Camera: It emits a burst of infrared light and then quickly determines the distance to an object by calculating the time difference between the light traveling to the object and bouncing back.
<center>Simply put, it's like how we use two eyes to judge distance</center>
II. "Ears" - Auditory Sensors
- Microphone/Microphone Array: Robots certainly need microphones to receive our voice commands. But typically, they install a "microphone array" (several microphones combined). The advantage of this is that it can not only "hear" but also "localize" where the sound is coming from. This allows it to accurately turn towards you when you speak to it in a crowd.
III. "Skin" and "Joints" - Tactile and Force Sensors
Being able to see isn't enough; to interact with the environment, tactile sensation is essential.
- Tactile Sensors (Skin): These are usually installed on the robot's fingertips, palms, and other areas. When it picks up a cup, these sensors can tell it the object's shape, material (soft or hard), and whether it's slipping. This allows it to grasp with just the right amount of force, rather than crushing the cup.
- Force/Torque Sensors (Joints): These sensors are installed in the robot's "joints," such as wrists, ankles, and waist. Their function is to perceive the forces exerted on the robot during each movement. For example, when walking, if its foot steps on the ground, the sensors will tell its "brain" how hard the ground is and whether it has a stable footing; when it pushes a door, it can feel whether the door opened or is stuck.
IV. "Inner Ear" - Balance and Posture Sensors
This is key for humanoid robots to walk like humans without falling.
-
Inertial Measurement Unit (IMU): This small device is very important; it's like the balance organ in our inner ear. An IMU typically includes:
- Accelerometer: Measures changes in the robot's speed (acceleration) in three directions: front-back, left-right, and up-down.
- Gyroscope: Measures the robot's rotation around three axes (pitch, yaw, roll).
By combining this information, the robot can instantly know if its body is tilted, accelerating, or turning, and then quickly adjust its steps to maintain balance.
In Summary
The process of a robot perceiving the world is a "data fusion" process.
It gathers all information—images seen by its "eyes," sounds heard by its "ears," pressure felt by its "skin," forces experienced by its "joints," and its own posture perceived by its "inner ear"—and sends it all to its central processor (its "brain"). Then, through complex algorithms (increasingly using artificial intelligence), it analyzes and makes decisions, ultimately instructing various parts of its body to perform appropriate actions.
Therefore, an advanced humanoid robot is essentially a walking supercomputer integrated with various high-precision sensors.