During the Spring Festival Gala of the Year of the Snake, the H1 humanoid robot by Unitree Robotics shone brightly in the program "Yangge BOT". It cooperated tacitly with human dancers, perfectly integrating traditional yangge with modern technology, stunning the world. This is not only a visual feast but also a powerful demonstration of the technical strength of domestic humanoid robots, allowing everyone to truly feel the charm of technological progress.
The 360° panoramic depth sensing technology integrated into H1 is like installing a super sensory system, enabling it to possess perception capabilities beyond those of humans. It can instantly capture the music rhythm on the stage, changes in the surrounding environment, and even the position dynamics of its teammates, and precisely adjust its movements and positions, truly being able to observe everything around.
AI technology endows H1 with top - notch "dancing talent". It can not only effortlessly complete complex limb movements but also vividly showcase the unique charm of yangge while keeping a neat formation. It cooperates seamlessly with human dancers, and every movement is full of vitality.
The high - precision 3D laser SLAM positioning and navigation technology is like installing a super - accurate stage "GPS" for H1. No matter how chaotic the stage is, it can quickly find its own position. Coupled with the multi - agent cooperative planning and control system, all robots move in unison, and the entire stage rhythm is well under control.
Notably, top - tier robot R & D institutions in the industry generally use motion capture systems to establish basic training libraries. According to insiders, during the early motion learning stage of H1, more than 2,000 sets of motion capture data of human yangge dancers were collected. This kind of high precision data collection is precisely the core technical field of CHINGMU. With its advanced optical motion capture system, CHINGMU can achieve high - precision and low - latency data collection, providing a solid data foundation for robot motion learning and making robot movements more smooth and natural.
Humanoid Robot (Shanghai) Co., Ltd., a cooperation unit of CHINGMU Conducting humanoid robot training and testing
For humanoid robots to achieve flexible and natural movements, they cannot do without the support of motion capture technology. Motion capture is like a "motion tutor" for robots. It can accurately convert human movements into data for robots to learn and imitate, thus achieving more smooth and intelligent movements. When Tesla was training Optimus, it used motion capture technology to obtain real - human motion data, making the robot's movements more natural and flexible, which fully demonstrates the key role of motion capture technology in the field of humanoid robots.
Tesla Optimus
As a leading optical motion capture brand in the industry, CHINGMU's independently developed optical motion capture system plays an irreplaceable role in the fields of embodied intelligence and humanoid robots, with significant advantages such as high precision, low latency, and wide viewing angle.
Motion Ability Evaluation and Detection System
CHINGMU's motion capture system can capture the kinematic parameters of people, objects, and intelligent agents in three - dimensional space with millisecond - level low latency and extremely high precision (position accuracy ≤ 0.1mm, angle accuracy ≤ 0.1°), such as position, posture, and speed. Through in - depth analysis of these data, it can scientifically and comprehensively evaluate and detect the motion abilities of humanoid robots and embodied intelligent agents, promptly identify potential problems and optimize these motion abilities, ensuring that they can operate stably and efficiently in various complex scenarios.
Intelligent Training Data Workshop
CHINGMU's training workshop is like a "treasure trove of data". By comprehensively and meticulously capturing various motion states and behavioral actions of the human body, such as gaits, walking, and jumping, it provides a large amount of high - quality data for the training of embodied intelligence and humanoid robots. These rich data can meet the diverse needs of multi-scenario teaching, evaluation, and training, effectively improving the motion performance of robots. At the same time, these data can also be used to build high - quality motion datasets, providing strong data support for the deep learning and intelligent decision - making of robots, helping them achieve more intelligent and flexible motion control.
Dexterous Hand Research:
During the research process of humanoid robots, in addition to conducting multiple experimental trainings for the development and optimization of hand movements, relying on motion capture technology can accelerate the realization of this goal. CHINGMU's optical finger - tracking technology provides more reliable and complete data support for the collection of hand movements. It can accurately collect the 6DOF data and kinematic data of human hands and robot hands, providing real and complete data support for dexterous hand control research and algorithm verification, making the hand movements of robots more flexible and precise, and enabling delicate operations such as grasping and pinching like humans.
Some cooperation cases of CHINGMU
Robot Teleoperation:
Using CHINGMU's motion capture system, it can capture the movements of each joint, part, and the whole body of the human body with high precision. After data conversion and analysis, it can output synchronously, providing key support for the research of humanoid robot teleoperation, which is conducive to achieving real - time synchronization and precise control of human - robot movements.
CHINGMU & Yingshi Robot Teleoperation Demonstration
Training and Simulation: Complex Scene Modeling
Currently, one of the major difficulties in the training of humanoid robots is the lack of rich data support. Simulation training can minimize the training cost of humanoid robots. In a simulated environment, various scenarios can be built 1:1, from simple obstacle avoidance to complex car - driving obstacle avoidance. Through simulation training, not only can the limitations of the spatial environment be overcome, but also the training efficiency can be improved, and training scenarios that are difficult to achieve in real - life scenarios or are difficult to achieve due to cost issues can be maximally completed. Moreover, based on the comparative analysis of simulation data and real - world data, the design and control strategies of humanoid robots can be optimized.
Currently, CHINGMU has cooperated with many schools and enterprises, such as Honor, Humanoid Robot (Shanghai) Co., Ltd., Zhejiang Humanoid Robot Innovation Center, Tsinghua University, Peking University, Zhejiang University, and Soochow University. By using CHINGMU's motion capture system, research on high - precision finger training, dexterous hands, and teleoperation is carried out. Through the close cooperation of industry, academia, and research, it continuously explores more application possibilities of motion capture technology in the field of humanoid robots and contributes to promoting the development of the industry.
In the era of the rapid development of the humanoid robot industry, CHINGMU will stand at the forefront of the times, continuously explore and innovate, and expand the application and practical possibilities of motion capture technology in the development of humanoid robots, embodied intelligence, and other industries. It will play an active role in fields such as industrial automation, home services, medical rehabilitation, and transportation and rescue.