Client: Robotics and Intelligent Equipment Research Institute, Xiangcheng Campus, Suzhou University
Applications: Motion Capture, Floor Cleaning Robot Position Tracking, Autonomous Navigation, Algorithm Verification, Position Tracking, PTS Measurement Experiment
With the advancement of technology, more and more small intelligent robots have entered the homes of ordinary people, with floor cleaning robots being among the most common. In addition to household floor cleaning robots, commercial floor cleaning robots are also poised to enter the market as technology matures. One critical indicator of whether a floor cleaning robot can be considered a qualified product is its cleaning coverage rate.
The cleaning coverage rate refers to placing a floor cleaning robot in a simulated home or commercial environment and allowing it to move around a room. By tracking the robot’s movement trajectory, the area covered by the robot within a set time frame is calculated. This covered area is then divided by the total area of the environment, resulting in the coverage rate, which indicates how much area the floor cleaning robot can clean during its operational time.
To measure the coverage rate, it is necessary to track the robot’s movement trajectory, positional data, and direction while operating. Traditional methods rely on camera imaging, target tracking, and graphical processing to display the robot’s path. However, these methods have high demands in terms of the number of cameras required, data processing capabilities, and budget, and are hindered by slow computation speeds, making real-time previews difficult. This has led the Robotics and Intelligent Equipment Research Institute of Suzhou University to seek alternative solutions to address these challenges.
Besides the visual imaging method mentioned earlier, another method involves using signal-emitting equipment to transmit positional information of the robot while it moves, which is then received by a receiver. However, this method presents two main issues that can affect experimental results: first, the accuracy of signal transmission is low; second, the transmitter mounted on the robot can interfere with its movement.
The industry also uses laser positioning methods for experimental measurements. While laser positioning offers high precision, it comes at a high cost—over three million RMB. For experiments that require precision within 0.5mm, laser positioning is clearly not the optimal solution. These limitations of the three methods provided a unique opportunity for collaboration between CHINGMU and the Robotics and Intelligent Equipment Research Institute of Suzhou University.
CHINGMU’s optical 3D motion capture system typically uses Maker balls for motion tracking. However, this method can also interfere with the movement of floor cleaning robots. To meet the specific needs of the robotics research institute’s project, CHINGMU provided an adhesive Maker dot tracking method, which allows for the accurate capture of positional data using infrared optical cameras without interfering with the robot’s movement.
In addition to addressing hardware issues, the precision of the system was also a key consideration in the experiment. The Robotics Institute compared the precision of CHINGMU’s motion capture system to that of laser positioning and other optical camera systems. In a 20m² test area, the space was divided into 9 regions for measuring relative displacement and angular precision. The results showed that the average error of relative displacement in CHINGMU’s system was below 0.42mm, and the angular measurement error was within 0.036 degrees. These results placed CHINGMU’s motion capture system in the top tier of motion capture products and met the client’s precision and budgetary requirements.
Laser vs. Optical Capture Comparative Test on-site
Experimental Area Divided into Nine Grid Points, with Movement Measured in Both Horizontal and Vertical Directions
Beyond hardware issues, the software aspect of the Pose Tracking System (PTS) also needed to be addressed. While the Robotics Institute had previously experimented with foreign optical motion capture systems, they found that the software provided by those systems lacked open interfaces, preventing algorithm traceability and customization. Additionally, the pricing was not competitive, and the institute’s specific needs could not be met.
CHINGMU’s MC1300 Camera
CHINGMU offers independently developed optical cameras and motion capture software, which can be customized for secondary development according to the Robotics Institute’s measurement requirements. This customization helps the institute implement application logic based on various parameters and industry standards.
The Robotics Institute conducted experiments in two separate laboratories: a 20m² residential scene simulation and a 160m² commercial scene simulation.
CHINGMU’s optical 3D motion capture system was able to easily track the position of a single or multiple floor cleaning robots in large spaces, recording data on their direction, position, and speed while providing real-time trajectory previews. This solution effectively solved the problems associated with traditional methods, such as high costs, low precision, and the inability to process image data in real time.
In the 20m² residential experiment room, four MC1000 optical cameras were used
In the 160m² large laboratory, 32 MC1300 optical cameras were set up
CHINGMU’s motion capture system not only met the needs of the current experiment but is also capable of providing multidimensional, high-precision experimental data for future experiments, such as robot and workbench space tracking, continuing to play a crucial role in subsequent research.
Chingmu Vision MC series optical motion capture system use case