The thesis deals with robot tracking using a point cloud and calculation of the robot's joint angles. We assembled a system with a LIDAR and a robot and created a program capable of robot tracking. While the robot was moving, a point cloud was repeatedly captured by the depth camera. Using the iterative closest point algorithm, we could then align the models of the two main segments of the robot with the captured point cloud. The joint angles were calculated using the transformation matrix returned by the algorithm. We tested our program with different speeds of the robot and two different camera positions. The experimental results show that the software functions properly with slow robot movements and gives good resuts with higher speeds as well. Errors occur when the segments cannot be seen by the LIDAR.
|