In recent years, there has been a shift in industrial robotics towards increased use of collaborative robots. Currently, safety is ensured by power and force limiting, but further development is also happening in the area of speed and separation monitoring. This can be measured using external depth cameras, and the use of multiple individual lidars attached to the robot has also been demonstrated.
Instead of using multiple individual lidars, the same effect might be achieved by using a centralized lidar, whose fields of view (FOVs) are statically redistributed along the robot's segment using mirrors. These fields of view provide perspective from multiple points on the robot's segments. We developed a multi-channel lidar that provides insight into all internal signals. With it, we examined the influence of mirrors and prisms on distance measurement. We tested different types of mirrors in both clean and dusty conditions. Additionally, we also investigated the effects of light redirection between multiple adjacent channels.
Findings indicate that the type of mirror is not significant, but dust on the mirror has a significant impact. Some light is reflected off it, and when combined with the reflection of light from the target, the signals add up to distorted reflection, resulting in a negative measurement error. Due to the establishment of new light paths, errors can also occur when mirrors for redirecting adjacent light paths are within the FOV of the observed channel. If the additional light path is shorter than the primary one, it leads to a negative measurement error; otherwise, the error is positive. In both cases, errors occur only in part of the measurement range.
In addition to the technical implementation of FOV redistribution, we also delved into possible applications. For a good reconstruction of the robot's surroundings, it is necessary to know the positions and shapes of the FOVs of the sensors used, which is a non-trivial requirement. However, a good reconstruction of the environment might not necessarily be required to increase safety. A system for real-time detection of unexpected intrusion in the robot's workspace has been developed. To determine the corresponding reference value, it relies on knowledge of the current positions and velocities of the robot's joints. The system has been tested and its proper operation is also demonstrated in collaborative applications, where a person moves in close proximity and constantly intrudes in the robot’s immediate vicinity.
|