Autonomous navigation on a mobile robot requires access to a map of the environment, which contains data about the obstacles in said environment. In our work, building a map works on the basis of a depth camera and tracking camera, which feed their data into a simultaneous localisation and mapping algorithm called Rtabmap. This work addresses the development of the robotic system, which receives all of its odometry data exclusively from the tracking camera, meaning it does not use the wheel encoders. Additional functionality and ease of use is provided by the detection and visualization of Aruco tags, which lower the localisation error. The developed system works well in good lighting conditions but fails to perform well in more difficult lighting conditions suck as dark rooms, as well as in the presence of stains and filth on the camera lenses, as it has trouble detecting various objects under such conditions.
|