Mobile robots represent an increasing share of logistics units in modern industry. Smart factories are already increasingly equipped with autonomously guided robots. When navigating robots, it is necessary to know the current position of the robot at all times as well as detect restriction movements in the surroundings. Since reliable position detection with only sensors mounted on the robot is demanding, external sensors are often used to solve this problem. In this thesis, navigation methods based on camera and machine vision are presented. The development of software and simulation environment in ROS software and Gazebo simulator is also described. Augmented reality (AR) tags are used to track the system of multiple robots in environment with obstacles, using the Alvar library. The system is tested and evaluated through scenarios of different complexity and environments with and without obstacles. It was found that the robots in all scenarios achieve the required end positions with appropriate accuracy (<10 cm of deviation). Deviations from ideal path in the simulator are smaller than deviations on test site due to more demanding tracking with camera in reality. As the complexity of scenarios increases, the deviation of the mobile robot's path from the ideal path also increases.