FIDUCIAL MARKERS EVALUATION USING ONBOARD CAMERA OF SERVOSILA ENGINEER MOBILE ROBOT IN INDOOR SETTINGS

  • Т.G. Tsoy Kazan Federal University
Keywords: Camera calibration, fiducial marker systems, mobile robot

Abstract

Modern tasks of special areas of robotics, including search and rescue operations in urban
areas, face a number of obstacles to mobile robotics, where the automatic performance of various functions
by mobile robots remains a key task. One of the important requirements for the algorithms and
software of the robot is the possibility of autonomous decision-making and automatic performance by
the robot of various functions, both low and high levels based on the embedded algorithms and information
received from the on-board sensors of the robot. To date, the most common on-board robot sensors
are cameras of various types, due to their technical capabilities and lower cost relative to lidars
and other sensors that provide visual information in the form of digital images. Camera calibration is a
necessary process for extracting accurate information from digital images. This process is necessary to
obtain an exact correspondence between the three-dimensional object space and the pixel space of the
image, for the possibility of subsequent use of computer vision algorithms, aggregation, and information
processing. Calibration of digital cameras is an integral part of a number of practical tasks of machine
vision such as navigation of mobile robotic systems, medicine, reconstruction of dense and sparse threedimensional
maps of the environment, video surveillance and visual inspection, visual simultaneous
localization and mapping, etc. The urgency of the problem of camera calibration is defined by the presence
of many different methods of calibration and calibration templates. Each individual solution is
suitable only for special conditions, e.g., lack of lighting, bad weather conditions, the presence of thirdparty
objects blocking visibility, etc. In most cases, each calibration method uses a specific calibration
pattern. Camera calibration is usually associated with the use of special calibration templates. They
allow to achieve the most accurate results due to a previously known geometric structure. Currently, the
procedure for camera calibrating of robotic systems is carried out in laboratory conditions using the
classic “chessboard” method. In addition, there are only a few alternative approaches that are in their infancy state both in Russia and abroad. On the other hand, research into camera calibration methods
continues and new alternatives for camera calibration are emerging. One of the new directions is the
use of fiducial marker systems as a reference object. A variety of parameters such as the size of the calibration
template, the dimension of the calibration data set, the distribution of distances from the camera
to objects on the stage, etc. creates a vast area for experimental testing of optimal camera calibration
parameters. This paper presents a research of automatic camera calibration using fiducial marker systems
(FMS), which are located on the surface of the robot. Based on the results of virtual experiments
with FMS in the Gazebo simulation environment of the robotic operating system ROS, two different
types of FMS were selected that are optimal relative to other types of FMS covered by our previous
studies in terms of the resistance of FMS to systematic occlusion of the marker area and the effect of
marker size on quality of its recognition. The selected FMS were tested using the onboard camera of the
Russian mobile robot Servosila Engineer in indoor settings to assess the correlation of results in virtual
and real environments.

References

1. Alishev N., Lavrenov R., Hsia K. H., Su K. L., and Magid, E. Network failure detection and
autonomous return algorithms for a crawler mobile robot navigation, 2018 11th International
Conference on Developments in eSystems Engineering (DeSE). IEEE, 2018, pp. 169-174.
2. Safin R., Lavrenov R., & Martínez-García E. A. Evaluation of visual slam methods in usar applications
using ros/gazebo simulation, Proceedings of 15th International Conference on
Electromechanics and Robotics" Zavalishin's Readings". Springer, Singapore, 2021, pp. 371-382.
3. Tsai R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology
using off-the-shelf TV cameras and lenses, IEEE Journal on Robotics and Automation,
1987, Vol. 3, pp. 323-344.
4. Zhang Z. A flexible new technique for camera calibration, IEEE Transactions on patttern
analysis and machine intelligence, 2000, Vol. 22, pp. 1330-1334.
5. Fiala M. Comparing ARTag and ARToolkit Plus fiducial marker systems, IEEE International
Workshop on Haptic Audio Visual Environments and their Applications. IEEE, 2005, p. 6.
6. Rojtberg P., and Kuijper A. Efficient pose selection for interactive camera calibration, 2018
IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2018,
pp. 31-36.
7. Romero-Ramirez F.J., Muñoz-Salinas R., Medina-Carnicer R. Speeded up detection of squared
fiducial markers, Image and Vision Computing, 2018, Vol. 76, pp. 38-47.
8. Atcheson B., Heide F., Heidrich W. CALTag: High Precision Fiducial Markers for Camera
Calibration, VMV. Citeseer, 2010, Vol. 10, pp. 41-48.
9. Hu D., DeTone D., and Malisiewicz T. Deep charuco: Dark charuco marker pose estimation,
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019,
pp. 8436-8444.
10. Mantha B.R.K., and Garcia de Soto B. Designing a reliable fiducial marker network for autonomous
indoor robot navigation, Proceedings of the 36th International Symposium on Automation
and Robotics in Construction (ISARC), 2019, pp. 74-81.
11. Annusewicz A., and Zwierzchowski J. Marker detection algorithm for the navigation of a mobile
robot, 2020 27th International Conference on Mixed Design of Integrated Circuits and
System (MIXDES). IEEE, 2020, pp. 223-226.
12. Liu X., Madhusudanan H., Chen W., Li D., Ge J., Ru C., and Sun Y. Fast eye-in-hand 3-d
scanner-robot calibration for low stitching errors, IEEE Transactions on Industrial Electronics.
IEEE, 2020, Vol. 68 (9), pp. 8422-8432.
13. Lee T.E., Tremblay J., To T., Cheng J., Mosier T., Kroemer O., and Birchfield S. Camera-torobot
pose estimation from a single image, 2020 IEEE International Conference on Robotics
and Automation (ICRA). IEEE, 2020, pp. 9426-9432.
14. dos Santos Cesar D.B., Gaudig C., Fritsche M., dos Reis M.A., & Kirchner F. An evaluation
of artificial fiducial markers in underwater environments, OCEANS 2015-Genova. IEEE,
2015, pp. 1-6.
15. Westman E., and Kaess M. Underwater AprilTag SLAM and calibration for high precision
robot localization. Tech. rep. Carnegie Mellon University, 2018.
16. Korthals T., Wolf D., Rudolph D., Hesse M., and Rückert U. Fiducial Marker based Extrinsic
Camera Calibration for a Robot Benchmarking Platform, 2019 European Conference on Mobile
Robots (ECMR). IEEE, 2019, pp. 1-6.
17. Shabalina K., Sagitov A., Svinin M. and Magid E. Comparing Fiducial Markers Performance
for a Task of a Humanoid Robot Self-calibration of Manipulators: A Pilot Experimental Study,
International Conference on Interactive Collaborative Robotics. Springer, Cham, 2018,
pp. 249-258.
18. Shabalina K., Sagitov A., Sabirova L., Li H., Magid E. ARTag, AprilTag and CALTag fiducial
systems comparison in a presence of partial rotation: Manual and automated approaches, Lecture
Notes in Electrical Engineering, 2020, Vol. 495, pp. 536-558.
19. Zakiev A., Shabalina K., Magid E. Pilot Virtual Experiments on ArUco and AprilTag Systems
Comparison for Fiducial Marker Rotation Resistance, International Conference on Artificial
Life and Robotics. ICAROB, 2019, Vol. 495, pp. 132-135.
20. Wahid Z. and Nadir N. Improvement of one factor at a time through design of experiments,
World Applied Sciences Journal, 2013, Vol. 21 (1), pp. 56-61.
21. Moskvin I., Lavrenov R., Magid E., Svinin M. Modelling a Crawler Robot Using Wheels as
Pseudo-Tracks: Model Complexity vs Performance, IEEE 7th International Conference on Industrial
Engineering and Applications (ICIEA 2020), 2020, pp. 235-239.
22. Safin R., Lavrenov R., Saha S. K., and Magid E. Experiments on mobile robot stereo vision
system calibration under hardware imperfection, MATEC Web of Conferences. EDP Sciences,
2018. Vol. 161, pp. 03020.
23. Magid E., and Sagitov A. Towards robot fall detection and management for russian humanoid
AR-601, KES International Symposium on Agent and Multi-Agent Systems: Technologies and
Applications. Springer, Cham, 2017, pp. 200-209.
Published
2022-08-09
Section
SECTION IV. ELECTRONICS, NANOTECHNOLOGIES AND INSTRUMENTATION