BUILDING A MAP OF REFERENCE SURFACES TO SOLVE THE PROBLEM OF PLANNING THE MOVEMENT OF A GROUP OF GROUND ROBOTS

  • B.S. Lapin FSUE "VNIIA"
  • О. P. Goydin FSUE "VNIIA"
  • S.А. Sobolnikov FSUE "VNIIA"
  • I.L. Ermolov Ishlinsky Institute for Problems in Mechanics of the Russian Academy of Sciences
Keywords: Mapping, vision system, motion planning, clustering, sensor fusion

Abstract

The purpose of the study is to form a geometric model of the environment containing information
about the parameters of the underlying surface for use in a system for planning the movements
of a group of robots in formation at high speed. The article examines the problem of con structing a map of support surfaces. An analysis of existing research on the topic of determining
the characteristics of supporting surfaces by mobile robots is presented. A classification of methods
for assessing the characteristics of a supporting surface into remote and contact ones is given.
Based on an analysis of the advantages and disadvantages of known remote and contact methods,
the work proposes a combined approach that makes it possible to use the advantages of both
methods. The approach is based on remote division of space into clusters according to the external
parameters of the underlying surface with potentially identical internal properties, simultaneous
determination of the internal parameters of the underlying surface by the contact method and their
further combination. In this case, the surface parameters are constantly updated during movement.
The approach uses a limited list of standard on-board means of a mobile robot and does not require
large computational costs compared to machine learning methods. A description is given of
the remote determination of the external parameters of the underlying surface, which are based on
point cloud segmentation algorithms that do not require preliminary training. The arguments for
segmentation are: the coordinates of the cloud points, the color of each point, and the height difference
in the vicinity of each point. An algorithm for determining the internal characteristics of a
surface using the contact method is described. The friction coefficients between each wheel and
the current surface are considered as internal parameters. These coefficients make it possible to
determine the maximum accelerations for each robot in the group, which are necessary to implement
the motion planning system. The paper presents the results of experimental studies of remote
determination of the parameters of the underlying surface within the framework of the proposed
approach using data from the public KITTI dataset. The results of the study confirm the possibility
of forming a geometric model of the environment, segmented into areas with different characteristics
of the supporting surface without training using standard hardware capabilities of the robot

References

1. Dorigo Marco. Swarmanoid: A Novel Concept for the Study of Heterogeneous Robotic
Swarms, IEEE Robotics and Automation Magazine, 2013, Vol. 20, No. 4, pp. 60-71.
2. Ermolov I.L., Lapin B.S. Raspredelennoe planirovanie dvizheniya dlya gruppy sovmestno
perenosyashchikh gruz robotov s uchetom svoystv opornykh poverkhnostey [Distributed motion
planning for a group of robots jointly carrying a load, taking into account the properties of
supporting surfaces], Mekhatronika, avtomatizatsiya, upravlenie [Mechatronics, automation,
control], 2023, Vol. 24, No. 6, pp. 327-334.
3. Cadena C., Carlone L., Carrillo H., Latif Y., Scaramuzza D., Neira J., Reid I., Leonard J.J.
Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-
Perception Age, IEEE Transactions on Robotics, 2016, Vol. 32, No. 6, pp. 1309-1332.
4. Vazaev A.V., Noskov V.P., Rubtsov I.V., Tsarichenko S.G. Raspoznavanie ob"ektov i tipov
opornoy poverkhnosti po dannym kompleksirovannoy sistemy tekhnicheskogo zreniya
[Recognition of objects and types of supporting surfaces according to data from an integrated
technical vision system], Izvestiya YuFU. Tekhnicheskie nauki [Izvestiya SFedU. Engineering
Sciences], 2016, No. 2, pp. 127-139.
5. Robert E.K., Gary W. Terrain Understanding for Robot Navigation, Proceedings of the 2007
IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007, pp. 895-900.
6. Angelova A., Matthies L., Helmick D., Perona P. Fast Terrain Classification Using Variable-
Length Representation for Autonomous Navigation, IEEE Conference on Computer Vision
and Pattern Recognition, 2007, pp. 1-8.
7. Cristian D., Nicolas V., Martial H. Classifier Fusion for Outdoor Obstacle Detection, Proceedings
of International Conference on Robotics and Automation, 2004, Vol. 1, pp. 665-671.
8. Jean-Francois L., Nicolas V., Daniel H., Martial H. Natural terrain classification using threedimensional
ladar data for ground robot mobility, Journal of Field Robotics, November 2006,
Vol. 23, No. 10, pp. 839-861.
9. Wolf D.F., Sukhatme G.S., Fox D., Burgard W. Autonomous Terrain Mapping and Classification
Using Hidden Markov Models, Proceedings of the IEEE International Conference on Robotics
and Automation, 2005, pp. 2026-2031.
10. RobotEye REHS25 Hyperspectral Ultrafast Broadband Spectral Scanner: Product Datasheet,
OCULAR Robotics, 2015. Available at: http://www.ocularrobotics.com/wp/wp-content/uploads/
2015/12/RobotEye-REHS25-Hyperspectral-Datasheet.pdf (accessed 04 June 2020).
11. Vishwanath S., Aswin C.S. Programmable Spectrometry -- Per-pixel Classification of Materials
using Learned Spectral Filters, Published in ArXiv. Computer Science, Engineering, 2019.
12. Lloyd W., Rishi R., Arman M., Richard J.M. Hyperspectral CNN Classification with Limited
Training Samples, BMVC 2016, 2016.
13. David B., Scott T., Anthony S., Peter R. Vegetation Detection for Mobile Robot Navigation,
Tech. Report, CMU-RI-TR-04-12, Robotics Institute, Carnegie Mellon University, 2004.
14. Wang S. Road Terrain Classification Technology for Autonomous Vehicle, 2019, 107 p.
15. Bekker M.G. Theory of Land Locomotion. The University of Michigan Press, 1962.
16. Mashkov K.Yu., Naumov V.N., Rubtsov V.I. Cistema avtomaticheskogo opredeleniya
kharakteristik grunta pri dinamicheskom vzaimodeystvii dvizhitelya MRK s opornoy
poverkhnost'yu [System for automatic determination of soil characteristics during dynamic interaction
of the MRC propulsion device with the supporting surface], Mater. Vos'moy
Vserossiyskoy nauchno-prakticheskoy konferentsii « erspektivnye sistemy i zadachi
upravleniya» [Proceedings of the Eighth All-Russian Scientific and Practical Conference “Advanced
Systems and Control Problems”]. Taganrog, 2013, pp. 87-95.
17. Mashkov K.Yu., Rubtsov V.I., Shtifanov N.V. Avtomaticheskaya sistema obespecheniya
opornoy prokhodimosti mobil'nogo robota [Automatic system for ensuring the support patency
of a mobile robot], Vestnik MGTU im. N.E. Baumana. Ser. Mashinostroenie. Vyp.
Spetsial'naya robototekhnika [Bulletin of MSTU im. N.E. Bauman. Ser. Mechanical engineering.
Vol. Special robotics], 2012, pp. 95-106.
18. Ovchinnikov A.M., Platonov A.K. Tekhnicheskoe zrenie v sistemakh upravleniya mobil'nymi
ob"ektami-2010 [Technical vision in mobile object control systems-2010], Tr. nauchnotekhnicheskoy
konferentsii-seminara [Proceedings of the scientific and technical conferenceseminar],
Issue 4, ed. by R.R. Nazirova. Moscow: KDU, 2011, pp. 216-229.
19. Taheri S., Sandu C., Taheri S., Pinto E., Gorsich D. A technical survey on terramechanics
models for tireterrain interaction used in modeling and simulation of wheeled vehicles, Journal
of Terramechanics, 2015, Vol. 57, pp. 1-22.
20. Upadhyaya S.K., Wulfsohn D., Mehlschau J. An instrumented device to obtain traction related
parameters, Journal of Terramechanics, 1993, Vol. 30, pp. 1-20.
21. Cao P., Hall E., Zhang E. Soil Sampling Sensor System on a Mobile Robot, in Proceedings of
SPIE Intelligent Robots and Computer Vision XXI: Algorithms, Techniques, and Active Vision,
2003, Vol. 5267.
22. Väljaots ., Lehiste H., Kiik M., Leemet T. Soil sampling automation using mobile robotic
platform, Agronomy Research. Estonian University of Life Sciences. Institute of Technology,
2018, Vol. 16, No. 3, pp. 917-922.
23. Wills B. The design and development of a hydraulic bevameter, Journal of Terramechanics,
1964, Vol. 1, pp. 91-97.
24. Nama J.S., Park Y.J., Kim K.U. Determination of rating cone index using wheel sinkage and
slip, Journal of Terramechanics, 2010, Vol. 47, pp. 243-248.
25. Iagnemma K., Dubowsky S. Terrain estimation for high-speed rough-terrain autonomous vehicle
navigation, In Proc. SPIE Conference on Unmanned Ground Vehicle Technology IV, 2002.
26. Marius M., David G.L. Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration,
International Conference on Computer Vision Theory and Applications
(VISAPP'09), 2009.
27. Martin A.F., Robert C.B. Random Sample Consensus: A Paradigm for Model Fitting with
Applications to Image Analysis and Automated Cartography, Comm. Of the ACM. June, 1981,
No. 24, pp. 381-395.
28. Shepel' I.O. Postroenie modeli prokhodimosti okruzhayushchey sredy po oblaku tochek
stereokamery s ispol'zovaniem ierarkhicheskoy karty vysot [Construction of a model of environmental
cross-country ability using a point cloud of a stereo camera using a hierarchical height map],
Inzhenernyy vestnik Dona [Engineering Bulletin of the Don], 2018, No. 1 (48), pp. 94-107.
29. Neuhaus F., Dillenberger D., Pellenz J., Paulus D. Terrain Drivability Analysis in 3D Laser
Range Data for Autonomous Robot Navigation in Unstructured Environments, Proceedings of
12th IEEE International Conference on Emerging Technologies and Factory Automation
(ETFA), Sep. 2009, pp. 4-9.
30. Fleischmann P., Berns K. A Stereo Vision Based Obstacle Detection System for Agricultural
Applications, 2016, pp. 217-231.
31. Li N., Ho C.P., Xue J., Lim L.W., Chen G., Fu Y.H., Lee L.Y.T. A Progress Review on Solid-
State LiDAR and Nanophotonics-Based LiDAR Sensors, Laser Photonics Rev., 2022, 16,
2100511. Available at: https://doi.org/10.1002/lpor.202100511.
32. Martin E., Hans- eter K., örg S., Xiaowei X. A density-based algorithm for discovering clusters
in large spatial databases with noise, Proceedings of the Second International Conference
on Knowledge Discovery and Data Mining (KDD-96), Evangelos Simoudis, Jiawei Han,
Usama M. Fayyad. AAAI Press, 1996, pp. 226-231.
33. Lapin B.S., Ermolov I.L., Sobolnikov S.A. THE simply integrated approach for surface parameters
detection by UGV, Extreme Robotics, 2019, Vol. 1, No. 1, pp. 137-144.
34. Forrest R.M., Neal S., Alonzo K. Continuous Vehicle Slip Model Identification on Changing
Terrains, Proceedings of RSS 2012 Workshop on Long-term Operation of Autonomous Robotic
Systems in Changing Environments, July 2012.
35. Ojeda L., Cruz D., Reina G., Borenstein J. Current-Based Slippage Detection and Odometry
Correction for Mobile Robots and Planetary Rovers, IEEE Transactions on Robotics, April
2006, Vol. 22, No. 2, pp. 366-378.
36. Andreas G., Philip L., Christoph S., Raquel U. Vision meets Robotics: The KITTI Dataset,
International Journal of Robotics Research (IJRR), 2013.
Published
2024-04-15
Section
SECTION I. PROSPECTS FOR THE APPLICATION OF ROBOTIC COMPLEXES