Comparative Review of Navigation Systems for Indoor Autonomous Unmanned Aerial Vehicles
https://doi.org/10.32603/1993-8985-2024-27-4-6-18
Abstract
Introduction. In recent years, unmanned aerial vehicles (UAVs) have been a rapidly advancing field. In all areas of UAV application, positioning accuracy is of particular importance. For outdoor environments, satellite navigation systems (such as GPS) are always the method of choice. However, for indoor environments, GPS signal weakening becomes a serious obstacle for determining the UAV location. A number of studies have been carried out to develop various indoor positioning technologies that meet the criteria of compactness and light weight, thus suitable for small aircrafts, including optical flows, inertial measurement systems, ultrasound, etc. However, there is a lack of comparative studies reviewing indoor positioning technologies for autonomous UAVs. The existing reviews fail to provide a comprehensive assessment of such technologies and their operational principles according to the main criteria. In this connection, this paper aims to review modern indoor positioning technologies and their operational principles, conducting evaluation according to such criteria as accuracy, operating range, cost. The assessment of promising machine vision-based technologies is carried out.
Aim. To classify modern indoor navigation technologies for UAVs; to assess the technologies under consideration according to various criteria.
Materials and methods. The current technologies for UAV indoor positioning were classified by the signal type used for connection and their capability to process information without external signals. The technologies were assessed according to the following criteria: accuracy, operating range, cost, as well as their advantages and disadvantages.
Results. А classification and evaluation table of UAV indoor positioning technologies is proposed; a review of the current developments in the field is given.
Conclusion. A review of UAV indoor positioning technologies has been carried out. In addition, the prospects of machine vision-based technologies are outlined.
About the Authors
A. M. BoronakhinRussian Federation
Alexander M. Boronakhin – Dr Sci. (Eng.) (2013), Professor (2020), Professor of the Department of Laser Measuring and Navigation Systems, Dean of the Faculty of Information Measuring and Biotechnical Systems,
5 F, Professor Popov St., St Petersburg 197022.
Quoc Khanh Nguyen
Viet Nam
Nguyen Quoc Khanh – Master of Science in Instrumentation Engineering (2020), PhD student
236, Hoang Quoc Viet, Co Nhue, Bac Tu Liem, Hanoi.
Trong Yen Nguyen
Viet Nam
Nguyen Trong Yen – PhD in Electronics, Photonics, Instrumentation and Communications (2023), member of the On-Board Control System Department,
18, Hoang Quoc Viet, Cau Giay, Hanoi.
References
1. Kanellakis C., Nikolakopoulos G. Survey on Computer Vision for UAVs: Current Developments and Trends. J. of Intell Robot Syst. 2017, vol. 87, pp. 141– 168. doi: 10.1007/s10846-017-0483-z
2. Chao H., Gu Y., Napolitano M. A Survey of Optical Flow Techniques for UAV Navigation Applications. Intern. Conf. on Unmanned Aircraft Systems (ICUAS), Atlanta, USA, 28–31 May 2013. IEEE, 2013, pp. 710–716. doi: 10.1109/ICUAS.2013.6564752
3. Yuntian L., Scanavino M., Capello E., Dabbene F., Guglieri G., Vilardi A. A Novel Distributed Architecture for UAV Indoor Navigation. Transportation Research Procedia. 2018, vol. 35, pp. 13–22. doi: 10.1016/j.trpro.2018.12.003
4. Pérez M. C., Gualda D., Vicente-Ranera J., Villadangos J. M., Ureña J. Review of UAV Positioning in Indoor Environments and New Proposal Based on US Measurements. Intern. Conf. on Indoor Positioning and Indoor Navigation, Italy, 10 March 2019. CEUR, Ljubljana, Slovenia, 2019, pp. 1–8.
5. Muhammad A. Comparative Study of Indoor Navigation Systems for Autonomous Flight. Telecommunication Computing Electronics and Control. 2018, vol. 16, no. 1, pp. 118–128. doi: 10.12928/telkomnika.v16i1.6814
6. Roth S., Black M. J. On the Spatial Statistics of Optical Flow. Intern. J. of Computer Vision. 2007, vol. 74, no. 1, pp. 33–50. doi: 10.1109/ICCV.2005.180
7. Black M. J., Anandan P. The Robust Estimation of Multiple Motions: Parametric and Piecewise-Smooth Flow Fields. Computer Vision and Image Understanding. 1996, vol. 63, no. 1, pp. 75–104. doi: 10.1006/cviu.1996.0006
8. Lucas B. D., Kanade T. An Iterative Image Registration Technique with an Application to Stereo Vision. Proc. of Imaging Understanding Workshop. 1981, pp. 121–130.
9. Horn B., Schunck B. Determining Optical Flow. Artificial Intelligence. 1981, vol. 17, pp. 185–203. doi: 10.1016/0004-3702(81)90024-2
10. Srinivasan M. V. An Image Interpolation Technique for the Computation of Optical Flow and Egomotion. Biological Cybernetics. 1994, vol. 71, pp. 401–415. doi: 10.1007/BF00198917
11. Kendoul F., Fantoni I., Nonami K. Optic FlowBased Vision System for Autonomous 3d Localization and Control of Small Aerial Vehicles. Robotics and Autonomous Systems. 2009, vol. 57, no. 6, pp. 591–602. doi: 10.1016/j.robot.2009.02.001
12. Lowe D. G. Distinctive Image Features from Scale-Invariant Key-Points. Intern. J. of Computer Vision. 2004, vol. 2, no. 60, pp. 91–110. doi: 10.1023/B%3AVISI.0000029664.99615.94
13. Yu Z., Tingting W., Zhihao C., Yingxun W., Zhenxing Y. The Use of Optical Flow for UAV Motion Estimation in Indoor Environment. IEEE Chinese Guidance, Navigation and Control Conf., Nanjing, 12– 14 Aug. 2016. IEEE, 2016. P. 785–790. doi: 10.1109/CGNCC.2016.7828885
14. Fausto F., Ju-Hyeon H. Visual Inertial Navigation for a Small UAV Using Sparse and Dense Optical Flow. Workshop on Research, Education and Development of Unmanned Aerial Systems (RED UAS), Cranfield, UK, 25–27 Nov. 2019. IEEE, 2019, pp. 206– 212. doi: 10.1109/REDUAS47371.2019.8999672
15. Rudol P., Wzorek M., Conte G., Doherty P. Micro Unmanned Aerial Vehicle Visual Servoing for Cooperative Indoor Exploration. Aerospace Conf., Sky, USA, 01–08 March 2008. IEEE, 2008, pp. 1–10. doi: 10.1109/AERO.2008.4526558
16. Paredes J. A., Álvarez F. J., Aguilera T., Villadangos J. M. 3D Indoor Positioning of UAVs with Spread Spectrum Ultrasound and Time-of-Flight Cameras. Sensors. 2017, vol. 18, no. 1, p. 89. doi: 10.3390/s18010089
17. Gualda D., Ureña J., Pérez M. C., Posso H., Bachiller S., Nieto R. 3D Position Estimation of an UAV in Indoor Environments using an Ultrasonic Local Positioning System. 9th Intern. Conf. on Indoor Positioning and Indoor Navigation, Nantes, France, 24– 27 Sept. 2018. IFSTTAR, 2018, p. 212808.
18. Daly D., Melia T., Baldwin G. Concrete Embedded RFID for Way‐Point Positioning. Intern. Conf. on Indoor Positioning and Indoor Navigation (IPIN), Zurich, Switzerland, 15–17 Sept. 2010. IEEE, 2010, pp. 1–10.
19. Tiemann J., Wietfeld C. Scalable and Precise Multi-UAV Indoor Navigation Using TDOA-Based UWB Localization. Intern. Conf. on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 Sept. 2017. IEEE, 2017, pp. 1–7. doi: 10.1109/IPIN.2017.8115937
20. Perez-Grau F. J., Caballero F., Merino L., Viguria A. Multi-Modal Mapping and Localization of Unmanned Aerial Robots Based on Ultra-Wideband and RGB-D Sensing. IEEE/RSJ Intern. Conf. on Intelligent Robots and Systems (IROS), Vancouver, Canada, 24–28 Sept. 2017. IEEE, 2017, pp. 3495–3502. doi: 10.1109/IROS.2017.8206191
21. Casati G., Longhi M., Latini D., Carbone F., Amendola S., Frate F., Schiavon G., Marrocco G. The Interrogation Footprint of RFID-UAV: Electromagnetic Modelling and Experimentations. IEEE J. of Radio Frequency Identification. 2017, vol. 1, iss. 2, pp. 155– 162. doi: 10.1109/JRFID.2017.2765619
22. Alajami A., Moreno G., Pous R. Design of a UAV for Autonomous RFID-Based Dynamic Inventories Using Stigmergy for Mapless Indoor Environments. Drones. 2022, vol. 6, iss. 8, p. 208. doi: 10.3390/drones6080208
23. Vaupel T., Seitz J., Kiefer F., Haimerl S., Thielecke J. Wi‐Fi Positioning: System Considerations and Device Calibration. Intern. Conf. on Indoor Positioning and Indoor Navigation (IPIN), Zurich, Switzerland, 15–17 Sept. 2010. IEEE, 2010, pp. 1–7. doi: 10.1109/IPIN.2010.5646207
24. Stojkoska B. R., Palikrushev J., Trivodaliev K., Kalajdziski S. Indoor Localization of Unmanned Aerial Vehicles Based on RSSI. 17th Intern. Conf. on Smart Technologies. IEEE EUROCON, Ohrid, Macedonia, 06–08 July 2017. IEEE, 2017, pp. 120–125. doi: 10.1109/EUROCON.2017.8011089
25. Zhou M., Lin J., Liang S., Du W., Cheng L. A UAV Patrol System Based on Bluetooth Localization. 2nd Asia-Pacific Conf. on Intelligent Robot Systems, Wuhan, China, 16–18 June 2017. IEEE, 2017, pp. 205–209. doi: 10.1109/ACIRS.2017.7986094
26. Gomes L. L., Leal L., Oliveira T. R., Cunha J. P. V. S., Revoredo T. C. Unmanned Quadcopter Control Using a Motion Capture System. IEEE Latin America Transactions. 2016, vol. 14, iss. 8, pp. 3606–3613. doi: 10.1109/TLA.2016.7786340
27. Scaramuzza D., Achtelik M. C., Doitsidis L. et al. Vision-Controlled Micro Flying Robots. IEEE Robotics & Automation Magazine. 2014, vol. 21, iss. 3, pp. 26–40. doi: 10.1109/MRA.2014.2322295
28. Ghassemlooy Z., Popoola W., Rajbhandari S. Optical Wireless Communications: System and Channel Modelling with MATLAB®. 2nd Ed. Boca Raton, 2019, 540 p. doi: 10.1201/9781315151724
29. Opromolla R., Fasano G., Rufino G., Grassi M., Savvaris A. LIDAR-Inertial Integration for UAV Localization and Mapping in Complex Environments. Intern. Conf. on Unmanned Aircraft Systems (ICUAS), Arlington, USA, 07–10 June 2016. IEEE, 2016, pp. 649–656. doi: 10.1109/ICUAS.2016.7502580
30. Ole R., Lars G. LiDAR from Drones Employed for Mapping Archaeology – Potential, Benefits and Challenges. Archaeological Prospection. 2018, vol. 25, iss. 4, pp. 329–338. doi: 10.1002/arp.1712
31. Li R., Liu J., Zhang L., Hang Y. LIDAR/MEMS IMU Integrated Navigation (SLAM) Method for a Small UAV in Indoor Environments. DGON Inertial Sensors and Systems (ISS), Karlsruhe, Germany, 16–17 Sept. 2014. IEEE, 2014, pp. 1–15. doi: 10.1109/InertialSensors.2014.7049479
32. Hoppe M., Burger M., Schmidt A., Kosch T. DronOS: Aflexible Open-Source Prototyping Framework for Interactive Drone Routines. Intern. Conf. on Mobile and Ubiquitous Multimedia (MUM), 26 Nov. 2019, pp. 1–7. doi: 10.1145/3365610.3365642
33. Greiff M., Robertsson A., Berntorp K. Performance Bounds in Positioning with the Vive Lighthouse System. IEEE Intern. Conf. on Information Fusion (FUSION), Ottawa, Canada, 02–05 July 2019. IEEE, 2019, pp. 1–8. doi: 10.23919/FUSION43075.2019.9011242
34. Niehorster D. C., Li L., Lappe M. The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. I-Perception. 2017, vol. 8, iss. 3. doi: 10.1177/2041669517708205
35. Ikbal M. S., Ramadoss V., Zoppi M. Dynamic Pose Tracking Performance Evaluation of HTC Vive Virtual Reality System. IEEE Access. 2021, vol. 9, pp. 3798–3815. doi: 10.1109/ACCESS.2020.3047698
36. Taffanel A., Rousselot B., Danielsson J., McGuire K., Richardsson K., Eliasson M., Antonsson T., Hönig W. Lighthouse Positioning System: Dataset, Accuracy, and Precision for UAV Research. Available at: https://arxiv.org/pdf/2104.11523 (accessed: 12.08.2024)
37. OptiTrack Motion Capture System. Available at: https://www.sdu.dk/en/forskning/sduuascenter/aboutsduuascenter/sduuastestcenter/motioncapturelab (accessed: 10.04.2022)
38. OptiTrack for Robotics. Available at: https://optitrack.com/applications/robotics/ (accessed: 20.01.2024).
Review
For citations:
Boronakhin A.M., Nguyen Q.Kh., Nguyen T.Ye. Comparative Review of Navigation Systems for Indoor Autonomous Unmanned Aerial Vehicles. Journal of the Russian Universities. Radioelectronics. 2024;27(4):6-18. (In Russ.) https://doi.org/10.32603/1993-8985-2024-27-4-6-18