Feature-Based Stereo Vision Relative Positioning Strategy for Formation Control of Unmanned Aerial Vehicles
Yousef Yaghoobi1, Muhammad Rijaluddin Bahiki2, Syaril Azrad3

1Yousef Yaghoobi, Department of Aerospace Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia.
2Muhammad Rijaluddin Bahiki, Department of Aerospace Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia.
3Syaril Azrad*, Department of Aerospace Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia.

Manuscript received on November 14, 2019. | Revised Manuscript received on 25 November, 2019. | Manuscript published on December 10, 2019. | PP: 1613-1617 | Volume-9 Issue-2, December 2019. | Retrieval Number: B7345129219/2019©BEIESP | DOI: 10.35940/ijitee.B7345.129219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract:  As inspired by birds flying in flocks, their vision is one of the most critical components to enable them to respond to their neighbor’s motion. In this paper, a novel approach in developing a Vision System as the primary sensor for relative positioning in flight formation of a Leader-Follower scenario is introduced. To use the system in real-time and on-board of the unmanned aerial vehicles (UAVs) with up to 1.5 kilograms of payload capacity, few computing platforms are reviewed and evaluated. The study shows that the NVIDIA Jetson TX1 is the most suited platform for this project. In addition, several different techniques and approaches for developing the algorithm is discussed as well. As per system requirements and conducted study, the algorithm that is developed for this Vision System is based on Tracking and On-Line Machine Learning approach. Flight test has been performed to check the accuracy and reliability of the system, and the results indicate the minimum accuracy of 83% of the vision system against ground truth data. 
Keywords: Flight formation, Unmanned Aerial Vehicle, Vision System, on-line Machine Learning, Leader-Follower.
Scope of the Article: Machine Learning