Image-based locating and guiding for unmanned aerial vehicles using scale invariant feature transform, speeded-up robust features, and oriented fast and rotated brief algorithms


Bal B., Erdem T., Kul S., SAYAR A.

CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, cilt.34, sa.9, 2022 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 34 Sayı: 9
  • Basım Tarihi: 2022
  • Doi Numarası: 10.1002/cpe.6766
  • Dergi Adı: CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Applied Science & Technology Source, Communication Abstracts, Compendex, Computer & Applied Sciences, INSPEC, Metadex, zbMATH, Civil Engineering Abstracts
  • Anahtar Kelimeler: geographical positioning, geolocation, image processing, scale invariant feature transform, speeded-up robust features, oriented FAST and rotated BRIEF, UAV
  • Kocaeli Üniversitesi Adresli: Evet

Özet

This study provides a method for determining the location information from a photograph taken by an Unmanned Aerial Vehicle (UAV). In today's defense industry, the goal is to design a reliable navigation system that can operate even when the Internet and GPS systems are unavailable. In the proposed system, it is expected that there is an image database in which each image has its own metadata in JavaScript Object Notation format that defines the geolocation information for the corresponding image. In real-time, UAV flies captures an image and uses a matching algorithm to compare it with each and every photo in the database. The image with the highest match rate in the database and its metadata is returned. The UAV extracts its own location information from the metadata. For image matching, Scale Invariant Feature Transform, Speeded-Up Robust Features, and Oriented FAST and Rotated BRIEF algorithms are used. The proposed system is tested and evaluated by using a real-world dataset obtained from Flickr's city images. The proposed approach is proved to work with a success rate of 97.14%.