A SLAM Approach to Combine Optical and Sonar Information From an AUV

Research output: Contribution to journalArticlepeer-review

Abstract

This article proposes a simultaneous localization and mapping (SLAM) solution that combines inputs from sonar and optical images. Our solution for this navigation problem is solved by matching objects detected by the AUV's camera with objects identified within the SAS image. In particular, upon detecting an object using the AUV's camera, e.g., an underwater structure or even a rock, we rank the similarity of this object to all objects found within the SAS image. Taking a SLAM approach, the decision for the best match, and thus the position of the AUV within the SAS image, is made based not only on the current detected object, but also on similarity ranking previously detected objects while accounting for the AUV's self-measured heading. The solution is handled by a tracking mechanism that considers the possible positions of the objects within the SAS image as problem states, and modeling the state relations using a hidden Markov chain. Experimental results on real sonar and optical images implemented within a simulated environment show high accuracy in matching the location of an AUV within an SAS image in multiple scenarios.

Original languageEnglish
Pages (from-to)7714-7724
Number of pages11
JournalIEEE Transactions on Mobile Computing
Volume23
Issue number7
DOIs
StatePublished - 1 Jul 2024

Bibliographical note

Publisher Copyright:
© 2002-2012 IEEE.

Keywords

  • Autonomous underwater vehicle
  • Viterbi algorithm
  • feature extraction
  • multi-modal detection
  • optical-sonar
  • simultaneous localization and mapping
  • sonar detection

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A SLAM Approach to Combine Optical and Sonar Information From an AUV'. Together they form a unique fingerprint.

Cite this