Target detection using features for sonar images

Peter Tueller, Ryan Kastner, Roee Diamant

Research output: Contribution to journalArticlepeer-review

Abstract

Robust object detection in sonar images is an important task for underwater exploration, navigation and mapping. Current methods make assumptions about the shape, highlight or shadow of an object, which may be invalid for some environments or targets. We focus on the area of feature extraction-based detection, which does not rely on information about the shape of the target, towards a robust framework for target detection for a variety of seabed structures and target types. The proposed framework first estimates the seabed type from the spatial distribution of features to determine the set of optimal parameters, and then obtains a set of features which are filtered according to intensity and distribution to yield a detection decision. The proposed method also provides a means to determine the seabed type, and a machine-learning based methodology to choose the feature detectors' parameters to match the evaluated seabed type. We report the performance of a variety of feature detectors for a simulated environment and of one feature detector for real sonar images. Results show the importance of choosing the parameters of the feature extractors based on the current environmental conditions and the proposed method obtains a favourable tradeoff between detection and false alarm rates.

Original languageEnglish
Pages (from-to)1888-1896
Number of pages9
JournalIET Radar, Sonar and Navigation
Volume14
Issue number12
DOIs
StatePublished - 1 Dec 2020

Bibliographical note

Funding Information:
This work was sponsored in part by the NATO Science for Peace and Security Programme under grant G5293

Publisher Copyright:
© The Institution of Engineering and Technology 2020.

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Target detection using features for sonar images'. Together they form a unique fingerprint.

Cite this