Abstract
Autonomous underwater vehicles (AUVs) are typically programmed to follow routes based on predefined waypoints and depth profiles. However, in complex and unpredictable environments-such as coral reefs, offshore structures, or ship-wrecks-AUVs can encounter unexpected obstacles that pose risks to both the vehicle and its surroundings. To navigate and avoid unexpected obstacles, AUVs operating in such environments are often equipped with forward-looking sonars (FLS). However, standard FLS sensors are typically limited in resolution and can only provide 2D information on bearing and range, restricting their effectiveness in facilitating navigation in complex environments. Vision cameras, on the other hand, offer high-resolution data with bearing and elevation information, but when using a single-camera setup, they cannot reliably provide distance information. This study introduces a comprehensive framework for the fusion of forward-looking camera (FLC) and FLS data, using a projection of FLS data into the FLC frame and incorporating data from a trained self-supervised network.
Original language | English |
---|---|
Title of host publication | OCEANS 2024 - Halifax, OCEANS 2024 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9798331540081 |
State | Published - 2024 |
Event | OCEANS 2024 - Halifax, OCEANS 2024 - Halifax, Canada Duration: 23 Sep 2024 → 26 Sep 2024 |
Publication series
Name | Oceans Conference Record (IEEE) |
---|---|
ISSN (Print) | 0197-7385 |
Conference
Conference | OCEANS 2024 - Halifax, OCEANS 2024 |
---|---|
Country/Territory | Canada |
City | Halifax |
Period | 23/09/24 → 26/09/24 |
Bibliographical note
Publisher Copyright:© 2024 IEEE.
Keywords
- Autonomous underwater vehicles (AUVs)
- forward-looking sonar
- obstacle avoidance
- sensor fusion
- underwater image processing
- underwater navigation
ASJC Scopus subject areas
- Oceanography
- Ocean Engineering