To develop autonomous systems for the maritime industry, a lot of research needs to be done. A broad range of new and existing technologies need to be integrated into 1 system. Among others: sensor systems, laser scanners, object recognition, ship control system, manoeuvre- & trajectory planning, collision avoidance, localization, SLAM, mapping and HMI.
We are experienced autonomous vehicle developers, and want to put the same technologies to work on the water. But how do these systems perform their job on a ship, how does the maritime environment influence sensor data?
Project Sensing focuses on these questions, as a result we are developing a prototype of an on-board sensor- and data acquisition system. With it, we are able to test automotive grade sensors and object recognition software, we gain insight in the required alterations and development of the algorithms for later use in autonomous systems.
This project is supported by: Samenwerkingsverband Noord Nederland (SNN) and the European Regional Development Fund (Europees Fonds voor Regionale Ontwikkeling (EFRO))
This page shows various sensor system setups we use on a daily basis.
Various sensors and setups are being used on a wide variety of ships and environments.
The video shows some raw data of video and lidar sensors mounted on a sailing vessel.
The lidar sensors detect shore, ships and obstacles in the water surprisingly good.
Machine learning: basic real-time video segmentation using Tensorflow show good results.
Each sensor technology has it's pros and cons. Knowing video is not sufficient for detecting all object (especially at night) and lidar is challenging in fog, we use infrared cameras in various maritime situations and environments.
In order to gain an overview of the current situation, we use 360 degrees cams. With live, remote access we are able to create a situational awareness around the unmanned ship on a distant location.