In ocean research, interpreting and tagging the images generated by echo sounders in a fish survey is a tedious, time-consuming task. Now Norwegian marine scientists are turning it over to artificial intelligence (AI).
During surveys, scientists follow set routes and “count” fish using echo sounders. The echo sounders emit sound waves into the water and listen for their echoes.
Each species of fish has a unique echo signature, or frequency response, which enables scientists to determine which fish they are seeing. This, in turn, allows them to estimate the quantity of fish in the ocean and provide fisheries advice on how much can be harvested sustainably.
However, interpreting and tagging the images produced by the echo sounders is monotonous and time-consuming. Researchers at the Institute of Marine Research (IMR) and the Norwegian Computing Center have therefore tried passing the job on to an AI. Their results have been published in a new article.
The researchers at the IMR and Norwegian Computing Center trained the AI for the IMR’s annual survey of sandeels (Ammodytes marinus, also known as sand lances) in the North Sea.
The AI works by converting the echo sounder data into shapes and colours in several stages. This process eventually gives it more abstract information about what it sees in the image
“We have trained the AI using echo sounder images from five of the 11 years that we have been doing our annual sandeel survey. Our human colleague Ronald Pedersen had previously identified the fish and tagged all the images,” explains marine scientist Nils Olav Handegard of the department of marine ecosystem acoustics at the IMR.
“We then let the AI tag the remaining six years of echo sounder images as a ‘graduation examination’.”
On a scale of 0-10, the AI scored 8.7 for its ability to distinguish between the categories sandeel, empty water, and fish other than sandeel. This score was based on how closely the AI matched Ronald’s interpretation, which was considered the right answer.
“The data from the sandeel surveys provided a great starting point for this. They have been categorised carefully by a single person, who has been looking for a specific species,” explains Handegard.
Nils Olav Handegard, Institute of Marine Research
By training the artificial intelligence to do this task, we have in a sense coded the technician’s knowledge digitally.
On 23 April 2020, the IMR marine scientists set out on another acoustic survey of sandeel in the North Sea. This year, AI will analyse the echo sounder data in parallel with Ronald.
“The aim isn’t to replace all humans, but to develop an artificial intelligence that is almost equally good and that can share the load. Particularly when we scale up our monitoring of resources in the future using autonomous vessels.”
The IMR has also been testing autonomous sailing drones on the sandeel survey.
“This is undoubtedly part of the future of data collection. It is cheap, environmentally friendly and scalable,” Handegard says.
“The raw data are too big to send back via satellite. But if the vessels are equipped with artificial intelligence to interpret what they’re seeing, they can send their results back in real time,” he says.
Marine scientists will continue to improve the artificial sandeel researcher, and they are working on expanding its use to other species and surveys.
“We also hope to start using so-called unsupervised learning. This involves not defining the correct answers that the artificial intelligence has to match, but rather giving it loads of data that it has to make sense of by itself. Then it can become even better than us. But that is for the future,” concludes Handegard.
Brautaset, Olav, Anders Ueland Waldeland, Espen Johnsen, Ketil Malde, Line Eikvil, Arnt-Børre Salberg and Nils Olav Handegard. "Acoustic classification in multifrequency echosounder data using deep convolutional neural networks." ICES Journal of Marine Science (2020). URL: https://doi.org/10.1093/icesjms/fsz235
The original version of this article was published on the Institute of Marine Research website on 11 May 2020. Author: Erlend A. Lorentzen.