This new AI can detect the calls of animals swimming in an ocean of noise

The ocean is swimming in sound, and a new artificial intelligence tool could help scientists sift through all that noise to track and study marine mammals.

The tool is called DeepSqueak, not because it measures dolphin calls in the ocean underworld, but because it is based on a deep learning algorithm that was first used to categorize the different ultrasonic squeals of mice.

Now, researchers are applying the technology to vast datasets of marine bioacoustics.

Given that much of the ocean is out of our physical reach, underwater sound could help us understand where marine mammals swim, their density and abundance, and how they interact with one another.

Already, recordings of whale songs have helped identify an unknown population of blue whales in the Indian Ocean and a never-before-heard species of beaked whale.

But listening to recordings of the ocean, and trying to pick out animal noise from hours of waves, wind, and boat engines is slow and painstaking work.

That’s where DeepSqueak comes in. The technology was recently presented at the 182nd Meeting of the Acoustical Society of America, and is designed to classify underwater acoustic signals faster and more accurately than anything other method to date.

DeepSqueak combs through sound data in the ocean and creates what look like heat maps, based on where certain acoustic signals are heard and at what frequency.

Those signals are then sourced to a specific animal. 

“Although we used DeepSqueak to detect underwater sounds, this user-friendly, open source tool would be useful for a variety of terrestrial species,” says Elizabeth Ferguson, the CEO and founder of Ocean Science Analytics, who presented the research.

“The capabilities of call detection extend to frequencies below the ultrasonic sounds it was originally intended for. Due to this and the capability of DeepSqueak to detect variable call types, development of neural networks is possible for many species of interest.”

Marine acoustic noise has never been easier to collect, but as hours of ocean soundscapes pile up in databases around the world, scientists need to figure out how to use that information most effectively.

DeepSqueak could be a possible alternative to the human ear, allowing researchers to classify sounds and study them right around the world with incredible efficiency.

The fully automated tool has consistently been able to detect the calls of specific marine mammals, like humpback whales, delphinids and fin whales, during tests.

It can also pick out these animals’ calls amongst background noise, which is important given that anthropogenic sound is turning up the volume in the ocean.

DeepSqueak was first introduced in 2019 as a way to analyze the rich repertoire of ultrasonic vocalizations employed by rats and mice.

Sifting through a series of squeaky recordings, the tool was able to identify a wide range of syllabic sounds, and these short mouse calls appear to be arranged in different ways depending on the context in which they are used.

The results could help scientists study how certain syllables and syntax may communicate unique information in the mouse world. For instance, the sounds a mouse makes in some situations could be used to convey fear, anxiety, or depression.

By reliably linking contextual information to certain vocal signals, DeepSqueak could allow scientists to better study the nuances between animal vocalizations and behavior – even in remote ocean underworlds where some of the planet’s most elusive animals swim.

The research will be presented at the 182nd Meeting of the Acoustical Society of America.

Products You May Like

Articles You May Like

Millions of Dead Fish Blanket Australian River in Hypoxia Disaster
The ‘Rapunzel’ Virus Has a Freakishly Long Tail, And We Finally Know Why
This Incredible Dinosaur Had The Longest Neck Known to Science
Terminator Zones on Harsh Planets May Sustain Life in an Endless Twilight
Ancient Siberian Bear Reveals an Unexpected Twist on Close Inspection

Leave a Reply

Your email address will not be published. Required fields are marked *