If you have ever dived into a coral reef environment, you will know the distinctive clicking and popping sounds made by various underwater marine creatures.

This noise is a unique feature in the acoustic landscape of coral reefs, and can help us monitor the health of these endangered marine habitats.

Scientists tend to associate healthy coral reefs with their visual splendor, the vibrant array of colors and shapes that inhabit these beautiful underwater ecosystems.

In a new study to be published in Ecological Indicators this July, scientists used machine learning to train an algorithm to recognize subtle acoustic differences between healthy, vibrant corals and degraded sites—a faint acoustic contrast that may be from It is impossible for people to recognize it.

Scientists tend to associate healthy corals with their visual splendor (Getty Images)

"Coral reefs face multiple threats, including climate change, so monitoring their health and the success of conservation projects is vital," says marine biologist Benn Williams of the University of Exeter in the UK - in a university press release.

"One of the main difficulties is that visual and acoustic reef surveys usually rely on labour-intensive methods. Visual surveys are also limited by the fact that many reef organisms hide themselves during the day, or are active at night, while the complexity of reef sounds has made it difficult to Determine coral health using individual recordings.

coral reef song

Coral reefs have a complex acoustic landscape, and even experts have to perform careful analysis to measure reef health based on acoustic recordings.

The new research shows that AI can track the health of coral reefs by learning the 'coral song'.

In the new study, University of Exeter scientists trained a computer algorithm using multiple recordings of the sounds of healthy and deteriorating corals, allowing the machine to tell the difference.

Artificial intelligence method offers great opportunities to improve coral reef monitoring (Pixabe)

The recordings used in the study were taken in a project working to restore severely damaged coral reefs in Indonesia.

Compared to other labor-intensive and time-consuming processes of monitoring the health of coral reefs, the new tool could offer significant advantages, the team suggests.

Besides, many reef organisms hide themselves or are only seen at night, further complicating any visual surveys.

Co-author and marine biologist Timothy Lamont from Lancaster University in the UK says the AI ​​method offers significant opportunities to improve coral reef monitoring. "This is a really exciting development," says co-author and marine biologist Timothy Lamont from Lancaster University in the UK.

Voice recorders and artificial intelligence can be used around the world to monitor the health of coral reefs, and detect if attempts to protect and restore them are working.

"Our results show that a computer can pick up patterns that cannot be detected by the human ear," marine biologist Ben Williams said.

"It can tell us faster and more accurately how corals work," he added.

Scientists trained an algorithm to record the sounds of healthy and degraded coral reefs (Ecological Indicators)

To capture coral acoustics, Williams and fellow researchers made recordings at 7 different locations in the Spermond archipelago, located off the southwest coast of Sulawesi in Indonesia. A computer then analyzed a set of new recordings, successfully identifying the health of the coral.

The team used this technique to track the progress of coral reef restoration projects.

To "automate" the process, the team trained a machine-learning algorithm to distinguish between different types of coral recordings, and subsequent tests showed that the AI ​​tool could determine the health of coral reefs from audio recordings with an accuracy of 92%.

Big advantages of the new method

Fish and other creatures that live on coral reefs make a wide range of sounds, and the meaning of many of these calls is still unknown, but a new AI method can distinguish between the aggregate sounds of healthy and unhealthy corals.

The recordings included 4 distinct types of coral reef habitats (healthy, degraded, mature and newly restored), each of which showed a different amount of coral cover, thus generating a different character of noise from the aquatic organisms that live and feed in the area.

According to the researchers, the algorithm results depend on a range of underwater acoustic factors, including the abundance and diversity of fish sounds, sounds made by invertebrates, and even faint sounds believed to be made by algae, along with contributions made by abiotic sources, such as The subtle differences in how waves and winds sound across different types of coral habitats.

While the human ear may not be able to easily recognize such faint and hidden sounds, machines can apparently reliably detect differences, although the researchers acknowledge that the method can still be improved further, with larger sound samples taking in the future. It is expected to provide a "more accurate approach to ecological classification".