Even specialists must conduct time-consuming sound recording analysis to estimate the status of a reef due to the intricacy of coral reef acoustics.
Researchers from the University of Exeter developed a computer system based on many recordings of healthy and deteriorating reefs in the current study. As a result, the computer was able to tell the difference between the two.
The computer then analysed a vast number of new recordings and was able to predict the status of the reef 92 percent of the time using that information.
The data was utilised by the researchers to assess how well reef restoration operations were carried out.
Ben Williams, the study’s lead author, claims that climate change is one of the many threats to coral reefs. As a result, it’s vital to keep track of their health and the effectiveness of conservation efforts.
According to Ben, one of the most important difficulties is that most reef visual and aural assessments rely on approaches that necessitate a significant amount of physical labour.
Visual surveys are difficult to do since many coral species hide or are active at night, and evaluating the health of a reef by listening to individual recordings is difficult due to the diversity of the reef’s acoustic environment.
Coral reef ‘songs’
The team decided to use machine learning to explore if a computer could recognise the reef’s song as a solution to this problem.
The findings demonstrate that a computer can detect patterns that a human ear cannot. It has the potential to give us more up-to-date and precise information regarding the state of the reef.
Coral reefs are home to a wide range of creatures, including fish, each of which has its own distinct sound.
Despite the fact that the bulk of these screams are unknown, a recently developed AI algorithm can distinguish between the overall sounds of healthy and unhealthy reefs.
The recordings utilised in the study were made as part of the Mars Coral Reef Restoration Project in Indonesia. The goal of the project is to rehabilitate badly damaged reefs.
Artificial intelligence technology, according to Dr. Tim Lamont of Lancaster University, a co-author on the work, offers tremendous potential to improve coral reef monitoring.
Sound recorders and artificial intelligence (AI), according to Dr. Lamont, could be used to monitor the health of reefs and evaluate whether or not efforts to protect and restore them have been successful.
It is frequently easier and less expensive to install an underwater hydrophone on a reef and leave it there rather than sending expert divers to scan the coral on a regular basis, especially in more isolated locations.
The Natural Environment Research Council and the Swiss National Science Foundation contributed to the study’s funding.
Story Source: Original press release by University of Exeter. Note: Content may be edited for style and length by Scible News.
Ben Williams, Timothy A.C. Lamont, Lucille Chapuis, Harry R. Harding, Eleanor B. May, Mochyudho E. Prasetya, Marie J. Seraphim, Jamaluddin Jompa, David J. Smith, Noel Janetski, Andrew N. Radford, Stephen D. Simpson. Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning. Ecological Indicators, 2022; 140: 108986 DOI: 10.1016/j.ecolind.2022.108986