Google has revolutionized marine biology research with the development of a new AI tool called SurfPerch, aimed at helping scientists better understand coral reef ecosystems and their health. This breakthrough technology, created in collaboration with Google Research and DeepMind, has been trained on thousands of hours of audio reef recordings, allowing researchers to “hear reef health from the inside” and track reef activity even in deep or murky waters.
The project began by engaging the public in listening to reef sounds through Google’s Calling in our Corals website. Over the past year, visitors listened to over 400 hours of reef audio from various sites worldwide and identified fish sounds, contributing to a valuable “bioacoustic” data set focused on reef health. By crowdsourcing this activity, Google was able to create a library of new fish sounds that fine-tuned the AI tool, SurfPerch, enabling it to quickly detect any new reef sound.
“This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation efforts,” stated a Google blog post co-authored by marine biology experts Steve Simpson and Ben Williams.
Furthermore, researchers discovered that leveraging bird recordings could enhance SurfPerch’s model performance, leading to groundbreaking findings such as uncovering differences between protected and unprotected reefs in the Philippines and tracking restoration outcomes in Indonesia. The project continues to evolve as new audio is added to the Calling in Our Corals website, further enhancing the AI model’s capabilities and advancing coral reef conservation efforts.