Google DeepMind has released an updated version of Perch, an AI model designed to help conservationists analyze bioacoustic data from endangered species and ecosystems. The new model features improved bird species predictions, better adaptation to underwater environments like coral reefs, and training on nearly twice as much data covering mammals, amphibians, and anthropogenic noise.
What you should know: The updated Perch model significantly expands beyond its original bird-focused capabilities to analyze a broader range of wildlife sounds.
- The model can now process complex acoustic scenes across thousands or millions of hours of audio data from microphones and underwater hydrophones (underwater listening devices).
- It’s trained on expanded datasets from public sources like Xeno-Canto and iNaturalist, covering mammals, amphibians, and human-made noise in addition to birds.
- Google DeepMind is open-sourcing the model and making it available on Kaggle to help scientists protect planetary ecosystems.
The big picture: Perch addresses a critical bottleneck in conservation efforts where scientists collect vast amounts of bioacoustic data but struggle to analyze it efficiently.
- Wildlife recordings can reveal which animals are present in an area and provide insights into ecosystem health, but manual analysis remains extremely time-consuming.
- The AI model can answer diverse conservation questions, from “how many babies are being born” to “how many individual animals are present in a given area.”
Proven impact: Since launching in 2023, the original Perch has been downloaded over 250,000 times and integrated into widely-used conservation tools.
- Perch’s vector search library is now part of Cornell Lab of Ornithology’s BirdNet Analyzer, a popular tool among working biologists.
- The model helped discover a new population of the elusive Plains Wanderer in Australia through collaboration with BirdLife Australia and the Australian Acoustic Observatory.
- Biologists at the University of Hawaiʻi’s LOHE Bioacoustics Lab used Perch to find honeycreeper sounds nearly 50x faster than traditional methods, enabling monitoring of endangered species threatened by avian malaria.
How it works: Perch combines species prediction with “agile modeling” tools that allow scientists to build custom classifiers quickly.
- Vector search functionality surfaces the most similar sounds in a dataset from just a single example.
- Local experts can mark search results as relevant or irrelevant to train new classifiers in under an hour.
- This approach works across different environments, from bird habitats to coral reefs, even with scarce training data.
What they’re saying: Conservation experts highlight the transformative potential of AI-powered acoustic monitoring.
- “This is an incredible discovery – acoustic monitoring like this will help shape the future of many endangered bird species,” said Paul Roe, Dean Research at James Cook University, Australia.
How AI is helping advance the science of bioacoustics to save endangered species