Using AI to monitor and measure biodiversity and wildlife

2 February 2024

In the landscape of conservation and ecology, Artificial Intelligence (AI) is emerging as a powerful ally for researchers and professionals tasked with monitoring and measuring biodiversity and wildlife. Aneesh Chauhan, senior scientist computer vision and robotics for Wageningen University & Research’s Vision + Robotics programme, sheds light on the transformative role of AI, specifically computer vision and deep learning, in wildlife monitoring techniques.

One of the remarkable applications of AI, especially the computer vision sub-field of AI, is where it helps transform indicators like population sizes and species into meaningful information. In cooperation with researchers from Wageningen Marine Research, the Vision + Robotics programme showcased the success of deep learning in monitoring biodiversity on sea floors, for example. This approach provides a solution to the challenge of human limitations in intensive video footage analysis during sea bed explorations.

“By annotating the video footage from underwater surveys, we can train deep-learning models to identify species accurately at scale, addressing the possible gaps in human observations,” notes Chauhan. These models not only prove to be more efficient but also reveal instances where human experts may overlook certain details.

AI measures effects of offshore wind farms and under-sea power lines

The benefits extend beyond sea floors to diverse environments, including the examination of wind farm implications at sea. AI enables the measurement and monitoring of species affected by the installation of wind farm infrastructure. It contributes to sustainable practices by assessing the impact of these wind farms on megafauna (sea birds and mammals). A similar approach is also useful for evaluating the success of mussel bank relocation after lifting and placing parts of the sea bed elsewhere when making room for laying power cables.

The integration of deep learning into aerial seal counting from drone imagery is another example. In addition to counting, the next challenge involves the detection of individual seals and using this information for a better understanding of seal conservation efforts as well as the effects of seals on the coastal ecosystem. Or for measuring the impact of conservation efforts, such as re-introduction of sea turtles and their grazing behaviour on sea-grass meadows. The researchers’ deep learning tool has been released here.

Turtle detection in aerial images using deep learning.

Turning sounds into images: acoustic monitoring of birds

AI’s potential spans across different environments, from underwater to aerial surveillance. The technology has proven successful in initiatives such as measuring and detecting underground insect biodiversity using sensitive cameras and eco-acoustic monitoring for pest control.
A bio-positive food production system requires a balanced interplay of pest and prey, for a successful production system. But quantifying the impact of a bio-diverse ecosystem on pest management and farm production is a major challenge in ecological farming systems. One of the main issues is the measurement of biodiversity: it is often too time-consuming, too expensive, or both. In the last decade, automated eco-acoustic surveying has emerged as a relevant technology for large-scale monitoring of natural as well as urban habitats. Machine learning, including deep learning, is being increasingly applied to acoustic data, to automatically identify a range of sounds, from different bird species, to amphibians, grasshoppers, and humans.

The recently finalized wildcard project “Eco-acoustics: a Biodiversity yardstick as a facilitating tool for nature-positive food production” in collaboration with Wageningen Biodiversity Initiative and the Federal University of Vicosa, Brazil, introduced a groundbreaking approach to acoustic monitoring . Chauhan explains, “We turned sounds into images, creating spectrograms that train the AI models to detect bird species by their unique sounds.” This method allows continuous monitoring for longer periods of time, providing valuable insights into bird species presence and behaviour patterns.

Open-source models for collaborative biodiversity monitoring

While AI in acoustic monitoring is still in its early stages, it holds great potential. BirdNET, for instance, is a state-of-the-art model that detects birds using sounds, and the models can provide estimates with probabilities. Researchers can use that to analyse relative frequency of birds and their behaviour patterns over time.

“This is just the beginning of a very promising journey. Technology is evolving rapidly, and the creation of open-source models such as this is a big opportunity, because it would allow researchers and ecologists to collaboratively contribute to biodiversity monitoring,” says Chauhan.

Aneesh Chauhan Vision Robotics

A (Aneesh) Chauhan PhD

Expertise leader Computer Vision and Robotics | Senior Scientist

Contact A (Aneesh) Chauhan PhD