https://greenmarked.it/wp-content/uploads/2026/03/Copia-di-Parabolic-recorder-scaled.jpg
1920
2560
Jennifer Lüdtke
https://greenmarked.it/wp-content/uploads/2022/01/LOGO-GREENMARKED-SITO-600x600.png
Jennifer Lüdtke2026-03-08 14:56:142026-04-07 23:17:05Behind the Lens and Beyond the Microphone: Studying Wildlife with AIHidden in the nook of a tree, a small microphone records the sounds of the forest throughout the night. Meanwhile, a bird flaps past with a small GPS tag mounted to its back. Hundreds of kilometres away, a computer algorithm scans the recordings and identifies the calls, while scientists track the bird’s movements from behind their screens.
The tide of artificial intelligence is rising in biodiversity monitoring, where deep learning and artificial neural networks are used to process vast amounts of data, amplifying the speed and scale of scientific analysis. This is coupled with technological advances that allow biologists to collect data in novel ways and at a lower cost. Researchers can now embed relatively cheap wildlife sensors such as camera traps, Autonomous Recording Units (ARUs), and thermal cameras across the landscape, allowing them to monitor difficult-to-capture species, access hard-to-reach locations, and expand their study scope across time and space.
Machine listening refers to the ability of computers to interpret sound. Recorders capture species calls and soundscapes and pass recordings through algorithms to answer biological and ecological questions. One example of the use of machine listening is to process long nighttime recordings of bird calls produced by nocturnal migrants. Up to 85% of birds migrate at night, and most have species-specific nocturnal flight calls [1].
In the past, people relied on techniques such as moon watching – observing the moon through a telescope to spot the outline of birds flying by [2]! Scientists and hobbyists can instead scroll through spectrograms during the day to search for potential calls [3]. Although faster than night-long activities, manually labelling datasets remains extremely labour-intensive. This is where machine listening is transforming the field, allowing computers to detect and identify calls and species in a matter of minutes.
But these systems are not perfect. On the ground, the precision and accuracy of machine learning models used to identify species and call types depends on how the models are trained. Trying to detect a European bird on a model trained on Hawaiian species might incorrectly identify a Kiwikiu rather than the far more likely Cuckoo.
However, model development is moving forward rapidly; artificial neural networks can analyse spectrograms using computer vision, and feature embeddings can capture even the slightest of differences between sounds [4]. This approach allows models to be pre-trained and then fine-tuned with local data using transfer learning – far more accessible than building a model from scratch, not to mention less computationally expensive [5].
The spread of wildlife sensors raises questions beyond technology. In a recent publication titled “The Planetisation of Machine Listening”, Professor James E. K. Parker argues that the more machine listening is used as a monitoring and conservation tool, the more its use will expand [6]. Once we start – and we already have – we are inclined to upscale at a pace with which political and legal frameworks are not currently equipped to deal with.
Unresolved challenges regarding carbon-friendly data storage and data security persist throughout the field, as do ethical considerations when bioacoustics begin to “actively interact” with nature, for example when loudspeakers play sounds in coral restoration zones to call in specific species [6].

Figure 1: “Camera Trap” by jurvetson is licensed under CC BY 2.0. 2009.
There are real and important questions to be answered about access to data, particularly when it comes to Indigenous data sovereignty and governance. This refers to the right of Indigenous peoples to exercise ownership over Indigenous data and decide what, why, and how these data are used [7]. The increased use of camera traps and ARUs, not to mention drones and UAVs, raise privacy concerns, alter how people use natural spaces, and may contribute to the techno-securitisation of society [8].
A study on wildlife sensor use in the Corbett Tiger Reserve in Uttarakhand, India shows that the deployment of camera traps, ARUs, and drones “[exacerbated] prevailing inequalities of gender, caste and class discrimination” [9]. Women who rely on the forest for their livelihoods say that conservation surveillance technologies, as they are sometimes referred to, has changed the way they use the forest:
“we normally sing ‘kumaoni’ (local language) songs or talk loudly while collecting grass in the forest to keep away elephants and tigers. When we see these cameras, we remain quiet, you never know who’s listening.”
While AI and sensor advancements offer exciting potential, this new frontier also brings political, ethical, and social considerations. Scientists and professionals across disciplines will need to work together to help answer a new tide of questions: which wildlife sensors should be deployed, and where, who should have access to the collected data, and what political and technological changes are needed to broaden access to these methods.
AI is striding forward into the standard toolbox of biodiversity monitoring and conservation techniques in many parts of the world. Whether its impact is hopeful or harmful will depend on how responsibly it is implemented – and whether sufficient ethical and socio-legal frameworks are put into place
References:
[1] Baushev, A. N., & Sinelschikova, A. (2006). On a probabilistic model for the numerical estimation of nocturnal migration of birds. Mathematical Biosciences, 205(1), 44–58. https://doi.org/10.1016/j.mbs.2006.01.001
[2] Flight paths. (2023, October 16). Rebecca Heisman. https://rebeccaheisman.com/flight-paths/
[3] nocmig. (n.d.). Simon Gillings. https://nocmig.com/
[4] Ghani, B., Denton, T., Kahl, S., & Klinck, H. (2023). Global birdsong embeddings enable superior transfer learning for bioacoustic classification. Scientific Reports, 13(1), 22876. https://doi.org/10.1038/s41598-023-49989-z
[5] Kershenbaum, A., Akçay, Ç., Babu‐Saheer, L., Barnhill, A., Best, P., Cauzinille, J., Clink, D., Dassow, A., Dufourq, E., Growcott, J., Markham, A., Marti‐Domken, B., Marxer, R., Muir, J., Reynolds, S., Root‐Gutteridge, H., Sadhukhan, S., Schindler, L., Smith, B. R., . . . Dunn, J. C. (2024). Automatic detection for bioacoustic research: a practical guide from and for biologists and computer scientists. Biological Reviews/Biological Reviews of the Cambridge Philosophical Society, 100(2), 620–646. https://doi.org/10.1111/brv.13155
[6] Parker, J. E. K. (2025). The planetization of machine listening. Critical Inquiry, 52(1), 21–47. https://doi.org/10.1086/737056
[7] Maiam Nayri Wingara. (n.d.). Maiam Nayri Wingara. https://www.maiamnayriwingara.org/
[8] Simlai, T. (2025, January 23). Digital Technologies and Conservation Surveillance. Smart Forests Atlas. https://atlas.smartforests.net/en/stories/digital-technologies-and-conservation-surveillance/
[9] WILDLABS.NET. (2021, August 6). Trishant Simlai: How does conservation tech cause harm? [Video]. YouTube. https://www.youtube.com/watch?v=lL1tUgzOWGw
Cover image: A parabolic microphone used to record wildlife sounds. Jennifer Lüdtke. Jul 2025




















