New funding from Paul G Allen Family Foundation to support development of technologies that will reconnect us to nature
Human beings are part of nature. But the vast majority of what is happening in this complex, interconnected natural world we inhabit is often beyond our ability to perceive - let alone comprehend. This disconnection from the world around us arguably represents one of the biggest obstacles humanity faces in tackling the biodiversity and climate crises.
Fortunately the exponential progress we’re seeing in AI offers new ways of looking at the world - extending our ability to perceive through technology, and finding patterns in data that will help to make sense of what is happening around us.
At Earth Species Project we are focused on harnessing these new developments in AI to gain a better understanding of animal communication because we believe it offers a unique and valuable way to connect to the rest of nature.
We’re thrilled to announce that we have just received significant new funding of $1.2 million from the Paul G Allen Family Foundation that will support these efforts.
ESP Senior Research Scientists Benjamin Hoffman and Maddie Cusimano announce the release of Voxaboxen, a new machine learning tool for research in bioacoustics. Based upon Earth Species Project's bioacoustics foundation model AVES, Voxaboxen is designed to detect and classify animal vocalizations in recorded audio, with the high level of temporal specificity that is required for studies focused on communication behavior. In this blog post, we will give an overview of why and how we designed Voxaboxen, as well as how to apply Voxaboxen to your own data. We anticipate this new tool will support and advance research by providing a straightforward way to automate annotation of large audio files.
While scientists have been studying the behavior and communication of other species for decades, the use of AI to help accelerate these discoveries is relatively new. In partnership with FootPrint Coalition (FPC), Robert Downey Jr’s initiative to accelerate technologies for sustainable abundance, and the Experiment Foundation, ESP is thrilled to announce the launch of a new grant program designed to catalyze new and emerging research into interspecies communication.
Sara Keen, Senior Researcher, Behavioral Ecology and AI reflects on how machine learning can help us find the signals in the noise and decipher meaning from the world of invisible information that surrounds us.
This blog provides an overview of ESP's technical roadmap - offering a path forward for our proposed and ongoing scientific work, which is focused on using AI to decode on non-human animal communication.
Machine learning (ML) has proven to be a powerful tool for learning latent representations of data (LeCun et al., 2015). For example, learned representations of human languages have enabled translation between different languages without the use of dictionaries (Artetxe et al., 2017; Lample et al., 2017). More recently, ML has been able to generate realistic images based on text descriptions (DALL-E 2), as well as predict and imitate the words we speak (AudioLM).
We hypothesize that ML models can provide new insights into non-human communication, and believe that discoveries fueled by these new techniques have the potential to transform our relationship with the rest of nature. We also expect that many of the techniques we develop will be useful tools for processing data in applied conservation settings (Tuia et al., 2022).
Whispering whales, turtle hatchlings communicating with each other through their shells, coral larvae that can hear and gravitate toward their home reefs…
Director of Development and External Affairs Jane Lawton reflects on the insights shared by Dr. Karen Bakker, Dr. Ari Friedlaender and Aza Raskin at ESP's October 25 panel discussion on interspecies communication in San Francisco - from how sophisticated the communication systems of other species are, to the explosion of new bioacoustic technologies and AI that see us on the cusp of dramatically expanding our understanding of those communication systems. And simultaneously exposing how little we, as human beings, currently know.
Katie Zacarian, CEO, Earth Species Project, reflects on the ESP journey to receiving a National Geographic Explorer grant. The project we have received funding for is focused on developing machine learning models to interpret animal motion data gathered on powerful animal-borne sensors, and it represents an important step on the journey toward decoding animal communication.
Senior AI research scientist Benjamin Hoffman is working on self-supervised methods for interpreting data collected from animal-borne tags, known as bio-loggers. Using bio-loggers, scientists are able to record an animal’s motions, as well as audio and video footage from the animal’s perspective. However, these data are often difficult to interpret, and there is often too much data to analyze by hand. A solution is to use self-supervised learning to discover repeated behavioral patterns in these data.
In December 2021, we published our first scientific paper in the peer-reviewed journal Scientific Reports which already has multiple citations. This publication focused on automatic source separation so that researchers are more easily able to distinguish between animal vocalizations when more than one animal is speaking at the same time. The research was the outcome of a close collaboration with marine biologist Dr. Laela Sayigh, who provided a dataset of bottlenose dolphin signature whistles for our project.