If any dolphins are reading this: Hello! A team of scientists studying a community of Florida dolphins has been awarded the first $100,000 Coller Dolittle Challenge prize, established to award research in interspecies communication algorithms. The team used non-invasive hydrophones to perform the research, which offers evidence that dolphins may be using whistles like words, shared with multiple members of their communities.
A type of whistle dolphins employ is used as an alarm, according to the US-based team led by Laela Sayigh of the Woods Hole Oceanographic Institution. Another whistle the team studied is used by dolphins to respond to unexpected or unfamiliar situations. Capturing the sounds is just the beginning. Researchers will use AI to continue deciphering the whistles to try to find more patterns.
"The main thing stopping us cracking the code of animal communication is a lack of data. Think of the 1 trillion words needed to train a large language model like ChatGPT. We don't have anything like this for other animals," said Jonathan Birch, a professor at London School of Economics and Political Science and one of the judges for the prize.
"That's why we need programs like the Sarasota Dolphin Research Program, which has built up an extraordinary library of dolphin whistles over 40 years. The cumulative result of all that work is that Laela Sayigh and her team can now use deep learning to analyze the whistles and perhaps, one day, crack the code," Brich added.
The award was part of a ceremony that honored the work of four teams from across the world. In addition to the dolphin project, researchers studied ways in which nightingales, marmoset monkeys and cuttlefish communicate. The challenge is a collaboration between the Jeremy Coller Foundation and Tel Aviv University. Submissions for next year open in August.
Dolphin language is just the start
Researching animals and trying to learn the secrets of their communication is nothing new, but AI is speeding up the creation of larger and larger datasets. "Breakthroughs are inevitable," said Kate Zacarian, CEO and co-founder of Earth Species Project, a California-based nonprofit that also works in breaking down language barriers with the animal world.
"Just as AI has revolutionized the fields of medicine and material science, we see a similar opportunity to bring those advances to the study of animal communication and empower researchers in this space with entirely new capabilities," she said.
Zacarian applauded Sayigh's team and their win and said it will help bring broader recognition to the study of non-human animal communication. It could also bring more attention to ways that AI can change the nature of this type of research.
"The AI systems aren't just faster. They allow for entirely new types of inquiry," she said. "We're moving from decoding isolated signals to exploring communication as a rich, dynamic, and structure phenomenon, which is a task that's simply too big for our human brains, but possible for large-scale AI models."
The Earth Species Project recently released an open-source large audio language model for analyzing animal sounds known as NatureLM audio. The organization is now working with biologists and ethologists to study species including orcas, carrion crows, jumping spiders and more. It plans to release some of their findings later this year, Zacarian said.