APBN New Site

APBN Developing Site

Reconstructing Animal Movements With Deep Neural Network

A new study by researchers at Duke University and Harvard University introduces an automated tool that can readily capture behaviour of freely behaving animals and precisely reconstruct their three dimensional (3D) pose from a single video camera and without markers.

Animals are constantly moving and behaving in response to instructions from the brain. But while there are advanced techniques for measuring these instructions in terms of neural activity, there is a paucity of techniques for quantifying the behaviour itself in freely moving animals. This inability to measure the key output of the brain limits our understanding of the nervous system and how it changes in disease.

In a study published in Nature Methods, which was led by Timothy W. Dunn, Assistant Professor, Duke University, and Jesse D. Marshall, postdoctoral researcher, Harvard University, describes a new 3D deep-neural network, DANNCE (3-Dimensional Aligned Neural Network for Computational Ethology).

The study follows the team’s 2020 study in Neuron which revealed the ground-breaking behavioural monitoring system, CAPTURE (Continuous Appendicular and Postural Tracking using Retroreflector Embedding), which uses motion capture and deep learning to continuously track the 3D movements of freely behaving animals. CAPTURE yielded an unprecedented detailed description of how animals behave. However, it required using specialized hardware and attaching markers to animals, making it a challenge to use.

DANNCE works across a broad range of species and is reproducible across laboratories and environments, ensuring it will have a broad impact on animal – and even human – behavioural studies. It has a specialized neural network tailored to 3D pose tracking from video. A key aspect is that its 3D feature space is in physical units (meters) rather than camera pixels. This allows the tool to more readily generalize across different camera arrangements and laboratories. In contrast, previous approaches to 3D pose tracking used neural networks tailored to pose detection in two-dimensions (2D), which struggled to readily adapt to new 3D viewpoints.

To predict landmarks on an animal’s body DANNCE required a large training dataset, which at the outset seemed daunting to collect. “Deep neural networks can be incredibly powerful, but they are very data hungry,” said senior author Bence Ölveczky, Professor in the Department of Organismic and Evolutionary Biology, Harvard University. “We realized that CAPTURE generates exactly the kind of rich and high-quality training data these little artificial brains need to do their magic.”

The researchers used CAPTURE to collect seven million examples of images and labelled 3D key points in rats from 30 different camera views. “It worked immediately on new rats, even those not wearing the markers,” Marshall said. “We really got excited though when we found that it could also track mice with just a few extra examples.”

Following the discovery, the team collaborated with multiple groups at Duke University, MIT, Rockefeller University and Columbia University to demonstrate the generality of DANNCE in various environments and species including marmosets, chickadees, and rat pups as they grow and develop.

The study highlights some of the applications of DANNCE that allow researchers to examine the microstructure of animal behaviour well beyond what is currently possible with human observation. The researchers show that DANNCE can extract individual ‘fingerprints’ describing the kinematics of different behaviours that mice make. These fingerprints should allow researchers to achieve standardized definitions of behaviours that can be used to improve reproducibility across laboratories. They also demonstrate the ability to carefully trace the emergence of behaviour over time, opening new avenues in the study of neurodevelopment.

Measuring movement in animal models of disease is critically important for both basic and clinical research programs and DANNCE can be readily applied to both domains, accelerating progress across the board. Partial funding for CAPTURE and DANNCE was provided by the NIH and the Simons Foundation Autism Research Initiative (SFARI) and the researchers note the value of these tools hold for autism-related and motor-related studies, both in animal models and in humans.

“Because we’ve had very poor ability to quantify motion and movement rigorously in humans this has prevented us from separating movement disorders into specialized subtypes that potentially could have different underlying mechanisms and remedies. I think any field in which people have noticed but have been unable to quantify effects across their population will see great benefits from applying this technology,” said Dunn.

The researchers open sourced the tool and it is already being put to use in other labs. Going forward, they plan to apply the system to multiple animals interacting. “DANNCE changes the game for studying behaviour in free moving animals,” said Marshall. “For the first time we can track actual kinematics in 3D and learn in unprecedented detail what animals do. These approaches are going to be more and more essential in our quest to understand how the brain operates. [APBN]