Mala Murthy and Joshua Shaevitz: AI-based motion-capture system for lab animals

Sept. 5, 2019

A new system that uses artificial intelligence to track animal movements is poised to aid a wide range of studies, from exploring new drugs that affect behavior to ecological research. The approach can be used with laboratory animals such as fruit flies and mice as well as larger animals. 

The technology, developed by Mala Murthy, Joshua Shaevitz, graduate student Talmo Pereira, and then-undergraduate Diego Aldarondo of the Class of 2018, accurately detects the location of each body part — legs, head, nose and other points — in millions of frames of video. 

First, a human experimenter records video of a moving animal. Next, the experimenter directs the system’s software to identify a small number of images in which to define body part positions. The system then uses this dataset to train a neural network to calculate the location of the points in subsequent frames. The method has recently been extended by Pereira to work not only on videos of a single animal but also on footage of multiple interacting animals, keeping track of animal identities over time. 

Team members: Talmo Pereira, graduate student; Diego Aldarondo, Class of 2018

Collaborators: Sam Wang, professor of neuroscience; David Turner, research software engineer, Office of Information Technology; Nathaniel Tabris, research software engineer, Princeton Neuroscience Institute 

Development status: Patent protection is pending. Princeton is seeking outside interest for the development of this technology.

Funding: National Science Foundation, National Institutes of Health

murthylab.princeton.edu and shaevitzlab.princeton.edu