Format

Send to

Choose Destination
Nat Methods. 2019 Jan;16(1):117-125. doi: 10.1038/s41592-018-0234-5. Epub 2018 Dec 20.

Fast animal pose estimation using deep neural networks.

Author information

1
Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
2
Program in Neuroscience, Harvard University, Cambridge, MA, USA.
3
Department of Molecular Biology, Princeton University, Princeton, NJ, USA.
4
Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA. mmurthy@princeton.edu.
5
Department of Molecular Biology, Princeton University, Princeton, NJ, USA. mmurthy@princeton.edu.
6
Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA. shaevitz@princeton.edu.
7
Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ, USA. shaevitz@princeton.edu.
8
Department of Physics, Princeton University, Princeton, NJ, USA. shaevitz@princeton.edu.

Abstract

The need for automated and efficient systems for tracking full animal pose has increased with the complexity of behavioral data and analyses. Here we introduce LEAP (LEAP estimates animal pose), a deep-learning-based method for predicting the positions of animal body parts. This framework consists of a graphical interface for labeling of body parts and training the network. LEAP offers fast prediction on new data, and training with as few as 100 frames results in 95% of peak performance. We validated LEAP using videos of freely behaving fruit flies and tracked 32 distinct points to describe the pose of the head, body, wings and legs, with an error rate of <3% of body length. We recapitulated reported findings on insect gait dynamics and demonstrated LEAP's applicability for unsupervised behavioral classification. Finally, we extended the method to more challenging imaging situations and videos of freely moving mice.

PMID:
30573820
DOI:
10.1038/s41592-018-0234-5
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Nature Publishing Group
Loading ...
Support Center