Back in 2013, Sandeep Robert “Bob” Datta was working in his neurobiology lab at Harvard Medical School in Boston when he made the fateful decision to send his student Alex Wiltschko to the Best Buy up the street. Wiltschko was on a mission to purchase an Xbox Kinect camera, designed to pick up players’ body movements for video games like Just Dance and FIFA. He plunked down about $150 and walked out with it. The unassuming piece of consumer electronics would determine the lab’s direction in the coming decade and beyond.
It also placed the team within a growing scientific movement at the intersection of artificial intelligence, neuroscience, and animal behavior—a field poised to change the way researchers use other creatures to study human health conditions. The Datta Lab is learning to track the intricate nuances of mouse movement and understand the basics of how the mammal brain creates behavior, untangling the neuroscience of different health conditions and ultimately developing new treatments for people. This area of research relies on so-called “computer vision” to analyze video footage of animals and detect behavior patterns imperceptible to the unaided eye. Computer vision can also be used to auto-detect cell types, addressing a persistent problem for researchers who study complex tissues in, for example, cancers and gut microbiomes.
In the early 2010s, Datta’s lab was interrogating how smell, “the sense that is most important to most animals” and the one that mice can’t survive without, drives the rodents’ responses to manipulations in their environment. Human observers traditionally track mouse behavior and record their observations—how many times a mouse freezes in fear, how often it rears up to explore its enclosure, how long it spends grooming, how many marbles it buries. Datta wanted to move beyond the movements visible to the unaided eye and use video cameras to track and compute whether a rodent avoids…
Read the full article here