Roian Egnor uses multi-generational groups of socially-housed mice to study the neural basis of complex vocal and social behavior.
The importance of complex behavior, or “Isn’t that awfully messy?”
We now have access to a variety of exquisitely precise tools for recording from, and even manipulating, populations of identified neurons in the intact behaving animal. The explosion in neurobiological tools has not, however, been matched by an increase in the quality and precision of behavioral analysis. At this point the bottleneck in understanding the nervous system is not tools, but behavioral assays that can take advantage of those tools.
The brain is not a tape recorder
It throws some things out and magnifies others. Animals live in complex and noisy environments. They extract useful information from these environments, cleaning up the noise (in general) to allow fantastically beautiful and adaptive behavior. This ability poses a problem for the neurobiologist – how do you study the representation of something if you don’t know what is being represented? One method, which has taught us a lot, is to make input stimuli or behaviors extremely simple – if there is only one useful feature in the stimulus, that must be what is being represented. However, the brain did not evolve to deal with simple stimuli; it evolved to deal with noise and mess. An alternative is to ask about the neural basis of behaviors the animal evolved to perform.
In our lab we embrace messiness
Our animals are socially housed (introducing dominance status, pregnancy, estrus state, age) in cages which mimic some of the features of wild mouse burrows. Animals don’t need massive repetition to make sensory discriminations, perform tasks, or navigate environments – so we think focusing on unpacking the neural code without averaging is important. Because of this we record from unrestrained, untrained mice during natural social behaviors.
We use several tools to tame the mess
One tool is theoretical – we work on vocalizations. Vocalizations are reasonably easy to record from a freely moving animal, and, importantly, vocalizations are an acoustic stimulus that mice are interested in attending to. When an animal is vocalizing, we know that at least some portions of the mouse’s nervous system are intent on vocal production – giving us a handle into representation on the motor side. Motor production rarely proceeds without feedback, so we also can look at the representation of the vocalization in the auditory system, as it is produced. In addition, a vocalization is a social signal, allowing us to record from the auditory systems of listening mice, with some belief that they are attending to the vocalizations.
Other tools are more technological – we have two active tracking projects (one video-based, one RFID-based) which allow us to make correlations between vocalizations and social interactions over long time scales. Video and RFID are complementary technologies—video provides high temporal and spatial resolution information, but identity is difficult, and data analysis to extract position information can be time consuming. RFID has low spatial and temporal resolution, but identity is perfect and data is recorded in an immediately biologically relevant way (mouse A and mouse B are in the same chamber).
motr (MOuse TRacker)
motr (MOuse TRacker): motr is a fully automated tracking application that is capable of tracking multiple mice in a single enclosure over long periods of time (days) without confusing their identities. motr is an open-source (GPLv2), freely available program developed by Shay Ohayon, Pietro Perona (Caltech), Ofer Avni, Adam Taylor and Roian Egnor (Janelia).