Filter
Associated Lab
- Ahrens Lab (1) Apply Ahrens Lab filter
- Aso Lab (4) Apply Aso Lab filter
- Remove Branson Lab filter Branson Lab
- Card Lab (3) Apply Card Lab filter
- Cardona Lab (1) Apply Cardona Lab filter
- Dickson Lab (1) Apply Dickson Lab filter
- Dudman Lab (1) Apply Dudman Lab filter
- Fetter Lab (1) Apply Fetter Lab filter
- Freeman Lab (2) Apply Freeman Lab filter
- Harris Lab (1) Apply Harris Lab filter
- Heberlein Lab (1) Apply Heberlein Lab filter
- Jayaraman Lab (1) Apply Jayaraman Lab filter
- Karpova Lab (1) Apply Karpova Lab filter
- Keller Lab (3) Apply Keller Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (1) Apply Pachitariu Lab filter
- Reiser Lab (4) Apply Reiser Lab filter
- Rubin Lab (8) Apply Rubin Lab filter
- Simpson Lab (1) Apply Simpson Lab filter
- Svoboda Lab (1) Apply Svoboda Lab filter
- Tervo Lab (1) Apply Tervo Lab filter
- Truman Lab (1) Apply Truman Lab filter
- Turaga Lab (5) Apply Turaga Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
Associated Support Team
- Project Pipeline Support (1) Apply Project Pipeline Support filter
- Anatomy and Histology (1) Apply Anatomy and Histology filter
- Electron Microscopy (1) Apply Electron Microscopy filter
- Invertebrate Shared Resource (2) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (1) Apply Janelia Experimental Technology filter
- Project Technical Resources (6) Apply Project Technical Resources filter
- Quantitative Genomics (1) Apply Quantitative Genomics filter
- Scientific Computing Software (6) Apply Scientific Computing Software filter
- Scientific Computing Systems (1) Apply Scientific Computing Systems filter
Publication Date
- 2025 (2) Apply 2025 filter
- 2024 (5) Apply 2024 filter
- 2023 (2) Apply 2023 filter
- 2021 (2) Apply 2021 filter
- 2020 (3) Apply 2020 filter
- 2019 (3) Apply 2019 filter
- 2018 (5) Apply 2018 filter
- 2017 (7) Apply 2017 filter
- 2016 (5) Apply 2016 filter
- 2015 (6) Apply 2015 filter
- 2014 (6) Apply 2014 filter
- 2012 (3) Apply 2012 filter
49 Janelia Publications
Showing 1-10 of 49 resultsNatural events present multiple types of sensory cues, each detected by a specialized sensory modality. Combining information from several modalities is essential for the selection of appropriate actions. Key to understanding multimodal computations is determining the structural patterns of multimodal convergence and how these patterns contribute to behaviour. Modalities could converge early, late or at multiple levels in the sensory processing hierarchy. Here we show that combining mechanosensory and nociceptive cues synergistically enhances the selection of the fastest mode of escape locomotion in Drosophila larvae. In an electron microscopy volume that spans the entire insect nervous system, we reconstructed the multisensory circuit supporting the synergy, spanning multiple levels of the sensory processing hierarchy. The wiring diagram revealed a complex multilevel multimodal convergence architecture. Using behavioural and physiological studies, we identified functionally connected circuit nodes that trigger the fastest locomotor mode, and others that facilitate it, and we provide evidence that multiple levels of multimodal integration contribute to escape mode selection. We propose that the multilevel multimodal convergence architecture may be a general feature of multisensory circuits enabling complex input–output functions and selective tuning to ecologically relevant combinations of cues.
An important role of visual systems is to detect nearby predators, prey, and potential mates [1], which may be distinguished in part by their motion. When an animal is at rest, an object moving in any direction may easily be detected by motion-sensitive visual circuits [2, 3]. During locomotion, however, this strategy is compromised because the observer must detect a moving object within the pattern of optic flow created by its own motion through the stationary background. However, objects that move creating back-to-front (regressive) motion may be unambiguously distinguished from stationary objects because forward locomotion creates only front-to-back (progressive) optic flow. Thus, moving animals should exhibit an enhanced sensitivity to regressively moving objects. We explicitly tested this hypothesis by constructing a simple fly-sized robot that was programmed to interact with a real fly. Our measurements indicate that whereas walking female flies freeze in response to a regressively moving object, they ignore a progressively moving one. Regressive motion salience also explains observations of behaviors exhibited by pairs of walking flies. Because the assumptions underlying the regressive motion salience hypothesis are general, we suspect that the behavior we have observed in Drosophila may be widespread among eyed, motile organisms.
Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal’s position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.
The training of deep neural networks is a high-dimension optimization problem with respect to the loss function of a model. Unfortunately, these functions are of high dimension and non-convex and hence difficult to characterize. In this paper, we empirically investigate the geometry of the loss functions for state-of-the-art networks with multiple stochastic optimization methods. We do this through several experiments that are visualized on polygons to understand how and when these stochastic optimization methods find minima.
The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.
Behavioral choices that ignore prior experience promote exploration and unpredictability but are seemingly at odds with the brain's tendency to use experience to optimize behavioral choice. Indeed, when faced with virtual competitors, primates resort to strategic counterprediction rather than to stochastic choice. Here, we show that rats also use history- and model-based strategies when faced with similar competitors but can switch to a "stochastic" mode when challenged with a competitor that they cannot defeat by counterprediction. In this mode, outcomes associated with an animal's actions are ignored, and normal engagement of anterior cingulate cortex (ACC) is suppressed. Using circuit perturbations in transgenic rats, we demonstrate that switching between strategic and stochastic behavioral modes is controlled by locus coeruleus input into ACC. Our findings suggest that, under conditions of uncertainty about environmental rules, changes in noradrenergic input alter ACC output and prevent erroneous beliefs from guiding decisions, thus enabling behavioral variation.
Persistent internal states are important for maintaining survival-promoting behaviors, such as aggression. In female Drosophila melanogaster, we have previously shown that individually activating either aIPg or pC1d cell types can induce aggression. Here we investigate further the individual roles of these cholinergic, sexually dimorphic cell types, and the reciprocal connections between them, in generating a persistent aggressive internal state. We find that a brief 30-second optogenetic stimulation of aIPg neurons was sufficient to promote an aggressive internal state lasting at least 10 minutes, whereas similar stimulation of pC1d neurons did not. While we previously showed that stimulation of pC1e alone does not evoke aggression, persistent behavior could be promoted through simultaneous stimulation of pC1d and pC1e, suggesting an unexpected synergy of these cell types in establishing a persistent aggressive state. Neither aIPg nor pC1d show persistent neuronal activity themselves, implying that the persistent internal state is maintained by other mechanisms. Moreover, inactivation of pC1d did not significantly reduce aIPg-evoked persistent aggression arguing that the aggressive state did not depend on pC1d-aIPg recurrent connectivity. Our results suggest the need for alternative models to explain persistent female aggression.
Aggressive social interactions are used to compete for limited resources and are regulated by complex sensory cues and the organism's internal state. While both sexes exhibit aggression, its neuronal underpinnings are understudied in females. Here, we identify a population of sexually dimorphic aIPg neurons in the adult central brain whose optogenetic activation increased, and genetic inactivation reduced, female aggression. Analysis of GAL4 lines identified in an unbiased screen for increased female chasing behavior revealed the involvement of another sexually dimorphic neuron, pC1d, and implicated aIPg and pC1d neurons as core nodes regulating female aggression. Connectomic analysis demonstrated that aIPg neurons and pC1d are interconnected and suggest that aIPg neurons may exert part of their effect by gating the flow of visual information to descending neurons. Our work reveals important regulatory components of the neuronal circuitry that underlies female aggressive social interactions and provides tools for their manipulation.