Filter
Associated Lab
- Branson Lab (4) Apply Branson Lab filter
- Card Lab (5) Apply Card Lab filter
- Dickson Lab (1) Apply Dickson Lab filter
- Funke Lab (1) Apply Funke Lab filter
- Jayaraman Lab (2) Apply Jayaraman Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Romani Lab (4) Apply Romani Lab filter
- Rubin Lab (15) Apply Rubin Lab filter
- Stern Lab (1) Apply Stern Lab filter
- Turaga Lab (3) Apply Turaga Lab filter
- Zuker Lab (1) Apply Zuker Lab filter
Associated Project Team
Publication Date
- 2024 (4) Apply 2024 filter
- 2023 (7) Apply 2023 filter
- 2022 (3) Apply 2022 filter
- 2021 (3) Apply 2021 filter
- 2020 (2) Apply 2020 filter
- 2019 (2) Apply 2019 filter
- 2018 (4) Apply 2018 filter
- 2017 (6) Apply 2017 filter
- 2016 (3) Apply 2016 filter
- 2015 (2) Apply 2015 filter
- 2014 (2) Apply 2014 filter
- 2013 (2) Apply 2013 filter
- 2012 (1) Apply 2012 filter
- 2011 (2) Apply 2011 filter
- 2010 (3) Apply 2010 filter
- 2009 (1) Apply 2009 filter
- 2008 (1) Apply 2008 filter
- 2007 (2) Apply 2007 filter
- 2003 (1) Apply 2003 filter
Type of Publication
51 Publications
Showing 1-10 of 51 resultsThe body of an animal influences how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. We demonstrate the use of visual sensors and the re-use of a pre-trained general-purpose flight controller by training the model to perform visually guided flight tasks. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.
Vision provides animals with detailed information about their surroundings, conveying diverse features such as color, form, and movement across the visual scene. Computing these parallel spatial features requires a large and diverse network of neurons, such that in animals as distant as flies and humans, visual regions comprise half the brain’s volume. These visual brain regions often reveal remarkable structure-function relationships, with neurons organized along spatial maps with shapes that directly relate to their roles in visual processing. To unravel the stunning diversity of a complex visual system, a careful mapping of the neural architecture matched to tools for targeted exploration of that circuitry is essential. Here, we report a new connectome of the right optic lobe from a male Drosophila central nervous system FIB-SEM volume and a comprehensive inventory of the fly’s visual neurons. We developed a computational framework to quantify the anatomy of visual neurons, establishing a basis for interpreting how their shapes relate to spatial vision. By integrating this analysis with connectivity information, neurotransmitter identity, and expert curation, we classified the 53,000 neurons into 727 types, about half of which are systematically described and named for the first time. Finally, we share an extensive collection of split-GAL4 lines matched to our neuron type catalog. Together, this comprehensive set of tools and data unlock new possibilities for systematic investigations of vision in Drosophila, a foundation for a deeper understanding of sensory processing.
The body of an animal determines how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. With a visually guided flight task, we demonstrate a neural controller that can use the vision sensors of the body model to control and steer flight. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.
The brain generates diverse neuron types which express unique homeodomain transcription factors (TFs) and assemble into precise neural circuits. Yet a mechanistic framework is lacking for how homeodomain TFs specify both neuronal fate and synaptic connectivity. We use Drosophila lamina neurons (L1-L5) to show the homeodomain TF Brain-specific homeobox (Bsh) is initiated in lamina precursor cells (LPCs) where it specifies L4/L5 fate and suppresses homeodomain TF Zfh1 to prevent L1/L3 fate. Subsequently, Bsh activates the homeodomain TF Apterous (Ap) in L4 in a feedforward loop to express the synapse recognition molecule DIP-β, in part by Bsh direct binding a DIP-β intron. Thus, homeodomain TFs function hierarchically: primary homeodomain TF (Bsh) first specifies neuronal fate, and subsequently acts with secondary homeodomain TF (Ap) to activate DIP-β, thereby generating precise synaptic connectivity. We speculate that hierarchical homeodomain TF function may represent a general principle for coordinating neuronal fate specification and circuit assembly.
Color and motion are used by many species to identify salient objects. They are processed largely independently, but color contributes to motion processing in humans, for example, enabling moving colored objects to be detected when their luminance matches the background. Here, we demonstrate an unexpected, additional contribution of color to motion vision in Drosophila. We show that behavioral ON-motion responses are more sensitive to UV than for OFF-motion, and we identify cellular pathways connecting UV-sensitive R7 photoreceptors to ON and OFF-motion-sensitive T4 and T5 cells, using neurogenetics and calcium imaging. Remarkably, this contribution of color circuitry to motion vision enhances the detection of approaching UV discs, but not green discs with the same chromatic contrast, and we show how this could generalize for systems with ON- and OFF-motion pathways. Our results provide a computational and circuit basis for how color enhances motion vision to favor the detection of saliently colored objects.
Color and motion are used by many species to identify salient objects. They are processed largely independently, but color contributes to motion processing in humans, for example, enabling moving colored objects to be detected when their luminance matches the background. Here, we demonstrate an unexpected, additional contribution of color to motion vision in Drosophila. We show that behavioral ON-motion responses are more sensitive to UV than for OFF-motion, and we identify cellular pathways connecting UV-sensitive R7 photoreceptors to ON and OFF-motion-sensitive T4 and T5 cells, using neurogenetics and calcium imaging. Remarkably, this contribution of color circuitry to motion vision enhances the detection of approaching UV discs, but not green discs with the same chromatic contrast, and we show how this could generalize for systems with ON- and OFF-motion pathways. Our results provide a computational and circuit basis for how color enhances motion vision to favor the detection of saliently colored objects.
Flying insects exhibit remarkable navigational abilities controlled by their compact nervous systems. Optic flow, the pattern of changes in the visual scene induced by locomotion, is a crucial sensory cue for robust self-motion estimation, especially during rapid flight. Neurons that respond to specific, large-field optic flow patterns have been studied for decades, primarily in large flies, such as houseflies, blowflies, and hover flies. The best-known optic-flow sensitive neurons are the large tangential cells of the dipteran lobula plate, whose visual-motion responses, and to a lesser extent, their morphology, have been explored using single-neuron neurophysiology. Most of these studies have focused on the large, Horizontal and Vertical System neurons, yet the lobula plate houses a much larger set of 'optic-flow' sensitive neurons, many of which have been challenging to unambiguously identify or to reliably target for functional studies. Here we report the comprehensive reconstruction and identification of the Lobula Plate Tangential Neurons in an Electron Microscopy (EM) volume of a whole Drosophila brain. This catalog of 58 LPT neurons (per brain hemisphere) contains many neurons that are described here for the first time and provides a basis for systematic investigation of the circuitry linking self-motion to locomotion control. Leveraging computational anatomy methods, we estimated the visual motion receptive fields of these neurons and compared their tuning to the visual consequence of body rotations and translational movements. We also matched these neurons, in most cases on a one-for-one basis, to stochastically labeled cells in genetic driver lines, to the mirror-symmetric neurons in the same EM brain volume, and to neurons in an additional EM data set. Using cell matches across data sets, we analyzed the integration of optic flow patterns by neurons downstream of the LPTs and find that most central brain neurons establish sharper selectivity for global optic flow patterns than their input neurons. Furthermore, we found that self-motion information extracted from optic flow is processed in distinct regions of the central brain, pointing to diverse foci for the generation of visual behaviors.
Connections between neurons can be mapped by acquiring and analysing electron microscopic brain images. In recent years, this approach has been applied to chunks of brains to reconstruct local connectivity maps that are highly informative, but nevertheless inadequate for understanding brain function more globally. Here we present a neuronal wiring diagram of a whole brain containing 5 × 107 chemical synapses between 139,255 neurons reconstructed from an adult female Drosophila melanogaster. The resource also incorporates annotations of cell classes and types, nerves, hemilineages and predictions of neurotransmitter identities. Data products are available for download, programmatic access and interactive browsing and have been made interoperable with other fly data resources. We derive a projectome-a map of projections between regions-from the connectome and report on tracing of synaptic pathways and the analysis of information flow from inputs (sensory and ascending neurons) to outputs (motor, endocrine and descending neurons) across both hemispheres and between the central brain and the optic lobes. Tracing from a subset of photoreceptors to descending motor pathways illustrates how structure can uncover putative circuit mechanisms underlying sensorimotor behaviours. The technologies and open ecosystem reported here set the stage for future large-scale connectome projects in other species.
Many animals rely on optic flow for navigation, using differences in eye image velocity to detect deviations from their intended direction of travel. However, asymmetries in image velocity between the eyes are often overshadowed by strong, symmetric translational optic flow during navigation. Yet, the brain efficiently extracts these asymmetries for course control. While optic flow sensitive-neurons have been found in many animal species, far less is known about the postsynaptic circuits that support such robust optic flow processing. In the fly Drosophila melanogaster, a group of neurons called the horizontal system (HS) are involved in course control during high-speed translation. To understand how HS cells facilitate robust optic flow processing, we identified central networks that connect to HS cells using full brain electron microscopy datasets. These networks comprise three layers: convergent inputs from different, optic flow-sensitive cells, a middle layer with reciprocal, and lateral inhibitory interactions among different interneuron classes, and divergent output projecting to both the ventral nerve cord (equivalent to the vertebrate spinal cord), and to deeper regions of the fly brain. By combining two-photon optical imaging to monitor free calcium dynamics, manipulating GABA receptors and modeling, we found that lateral disinhibition between brain hemispheres enhance the selectivity to rotational visual flow at the output layer of the network. Moreover, asymmetric manipulations of interneurons and their descending outputs induce drifts during high-speed walking, confirming their contribution to steering control. Together, these findings highlight the importance of competitive disinhibition as a critical circuit mechanism for robust processing of optic flow, which likely influences course control and heading perception, both critical functions supporting navigation.
Animals rely on visual motion for navigating the world, and research in flies has clarified how neural circuits extract information from moving visual scenes. However, the major pathways connecting these patterns of optic flow to behavior remain poorly understood. Using a high-throughput quantitative assay of visually guided behaviors and genetic neuronal silencing, we discovered a region in Drosophila’s protocerebrum critical for visual motion following. We used neuronal silencing, calcium imaging, and optogenetics to identify a single cell type, LPC1, that innervates this region, detects translational optic flow, and plays a key role in regulating forward walking. Moreover, the population of LPC1s can estimate the travelling direction, such as when gaze direction diverges from body heading. By linking specific cell types and their visual computations to specific behaviors, our findings establish a foundation for understanding how the nervous system uses vision to guide navigation.