Filter
Associated Lab
- Aso Lab (29) Apply Aso Lab filter
- Betzig Lab (1) Apply Betzig Lab filter
- Bock Lab (2) Apply Bock Lab filter
- Branson Lab (7) Apply Branson Lab filter
- Card Lab (5) Apply Card Lab filter
- Clapham Lab (1) Apply Clapham Lab filter
- Dickson Lab (2) Apply Dickson Lab filter
- Druckmann Lab (1) Apply Druckmann Lab filter
- Fetter Lab (1) Apply Fetter Lab filter
- Funke Lab (1) Apply Funke Lab filter
- Harris Lab (3) Apply Harris Lab filter
- Heberlein Lab (2) Apply Heberlein Lab filter
- Hermundstad Lab (2) Apply Hermundstad Lab filter
- Hess Lab (5) Apply Hess Lab filter
- Jayaraman Lab (5) Apply Jayaraman Lab filter
- Lippincott-Schwartz Lab (1) Apply Lippincott-Schwartz Lab filter
- Looger Lab (2) Apply Looger Lab filter
- O'Shea Lab (1) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Reiser Lab (15) Apply Reiser Lab filter
- Riddiford Lab (1) Apply Riddiford Lab filter
- Romani Lab (1) Apply Romani Lab filter
- Rubin Lab (140) Apply Rubin Lab filter
- Saalfeld Lab (4) Apply Saalfeld Lab filter
- Scheffer Lab (7) Apply Scheffer Lab filter
- Schreiter Lab (1) Apply Schreiter Lab filter
- Simpson Lab (3) Apply Simpson Lab filter
- Singer Lab (1) Apply Singer Lab filter
- Spruston Lab (1) Apply Spruston Lab filter
- Stern Lab (1) Apply Stern Lab filter
- Svoboda Lab (3) Apply Svoboda Lab filter
- Tjian Lab (1) Apply Tjian Lab filter
- Truman Lab (4) Apply Truman Lab filter
- Turaga Lab (1) Apply Turaga Lab filter
- Turner Lab (5) Apply Turner Lab filter
- Zuker Lab (1) Apply Zuker Lab filter
Associated Project Team
Publication Date
- 2024 (5) Apply 2024 filter
- 2023 (5) Apply 2023 filter
- 2022 (1) Apply 2022 filter
- 2021 (4) Apply 2021 filter
- 2020 (9) Apply 2020 filter
- 2019 (6) Apply 2019 filter
- 2018 (7) Apply 2018 filter
- 2017 (15) Apply 2017 filter
- 2016 (3) Apply 2016 filter
- 2015 (16) Apply 2015 filter
- 2014 (9) Apply 2014 filter
- 2013 (5) Apply 2013 filter
- 2012 (8) Apply 2012 filter
- 2011 (4) Apply 2011 filter
- 2010 (4) Apply 2010 filter
- 2009 (2) Apply 2009 filter
- 2008 (4) Apply 2008 filter
- 2007 (2) Apply 2007 filter
- 2006 (1) Apply 2006 filter
- 2002 (1) Apply 2002 filter
- 2000 (2) Apply 2000 filter
- 1999 (1) Apply 1999 filter
- 1997 (1) Apply 1997 filter
- 1995 (2) Apply 1995 filter
- 1994 (2) Apply 1994 filter
- 1993 (2) Apply 1993 filter
- 1992 (1) Apply 1992 filter
- 1991 (2) Apply 1991 filter
- 1990 (3) Apply 1990 filter
- 1989 (2) Apply 1989 filter
- 1987 (2) Apply 1987 filter
- 1986 (1) Apply 1986 filter
- 1985 (1) Apply 1985 filter
- 1984 (1) Apply 1984 filter
- 1983 (1) Apply 1983 filter
- 1982 (2) Apply 1982 filter
- 1981 (1) Apply 1981 filter
- 1979 (1) Apply 1979 filter
- 1973 (1) Apply 1973 filter
Type of Publication
140 Publications
Showing 1-10 of 140 resultsAnimals are often bombarded with visual information and must prioritize specific visual features based on their current needs. The neuronal circuits that detect and relay visual features have been well studied. Much less is known about how an animal adjusts its visual attention as its goals or environmental conditions change. During social behaviours, flies need to focus on nearby flies. Here we study how the flow of visual information is altered when female Drosophila enter an aggressive state. From the connectome, we identify three state-dependent circuit motifs poised to modify the response of an aggressive female to fly-sized visual objects: convergence of excitatory inputs from neurons conveying select visual features and internal state; dendritic disinhibition of select visual feature detectors; and a switch that toggles between two visual feature detectors. Using cell-type-specific genetic tools, together with behavioural and neurophysiological analyses, we show that each of these circuit motifs is used during female aggression. We reveal that features of this same switch operate in male Drosophila during courtship pursuit, suggesting that disparate social behaviours may share circuit mechanisms. Our study provides a compelling example of using the connectome to infer circuit mechanisms that underlie dynamic processing of sensory signals.
The central complex (CX) plays a key role in many higher-order functions of the insect brain including navigation and activity regulation. Genetic tools for manipulating individual cell types, and knowledge of what neurotransmitters and neuromodulators they express, will be required to gain mechanistic understanding of how these functions are implemented. We generated and characterized split-GAL4 driver lines that express in individual or small subsets of about half of CX cell types. We surveyed neuropeptide and neuropeptide receptor expression in the central brain using fluorescent in situ hybridization. About half of the neuropeptides we examined were expressed in only a few cells, while the rest were expressed in dozens to hundreds of cells. Neuropeptide receptors were expressed more broadly and at lower levels. Using our GAL4 drivers to mark individual cell types, we found that 51 of the 85 CX cell types we examined expressed at least one neuropeptide and 21 expressed multiple neuropeptides. Surprisingly, all co-expressed a small neurotransmitter. Finally, we used our driver lines to identify CX cell types whose activation affects sleep, and identified other central brain cell types that link the circadian clock to the CX. The well-characterized genetic tools and information on neuropeptide and neurotransmitter expression we provide should enhance studies of the CX.
Many animals use visual information to navigate, but how such information is encoded and integrated by the navigation system remains incompletely understood. In Drosophila melanogaster, EPG neurons in the central complex compute the heading direction by integrating visual input from ER neurons, which are part of the anterior visual pathway (AVP). Here we densely reconstruct all neurons in the AVP using electron-microscopy data. The AVP comprises four neuropils, sequentially linked by three major classes of neurons: MeTu neurons, which connect the medulla in the optic lobe to the small unit of the anterior optic tubercle (AOTUsu) in the central brain; TuBu neurons, which connect the AOTUsu to the bulb neuropil; and ER neurons, which connect the bulb to the EPG neurons. On the basis of morphologies, connectivity between neural classes and the locations of synapses, we identify distinct information channels that originate from four types of MeTu neurons, and we further divide these into ten subtypes according to the presynaptic connections in the medulla and the postsynaptic connections in the AOTUsu. Using the connectivity of the entire AVP and the dendritic fields of the MeTu neurons in the optic lobes, we infer potential visual features and the visual area from which any ER neuron receives input. We confirm some of these predictions physiologically. These results provide a strong foundation for understanding how distinct sensory features can be extracted and transformed across multiple processing stages to construct higher-order cognitive representations.
Vision provides animals with detailed information about their surroundings, conveying diverse features such as color, form, and movement across the visual scene. Computing these parallel spatial features requires a large and diverse network of neurons, such that in animals as distant as flies and humans, visual regions comprise half the brain’s volume. These visual brain regions often reveal remarkable structure-function relationships, with neurons organized along spatial maps with shapes that directly relate to their roles in visual processing. To unravel the stunning diversity of a complex visual system, a careful mapping of the neural architecture matched to tools for targeted exploration of that circuitry is essential. Here, we report a new connectome of the right optic lobe from a male Drosophila central nervous system FIB-SEM volume and a comprehensive inventory of the fly’s visual neurons. We developed a computational framework to quantify the anatomy of visual neurons, establishing a basis for interpreting how their shapes relate to spatial vision. By integrating this analysis with connectivity information, neurotransmitter identity, and expert curation, we classified the 53,000 neurons into 727 types, about half of which are systematically described and named for the first time. Finally, we share an extensive collection of split-GAL4 lines matched to our neuron type catalog. Together, this comprehensive set of tools and data unlock new possibilities for systematic investigations of vision in Drosophila, a foundation for a deeper understanding of sensory processing.
How memories of past events influence behavior is a key question in neuroscience. The major associative learning center in Drosophila, the Mushroom Body (MB), communicates to the rest of the brain through Mushroom Body Output Neurons (MBONs). While 21 MBON cell types have their dendrites confined to small compartments of the MB lobes, analysis of EM connectomes revealed the presence of an additional 14 MBON cell types that are atypical in having dendritic input both within the MB lobes and in adjacent brain regions. Genetic reagents for manipulating atypical MBONs and experimental data on their functions has been lacking. In this report we describe new cell-type-specific GAL4 drivers for many MBONs, including the majority of atypical MBONs. Using these genetic reagents, we conducted optogenetic activation screening to examine their ability to drive behaviors and learning. These reagents provide important new tools for the study of complex behaviors in Drosophila.
The mushroom body (MB) is the center for associative learning in insects. In Drosophila, intersectional split-GAL4 drivers and electron microscopy (EM) connectomes have laid the foundation for precise interrogation of the MB neural circuits. However, many cell types upstream and downstream of the MB remained to be investigated due to lack of driver lines. Here we describe a new collection of over 800 split-GAL4 and split-LexA drivers that cover approximately 300 cell types, including sugar sensory neurons, putative nociceptive ascending neurons, olfactory and thermo-/hygro-sensory projection neurons, interneurons connected with the MB-extrinsic neurons, and various other cell types. We characterized activation phenotypes for a subset of these lines and identified the sugar sensory neuron line most suitable for reward substitution. Leveraging the thousands of confocal microscopy images associated with the collection, we analyzed neuronal morphological stereotypy and discovered that one set of mushroom body output neurons, MBON08/MBON09, exhibits striking individuality and asymmetry across animals. In conjunction with the EM connectome maps, the driver lines reported here offer a powerful resource for functional dissection of neural circuits for associative learning in adult Drosophila.
Animals rely on visual motion for navigating the world, and research in flies has clarified how neural circuits extract information from moving visual scenes. However, the major pathways connecting these patterns of optic flow to behavior remain poorly understood. Using a high-throughput quantitative assay of visually guided behaviors and genetic neuronal silencing, we discovered a region in Drosophila’s protocerebrum critical for visual motion following. We used neuronal silencing, calcium imaging, and optogenetics to identify a single cell type, LPC1, that innervates this region, detects translational optic flow, and plays a key role in regulating forward walking. Moreover, the population of LPC1s can estimate the travelling direction, such as when gaze direction diverges from body heading. By linking specific cell types and their visual computations to specific behaviors, our findings establish a foundation for understanding how the nervous system uses vision to guide navigation.
Persistent internal states are important for maintaining survival-promoting behaviors, such as aggression. In female Drosophila melanogaster, we have previously shown that individually activating either aIPg or pC1d cell types can induce aggression. Here we investigate further the individual roles of these cholinergic, sexually dimorphic cell types, and the reciprocal connections between them, in generating a persistent aggressive internal state. We find that a brief 30-second optogenetic stimulation of aIPg neurons was sufficient to promote an aggressive internal state lasting at least 10 minutes, whereas similar stimulation of pC1d neurons did not. While we previously showed that stimulation of pC1e alone does not evoke aggression, persistent behavior could be promoted through simultaneous stimulation of pC1d and pC1e, suggesting an unexpected synergy of these cell types in establishing a persistent aggressive state. Neither aIPg nor pC1d show persistent neuronal activity themselves, implying that the persistent internal state is maintained by other mechanisms. Moreover, inactivation of pC1d did not significantly reduce aIPg-evoked persistent aggression arguing that the aggressive state did not depend on pC1d-aIPg recurrent connectivity. Our results suggest the need for alternative models to explain persistent female aggression.
Animal behavior is principally expressed through neural control of muscles. Therefore understanding how the brain controls behavior requires mapping neuronal circuits all the way to motor neurons. We have previously established technology to collect large-volume electron microscopy data sets of neural tissue and fully reconstruct the morphology of the neurons and their chemical synaptic connections throughout the volume. Using these tools we generated a dense wiring diagram, or connectome, for a large portion of the Drosophila central brain. However, in most animals, including the fly, the majority of motor neurons are located outside the brain in a neural center closer to the body, i.e. the mammalian spinal cord or insect ventral nerve cord (VNC). In this paper, we extend our effort to map full neural circuits for behavior by generating a connectome of the VNC of a male fly.
Precise, repeatable genetic access to specific neurons via GAL4/UAS and related methods is a key advantage of Drosophila neuroscience. Neuronal targeting is typically documented using light microscopy of full GAL4 expression patterns, which generally lack the single-cell resolution required for reliable cell type identification. Here we use stochastic GAL4 labeling with the MultiColor FlpOut approach to generate cellular resolution confocal images at large scale. We are releasing aligned images of 74,000 such adult central nervous systems. An anticipated use of this resource is to bridge the gap between neurons identified by electron or light microscopy. Identifying individual neurons that make up each GAL4 expression pattern improves the prediction of split-GAL4 combinations targeting particular neurons. To this end we have made the images searchable on the NeuronBridge website. We demonstrate the potential of NeuronBridge to rapidly and effectively identify neuron matches based on morphology across imaging modalities and datasets.