Filter
Associated Lab
- Aguilera Castrejon Lab (2) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (57) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (102) Apply Betzig Lab filter
- Beyene Lab (9) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (51) Apply Branson Lab filter
- Card Lab (37) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (40) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (8) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (41) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (54) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (25) Apply Hermundstad Lab filter
- Hess Lab (76) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (43) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (139) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (5) Apply Li Lab filter
- Lippincott-Schwartz Lab (101) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (2) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (59) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (37) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (46) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (34) Apply Romani Lab filter
- Rubin Lab (108) Apply Rubin Lab filter
- Saalfeld Lab (47) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (37) Apply Scheffer Lab filter
- Schreiter Lab (51) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (58) Apply Spruston Lab filter
- Stern Lab (75) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (36) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (40) Apply Turaga Lab filter
- Turner Lab (28) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (23) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (55) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (6) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (27) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (40) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (18) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- High Performance Computing (7) Apply High Performance Computing filter
- Integrative Imaging (18) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (50) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing (95) Apply Scientific Computing filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (187) Apply 2025 filter
- 2024 (212) Apply 2024 filter
- 2023 (158) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2747 Janelia Publications
Showing 671-680 of 2747 resultsThe interplay between two major forebrain structures - cortex and subcortical striatum - is critical for flexible, goal-directed action. Traditionally, it has been proposed that striatum is critical for selecting what type of action is initiated while the primary motor cortex is involved in the online control of movement execution. Recent data indicates that striatum may also be critical for specifying movement execution. These alternatives have been difficult to reconcile because when comparing very distinct actions, as in the vast majority of work to date, they make essentially indistinguishable predictions. Here, we develop quantitative models to reveal a somewhat paradoxical insight: only comparing neural activity during similar actions makes strongly distinguishing predictions. We thus developed a novel reach-to-pull task in which mice reliably selected between two similar, but distinct reach targets and pull forces. Simultaneous cortical and subcortical recordings were uniquely consistent with a model in which cortex and striatum jointly specify flexible parameters of action during movement execution.
Feature-selective firing allows networks to produce representations of the external and internal environments. Despite its importance, the mechanisms generating neuronal feature selectivity are incompletely understood. In many cortical microcircuits the integration of two functionally distinct inputs occurs nonlinearly through generation of active dendritic signals that drive burst firing and robust plasticity. To examine the role of this processing in feature selectivity, we recorded CA1 pyramidal neuron membrane potential and local field potential in mice running on a linear treadmill. We found that dendritic plateau potentials were produced by an interaction between properly timed input from entorhinal cortex and hippocampal CA3. These conjunctive signals positively modulated the firing of previously established place fields and rapidly induced new place field formation to produce feature selectivity in CA1 that is a function of both entorhinal cortex and CA3 input. Such selectivity could allow mixed network level representations that support context-dependent spatial maps.
Brains are optimized for processing ethologically relevant sensory signals. However, few studies have characterized the neural coding mechanisms that underlie the transformation from natural sensory information to behavior. Here, we focus on acoustic communication in Drosophila melanogaster and use computational modeling to link natural courtship song, neuronal codes, and female behavioral responses to song. We show that melanogaster females are sensitive to long timescale song structure (on the order of tens of seconds). From intracellular recordings, we generate models that recapitulate neural responses to acoustic stimuli. We link these neural codes with female behavior by generating model neural responses to natural courtship song. Using a simple decoder, we predict female behavioral responses to the same song stimuli with high accuracy. Our modeling approach reveals how long timescale song features are represented by the Drosophila brain and how neural representations can be decoded to generate behavioral selectivity for acoustic communication signals.
Neural representations of information are shaped by local network interactions. Previous studies linking neural coding and cortical connectivity focused on stimulus selectivity in the sensory cortex 1–4. Here we study neural activity in the motor cortex during naturalistic behavior in which mice gathered rewards with multidirectional tongue reaching. This behavior does not require training and thus allowed us to probe neural coding and connectivity in motor cortex before its activity is shaped by learning a specific task. Neurons typically responded during and after reaching movements and exhibited conjunctive tuning to target location and reward outcome. We used an all-optical 5,4,6,7 method for large-scale causal functional connectivity mapping in vivo. Mapping connectivity between > 20,000,000 excitatory neuronal pairs revealed fine-scale columnar architecture in layer 2/3 of the motor cortex. Neurons displayed local (< 100 µm) like-to-like connectivity according to target-location tuning, and inhibition over longer spatial scales. Connectivity patterns comprised a continuum, with abundant weakly connected neurons and sparse strongly connected neurons that function as network hubs. Hub neurons were weakly tuned to target-location and reward-outcome but strongly influenced neighboring neurons. This network of neurons, encoding location and outcome of movements to different motor goals, may be a general substrate for rapid learning of complex, goal-directed behaviors.
Recent powerful tools for reconstructing connectomes using electron microscopy (EM) have made outstanding contributions to the field of neuroscience. As a prime example, the detection of visual motion is a classic problem of neural computation, yet our understanding of the exact mechanism has been frustrated by our incomplete knowledge of the relevant neurons and synapses. Recent connectomic studies have successfully identified the concrete neuronal circuit in the fly's visual system that computes the motion signals. This identification was greatly aided by the comprehensiveness of the EM reconstruction. Compared with light microscopy, which gives estimated connections from arbor overlap, EM gives unequivocal connections with precise synaptic counts. This paper reviews the recent study of connectomics in a brain of the fruit fly Drosophila and highlights how connectomes can provide a foundation for understanding the mechanism of neuronal functions by identifying the underlying neural circuits.
The brain is a network of neurons, one that generates behaviour, and knowing the former is crucial to understanding the latter. Identifying the exact network of synaptic connections, or connectome, of the fly's central nervous system is now a major objective in Drosophila neurobiology, one that has been initiated in several laboratories, especially the Janelia Research Campus of the Howard Hughes Medical Institute. Progress is most advanced in the optic neuropiles of the visual system. The effort to derive a connectome from these and other neuropile regions is proceeding by various methods of electron microscopy, especially focused-ion beam milling scanning electron microscopy, and relies upon - but is to be carefully distinguished from - published light microscopic methods that reveal the projections of genetically labelled cell types. The latter reveal those neurons that come into close proximity and are therefore candidate synaptic partners. Synaptic partnerships are not in fact reliably revealed by such candidate pairs, anatomical connections often revealing unexpected pathways. Synaptic partnerships identified from ultrastructural features provide a strong heuristic basis to interpret not only functional interactions between identified neurons, but also a powerful means to predict such interactions, and suggest functional pathways not readily predicted from existing experimental evidence. The analysis of circuit function may proceed cell by cell, by examining the behavioural outcome of either interrupting or restoring function to any one element in an anatomically defined circuit, but can be foiled by degeneracy in pathway elements. Circuit information can also be used to identify and analyse circuit motifs, and their role in higher-order network properties. These attempts in Drosophila anticipate parallel attempts in other systems, notably the inner plexiform layer of the vertebrate retina, and augment the one complete connectome already available to us, that available for 30 years in the nematode Caenorhabditis elegans.
The availability of both anatomical connectivity and brain-wide neural activity measurements in C. elegans make the worm a promising system for learning detailed, mechanistic models of an entire nervous system in a data-driven way. However, one faces several challenges when constructing such a model. We often do not have direct experimental access to important modeling details such as single-neuron dynamics and the signs and strengths of the synaptic connectivity. Further, neural activity can only be measured in a subset of neurons, often indirectly via calcium imaging, and significant trial-to-trial variability has been observed. To address these challenges, we introduce a connectome-constrained latent variable model (CC-LVM) of the unobserved voltage dynamics of the entire C. elegans nervous system and the observed calcium signals. We used the framework of variational autoencoders to fit parameters of the mechanistic simulation constituting the generative model of the LVM to calcium imaging observations. A variational approximate posterior distribution over latent voltage traces for all neurons is efficiently inferred using an inference network, and constrained by a prior distribution given by the biophysical simulation of neural dynamics. We applied this model to an experimental whole-brain dataset, and found that connectomic constraints enable our LVM to predict the activity of neurons whose activity were withheld significantly better than models unconstrained by a connectome. We explored models with different degrees of biophysical detail, and found that models with realistic conductance-based synapses provide markedly better predictions than current-based synapses for this system.
We can now measure the connectivity of every neuron in a neural circuit, but we cannot measure other biological details, including the dynamical characteristics of each neuron. The degree to which measurements of connectivity alone can inform the understanding of neural computation is an open question. Here we show that with experimental measurements of only the connectivity of a biological neural network, we can predict the neural activity underlying a specified neural computation. We constructed a model neural network with the experimentally determined connectivity for 64 cell types in the motion pathways of the fruit fly optic lobe but with unknown parameters for the single-neuron and single-synapse properties. We then optimized the values of these unknown parameters using techniques from deep learning, to allow the model network to detect visual motion. Our mechanistic model makes detailed, experimentally testable predictions for each neuron in the connectome. We found that model predictions agreed with experimental measurements of neural activity across 26 studies. Our work demonstrates a strategy for generating detailed hypotheses about the mechanisms of neural circuit function from connectivity measurements. We show that this strategy is more likely to be successful when neurons are sparsely connected-a universally observed feature of biological neural networks across species and brain regions.
Vision provides animals with detailed information about their surroundings, conveying diverse features such as color, form, and movement across the visual scene. Computing these parallel spatial features requires a large and diverse network of neurons, such that in animals as distant as flies and humans, visual regions comprise half the brain’s volume. These visual brain regions often reveal remarkable structure-function relationships, with neurons organized along spatial maps with shapes that directly relate to their roles in visual processing. To unravel the stunning diversity of a complex visual system, a careful mapping of the neural architecture matched to tools for targeted exploration of that circuitry is essential. Here, we report a new connectome of the right optic lobe from a male Drosophila central nervous system FIB-SEM volume and a comprehensive inventory of the fly’s visual neurons. We developed a computational framework to quantify the anatomy of visual neurons, establishing a basis for interpreting how their shapes relate to spatial vision. By integrating this analysis with connectivity information, neurotransmitter identity, and expert curation, we classified the 53,000 neurons into 727 types, about half of which are systematically described and named for the first time. Finally, we share an extensive collection of split-GAL4 lines matched to our neuron type catalog. Together, this comprehensive set of tools and data unlock new possibilities for systematic investigations of vision in Drosophila, a foundation for a deeper understanding of sensory processing.
Mitochondria are an integral part of the metabolism of a neuron. EM images of fly brain volumes, taken for connectomics, contain mitochondria as well as the cells and synapses that have already been reported. Here, from the Drosophila hemibrain dataset, we extract, classify, and measure approximately 6 million mitochondria among roughly 21 thousand neurons of more than 5500 cell types. Each mitochondrion is classified by its appearance - dark and dense, light and sparse, or intermediate - and the location, orientation, and size (in voxels) are annotated. These mitochondria are added to our publicly available data portal, and each synapse is linked to its closest mitochondrion. Using this data, we show quantitative evidence that mitochodrial trafficing extends to the smallest dimensions in neurons. The most basic characteristics of mitochondria - volume, distance from synapses, and color - vary considerably between cell types, and between neurons with different neurotransmitters. We find that polyadic synapses with more post-synaptic densities (PSDs) have closer and larger mitochondria on the pre-synaptic side, but smaller and more distant mitochondria on the PSD side. We note that this relationship breaks down for synapses with only one PSD, suggesting a different role for such synapses.Competing Interest StatementThe authors have declared no competing interest.