Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (53) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (101) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (50) Apply Branson Lab filter
- Card Lab (36) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (53) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (23) Apply Hermundstad Lab filter
- Hess Lab (74) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (137) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (97) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (58) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (36) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (32) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (39) Apply Turaga Lab filter
- Turner Lab (27) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (21) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (3) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (46) Apply GENIE filter
- Integrative Imaging (4) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (36) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (16) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (50) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (92) Apply Scientific Computing Software filter
- Scientific Computing Systems (7) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (126) Apply 2025 filter
- 2024 (215) Apply 2024 filter
- 2023 (159) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2691 Janelia Publications
Showing 1071-1080 of 2691 resultsExtracting a connectome from an electron microscopy (EM) data set requires identification of neurons and determination of synapses between neurons. As manual extraction of this information is very time-consuming, there has been extensive research effort to automatically segment the neurons to help guide and eventually replace manual tracing. Until recently, there has been comparatively less research on automatically detecting the actual synapses between neurons. This discrepancy can, in part, be attributed to several factors: obtaining neuronal shapes is a prerequisite first step in extracting a connectome, manual tracing is much more time-consuming than annotating synapses, and neuronal contact area can be used as a proxy for synapses in determining connections.
However, recent research has demonstrated that contact area alone is not a sufficient predictor of synaptic connection. Moreover, as segmentation has improved, we have observed that synapse annotation is consuming a more significant fraction of overall reconstruction time. This ratio will only get worse as segmentation improves, gating overall possible speed-up. Therefore, we address this problem by developing algorithms that automatically detect pre-synaptic neurons and their post-synaptic partners. In particular, pre-synaptic structures are detected using a Deep and Wide Multiscale Recursive Network, and post-synaptic partners are detected using a MLP with features conditioned on the local segmentation.
This work is novel because it requires minimal amount of training, leverages advances in image segmentation directly, and provides a complete solution for polyadic synapse detection. We further introduce novel metrics to evaluate our algorithm on connectomes of meaningful size. These metrics demonstrate that complete automatic prediction can be used to effectively characterize most connectivity correctly.
This chapter reviews the application of new genetically encoded tools in feeding circuits that regulate appetite. Rapid activation and inhibition of agouti related peptide (AgRP) neurons conclusively established a causal role for rapid control of food intake. Chemogenetic activation of AgRP neurons using hM3Dq avoids the invasive protocols required for ChR2 activation. ChR2 distributes into axons, and selective optogenetic activation of AgRP neuron axon projection fields in distinct brain areas was used to examine their individual contribution to feeding behavior. Some of the brain areas targeted by AgRP neuron axon projections have been examined further for cell type specific control of appetite. Rodents with bed nucleus of stria terminalis (BNST) lesions show hyperphagia and obesity, indicating that reduced BNST output promotes feeding. pro-opiomelanocortin (POMC) neurons regulate feeding over longer timescales. parabrachial nucleus (PBN) neurons have a powerful inhibitory role on food intake, but their inhibition does not strongly elevate food intake.
Most sensory systems are organized into parallel neuronal pathways that process distinct aspects of incoming stimuli. In the insect olfactory system, second order projection neurons target both the mushroom body, required for learning, and the lateral horn (LH), proposed to mediate innate olfactory behavior. Mushroom body neurons form a sparse olfactory population code, which is not stereotyped across animals. In contrast, odor coding in the LH remains poorly understood. We combine genetic driver lines, anatomical and functional criteria to show that the LH has ~1400 neurons and >165 cell types. Genetically labeled LHNs have stereotyped odor responses across animals and on average respond to three times more odors than single projection neurons. LHNs are better odor categorizers than projection neurons, likely due to stereotyped pooling of related inputs. Our results reveal some of the principles by which a higher processing area can extract innate behavioral significance from sensory stimuli.
To effectively control their bodies, animals rely on feedback from proprioceptive mechanosensory neurons. In the Drosophila leg, different proprioceptor subtypes monitor joint position, movement direction, and vibration. Here, we investigate how these diverse sensory signals are integrated by central proprioceptive circuits. We find that signals for leg joint position and directional movement converge in second-order neurons, revealing pathways for local feedback control of leg posture. Distinct populations of second-order neurons integrate tibia vibration signals across pairs of legs, suggesting a role in detecting external substrate vibration. In each pathway, the flow of sensory information is dynamically gated and sculpted by inhibition. Overall, our results reveal parallel pathways for processing of internal and external mechanosensory signals, which we propose mediate feedback control of leg movement and vibration sensing, respectively. The existence of a functional connectivity map also provides a resource for interpreting connectomic reconstruction of neural circuits for leg proprioception.
The brain adaptively integrates present sensory input, past experience, and options for future action. The insect mushroom body exemplifies how a central brain structure brings about such integration. Here we use a combination of systematic single-cell labeling, connectomics, transgenic silencing, and activation experiments to study the mushroom body at single-cell resolution, focusing on the behavioral architecture of its input and output neurons (MBINs and MBONs), and of the mushroom body intrinsic APL neuron. Our results reveal the identity and morphology of almost all of these 44 neurons in stage 3 Drosophila larvae. Upon an initial screen, functional analyses focusing on the mushroom body medial lobe uncover sparse and specific functions of its dopaminergic MBINs, its MBONs, and of the GABAergic APL neuron across three behavioral tasks, namely odor preference, taste preference, and associative learning between odor and taste. Our results thus provide a cellular-resolution study case of how brains organize behavior.
The active properties of dendrites can support local nonlinear operations, but previous imaging and electrophysiological measurements have produced conflicting views regarding the prevalence and selectivity of local nonlinearities in vivo. We imaged calcium signals in pyramidal cell dendrites in the motor cortex of mice performing a tactile decision task. A custom microscope allowed us to image the soma and up to 300 μm of contiguous dendrite at 15 Hz, while resolving individual spines. New analysis methods were used to estimate the frequency and spatial scales of activity in dendritic branches and spines. The majority of dendritic calcium transients were coincident with global events. However, task-associated calcium signals in dendrites and spines were compartmentalized by dendritic branching and clustered within branches over approximately 10 μm. Diverse behavior-related signals were intermingled and distributed throughout the dendritic arbor, potentially supporting a large learning capacity in individual neurons.
The male-specific Fruitless proteins (Fru(M)) act to establish the potential for male courtship behavior in Drosophila melanogaster and are expressed in small groups of neurons throughout the nervous system. We screened 1000 GAL4 lines, using assays for general courtship, male-male interactions, and male fertility to determine the phenotypes resulting from the GAL4 driven inhibition of Fru(M) expression in subsets of these neurons. A battery of secondary assays showed that the phenotypic classes of GAL4 lines could be divided into subgroups based on additional neurobiological and behavioral criteria. For example, in some lines restoration of Fru(M) expression in cholinergic neurons restores fertility or reduces male-male courtship. Persistent chains of males courting each other in some lines results from males courting both sexes indiscriminately whereas in other lines this phenotype result from apparent habituation deficits. Inhibition of ectopic Fru(M) expression in females, in populations of neurons where Fru(M) is necessary for male fertility, can rescue female infertility. To identify the neurons responsible for some of the observed behavioral alterations, we determined the overlap between the identified GAL4 lines and endogenous Fru(M) expression in lines with fertility defects. The GAL4 lines causing fertility defects generally had widespread overlap with Fru(M) expression in many regions of the nervous system suggesting likely redundant Fru(M)-expressing neuronal pathways capable of conferring male fertility. From associations between the screened behaviors, we propose a functional model for courtship initiation.
Understanding how activity patterns in specific neural circuits coordinate an animal's behavior remains a key area of neuroscience research. Genetic tools and a brain of tractable complexity make a premier model organism for these studies. Here, we review the wealth of reagents available to map and manipulate neuronal activity with light.
Spatial navigation is often used as a behavioral task in studies of the neuronal circuits that underlie cognition, learning and memory in rodents. The combination of in vivo microscopy with genetically encoded indicators has provided an important new tool for studying neuronal circuits, but has been technically difficult to apply during navigation. Here we describe methods for imaging the activity of neurons in the CA1 region of the hippocampus with subcellular resolution in behaving mice. Neurons that expressed the genetically encoded calcium indicator GCaMP3 were imaged through a chronic hippocampal window. Head-restrained mice performed spatial behaviors in a setup combining a virtual reality system and a custom-built two-photon microscope. We optically identified populations of place cells and determined the correlation between the location of their place fields in the virtual environment and their anatomical location in the local circuit. The combination of virtual reality and high-resolution functional imaging should allow a new generation of studies to investigate neuronal circuit dynamics during behavior.