Filter
Associated Lab
- Aguilera Castrejon Lab (2) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (58) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (103) Apply Betzig Lab filter
- Beyene Lab (9) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (51) Apply Branson Lab filter
- Card Lab (37) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (41) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (17) Apply Espinosa Medina Lab filter
- Feliciano Lab (10) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (42) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (54) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (76) Apply Hess Lab filter
- Ilanges Lab (3) Apply Ilanges Lab filter
- Jayaraman Lab (44) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (142) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (6) Apply Li Lab filter
- Lippincott-Schwartz Lab (106) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (2) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (59) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (37) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (48) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (37) Apply Romani Lab filter
- Rubin Lab (110) Apply Rubin Lab filter
- Saalfeld Lab (47) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (38) Apply Scheffer Lab filter
- Schreiter Lab (52) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (61) Apply Spruston Lab filter
- Stern Lab (75) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (36) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (11) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (41) Apply Turaga Lab filter
- Turner Lab (28) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (4) Apply Voigts Lab filter
- Wang (Meng) Lab (27) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (56) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (7) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (28) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (41) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (18) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- High Performance Computing (7) Apply High Performance Computing filter
- Integrative Imaging (18) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Mass Spectrometry (1) Apply Mass Spectrometry filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (53) Apply Project Technical Resources filter
- Quantitative Genomics (20) Apply Quantitative Genomics filter
- Scientific Computing (100) Apply Scientific Computing filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (227) Apply 2025 filter
- 2024 (211) Apply 2024 filter
- 2023 (157) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2785 Janelia Publications
Showing 841-850 of 2785 resultsVisual motion perception is critical to many animal behaviors, and flies have emerged as a powerful model system for exploring this fundamental neural computation. Although numerous studies have suggested that fly motion vision is governed by a simple neural circuit [1-3], the implementation of this circuit has remained mysterious for decades. Connectomics and neurogenetics have produced a surge in recent progress, and several studies have shown selectivity for light increments (ON) or decrements (OFF) in key elements associated with this circuit [4-7]. However, related studies have reached disparate conclusions about where this selectivity emerges and whether it plays a major role in motion vision [8-13]. To address these questions, we examined activity in the neuropil thought to be responsible for visual motion detection, the medulla, of Drosophila melanogaster in response to a range of visual stimuli using two-photon calcium imaging. We confirmed that the input neurons of the medulla, the LMCs, are not responsible for light-on and light-off selectivity. We then examined the pan-neural response of medulla neurons and found prominent selectivity for light-on and light-off in layers of the medulla associated with two anatomically derived pathways (L1/L2 associated) [14, 15]. We next examined the activity of prominent interneurons within each pathway (Mi1 and Tm1) and found that these neurons have corresponding selectivity for light-on or light-off. These results provide direct evidence that motion is computed in parallel light-on and light-off pathways, demonstrate that this selectivity emerges in neurons immediately downstream of the LMCs, and specify where crucial elements of motion computation occur.
PIEZOs are mechanosensitive ion channels that convert force into chemoelectric signals and have essential roles in diverse physiological settings. In vitro studies have proposed that PIEZO channels transduce mechanical force through the deformation of extensive blades of transmembrane domains emanating from a central ion-conducting pore. However, little is known about how these channels interact with their native environment and which molecular movements underlie activation. Here we directly observe the conformational dynamics of the blades of individual PIEZO1 molecules in a cell using nanoscopic fluorescence imaging. Compared with previous structural models of PIEZO1, we show that the blades are significantly expanded at rest by the bending stress exerted by the plasma membrane. The degree of expansion varies dramatically along the length of the blade, where decreased binding strength between subdomains can explain increased flexibility of the distal blade. Using chemical and mechanical modulators of PIEZO1, we show that blade expansion and channel activation are correlated. Our findings begin to uncover how PIEZO1 is activated in a native environment. More generally, as we reliably detect conformational shifts of single nanometres from populations of channels, we expect that this approach will serve as a framework for the structural analysis of membrane proteins through nanoscopic imaging.
In traditional zonal wavefront sensing for adaptive optics, after local wavefront gradients are obtained, the entire wavefront can be calculated by assuming that the wavefront is a continuous surface. Such an approach will lead to sub-optimal performance in reconstructing wavefronts which are either discontinuous or undersampled by the zonal wavefront sensor. Here, we report a new method to reconstruct the wavefront by directly measuring local wavefront phases in parallel using multidither coherent optical adaptive technique. This method determines the relative phases of each pupil segment independently, and thus produces an accurate wavefront for even discontinuous wavefronts. We implemented this method in an adaptive optical two-photon fluorescence microscopy and demonstrated its superior performance in correcting large or discontinuous aberrations.
Uncovering the direct regulatory targets of doublesex (dsx) and fruitless (fru) is crucial for an understanding of how they regulate sexual development, morphogenesis, differentiation and adult functions (including behavior) in Drosophila melanogaster. Using a modified DamID approach, we identified 650 DSX-binding regions in the genome from which we then extracted an optimal palindromic 13 bp DSX-binding sequence. This sequence is functional in vivo, and the base identity at each position is important for DSX binding in vitro. In addition, this sequence is enriched in the genomes of D. melanogaster (58 copies versus approximately the three expected from random) and in the 11 other sequenced Drosophila species, as well as in some other Dipterans. Twenty-three genes are associated with both an in vivo peak in DSX binding and an optimal DSX-binding sequence, and thus are almost certainly direct DSX targets. The association of these 23 genes with optimum DSX binding sites was used to examine the evolutionary changes occurring in DSX and its targets in insects.
We advance two-photon microscopy for near-diffraction-limited imaging up to 850 µm below the pia in awake mice. Our approach combines direct wavefront sensing of light from a guidestar (formed by descanned fluorescence from Cy5.5-conjugated dextran in brain microvessels) with adaptive optics to compensate for tissue-induced aberrations in the wavefront. We achieve high signal-to-noise ratios in recordings of glutamate release from thalamocortical axons and calcium transients in spines of layer 5b basal dendrites during active tactile sensing.
Adaptive optics by direct imaging of the wavefront distortions of a laser-induced guide star has long been used in astronomy, and more recently in microscopy to compensate for aberrations in transparent specimens. Here we extend this approach to tissues that strongly scatter visible light by exploiting the reduced scattering of near-infrared guide stars. The method enables in vivo two-photon morphological and functional imaging down to 700 μm inside the mouse brain.
Serotonin plays a central role in cognition and is the target of most pharmaceuticals for psychiatric disorders. Existing drugs have limited efficacy; creation of improved versions will require better understanding of serotonergic circuitry, which has been hampered by our inability to monitor serotonin release and transport with high spatial and temporal resolution. We developed and applied a binding-pocket redesign strategy, guided by machine learning, to create a high-performance, soluble, fluorescent serotonin sensor (iSeroSnFR), enabling optical detection of millisecond-scale serotonin transients. We demonstrate that iSeroSnFR can be used to detect serotonin release in freely behaving mice during fear conditioning, social interaction, and sleep/wake transitions. We also developed a robust assay of serotonin transporter function and modulation by drugs. We expect that both machine-learning-guided binding-pocket redesign and iSeroSnFR will have broad utility for the development of other sensors and in vitro and in vivo serotonin detection, respectively.
Cataloging the neuronal cell types that comprise circuitry of individual brain regions is a major goal of modern neuroscience and the BRAIN initiative. Single-cell RNA sequencing can now be used to measure the gene expression profiles of individual neurons and to categorize neurons based on their gene expression profiles. While the single-cell techniques are extremely powerful and hold great promise, they are currently still labor intensive, have a high cost per cell, and, most importantly, do not provide information on spatial distribution of cell types in specific regions of the brain. We propose a complementary approach that uses computational methods to infer the cell types and their gene expression profiles through analysis of brain-wide single-cell resolution in situ hybridization (ISH) imagery contained in the Allen Brain Atlas (ABA). We measure the spatial distribution of neurons labeled in the ISH image for each gene and model it as a spatial point process mixture, whose mixture weights are given by the cell types which express that gene. By fitting a point process mixture model jointly to the ISH images, we infer both the spatial point process distribution for each cell type and their gene expression profile. We validate our predictions of cell type-specific gene expression profiles using single cell RNA sequencing data, recently published for the mouse somatosensory cortex. Jointly with the gene expression profiles, cell features such as cell size, orientation, intensity and local density level are inferred per cell type.
Symbolic models play a key role in cognitive science, expressing computationally precise hypotheses about how the brain implements a cognitive process. Identifying an appropriate model typically requires a great deal of effort and ingenuity on the part of a human scientist. Here, we adapt FunSearch Romera-Paredes et al. (2024), a recently developed tool that uses Large Language Models (LLMs) in an evolutionary algorithm, to automatically discover symbolic cognitive models that accurately capture human and animal behavior. We consider datasets from three species performing a classic reward-learning task that has been the focus of substantial modeling effort, and find that the discovered programs outperform state-of-the-art cognitive models for each. The discovered programs can readily be interpreted as hypotheses about human and animal cognition, instantiating interpretable symbolic learning and decision-making algorithms. Broadly, these results demonstrate the viability of using LLM-powered program synthesis to propose novel scientific hypotheses regarding mechanisms of human and animal cognition.
A single nervous system can generate many distinct motor patterns. Identifying which neurons and circuits control which behaviors has been a laborious piecemeal process, usually for one observer-defined behavior at a time. We present a fundamentally different approach to neuron-behavior mapping. We optogenetically activated 1,054 identified neuron lines in Drosophila larva and tracked the behavioral responses from 37,780 animals. Applying multiscale unsupervised structure learning methods to the behavioral data identified 29 discrete statistically distinguishable and observer-unbiased behavioral phenotypes. Mapping the neural lines to the behavior(s) they evoke provides a behavioral reference atlas for neuron subsets covering a large fraction of larval neurons. This atlas is a starting point for connectivity- and activity-mapping studies to further investigate the mechanisms by which neurons mediate diverse behaviors.
