Filter
Associated Lab
- Aguilera Castrejon Lab (2) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (61) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (103) Apply Betzig Lab filter
- Beyene Lab (10) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (51) Apply Branson Lab filter
- Card Lab (37) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (15) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (43) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (12) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (17) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (45) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (55) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (77) Apply Hess Lab filter
- Ilanges Lab (3) Apply Ilanges Lab filter
- Jayaraman Lab (44) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (3) Apply Koay Lab filter
- Lavis Lab (146) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (7) Apply Li Lab filter
- Lippincott-Schwartz Lab (108) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (3) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (60) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (7) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (40) Apply Pachitariu Lab filter
- Pastalkova Lab (6) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (49) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (39) Apply Romani Lab filter
- Rubin Lab (111) Apply Rubin Lab filter
- Saalfeld Lab (47) Apply Saalfeld Lab filter
- Satou Lab (3) Apply Satou Lab filter
- Scheffer Lab (38) Apply Scheffer Lab filter
- Schreiter Lab (53) Apply Schreiter Lab filter
- Schulze Lab (1) Apply Schulze Lab filter
- Sgro Lab (2) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (61) Apply Spruston Lab filter
- Stern Lab (75) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (38) Apply Stringer Lab filter
- Svoboda Lab (132) Apply Svoboda Lab filter
- Tebo Lab (11) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (19) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (41) Apply Turaga Lab filter
- Turner Lab (27) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (4) Apply Voigts Lab filter
- Wang (Meng) Lab (27) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wong-Campos Lab (1) Apply Wong-Campos Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (56) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (9) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (29) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (43) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (18) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- High Performance Computing (7) Apply High Performance Computing filter
- Integrative Imaging (21) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Mass Spectrometry (1) Apply Mass Spectrometry filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Project Technical Resources (54) Apply Project Technical Resources filter
- Quantitative Genomics (20) Apply Quantitative Genomics filter
- Scientific Computing (103) Apply Scientific Computing filter
- Stem Cell & Primary Culture (14) Apply Stem Cell & Primary Culture filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2026 (46) Apply 2026 filter
- 2025 (223) Apply 2025 filter
- 2024 (211) Apply 2024 filter
- 2023 (157) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2827 Janelia Publications
Showing 1901-1910 of 2827 resultsInsect neural systems are a promising source of inspiration for new navigation algorithms, especially on low size, weight, and power platforms. There have been unprecedented recent neuroscience breakthroughs with Drosophila in behavioral and neural imaging experiments as well as the mapping of detailed connectivity of neural structures. General mechanisms for learning orientation in the central complex (CX) of Drosophila have been investigated previously; however, it is unclear how these underlying mechanisms extend to cases where there is translation through an environment (beyond only rotation), which is critical for navigation in robotic systems. Here, we develop a CX neural connectivity-constrained model that performs sensor fusion, as well as unsupervised learning of visual features for path integration; we demonstrate the viability of this circuit for use in robotic systems in simulated and physical environments. Furthermore, we propose a theoretical understanding of how distributed online unsupervised network weight modification can be leveraged for learning in a trajectory through an environment by minimizing orientation estimation error. Overall, our results may enable a new class of CX-derived low power robotic navigation algorithms and lead to testable predictions to inform future neuroscience experiments.
A group leader decided that his lab would share the fluorescent dyes they create, for free and without authorship requirements. Nearly 12,000 aliquots later, he reveals what has happened since.
New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience.
Laboratory behavioural tasks are an essential research tool. As questions asked of behaviour and brain activity become more sophisticated, the ability to specify and run richly structured tasks becomes more important. An increasing focus on reproducibility also necessitates accurate communication of task logic to other researchers. To these ends, we developed pyControl, a system of open-source hardware and software for controlling behavioural experiments comprising a simple yet flexible Python-based syntax for specifying tasks as extended state machines, hardware modules for building behavioural setups, and a graphical user interface designed for efficiently running high-throughput experiments on many setups in parallel, all with extensive online documentation. These tools make it quicker, easier, and cheaper to implement rich behavioural tasks at scale. As important, pyControl facilitates communication and reproducibility of behavioural experiments through a highly readable task definition syntax and self-documenting features. Here, we outline the system's design and rationale, present validation experiments characterising system performance, and demonstrate example applications in freely moving and head-fixed mouse behaviour.
To smooth the academic-to-industry transition, one institution is experimenting with offering biomedical researchers pre-commercial open access to new optical imaging systems still under development. The approach, the authors of this case study suggest, can be a win on both sides.
For goal-directed behaviour it is critical that we can both select the appropriate action and learn to modify the underlying movements (for example, the pitch of a note or velocity of a reach) to improve outcomes. The basal ganglia are a critical nexus where circuits necessary for the production of behaviour, such as the neocortex and thalamus, are integrated with reward signalling to reinforce successful, purposive actions. The dorsal striatum, a major input structure of basal ganglia, is composed of two opponent pathways, direct and indirect, thought to select actions that elicit positive outcomes and suppress actions that do not, respectively. Activity-dependent plasticity modulated by reward is thought to be sufficient for selecting actions in the striatum. Although perturbations of basal ganglia function produce profound changes in movement, it remains unknown whether activity-dependent plasticity is sufficient to produce learned changes in movement kinematics, such as velocity. Here we use cell-type-specific stimulation in mice delivered in closed loop during movement to demonstrate that activity in either the direct or indirect pathway is sufficient to produce specific and sustained increases or decreases in velocity, without affecting action selection or motivation. These behavioural changes were a form of learning that accumulated over trials, persisted after the cessation of stimulation, and were abolished in the presence of dopamine antagonists. Our results reveal that the direct and indirect pathways can each bidirectionally control movement velocity, demonstrating unprecedented specificity and flexibility in the control of volition by the basal ganglia.
Deep learning describes a class of machine learning algorithms that are capable of combining raw inputs into layers of intermediate features. These algorithms have recently shown impressive results across a variety of domains. Biology and medicine are data-rich disciplines, but the data are complex and often ill-understood. Hence, deep learning techniques may be particularly well suited to solve problems of these fields. We examine applications of deep learning to a variety of biomedical problems—patient classification, fundamental biological processes and treatment of patients—and discuss whether deep learning will be able to transform these tasks or if the biomedical sphere poses unique challenges. Following from an extensive literature review, we find that deep learning has yet to revolutionize biomedicine or definitively resolve any of the most pressing challenges in the field, but promising advances have been made on the prior state of the art. Even though improvements over previous baselines have been modest in general, the recent progress indicates that deep learning methods will provide valuable means for speeding up or aiding human investigation. Though progress has been made linking a specific neural network's prediction to input features, understanding how users should interpret these models to make testable hypotheses about the system under study remains an open challenge. Furthermore, the limited amount of labelled data for training presents problems in some domains, as do legal and privacy constraints on work with sensitive health records. Nonetheless, we foresee deep learning enabling changes at both bench and bedside with the potential to transform several areas of biology and medicine.
Major resources are now available to develop tools and technologies aimed at dissecting the circuitry and computations underlying behavior, unraveling the underpinnings of brain disorders, and understanding the neural substrates of cognition. Scientists from around the world shared their views around new tools and technologies to drive advances in neuroscience.
Neural stem cells show age-dependent developmental potentials, as evidenced by their production of distinct neuron types at different developmental times. Drosophila neuroblasts produce long, stereotyped lineages of neurons. We searched for factors that could regulate neural temporal fate by RNA-sequencing lineage-specific neuroblasts at various developmental times. We found that two RNA-binding proteins, IGF-II mRNA-binding protein (Imp) and Syncrip (Syp), display opposing high-to-low and low-to-high temporal gradients with lineage-specific temporal dynamics. Imp and Syp promote early and late fates, respectively, in both a slowly progressing and a rapidly changing lineage. Imp and Syp control neuronal fates in the mushroom body lineages by regulating the temporal transcription factor Chinmo translation. Together, the opposing Imp/Syp gradients encode stem cell age, specifying multiple cell fates within a lineage.
Two-photon excitation fluorescence microscopy has revolutionized our understanding of brain structure and function through the high resolution and large penetration depth it offers. Investigating neural structures in vivo requires gaining optical access to the brain, which is typically achieved by replacing a part of the skull with one or several layers of cover glass windows. To compensate for the spherical aberrations caused by the presence of these layers of glass, collar-correction objectives are typically used. However, the efficiency of this correction has been shown to depend significantly on the tilt angle between the glass window surface and the optical axis of the imaging system. Here, we first expand these observations and characterize the effect of the tilt angle on the collected fluorescence signal with thicker windows (double cover slide) and compare these results with an objective devoid of collar-correction. Second, we present a simple optical alignment device designed to rapidly minimize the tilt angle in vivo and align the optical axis of the microscope perpendicularly to the glass window to an angle below 0.25°, thereby significantly improving the imaging quality. Finally, we describe a tilt-correction procedure for users in an in vivo setting, enabling the accurate alignment with a resolution of <0.2° in only few iterations.
