Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (52) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (100) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (49) Apply Branson Lab filter
- Card Lab (34) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (13) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (14) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (37) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (50) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (73) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (135) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (3) Apply Li Lab filter
- Lippincott-Schwartz Lab (93) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (5) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (34) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (45) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Shroff Lab (29) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (72) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (29) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (8) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (17) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (36) Apply Turaga Lab filter
- Turner Lab (26) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (17) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (48) Apply FlyLight filter
- GENIE (43) Apply GENIE filter
- Integrative Imaging (2) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (3) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (33) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (15) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (36) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (47) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (89) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (62) Apply 2025 filter
- 2024 (223) Apply 2024 filter
- 2023 (162) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2638 Janelia Publications
Showing 101-110 of 2638 resultsIn natural environments, animals must efficiently allocate their choices across multiple concurrently available resources when foraging, a complex decision-making process not fully captured by existing models. To understand how rodents learn to navigate this challenge we developed a novel paradigm in which untrained, water-restricted mice were free to sample from six options rewarded at a range of deterministic intervals and positioned around the walls of a large ( 2m) arena. Mice exhibited rapid learning, matching their choices to integrated reward ratios across six options within the first session. A reinforcement learning model with separate states for staying or leaving an option and a dynamic, global learning rate was able to accurately reproduce mouse learning and decision-making. Fiber photometry recordings revealed that dopamine in the nucleus accumbens core (NAcC), but not dorsomedial striatum (DMS), more closely reflected the global learning rate than local error-based updating. Altogether, our results provide insight into the neural substrate of a learning algorithm that allows mice to rapidly exploit multiple options when foraging in large spatial environments.
Considerable attention has been recently paid to improving replicability and reproducibility in life science research. This has resulted in commendable efforts to standardize a variety of reagents, assays, cell lines and other resources. However, given that microscopy is a dominant tool for biologists, comparatively little discussion has been offered regarding how the proper reporting and documentation of microscopy relevant details should be handled. Image processing is a critical step of almost any microscopy-based experiment; however, improper, or incomplete reporting of its use in the literature is pervasive. The chosen details of an image processing workflow can dramatically determine the outcome of subsequent analyses, and indeed, the overall conclusions of a study. This Review aims to illustrate how proper reporting of image processing methodology improves scientific reproducibility and strengthens the biological conclusions derived from the results.
Olshausen and Field (OF) proposed that neural computations in the primary visual cortex (V1) can be partially modelled by sparse dictionary learning. By minimizing the regularized representation error they derived an online algorithm, which learns Gabor-filter receptive fields from a natural image ensemble in agreement with physiological experiments. Whereas the OF algorithm can be mapped onto the dynamics and synaptic plasticity in a single-layer neural network, the derived learning rule is nonlocal - the synaptic weight update depends on the activity of neurons other than just pre- and postsynaptic ones – and hence biologically implausible. Here, to overcome this problem, we derive sparse dictionary learning from a novel cost-function - a regularized error of the symmetric factorization of the input’s similarity matrix. Our algorithm maps onto a neural network of the same architecture as OF but using only biologically plausible local learning rules. When trained on natural images our network learns Gabor-filter receptive fields and reproduces the correlation among synaptic weights hard-wired in the OF network. Therefore, online symmetric matrix factorization may serve as an algorithmic theory of neural computation.
A large number of degrees of freedom are required to produce a high quality focus through random scattering media. Previous demonstrations based on spatial phase modulations suffer from either a slow speed or a small number of degrees of freedom. In this work, a high speed wavefront determination technique based on spatial frequency domain wavefront modulations is proposed and experimentally demonstrated, which is capable of providing both a high operation speed and a large number of degrees of freedom. The technique was employed to focus light through a strongly scattering medium and the entire wavefront was determined in 400 milliseconds, three orders of magnitude faster than the previous report.
We demonstrate a high throughput, large compensation range, single-prism femtosecond pulse compressor, using a single prism and two roof mirrors. The compressor has zero angular dispersion, zero spatial dispersion, zero pulse-front tilt, and unity magnification. The high efficiency is achieved by adopting two roof mirrors as the retroreflectors. We experimentally achieved ~ -14500 fs2 group delay dispersion (GDD) with 30 cm of prism tip-roof mirror prism separation, and ~90.7% system throughput with the current implementation. With better components, the throughput can be even higher.
To provide a temporal framework for the genoarchitecture of brain development, we generated in situ hybridization data for embryonic and postnatal mouse brain at seven developmental stages for ∼2,100 genes, which were processed with an automated informatics pipeline and manually annotated. This resource comprises 434,946 images, seven reference atlases, an ontogenetic ontology, and tools to explore coexpression of genes across neurodevelopment. Gene sets coinciding with developmental phenomena were identified. A temporal shift in the principles governing the molecular organization of the brain was detected, with transient neuromeric, plate-based organization of the brain present at E11.5 and E13.5. Finally, these data provided a transcription factor code that discriminates brain structures and identifies the developmental age of a tissue, providing a foundation for eventual genetic manipulation or tracking of specific brain structures over development. The resource is available as the Allen Developing Mouse Brain Atlas (http://developingmouse.brain-map.org).
Assays that measure morphology, proliferation, motility, deformability, and migration are used to study the invasiveness of cancer cells. However, native invasive potential of cells may be hidden from these contextual metrics because they depend on culture conditions. We created a micropatterned chip that mimics the native environmental conditions, quantifies the invasive potential of tumor cells, and improves our understanding of the malignancy signatures. Unlike conventional assays, which rely on indirect measurements of metastatic potential, our method uses three-dimensional microchannels to measure the basal native invasiveness without chemoattractants or microfluidics. No change in cell death or proliferation is observed on our chips. Using six cancer cell lines, we show that our system is more sensitive than other motility-based assays, measures of nuclear deformability, or cell morphometrics. In addition to quantifying metastatic potential, our platform can distinguish between motility and invasiveness, help study molecular mechanisms of invasion, and screen for targeted therapeutics.
Although most experimentally characterized proteins with similar sequences assume the same folds and perform similar functions, an increasing number of exceptions is emerging. One class of exceptions comprises sequence-similar fold switchers, whose secondary structures shift from α-helix <-> β-sheet through a small number of mutations, a sequence insertion, or a deletion. Predictive methods for identifying sequence-similar fold switchers are desirable because some are associated with disease and/or can perform different functions in cells. Here, we use homology-based secondary structure predictions to identify sequence-similar fold switchers from their amino acid sequences alone. To do this, we predicted the secondary structures of sequence-similar fold switchers using three different homology-based secondary structure predictors: PSIPRED, JPred4, and SPIDER3. We found that α-helix <-> β-strand prediction discrepancies from JPred4 discriminated between the different conformations of sequence-similar fold switchers with high statistical significance (P < 1.8*10 ). Thus, we used these discrepancies as a classifier and found that they can often robustly discriminate between sequence-similar fold switchers and sequence-similar proteins that maintain the same folds (Matthews Correlation Coefficient of 0.82). We found that JPred4 is a more robust predictor of sequence-similar fold switchers because of (a) the curated sequence database it uses to produce multiple sequence alignments and (b) its use of sequence profiles based on Hidden Markov Models. Our results indicate that inconsistencies between JPred4 secondary structure predictions can be used to identify some sequence-similar fold switchers from their sequences alone. Thus, the negative information from inconsistent secondary structure predictions can potentially be leveraged to identify sequence-similar fold switchers from the broad base of genomic sequences.
Animal sensory systems are optimally adapted to those features typically encountered in natural surrounds, thus allowing neurons with limited bandwidth to encode challengingly large input ranges. Natural scenes are not random, and peripheral visual systems in vertebrates and insects have evolved to respond efficiently to their typical spatial statistics. The mammalian visual cortex is also tuned to natural spatial statistics, but less is known about coding in higher order neurons in insects. To redress this we here record intracellularly from a higher order visual neuron in the hoverfly. We show that the cSIFE neuron, which is inhibited by stationary images, is maximally inhibited when the slope constant of the amplitude spectrum is close to the mean in natural scenes. The behavioural optomotor response is also strongest to images with naturalistic image statistics. Our results thus reveal a close coupling between the inherent statistics of natural scenes and higher order visual processing in insects.
Imaging is used to map activity across populations of neurons. Microscopes with cellular resolution have small (<1 millimeter) fields of view and cannot simultaneously image activity distributed across multiple brain areas. Typical large field of view microscopes do not resolve single cells, especially in the axial dimension. We developed a 2-photon random access mesoscope (2p-RAM) that allows high-resolution imaging anywhere within a volume spanning multiple brain areas (∅ 5 mm x 1 mm cylinder). 2p-RAM resolution is near diffraction limited (lateral, 0.66 μm, axial 4.09 μm at the center; excitation wavelength = 970 nm; numerical aperture = 0.6) over a large range of excitation wavelengths. A fast three-dimensional scanning system allows efficient sampling of neural activity in arbitrary regions of interest across the entire imaging volume. We illustrate the use of the 2p-RAM by imaging neural activity in multiple, non-contiguous brain areas in transgenic mice expressing protein calcium sensors.