Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (46) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (99) Apply Betzig Lab filter
- Beyene Lab (5) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (45) Apply Branson Lab filter
- Card Lab (34) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (12) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (34) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (12) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (15) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (35) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (48) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (18) Apply Hermundstad Lab filter
- Hess Lab (69) Apply Hess Lab filter
- Ilanges Lab (1) Apply Ilanges Lab filter
- Jayaraman Lab (40) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (60) Apply Keller Lab filter
- Koay Lab (1) Apply Koay Lab filter
- Lavis Lab (126) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (3) Apply Li Lab filter
- Lippincott-Schwartz Lab (89) Apply Lippincott-Schwartz Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (136) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (5) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (30) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (3) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (43) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (29) Apply Romani Lab filter
- Rubin Lab (101) Apply Rubin Lab filter
- Saalfeld Lab (44) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (45) Apply Schreiter Lab filter
- Shroff Lab (24) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (55) Apply Spruston Lab filter
- Stern Lab (69) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (26) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (7) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (14) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (34) Apply Turaga Lab filter
- Turner Lab (24) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (2) Apply Voigts Lab filter
- Wang (Meng) Lab (10) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (5) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (6) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (51) Apply FlyEM filter
- FlyLight (46) Apply FlyLight filter
- GENIE (40) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (24) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (1) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (33) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (13) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (13) Apply Integrative Imaging filter
- Invertebrate Shared Resource (39) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (35) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (13) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (37) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (63) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (6) Apply Vivarium filter
Publication Date
- 2024 (170) Apply 2024 filter
- 2023 (170) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (174) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2529 Janelia Publications
Showing 371-380 of 2529 resultsHigh-resolution microscopic imaging of biological samples often produces multiple 3D image tiles to cover a large field of view of specimen. Usually each tile has a large size, in the range of hundreds of megabytes to several gigabytes. For many of our image data sets, existing software tools are often unable to stitch those 3D tiles into a panoramic view, thus impede further data analysis. We propose a simple, but accurate, robust, and automatic method to stitch a group of image tiles without a priori adjacency information of them. We first use a multiscale strategy to register a pair of 3D image tiles rapidly, achieving about 8~10 times faster speed and 10 times less memory requirement compared to previous methods. Then we design a minimum-spanning-tree based method to determine the optimal adjacency of tiles. We have successfully stitched large image stacks of model animals including C. elegans, fruit fly, dragonfly, and mouse, which could not be stitched by several existing methods.
The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.
A quantitative description of animal social behaviour is informative for behavioural biologists and clinicians developing drugs to treat social disorders. Social interaction in a group of animals has been difficult to measure because behaviour develops over long periods of time and requires tedious manual scoring, which is subjective and often non-reproducible. Computer-vision systems with the ability to measure complex social behaviour automatically would have a transformative impact on biology. Here, we present a method for tracking group-housed mice individually as they freely interact over multiple days. Each mouse is bleach-marked with a unique fur pattern. The patterns are automatically learned by the tracking software and used to infer identities. Trajectories are analysed to measure behaviour as it develops over days, beyond the range of acute experiments. We demonstrate how our system may be used to study the development of place preferences, associations and social relationships by tracking four mice continuously for five days. Our system enables accurate and reproducible characterisation of wild-type mouse social behaviour and paves the way for high-throughput long-term observation of the effects of genetic, pharmacological and environmental manipulations.
Reconstruction of neural circuitry at single-synapse resolution is an attractive target for improving understanding of the nervous system in health and disease. Serial section transmission electron microscopy (ssTEM) is among the most prolific imaging methods employed in pursuit of such reconstructions. We demonstrate how Flood-Filling Networks (FFNs) can be used to computationally segment a forty-teravoxel whole-brain Drosophila ssTEM volume. To compensate for data irregularities and imperfect global alignment, FFNs were combined with procedures that locally re-align serial sections and dynamically adjust image content. The proposed approach produced a largely merger-free segmentation of the entire ssTEM Drosophila brain, which we make freely available. As compared to manual tracing using an efficient skeletonization strategy, the segmentation enabled circuit reconstruction and analysis workflows that were an order of magnitude faster.
Digital reconstruction of neurons from microscope images is an important and challenging problem in neuroscience. In this paper, we propose a model-based method to tackle this problem. We first formulate a model structure, then develop an algorithm for computing it by carefully taking into account morphological characteristics of neurons, as well as the image properties under typical imaging protocols. The method has been tested on the data sets used in the DIADEM competition and produced promising results for four out of the five data sets.
We present a method to automatically identify and track nuclei in time-lapse microscopy recordings of entire developing embryos. The method combines deep learning and global optimization. On a mouse dataset, it reconstructs 75.8% of cell lineages spanning 1 h, as compared to 31.8% for the competing method. Our approach improves understanding of where and when cell fate decisions are made in developing embryos, tissues, and organs.
We present a method to automatically identify and track nuclei in time-lapse microscopy recordings of entire developing embryos. The method combines deep learning and global optimization. On a mouse dataset, it reconstructs 75.8% of cell lineages spanning 1 h, as compared to 31.8% for the competing method. Our approach improves understanding of where and when cell fate decisions are made in developing embryos, tissues, and organs.
The centrosome is a dynamic structure in animal cells that serves as a microtubule organizing center during mitosis and also regulates cell-cycle progression and sets polarity cues. Automated and reliable tracking of centrosomes is essential for genetic screens that study the process of centrosome assembly and maturation in the nematode Caenorhabditis elegans.
We have developed software for fully automated tracking of vibrissae (whiskers) in high-speed videos (>500 Hz) of head-fixed, behaving rodents trimmed to a single row of whiskers. Performance was assessed against a manually curated dataset consisting of 1.32 million video frames comprising 4.5 million whisker traces. The current implementation detects whiskers with a recall of 99.998% and identifies individual whiskers with 99.997% accuracy. The average processing rate for these images was 8 Mpx/s/cpu (2.6 GHz Intel Core2, 2 GB RAM). This translates to 35 processed frames per second for a 640 px×352 px video of 4 whiskers. The speed and accuracy achieved enables quantitative behavioral studies where the analysis of millions of video frames is required. We used the software to analyze the evolving whisking strategies as mice learned a whisker-based detection task over the course of 6 days (8148 trials, 25 million frames) and measure the forces at the sensory follicle that most underlie haptic perception.
Motivation: Digital reconstruction, or tracing, of 3D neuron structures is critical toward reverse engineering the wiring and functions of a brain. However, despite a number of existing studies, this task is still challenging, especially when a 3D microscopic image has low signal-to-noise ratio (SNR) and fragmented neuron segments. Published work can handle these hard situations only by introducing global prior information, such as where a neurite segment starts and terminates. However, manual incorporation of such global information can be very time consuming. Thus, a completely automatic approach for these hard situations is highly desirable. Results: We have developed an automatic graph algorithm, called the all-path pruning (APP), to trace the 3D structure of a neuron. To avoid potential mis-tracing of some parts of a neuron, an APP first produces an initial over-reconstruction, by tracing the optimal geodesic shortest path from the seed location to every possible destination voxel/pixel location in the image. Since the initial reconstruction contains all the possible paths and thus could contain redundant structural components (SC), we simplify the entire reconstruction without compromising its connectedness by pruning the redundant structural elements, using a new maximal-covering minimal-redundant (MCMR) subgraph algorithm. We show that MCMR has a linear computational complexity and will converge. We examined the performance of our method using challenging 3D neuronal image datasets of model organisms (e.g. fruit fly).