Filter
Associated Lab
- Aguilera Castrejon Lab (2) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (57) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (103) Apply Betzig Lab filter
- Beyene Lab (9) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (51) Apply Branson Lab filter
- Card Lab (37) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (40) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (17) Apply Espinosa Medina Lab filter
- Feliciano Lab (10) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (41) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (54) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (25) Apply Hermundstad Lab filter
- Hess Lab (76) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (43) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (142) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (6) Apply Li Lab filter
- Lippincott-Schwartz Lab (106) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (2) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (59) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (37) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (47) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (36) Apply Romani Lab filter
- Rubin Lab (109) Apply Rubin Lab filter
- Saalfeld Lab (47) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (38) Apply Scheffer Lab filter
- Schreiter Lab (51) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (60) Apply Spruston Lab filter
- Stern Lab (75) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (36) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (11) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (41) Apply Turaga Lab filter
- Turner Lab (28) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (25) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (56) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (6) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (28) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (40) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (18) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- High Performance Computing (7) Apply High Performance Computing filter
- Integrative Imaging (18) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Mass Spectrometry (1) Apply Mass Spectrometry filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (51) Apply Project Technical Resources filter
- Quantitative Genomics (20) Apply Quantitative Genomics filter
- Scientific Computing (96) Apply Scientific Computing filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (210) Apply 2025 filter
- 2024 (211) Apply 2024 filter
- 2023 (157) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2768 Janelia Publications
Showing 1281-1290 of 2768 resultsWe present ilastik, an easy-to-use interactive tool that brings machine-learning-based (bio)image analysis to end users without substantial computational expertise. It contains pre-defined workflows for image segmentation, object classification, counting and tracking. Users adapt the workflows to the problem at hand by interactively providing sparse training annotations for a nonlinear classifier. ilastik can process data in up to five dimensions (3D, time and number of channels). Its computational back end runs operations on-demand wherever possible, allowing for interactive prediction on data larger than RAM. Once the classifiers are trained, ilastik workflows can be applied to new data from the command line without further user interaction. We describe all ilastik workflows in detail, including three case studies and a discussion on the expected performance.
BACKGROUND: Kidney epithelial cells perform complex vectorial fluid and solute transport at high volumes and rapid rates. Their structural organization both reflects and enables these sophisticated physiological functions. However, our understanding of the nanoscale spatial organization and intracellular ultrastructure that underlies these crucial cellular functions remains limited. METHODS: To address this knowledge gap, we generated and reconstructed an extensive electron microscopic dataset of renal proximal tubule (PT) epithelial cells at isotropic resolutions down to 4nm. We employed artificial intelligence-based segmentation tools to identify, trace, and measure all major subcellular components. We complemented this analysis with immunofluorescence microscopy to connect subcellular architecture to biochemical function. RESULTS: Our ultrastructural analysis revealed complex organization of membrane-bound compartments in proximal tubule cells. The apical endocytic system featured deep invaginations connected to an anastomosing meshwork of dense apical tubules, rather than discrete structures. The endoplasmic reticulum displayed distinct structural domains: fenestrated sheets in the basolateral region and smaller, disconnected clusters in the subapical region. We identified, quantified, and visualized membrane contact sites between endoplasmic reticulum, plasma membrane, mitochondria, and apical endocytic compartments. Immunofluorescence microscopy demonstrated distinct localization patterns for endoplasmic reticulum resident proteins at mitochondrial and plasma membrane interfaces. CONCLUSIONS: This study provides novel insights into proximal tubule cell organization, revealing specialized compartmentalization and unexpected connections between membrane-bound organelles. We identified previously uncharacterized structures, including mitochondria-plasma membrane bridges and an interconnected endocytic meshwork, suggesting mechanisms for efficient energy distribution, cargo processing and structural support. Morphological differences between 4nm and 8nm datasets indicate subsegment-specific specializations within the proximal tubule. This comprehensive open-source dataset provides a foundation for understanding how subcellular architecture supports specialized epithelial function in health and disease.
Context plays a foundational role in determining how to interpret potentially fear-producing stimuli, yet the precise neurobiological substrates of context are poorly understood. In this issue of Cell, Xu et al. elegantly show that parallel neuronal circuits are necessary for two distinct roles of context in fear conditioning.
The olfactory system encodes information about molecules by spatiotemporal patterns of activity across distributed populations of neurons and extracts information from these patterns to control specific behaviors. Recent studies used in vivo recordings, optogenetics, and other methods to analyze the mechanisms by which odor information is encoded and processed in the olfactory system, the functional connectivity within and between olfactory brain areas, and the impact of spatiotemporal patterning of neuronal activity on higher-order neurons and behavioral outputs. The results give rise to a faceted picture of olfactory processing and provide insights into fundamental mechanisms underlying neuronal computations. This review focuses on some of this work presented in a Mini-Symposium at the Annual Meeting of the Society for Neuroscience in 2012.
Fluorescence image co-localization analysis is widely utilized to suggest biomolecular interaction. However, there exists some confusion as to its correct implementation and interpretation. In reality, co-localization analysis consists of at least two distinct sets of methods, termed co-occurrence and correlation. Each approach has inherent and often contrasting strengths and weaknesses. Yet, neither one can be considered to always be preferable for any given application. Rather, each method is most appropriate for answering different types of biological question. This Review discusses the main factors affecting multicolor image co-occurrence and correlation analysis, while giving insight into the types of biological behavior that are better suited to one approach or the other. Further, the limits of pixel-based co-localization analysis are discussed in the context of increasingly popular super-resolution imaging techniques.
Light sheet microscopy is a powerful technique for high-speed three-dimensional imaging of subcellular dynamics and large biological specimens. However, it often generates datasets ranging from hundreds of gigabytes to petabytes in size for a single experiment. Conventional computational tools process such images far slower than the time to acquire them and often fail outright due to memory limitations. To address these challenges, we present PetaKit5D, a scalable software solution for efficient petabyte-scale light sheet image processing. This software incorporates a suite of commonly used processing tools that are optimized for memory and performance. Notable advancements include rapid image readers and writers, fast and memory-efficient geometric transformations, high-performance Richardson-Lucy deconvolution and scalable Zarr-based stitching. These features outperform state-of-the-art methods by over one order of magnitude, enabling the processing of petabyte-scale image data at the full teravoxel rates of modern imaging cameras. The software opens new avenues for biological discoveries through large-scale imaging experiments.
MOTIVATION: Serial section microscopy is an established method for detailed anatomy reconstruction of biological specimen. During the last decade, high resolution electron microscopy (EM) of serial sections has become the de-facto standard for reconstruction of neural connectivity at ever increasing scales (EM connectomics). In serial section microscopy, the axial dimension of the volume is sampled by physically removing thin sections from the embedded specimen and subsequently imaging either the block-face or the section series. This process has limited precision leading to inhomogeneous non-planar sampling of the axial dimension of the volume which, in turn, results in distorted image volumes. This includes that section series may be collected and imaged in unknown order. RESULTS: We developed methods to identify and correct these distortions through image-based signal analysis without any additional physical apparatus or measurements. We demonstrate the efficacy of our methods in proof of principle experiments and application to real world problems. AVAILABILITY AND IMPLEMENTATION: We made our work available as libraries for the ImageJ distribution Fiji and for deployment in a high performance parallel computing environment. Our sources are open and available at http://github.com/saalfeldlab/section-sort, http://github.com/saalfeldlab/z-spacing and http://github.com/saalfeldlab/z-spacing-spark CONTACT: : [email protected] information: Supplementary data are available at Bioinformatics online.
Genome-wide CRISPR screens have transformed our ability to systematically interrogate human gene function, but are currently limited to a subset of cellular phenotypes. We report a novel pooled screening approach for a wider range of cellular and subtle subcellular phenotypes. Machine learning and convolutional neural network models are trained on the subcellular phenotype to be queried. Genome-wide screening then utilizes cells stably expressing dCas9-KRAB (CRISPRi), photoactivatable fluorescent protein (PA-mCherry), and a lentiviral guide RNA (gRNA) pool. Cells are screened by using microscopy and classified by artificial intelligence (AI) algorithms, which precisely identify the genetically altered phenotype. Cells with the phenotype of interest are photoactivated and isolated via flow cytometry, and the gRNAs are identified by sequencing. A proof-of-concept screen accurately identified PINK1 as essential for Parkin recruitment to mitochondria. A genome-wide screen identified factors mediating TFEB relocation from the nucleus to the cytosol upon prolonged starvation. Twenty-one of the 64 hits called by the neural network model were independently validated, revealing new effectors of TFEB subcellular localization. This approach, AI-photoswitchable screening (AI-PS), offers a novel screening platform capable of classifying a broad range of mammalian subcellular morphologies, an approach largely unattainable with current methodologies at genome-wide scale.
We present STIM, an imaging-based computational framework for exploring, visualizing, and processing high-throughput spatial sequencing datasets. STIM is built on the powerful ImgLib2, N5 and BigDataViewer (BDV) frameworks enabling transfer of computer vision techniques to datasets with irregular measurement-spacing and arbitrary spatial resolution, such as spatial transcriptomics data generated by multiplexed targeted hybridization or spatial sequencing technologies. We illustrate STIM’s capabilities by representing, visualizing, and automatically registering publicly available spatial sequencing data from 14 serial sections of mouse brain tissue.
