Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block

Associated Lab

facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
general_search_page-panel_pane_1 | views_panes

2827 Janelia Publications

Showing 351-360 of 2827 results
Funke Lab
02/23/26 | An investigation of unsupervised cell tracking and interactive fine-tuning
Lalit M, Funke J
2025 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW). 2026 Feb 23:. doi: 10.1109/ICCVW69036.2025.00610

Most existing deep learning-based cell tracking methods rely on supervised learning, requiring large-scale annotated datasets that are often unavailable in real-world scenarios. Moreover, many approaches lack tools and methods for correcting mispredicted links or incorporating corrections through fine-tuning. These limitations contribute to the limited adoption of deep learning-based tracking methods in the life sciences, where manual tracking remains the predominant approach. To reduce the annotation burden and enable model training without extensive labeled data, we introduce a loss function for unsupervised training. Our method leverages the predictable dynamics inherent in many biological processes, providing an initialization that does not require an annotated dataset. We further investigate how minimal user-provided annotations can refine tracking accuracy. To this end, we propose an active learning framework that selectively identifies uncertain decisions within the tracking graph, allowing for efficient annotation of the most informative data points. We evaluate our approach on two microscopy datasets, demonstrating the effectiveness of both our unsupervised training strategy and active learning scheme in improving tracking performance. Our implementation and reproducible experiments are available at github.com/funkelab/attrackt and github.com/funkelab/attrackt_experiments, respectively.

View Publication Page
07/26/12 | An olfactory subsystem that mediates high-sensitivity detection of volatile amines.
Pacifico R, Dewan A, Cawley D, Guo C, Bozza T
Cell Rep. 2012 Jul 26;2(1):76-88. doi: 10.1016/j.celrep.2012.06.006

Olfactory stimuli are detected by over 1,000 odorant receptors in mice, with each receptor being mapped to specific glomeruli in the olfactory bulb. The trace amine-associated receptors (TAARs) are a small family of evolutionarily conserved olfactory receptors whose contribution to olfaction remains enigmatic. Here, we show that a majority of the TAARs are mapped to a discrete subset of glomeruli in the dorsal olfactory bulb of the mouse. This TAAR projection is distinct from the previously described class I and class II domains, and is formed by a sensory neuron population that is restricted to express TAAR genes prior to choice. We also show that the dorsal TAAR glomeruli are selectively activated by amines at low concentrations. Our data uncover a hard-wired, parallel input stream in the main olfactory pathway that is specialized for the detection of volatile amines.

View Publication Page
11/01/21 | An open-access volume electron microscopy atlas of whole cells and tissues.
Xu CS, Pang S, Shtengel G, Müller A, Ritter AT, Hoffman HK, Takemura S, Lu Z, Pasolli HA, Iyer N, Chung J, Bennett D, Weigel AV, Freeman M, Van Engelenburg SB, Walther TC, Farese RV, Lippincott-Schwartz J, Mellman I, Solimena M, Hess HF
Nature. 2021 Nov 1;599(7883):147-51. doi: 10.1038/s41586-021-03992-4

Understanding cellular architecture is essential for understanding biology. Electron microscopy (EM) uniquely visualizes cellular structures with nanometre resolution. However, traditional methods, such as thin-section EM or EM tomography, have limitations in that they visualize only a single slice or a relatively small volume of the cell, respectively. Focused ion beam-scanning electron microscopy (FIB-SEM) has demonstrated the ability to image small volumes of cellular samples with 4-nm isotropic voxels. Owing to advances in the precision and stability of FIB milling, together with enhanced signal detection and faster SEM scanning, we have increased the volume that can be imaged with 4-nm voxels by two orders of magnitude. Here we present a volume EM atlas at such resolution comprising ten three-dimensional datasets for whole cells and tissues, including cancer cells, immune cells, mouse pancreatic islets and Drosophila neural tissues. These open access data (via OpenOrganelle) represent the foundation of a field of high-resolution whole-cell volume EM and subsequent analyses, and we invite researchers to explore this atlas and pose questions.

View Publication Page
05/13/21 | An open-source semi-automated robotics pipeline for embryo immunohistochemistry.
Fuqua T, Jordan J, Halavatyi A, Tischer C, Richter K, Crocker J
Scientific Reports. 2021 May 13;11(1):10314. doi: 10.1038/s41598-021-89676-5

A significant challenge for developmental systems biology is balancing throughput with controlled conditions that minimize experimental artifacts. Large-scale developmental screens such as unbiased mutagenesis surveys have been limited in their applicability to embryonic systems, as the technologies for quantifying precise expression patterns in whole animals has not kept pace with other sequencing-based technologies. Here, we outline an open-source semi-automated pipeline to chemically fixate, stain, and 3D-image Drosophila embryos. Central to this pipeline is a liquid handling robot, Flyspresso, which automates the steps of classical embryo fixation and staining. We provide the schematics and an overview of the technology for an engineer or someone equivalently trained to reproduce and further improve upon Flyspresso, and highlight the Drosophila embryo fixation and colorimetric or antibody staining protocols. Additionally, we provide a detailed overview and stepwise protocol for our adaptive-feedback pipeline for automated embryo imaging on confocal microscopes. We demonstrate the efficiency of this pipeline compared to classical techniques, and how it can be repurposed or scaled to other protocols and biological systems. We hope our pipeline will serve as a platform for future research, allowing a broader community of users to build, execute, and share similar experiments.

View Publication Page
04/19/15 | An open-source VAA3D plugin for real-time 3D visualization of terabyte-sized volumetric images.
Alessandro Bria , Giulio Iannello , Hanchuan Peng
IEEE 12th International Symposium on Biomedical Imaging. 2015 Apr 19:. doi: 10.1109/ISBI.2015.7163925

Modern high-throughput bioimaging techniques pose the unprecedented challenge of exploring and analyzing the produced Terabyte-scale volumetric images directly in their 3D space. Without expensive virtual reality devices and/or parallel computing infrastructures, this becomes even more demanding and calls for new, more scalable tools that help exploring these very large 3D data also on common laptops and graphic hardware. To this end, we developed a plugin for the open-source, cross-platform Vaa3D system to extend its powerful 3D visualization and analysis capabilities to images of potentially unlimited size. When used with large volumetric images up to 2.5 Terabyte in size, Vaa3D-TeraFly exhibited real-time (subsecond) performance that scaled constantly on image size. The tool has been implemented in C++ with Qt and OpenGL and it is freely and publicly available both as open-source and as binary package along with the main Vaa3D distribution.

View Publication Page
Looger LabSvoboda LabLeonardo LabSchreiter LabGENIE
02/01/13 | An optimized fluorescent probe for visualizing glutamate neurotransmission.
Marvin JS, Borghuis BG, Tian L, Cichon J, Harnett MT, Akerboom J, Gordus A, Renninger SL, Chen T, Bargmann CI, Orger MB, Schreiter ER, Demb JB, Gan W, Hires SA, Looger LL
Nature Methods. 2013 Feb;10:162-70. doi: 10.1038/nmeth.2333

We describe an intensity-based glutamate-sensing fluorescent reporter (iGluSnFR) with signal-to-noise ratio and kinetics appropriate for in vivo imaging. We engineered iGluSnFR in vitro to maximize its fluorescence change, and we validated its utility for visualizing glutamate release by neurons and astrocytes in increasingly intact neurological systems. In hippocampal culture, iGluSnFR detected single field stimulus-evoked glutamate release events. In pyramidal neurons in acute brain slices, glutamate uncaging at single spines showed that iGluSnFR responds robustly and specifically to glutamate in situ, and responses correlate with voltage changes. In mouse retina, iGluSnFR-expressing neurons showed intact light-evoked excitatory currents, and the sensor revealed tonic glutamate signaling in response to light stimuli. In worms, glutamate signals preceded and predicted postsynaptic calcium transients. In zebrafish, iGluSnFR revealed spatial organization of direction-selective synaptic activity in the optic tectum. Finally, in mouse forelimb motor cortex, iGluSnFR expression in layer V pyramidal neurons revealed task-dependent single-spine activity during running.

View Publication Page
Svoboda LabDruckmann LabScientific Computing
01/15/19 | An orderly single-trial organization of population dynamics in premotor cortex predicts behavioral variability.
Wei Z, Inagaki H, Li N, Svoboda K, Druckmann S
Nature Communications. 2019 Jan 15;10(1):216. doi: 10.1038/s41467-018-08141-6

Animals are not simple input-output machines. Their responses to even very similar stimuli are variable. A key, long-standing question in neuroscience is to understand the neural correlates of such behavioral variability. To reveal these correlates, behavior and neural population activity must be related to one another on single trials. Such analysis is challenging due to the dynamical nature of brain function (e.g., in decision making), heterogeneity across neurons and limited sampling of the relevant neural population. By analyzing population recordings from mouse frontal cortex in perceptual decision-making tasks, we show that an analysis approach tailored to the coarse grain features of the dynamics is able to reveal previously unrecognized structure in the organization of population activity. This structure is similar on error and correct trials, suggesting dynamics that may be constrained by the underlying circuitry, is able to predict multiple aspects of behavioral variability and reveals long time-scale modulation of population activity.

View Publication Page
07/25/18 | An unbiased template of the Drosophila brain and ventral nerve cord.
Bogovic JA, Otsuna H, Heinrich L, Ito M, Jeter J, Meissner GW, Nern A, Colonell J, Malkesman O, Saalfeld S
bioRxiv. 2018 Jul 25:. doi: 10.1101/376384

The fruit fly Drosophila melanogaster is an important model organism for neuroscience with a wide array of genetic tools that enable the mapping of individuals neurons and neural subtypes. Brain templates are essential for comparative biological studies because they enable analyzing many individuals in a common reference space. Several central brain templates exist for Drosophila, but every one is either biased, uses sub-optimal tissue preparation, is imaged at low resolution, or does not account for artifacts. No publicly available Drosophila ventral nerve cord template currently exists. In this work, we created high-resolution templates of the Drosophila brain and ventral nerve cord using the best-available technologies for imaging, artifact correction, stitching, and template construction using groupwise registration. We evaluated our central brain template against the four most competitive, publicly available brain templates and demonstrate that ours enables more accurate registration with fewer local deformations in shorter time.

View Publication Page
12/31/20 | An unbiased template of the Drosophila brain and ventral nerve cord.
Bogovic JA, Otsuna H, Heinrich L, Ito M, Jeter J, Meissner G, Nern A, Colonell J, Malkesman O, Ito K, Saalfeld S
PLoS One. 2020 Dec 31;15(12):e0236495. doi: 10.1371/journal.pone.0236495

The fruit fly Drosophila melanogaster is an important model organism for neuroscience with a wide array of genetic tools that enable the mapping of individual neurons and neural subtypes. Brain templates are essential for comparative biological studies because they enable analyzing many individuals in a common reference space. Several central brain templates exist for Drosophila, but every one is either biased, uses sub-optimal tissue preparation, is imaged at low resolution, or does not account for artifacts. No publicly available Drosophila ventral nerve cord template currently exists. In this work, we created high-resolution templates of the Drosophila brain and ventral nerve cord using the best-available technologies for imaging, artifact correction, stitching, and template construction using groupwise registration. We evaluated our central brain template against the four most competitive, publicly available brain templates and demonstrate that ours enables more accurate registration with fewer local deformations in shorter time.

View Publication Page
Stern Lab
02/16/17 | An unsupervised method for quantifying the behavior of interacting individuals.
Klibaite U, Berman GJ, Cande J, Stern DL
Physical Biology. 2017 Feb16;14(1):1609.09345. doi: 10.1088/1478-3975/aa5c50

Behaviors involving the interaction of multiple individuals are complex and frequently crucial for an animal's survival. These interactions, ranging across sensory modalities, length scales, and time scales, are often subtle and difficult to characterize. Contextual effects on the frequency of behaviors become even more difficult to quantify when physical interaction between animals interferes with conventional data analysis, e.g. due to visual occlusion. We introduce a method for quantifying behavior in fruit fly interaction that combines high-throughput video acquisition and tracking of individuals with recent unsupervised methods for capturing an animal's entire behavioral repertoire. We find behavioral differences between solitary flies and those paired with an individual of the opposite sex, identifying specific behaviors that are affected by social and spatial context. Our pipeline allows for a comprehensive description of the interaction between two individuals using unsupervised machine learning methods, and will be used to answer questions about the depth of complexity and variance in fruit fly courtship.

View Publication Page