Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (64) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (113) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (42) Apply Card Lab filter
- Cardona Lab (64) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (15) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (63) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (77) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (149) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (28) Apply Li Lab filter
- Lippincott-Schwartz Lab (169) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (63) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (7) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (48) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (63) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (35) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (51) Apply Turaga Lab filter
- Turner Lab (38) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (21) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (3) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (46) Apply GENIE filter
- Integrative Imaging (4) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (119) Apply 2025 filter
- 2024 (217) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4102 Publications
Showing 3861-3870 of 4102 resultsCell lineage defines the mitotic connection between cells that make up an organism. Mapping these connections in relation to cell identity offers an extraordinary insight into the mechanisms underlying normal and pathological development. The analysis of molecular determinants involved in the acquisition of cell identity requires gaining experimental access to precise parts of cell lineages. Recently, we have developed CaSSA and CLADES, a new technology based on CRISPR that allows targeting and labeling specific lineage branches. Here we discuss how to better exploit this technology for lineage studies in Drosophila, with an emphasis on neuronal specification.
Object tracking is essential for a multitude of biomedical re- search projects. Automated methods are desired in order to avoid im- possible amounts of manual tracking efforts. However, automatically found solutions are not free of errors, and these errors again have to be identified and resolved manually. We propose six innovative ways for semi-automatic curation of automatically found tracking solutions. Respective user interactions are six simple operations: Inclusion and ex- clusion of objects and tracking decisions, specification of the number of objects, and one-click altering of object segmentations. We show how all proposed interactions can be elegantly incorporated into “assignment models” [1,2,3,4,5,6], an innovative and increasingly popular tracking paradigm. Given some user interaction, the tracking engine is capable of computing the respective globally optimal tracking solution efficiently, even benefitting from “warm start”-capabilities. We show that after in- teractively pointing at a single mistake, multiple segmentation and track- ing errors can be fixed automatically in one single re-evaluation, provably leading to the new, feedback-conscious global optimum.
Tracking all nuclei of an embryo in noisy and dense fluorescence microscopy data is a challenging task. We build upon a recent method for nuclei tracking that combines weakly-supervised learning from a small set of nuclei center point annotations with an integer linear program (ILP) for optimal cell lineage extraction. Our work specifically addresses the following challenging properties of C. elegans embryo recordings: (1) Many cell divisions as compared to benchmark recordings of other organisms, and (2) the presence of polar bodies that are easily mistaken as cell nuclei. To cope with (1), we devise and incorporate a learnt cell division detector. To cope with (2), we employ a learnt polar body detector. We further propose automated ILP weights tuning via a structured SVM, alleviating the need for tedious manual set-up of a respective grid search.
Summary HYlight is a genetically encoded fluorescent biosensor that ratiometrically monitors fructose 1,6-bisphosphate (FBP), a key glycolytic metabolite. Given the role of glucose in liver cancer metabolism, we expressed HYlight in human liver cancer cells and primary mouse hepatocytes. Through in vitro, in silico, and in cellulo experiments, we showed HYlight’s ability to monitor FBP changes linked to glycolysis, not gluconeogenesis. HYlight’s affinity for FBP was ∼1 μM and stable within physiological pH range. HYlight demonstrated weak binding to dihydroxyacetone phosphate, and its ratiometric response was influenced by both ionic strength and phosphate. Therefore, simulating cytosolic conditions in vitro was necessary to establish a reliable correlation between HYlight’s cellular responses and FBP concentrations. FBP concentrations were found to be in the lower micromolar range, far lower than previous millimolar estimates. Altogether, this biosensor approach offers real-time monitoring of FBP concentrations at single-cell resolution, making it an invaluable tool for the understanding of cancer metabolism.
We present a particle filtering algorithm for robustly tracking the contours of multiple deformable objects through severe occlusions. Our algorithm combines a multiple blob tracker with a contour tracker in a manner that keeps the required number of samples small. This is a natural combination because both algorithms have complementary strengths. The multiple blob tracker uses a natural multi-target model and searches a smaller and simpler space. On the other hand, contour tracking gives more fine-tuned results and relies on cues that are available during severe occlusions. Our choice of combination of these two algorithms accentuates the advantages of each. We demonstrate good performance on challenging video of three identical mice that contains multiple instances of severe occlusion.
We are interested in establishing the correspondence between neuron activity and body curvature during various movements of C. Elegans worms. Given long sequences of images, specifically recorded to glow when the neuron is active, it is required to track all identifiable neurons in each frame. The characteristics of the neuron data, e.g., the uninformative nature of neuron appearance and the sequential ordering of neurons, renders standard single and multi-object tracking methods either ineffective or unnecessary for our task. In this paper, we propose a multi-target tracking algorithm that correctly assigns each neuron to one of several candidate locations in the next frame preserving shape constraint. The results demonstrate how the proposed method can robustly track more neurons than several existing methods in long sequences of images.
Organisms that use vocal signals to communicate often modulate their vocalizations to avoid being masked by other sounds in the environment. Although some environmental noise is continuous, both biotic and abiotic noise can be intermittent, or even periodic. Interference from intermittent noise can be avoided if calls are timed to coincide with periods of silence, a capacity that is unambiguously present in insects, amphibians, birds, and humans. Surprisingly, we know virtually nothing about this fundamental capacity in nonhuman primates. Here we show that a New World monkey, the cotton-top tamarin (Saguinus oedipus), can restrict calls to periodic silent intervals in loud white noise. In addition, calls produced during these silent intervals were significantly louder than calls recorded in silent baseline sessions. Finally, average call duration dropped across sessions, indicating that experience with temporally patterned noise caused tamarins to compress their calls. Taken together, these results show that in the presence of a predictable, intermittent environmental noise, cotton-top tamarins are able to modify the duration, timing, and amplitude of their calls to avoid acoustic interference.
Using a combination of metabolically labeled glycans, a bioorthogonal copper(I)-catalyzed azide-alkyne cycloaddition, and the controlled bleaching of fluorescent probes conjugated to azide- or alkyne-tagged glycans, a sufficiently low spatial density of dye-labeled glycans was achieved, enabling dynamic single-molecule tracking and super-resolution imaging of N-linked sialic acids and O-linked N-acetyl galactosamine (GalNAc) on the membrane of live cells. Analysis of the trajectories of these dye-labeled glycans in mammary cancer cells revealed constrained diffusion of both N- and O-linked glycans, which was interpreted as reflecting the mobility of the glycan rather than to be caused by transient immobilization owing to spatial inhomogeneities on the plasma membrane. Stochastic optical reconstruction microscopy (STORM) imaging revealed the structure of dynamic membrane nanotubes.
Summary: State-of-the-art light and electron microscopes are capable of acquiring large image datasets, but quantitatively evaluating the data often involves manually annotating structures of interest. This processis time-consuming and often a major bottleneck in the evaluation pipeline. To overcome this problem, we have introduced the Trainable Weka Segmentation (TWS), a machine learning tool that leveragesa limited number of manual annotations in order to train a classifier and segment the remaining dataautomatically. In addition, TWS can provide unsupervised segmentation learning schemes (clustering) and can be customized to employ user-designed image features or classifiers. Availability and Implementation: TWS is distributed as open-source software as part of the Fiji image processing distribution of ImageJ at http://imagej.net/Trainable_Weka_Segmentation. Contact: ignacio.arganda@ehu.eus. Supplementary information: Supplementary data are available at Bioinformatics online.
A key challenge in neuroscience is the expeditious reconstruction of neuronal circuits. For model systems such as Drosophila and C. elegans, the limiting step is no longer the acquisition of imagery but the extraction of the circuit from images. For this purpose, we designed a software application, TrakEM2, that addresses the systematic reconstruction of neuronal circuits from large electron microscopical and optical image volumes. We address the challenges of image volume composition from individual, deformed images; of the reconstruction of neuronal arbors and annotation of synapses with fast manual and semi-automatic methods; and the management of large collections of both images and annotations. The output is a neural circuit of 3d arbors and synapses, encoded in NeuroML and other formats, ready for analysis.