Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (64) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (112) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (40) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (60) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (76) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (148) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (27) Apply Li Lab filter
- Lippincott-Schwartz Lab (167) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (46) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (62) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Shroff Lab (29) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (33) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (49) Apply Turaga Lab filter
- Turner Lab (37) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (17) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (2) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (72) Apply 2025 filter
- 2024 (223) Apply 2024 filter
- 2023 (163) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4064 Publications
Showing 4011-4020 of 4064 resultsFluorescence microscopy images should not be treated as perfect representations of biology. Many factors within the biospecimen itself can drastically affect quantitative microscopy data. Whereas some sample-specific considerations, such as photobleaching and autofluorescence, are more commonly discussed, a holistic discussion of sample-related issues (which includes less-routine topics such as quenching, scattering and biological anisotropy) is required to appropriately guide life scientists through the subtleties inherent to bioimaging. Here, we consider how the interplay between light and a sample can cause common experimental pitfalls and unanticipated errors when drawing biological conclusions. Although some of these discrepancies can be minimized or controlled for, others require more pragmatic considerations when interpreting image data. Ultimately, the power lies in the hands of the experimenter. The goal of this Review is therefore to survey how biological samples can skew quantification and interpretation of microscopy data. Furthermore, we offer a perspective on how to manage many of these potential pitfalls.
The activity of a handful of transcription factors, such as mammalian NF-B, Drosophila melanogaster Cubitus interruptus and yeast Spt23 and Mga2, are regulated through partial protein degradation by the proteasome. New data now show that the proteasome activates membrane-bound Spt23 and Mga2 by initiating their proteolysis at an internal site and then degrading the proteins bidirectionally toward both ends of the polypeptide chain, modifying our ideas on how the proteasome degrades targeted substrates.
Eyes may be 'the window to the soul' in humans, but whiskers provide a better path to the inner lives of rodents. The brain has remarkable abilities to focus its limited resources on information that matters, while ignoring a cacophony of distractions. While inspecting a visual scene, primates foveate to multiple salient locations, for example mouths and eyes in images of people, and ignore the rest. Similar processes have now been observed and studied in rodents in the context of whisker-based tactile sensation. Rodents use their mechanosensitive whiskers for a diverse range of tactile behaviors such as navigation, object recognition and social interactions. These animals move their whiskers in a purposive manner to locations of interest. The shapes of whiskers, as well as their movements, are exquisitely adapted for tactile exploration in the dark tight burrows where many rodents live. By studying whisker movements during tactile behaviors, we can learn about the tactile information available to rodents through their whiskers and how rodents direct their attention. In this primer, we focus on how the whisker movements of rats and mice are providing clues about the logic of active sensation and the underlying neural mechanisms.
Automated reconstruction of neural connectivity graphs from electron microscopy image stacks is an essential step towards large-scale neural circuit mapping. While significant progress has recently been made in automated segmentation of neurons and detection of synapses, the problem of synaptic partner assignment for polyadic (one-to-many) synapses, prevalent in the Drosophila brain, remains unsolved. In this contribution, we propose a method which automatically assigns pre- and postsynaptic roles to neurites adjacent to a synaptic site. The method constructs a probabilistic graphical model over potential synaptic partner pairs which includes factors to account for a high rate of one-to-many connections, as well as the possibility of the same neuron to be pre-synaptic in one synapse and post-synaptic in another. The algorithm has been validated on a publicly available stack of ssTEM images of Drosophila neural tissue and has been shown to reconstruct most of the synaptic relations correctly.
Imaging fast cellular dynamics across large specimens requires high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To meet these requirements, we developed isotropic multiview (IsoView) light-sheet microscopy, which rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. Combining these four views by means of high-throughput multiview deconvolution yields images with high resolution in all three dimensions. We demonstrate whole-animal functional imaging of Drosophila larvae at a spatial resolution of 1.1-2.5 μm and temporal resolution of 2 Hz for several hours. We also present spatially isotropic whole-brain functional imaging in Danio rerio larvae and spatially isotropic multicolor imaging of fast cellular dynamics across gastrulating Drosophila embryos. Compared with conventional light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.
We developed isotropic multiview (IsoView) light-sheet microscopy in order to image fast cellular dynamics, such as cell movements in an entire developing embryo or neuronal activity throughput an entire brain or nervous system, with high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To achieve high temporal resolution and high spatial resolution at the same time, IsoView microscopy rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. In a post-processing step, these four views are then combined by means of high-throughput multiview deconvolution to yield images with a system resolution of ≤ 450 nm in all three dimensions. Using IsoView microscopy, we performed whole-animal functional imaging of Drosophila embryos and larvae at a spatial resolution of 1.1-2.5 μm and at a temporal resolution of 2 Hz for up to 9 hours. We also performed whole-brain functional imaging in larval zebrafish and multicolor imaging of fast cellular dynamics across entire, gastrulating Drosophila embryos with isotropic, sub-cellular resolution. Compared with conventional (spatially anisotropic) light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, such as lattice lightsheet microscopy or diSPIM, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.
The body of an animal influences how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. We demonstrate the use of visual sensors and the re-use of a pre-trained general-purpose flight controller by training the model to perform visually guided flight tasks. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.
The body of an animal influences how its nervous system generates behavior1. Accurately modeling the neural control of sensorimotor behavior requires an anatomically detailed biomechanical representation of the body. Here, we introduce a whole-body model of the fruit fly Drosophila melanogaster in a physics simulator. Designed as a general-purpose framework, our model enables the simulation of diverse fly behaviors, including both terrestrial and aerial locomotion. We validate its versatility by replicating realistic walking and flight behaviors. To support these behaviors, we develop new phenomenological models for fluid and adhesion forces. Using data-driven, end-to-end reinforcement learning we train neural network controllers capable of generating naturalistic locomotion along complex trajectories in response to high-level steering commands. Additionally, we show the use of visual sensors and hierarchical motor control, training a high-level controller to reuse a pre-trained low-level flight controller to perform visually guided flight tasks. Our model serves as an open-source platform for studying the neural control of sensorimotor behavior in an embodied context. Preprint: www.biorxiv.org/content/early/2024/03/14/2024.03.11.584515