Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (64) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (112) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (40) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (60) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (76) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (148) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (27) Apply Li Lab filter
- Lippincott-Schwartz Lab (167) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (46) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (62) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Shroff Lab (29) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (33) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (49) Apply Turaga Lab filter
- Turner Lab (37) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (17) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (2) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (72) Apply 2025 filter
- 2024 (223) Apply 2024 filter
- 2023 (163) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4064 Publications
Showing 3981-3990 of 4064 resultsChemical neurotransmission constitutes one of the fundamental modalities of communication between neurons. Monitoring release of these chemicals has traditionally been difficult to carry out at spatial and temporal scales relevant to neuron function. To understand chemical neurotransmission more fully, we need to improve the spatial and temporal resolutions of measurements for neurotransmitter release. To address this, we engineered a chemi-sensitive, two-dimensional nanofilm that facilitates subcellular visualization of the release and diffusion of the neurochemical dopamine with synaptic resolution, quantal sensitivity, and simultaneously from hundreds of release sites. Using this technology, we were able to monitor the spatiotemporal dynamics of dopamine release in dendritic processes, a poorly understood phenomenon. We found that dopamine release is broadcast from a subset of dendritic processes as hotspots that have a mean spatial spread of ≈3.2 µm (full width at half maximum) and are observed with a mean spatial frequency of 1 hotspot per ≈7.5 µm of dendritic length. Major dendrites of dopamine neurons and fine dendritic processes, as well as dendritic arbors and dendrites with no apparent varicose morphology participated in dopamine release. Remarkably, these release hotspots colocalized with Bassoon, suggesting that Bassoon may contribute to organizing active zones in dendrites, similar to its role in axon terminals.
The importance of mechanical force in biology is evident across diverse length scales, ranging from tissue morphogenesis during embryo development to mechanotransduction across single adhesion proteins at the cell surface. Consequently, many force measurement techniques rely on optical microscopy to measure forces being applied by cells on their environment, to visualize specimen deformations due to external forces, or even to directly apply a physical perturbation to the sample via photoablation or optogenetic tools. Recent developments in advanced microscopy offer improved approaches to enhance spatiotemporal resolution, imaging depth, and sample viability. These advances can be coupled with already existing force measurement methods to improve sensitivity, duration and speed, amongst other parameters. However, gaining access to advanced microscopy instrumentation and the expertise necessary to extract meaningful insights from these techniques is an unavoidable hurdle. In this Live Cell Imaging special issue Review, we survey common microscopy-based force measurement techniques and examine how they can be bolstered by emerging microscopy methods. We further explore challenges related to the accompanying data analysis in biomechanical studies and discuss the various resources available to tackle the global issue of technology dissemination, an important avenue for biologists to gain access to pre-commercial instruments that can be leveraged for biomechanical studies.
The assembly of sequence-specific enhancer-binding transcription factors (TFs) at cis-regulatory elements in the genome has long been regarded as the fundamental mechanism driving cell type-specific gene expression. However, despite extensive biochemical, genetic, and genomic studies in the past three decades, our understanding of molecular mechanisms underlying enhancer-mediated gene regulation remains incomplete. Recent advances in imaging technologies now enable direct visualization of TF-driven regulatory events and transcriptional activities at the single-cell, single-molecule level. The ability to observe the remarkably dynamic behavior of individual TFs in live cells at high spatiotemporal resolution has begun to provide novel mechanistic insights and promises new advances in deciphering causal-functional relationships of TF targeting, genome organization, and gene activation. In this review, we review current transcription imaging techniques and summarize converging results from various lines of research that may instigate a revision of models to describe key features of eukaryotic gene regulation.
The nature of nervous system function and development is inherently global, since all components eventually influence one another. Networks communicate through dense synaptic, electric, and modulatory connections and develop through concurrent growth and interlinking of their neurons, processes, glia, and blood vessels. These factors drive the development of techniques capable of imaging neural signaling, anatomy, and developmental processes at ever-larger scales. Here, we discuss the nature of questions benefitting from large-scale imaging techniques and introduce recent applications. We focus on emerging light-sheet microscopy approaches, which are well suited for live imaging of large systems with high spatiotemporal resolution and over long periods of time. We also discuss computational methods suitable for extracting biological information from the resulting system-level image data sets. Together with new tools for reporting and manipulating neuronal activity and gene expression, these techniques promise new insights into the large-scale function and development of neural systems.
Studying the intertwined roles of sensation, experience, and directed action in navigation has been facilitated by the development of virtual reality (VR) environments for head-fixed animals, allowing for quantitative measurements of behavior in well-controlled conditions. VR has long featured in studies of Drosophila melanogaster, but these experiments have typically allowed the fly to change only its heading in a visual scene and not its position. Here we explore how flies move in two dimensions (2D) using a visual VR environment that more closely captures an animal's experience during free behavior. We show that flies' 2D interaction with landmarks cannot be automatically derived from their orienting behavior under simpler one-dimensional (1D) conditions. Using novel paradigms, we then demonstrate that flies in 2D VR adapt their behavior in response to optogenetically delivered appetitive and aversive stimuli. Much like free-walking flies after encounters with food, head-fixed flies exploring a 2D VR respond to optogenetic activation of sugar-sensing neurons by initiating a local search, which appears not to rely on visual landmarks. Visual landmarks can, however, help flies to avoid areas in VR where they experience an aversive, optogenetically generated heat stimulus. By coupling aversive virtual heat to the flies' presence near visual landmarks of specific shapes, we elicit selective learned avoidance of those landmarks. Thus, we demonstrate that head-fixed flies adaptively navigate in 2D virtual environments, but their reliance on visual landmarks is context dependent. These behavioral paradigms set the stage for interrogation of the fly brain circuitry underlying flexible navigation in complex multisensory environments.
A key feature of reactive behaviors is the ability to spatially localize a salient stimulus and act accordingly. Such sensory-motor transformations must be particularly fast and well tuned in escape behaviors, in which both the speed and accuracy of the evasive response determine whether an animal successfully avoids predation [1]. We studied the escape behavior of the fruit fly, Drosophila, and found that flies can use visual information to plan a jump directly away from a looming threat. This is surprising, given the architecture of the pathway thought to mediate escape [2, 3]. Using high-speed videography, we found that approximately 200 ms before takeoff, flies begin a series of postural adjustments that determine the direction of their escape. These movements position their center of mass so that leg extension will push them away from the expanding visual stimulus. These preflight movements are not the result of a simple feed-forward motor program because their magnitude and direction depend on the flies’ initial postural state. Furthermore, flies plan a takeoff direction even in instances when they choose not to jump. This sophisticated motor program is evidence for a form of rapid, visually mediated motor planning in a genetically accessible model organism.
For sensory signals to control an animal’s behavior, they must first be transformed into a format appropriate for use by its motor systems. This fundamental problem is faced by all animals, including humans. Beyond simple reflexes, little is known about how such sensorimotor transformations take place. Here we describe how the outputs of a well-characterized population of fly visual interneurons, lobula plate tangential cells (LPTCs), are used by the animal’s gaze-stabilizing neck motor system. The LPTCs respond to visual input arising from both self-rotations and translations of the fly. The neck motor system however is involved in gaze stabilization and thus mainly controls compensatory head rotations. We investigated how the neck motor system is able to selectively extract rotation information from the mixed responses of the LPTCs. We recorded extracellularly from fly neck motor neurons (NMNs) and mapped the directional preferences across their extended visual receptive fields. Our results suggest that-like the tangential cells-NMNs are tuned to panoramic retinal image shifts, or optic flow fields, which occur when the fly rotates about particular body axes. In many cases, tangential cells and motor neurons appear to be tuned to similar axes of rotation, resulting in a correlation between the coordinate systems the two neural populations employ. However, in contrast to the primarily monocular receptive fields of the tangential cells, most NMNs are sensitive to visual motion presented to either eye. This results in the NMNs being more selective for rotation than the LPTCs. Thus, the neck motor system increases its rotation selectivity by a comparatively simple mechanism: the integration of binocular visual motion information.
The hippocampus is critical for recollecting and imagining experiences. This is believed to involve voluntarily drawing from hippocampal memory representations of people, events, and places, including maplike representations of familiar environments. However, whether representations in such "cognitive maps" can be volitionally accessed is unknown. We developed a brain-machine interface to test whether rats can do so by controlling their hippocampal activity in a flexible, goal-directed, and model-based manner. We found that rats can efficiently navigate or direct objects to arbitrary goal locations within a virtual reality arena solely by activating and sustaining appropriate hippocampal representations of remote places. This provides insight into the mechanisms underlying episodic memory recall, mental simulation and planning, and imagination and opens up possibilities for high-level neural prosthetics that use hippocampal representations.
Voltage imaging enables monitoring neural activity at sub-millisecond and sub-cellular scale, unlocking the study of subthreshold activity, synchrony, and network dynamics with unprecedented spatio-temporal resolution. However, high data rates (>800MB/s) and low signal-to-noise ratios create bottlenecks for analyzing such datasets. Here we present VolPy, an automated and scalable pipeline to pre-process voltage imaging datasets. VolPy features motion correction, memory mapping, automated segmentation, denoising and spike extraction, all built on a highly parallelizable, modular, and extensible framework optimized for memory and speed. To aid automated segmentation, we introduce a corpus of 24 manually annotated datasets from different preparations, brain areas and voltage indicators. We benchmark VolPy against ground truth segmentation, simulations and electrophysiology recordings, and we compare its performance with existing algorithms in detecting spikes. Our results indicate that VolPy's performance in spike extraction and scalability are state-of-the-art.
Neurons integrate synaptic inputs within their dendrites and produce spiking outputs, which then propagate down the axon and back into the dendrites where they contribute to plasticity. Mapping the voltage dynamics in dendritic arbors of live animals is crucial for understanding neuronal computation and plasticity rules. Here we combine patterned channelrhodopsin activation with dual-plane structured illumination voltage imaging, for simultaneous perturbation and monitoring of dendritic and somatic voltage in Layer 2/3 pyramidal neurons in anesthetized and awake mice. We examined the integration of synaptic inputs and compared the dynamics of optogenetically evoked, spontaneous, and sensory-evoked back-propagating action potentials (bAPs). Our measurements revealed a broadly shared membrane voltage throughout the dendritic arbor, and few signatures of electrical compartmentalization among synaptic inputs. However, we observed spike rate acceleration-dependent propagation of bAPs into distal dendrites. We propose that this dendritic filtering of bAPs may play a critical role in activity-dependent plasticity.