Filter
Associated Lab
- Aguilera Castrejon Lab (17) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (68) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (115) Apply Betzig Lab filter
- Beyene Lab (14) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (54) Apply Branson Lab filter
- Card Lab (43) Apply Card Lab filter
- Cardona Lab (64) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (15) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (52) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (21) Apply Espinosa Medina Lab filter
- Feliciano Lab (10) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (41) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (64) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (29) Apply Hermundstad Lab filter
- Hess Lab (79) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (47) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (154) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (30) Apply Li Lab filter
- Lippincott-Schwartz Lab (178) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (7) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (64) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (7) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (49) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (53) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (48) Apply Romani Lab filter
- Rubin Lab (147) Apply Rubin Lab filter
- Saalfeld Lab (64) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (38) Apply Scheffer Lab filter
- Schreiter Lab (68) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (96) Apply Spruston Lab filter
- Stern Lab (158) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (39) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (35) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (53) Apply Turaga Lab filter
- Turner Lab (39) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (25) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (56) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (6) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (28) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (210) Apply 2025 filter
- 2024 (212) Apply 2024 filter
- 2023 (158) Apply 2023 filter
- 2022 (192) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4185 Publications
Showing 171-180 of 4185 resultsActivity in the mouse anterior lateral motor cortex (ALM) instructs directional movements, often seconds before movement initiation. It is unknown whether this preparatory activity is localized to ALM or widely distributed within motor cortex. Here we imaged activity across motor cortex while mice performed a whisker-based object localization task with a delayed, directional licking response. During tactile sensation and the delay epoch, object location was represented in motor cortex areas that are medial and posterior relative to ALM, including vibrissal motor cortex. Preparatory activity appeared first in deep layers of ALM, seconds before the behavioral response, and remained localized to ALM until the behavioral response. Later, widely distributed neurons represented the outcome of the trial. Cortical area was more predictive of neuronal selectivity than laminar location or axonal projection target. Motor cortex therefore represents sensory, motor, and outcome information in a spatially organized manner.
Complex phenotypes, such as an animal’s behavior, generally depend on an overwhelming number of processes that span a vast range of scales. While there is no reason that behavioral dynamics permit simple models, by subsuming inherent nonlinearities and memory into maximally predictive microstates, we find one for Caenorhabditis elegans foraging. The resulting “Markov worm” is effectively indistinguishable from real worm motion across a range of timescales, and we can decompose our model dynamics both to recover and reveal behavioral states. Finally, we connect postures to trajectories, illuminating how worms explore the environment in different behavioral states. How do we capture the breadth of behavior in animal movement, from rapid body twitches to aging? Using high-resolution videos of the nematode worm Caenorhabditis elegans, we show that a single dynamics connects posture-scale fluctuations with trajectory diffusion and longer-lived behavioral states. We take short posture sequences as an instantaneous behavioral measure, fixing the sequence length for maximal prediction. Within the space of posture sequences, we construct a fine-scale, maximum entropy partition so that transitions among microstates define a high-fidelity Markov model, which we also use as a means of principled coarse-graining. We translate these dynamics into movement using resistive force theory, capturing the statistical properties of foraging trajectories. Predictive across scales, we leverage the longest-lived eigenvectors of the inferred Markov chain to perform a top–down subdivision of the worm’s foraging behavior, revealing both “runs-and-pirouettes” as well as previously uncharacterized finer-scale behaviors. We use our model to investigate the relevance of these fine-scale behaviors for foraging success, recovering a trade-off between local and global search strategies.
Early stages of sensory systems face the challenge of compressing information from numerous receptors onto a much smaller number of projection neurons, a so called communication bottleneck. To make more efficient use of limited bandwidth, compression may be achieved using predictive coding, whereby predictable, or redundant, components of the stimulus are removed. In the case of the retina, Srinivasan et al. (1982) suggested that feedforward inhibitory connections subtracting a linear prediction generated from nearby receptors implement such compression, resulting in biphasic center-surround receptive fields. However, feedback inhibitory circuits are common in early sensory circuits and furthermore their dynamics may be nonlinear. Can such circuits implement predictive coding as well? Here, solving the transient dynamics of nonlinear reciprocal feedback circuits through analogy to a signal-processing algorithm called linearized Bregman iteration we show that nonlinear predictive coding can be implemented in an inhibitory feedback circuit. In response to a step stimulus, interneuron activity in time constructs progressively less sparse but more accurate representations of the stimulus, a temporally evolving prediction. This analysis provides a powerful theoretical framework to interpret and understand the dynamics of early sensory processing in a variety of physiological experiments and yields novel predictions regarding the relation between activity and stimulus statistics.
Johnston’s organ is the largest mechanosensory organ in Drosophila; it analyzes movements of the antenna due to sound, wind, gravity, and touch. Different Johnston’s organ neurons (JONs) encode distinct stimulus features. Certain JONs respond in a sustained manner to steady displacements, and these JONs subdivide into opponent populations that prefer push or pull displacements. Here, we describe neurons in the brain (aPN3 neurons) that combine excitation and inhibition from push/pull JONs in different ratios. Consequently, different aPN3 neurons are sensitive to movement in different parts of the antenna’s range, at different frequencies, or at different amplitude modulation rates. We use a model to show how the tuning of aPN3 neurons can arise from rectification and temporal filtering in JONs, followed by mixing of JON signals in different proportions. These results illustrate how several canonical neural circuit components—rectification, opponency, and filtering—can combine to produce selectivity for complex stimulus features.
Low dose imaging procedures are key for a successful cryoEM experiment (whether by electron cryotomography, single particle analysis, electron crystallography, or MicroED). We present a method to minimize magnetic hysteresis of the condenser lens system in the JEOL JEM-3200FSC transmission electron microscope (TEM) in order to maintain a stable optical axis for the beam path of low-dose imaging. The simple procedure involves independent voltage ramping of the CL1 and CL2 lenses immediately before switching to the focusing and exposure beam settings for data collection.
Motor behaviors are often planned long before execution but only released after specific sensory events. Planning and execution are each associated with distinct patterns of motor cortex activity. Key questions are how these dynamic activity patterns are generated and how they relate to behavior. Here, we investigate the multi-regional neural circuits that link an auditory "Go cue" and the transition from planning to execution of directional licking. Ascending glutamatergic neurons in the midbrain reticular and pedunculopontine nuclei show short latency and phasic changes in spike rate that are selective for the Go cue. This signal is transmitted via the thalamus to the motor cortex, where it triggers a rapid reorganization of motor cortex state from planning-related activity to a motor command, which in turn drives appropriate movement. Our studies show how midbrain can control cortical dynamics via the thalamus for rapid and precise motor behavior.
Genetically encoded fluorescent calcium indicators allow cellular-resolution recording of physiology. However, bright, genetically targetable indicators that can be multiplexed with existing tools in vivo are needed for simultaneous imaging of multiple signals. Here we describe WHaloCaMP, a modular chemigenetic calcium indicator built from bright dye-ligands and protein sensor domains. Fluorescence change in WHaloCaMP results from reversible quenching of the bound dye via a strategically placed tryptophan. WHaloCaMP is compatible with rhodamine dye-ligands that fluoresce from green to near-infrared, including several that efficiently label the brain in animals. When bound to a near-infrared dye-ligand, WHaloCaMP shows a 7× increase in fluorescence intensity and a 2.1-ns increase in fluorescence lifetime upon calcium binding. We use WHaloCaMP1a to image Ca responses in vivo in flies and mice, to perform three-color multiplexed functional imaging of hundreds of neurons and astrocytes in zebrafish larvae and to quantify Ca concentration using fluorescence lifetime imaging microscopy (FLIM).
Flying insects exhibit stunning behavioral repertoires that are largely mediated by the visual control of flight. For this reason, presenting a controlled visual environment to tethered insects has been and continues to be a powerful tool for studying the sensory control of complex behaviors. To create an easily controlled, scalable, and customizable visual stimulus, we have designed a modular system, based on panels composed of an 8 x 8 array of individual LEDs, that may be connected together to ’tile’ an experimental environment with controllable displays. The panels have been designed to be extremely bright, with the added flexibility of individual-pixel brightness control, allowing experimentation over a broad range of behaviorally relevant conditions. Patterns to be displayed may be designed using custom software, downloaded to a controller board, and displayed on the individually addressed panels via a rapid communication interface. The panels are controlled by a microprocessor-based display controller which, for most experiments, will not require a computer in the loop, greatly reducing the experimental infrastructure. This technology allows an experimenter to build and program a visual arena with a customized geometry in a matter of hours. To demonstrate the utility of this system, we present results from experiments with tethered Drosophila melanogaster: (1) in a cylindrical arena composed of 44 panels, used to test the contrast dependence of object orientation behavior, and (2) above a 30-panel floor display, used to examine the effects of ground motion on orientation during flight.
The transformer (tra) gene regulates all aspects of somatic sexual differentiation in Drosophila melanogaster females and has no function in males. We have isolated the tra gene as part of a 200 kb chromosomal walk. The 25 kb region around tra contains four genetically identified complementation groups and at least six transcriptional units. Germ-line transformation experiments indicate that a fragment of 2 kb is sufficient to supply tra+ function. Mapping of cDNAs from tra and from the adjacent genes indicates that the tra+ transcription unit is 1.2 kb or less. This transcription unit gives rise to a 1.0 kb RNA that is female-specific and a 1.2 kb RNA that is present in both sexes. tra+ and the gene at the 3' side overlap slightly in the 3' ends of their RNA coding sequences. These results suggest that tra+ function is regulated at the level of production of the female-specific tra RNA. The fact that a tra transcript is found in males raises interesting possibilities for how tra expression is controlled.
Light-induced photoreceptor apoptosis occurs in many forms of inherited retinal degeneration resulting in blindness in both vertebrates and invertebrates. Though mutations in several photoreceptor signaling proteins have been implicated in triggering this process, the molecular events relating light activation of rhodopsin to photoreceptor death are yet unclear. Here, we uncover a pathway by which activation of rhodopsin in Drosophila mediates apoptosis through a G protein-independent mechanism. This process involves the formation of membrane complexes of phosphorylated, activated rhodopsin and its inhibitory protein arrestin, and subsequent clathrin-dependent endocytosis of these complexes into a cytoplasmic compartment. Together, these data define the proapoptotic molecules in Drosophila photoreceptors and indicate a novel signaling pathway for light-activated rhodopsin molecules in control of photoreceptor viability.
