Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (59) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (111) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (50) Apply Branson Lab filter
- Card Lab (40) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (13) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (47) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (17) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (37) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (59) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (25) Apply Hermundstad Lab filter
- Hess Lab (73) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (45) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (17) Apply Koay Lab filter
- Lavis Lab (140) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (27) Apply Li Lab filter
- Lippincott-Schwartz Lab (162) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (5) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (44) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (14) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (140) Apply Rubin Lab filter
- Saalfeld Lab (61) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (65) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Shroff Lab (24) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (91) Apply Spruston Lab filter
- Stern Lab (154) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (31) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (31) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (17) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (47) Apply Turaga Lab filter
- Turner Lab (36) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (14) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (24) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (9) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (52) Apply FlyEM filter
- FlyLight (47) Apply FlyLight filter
- GENIE (41) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (1) Apply 2025 filter
- 2024 (232) Apply 2024 filter
- 2023 (164) Apply 2023 filter
- 2022 (192) Apply 2022 filter
- 2021 (193) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4001 Publications
Showing 21-30 of 4001 resultsMost methods for structure-function analysis of the brain in medical images are usually based on voxel-wise statistical tests performed on registered magnetic resonance (MR) images across subjects. A major drawback of such methods is the inability to accurately locate regions that manifest nonlinear associations with clinical variables. In this paper, we propose Bayesian morphological analysis methods, based on a Bayesian-network representation, for the analysis of MR brain images. First, we describe how Bayesian networks (BNs) can represent probabilistic associations among voxels and clinical (function) variables. Second, we present a model-selection framework, which generates a BN that captures structure-function relationships from MR brain images and function variables. We demonstrate our methods in the context of determining associations between regional brain atrophy (as demonstrated on MR images of the brain), and functional deficits. We employ two data sets for this evaluation: the first contains MR images of 11 subjects, where associations between regional atrophy and a functional deficit are almost linear; the second data set contains MR images of the ventricles of 84 subjects, where the structure-function association is nonlinear. Our methods successfully identify voxel-wise morphological changes that are associated with functional deficits in both data sets, whereas standard statistical analysis (i.e., t-test and paired t-test) fails in the nonlinear-association case.
We address the problem of inferring the number of independently blinking fluorescent light emitters, when only their combined intensity contributions can be observed at each timepoint. This problem occurs regularly in light microscopy of objects that are smaller than the diffraction limit, where one wishes to count the number of fluorescently labelled subunits. Our proposed solution directly models the photo-physics of the system, as well as the blinking kinetics of the fluorescent emitters as a fully differentiable hidden Markov model. Given a trace of intensity over time, our model jointly estimates the parameters of the intensity distribution per emitter, their blinking rates, as well as a posterior distribution of the total number of fluorescent emitters. We show that our model is consistently more accurate and increases the range of countable subunits by a factor of two compared to current state-of-the-art methods, which count based on autocorrelation and blinking frequency, Further-more, we demonstrate that our model can be used to investigate the effect of blinking kinetics on counting ability, and therefore can inform experimental conditions that will maximize counting accuracy.
Medial and lateral hypothalamic loci are known to suppress and enhance appetite, respectively, but the dynamics and functional significance of their interaction have yet to be explored. Here we report that, in larval zebrafish, primarily serotonergic neurons of the ventromedial caudal hypothalamus (cH) become increasingly active during food deprivation, whereas activity in the lateral hypothalamus (LH) is reduced. Exposure to food sensory and consummatory cues reverses the activity patterns of these two nuclei, consistent with their representation of opposing internal hunger states. Baseline activity is restored as food-deprived animals return to satiety via voracious feeding. The antagonistic relationship and functional importance of cH and LH activity patterns were confirmed by targeted stimulation and ablation of cH neurons. Collectively, the data allow us to propose a model in which these hypothalamic nuclei regulate different phases of hunger and satiety and coordinate energy balance via antagonistic control of distinct behavioral outputs.
To accurately track self-location, animals need to integrate their movements through space. In amniotes, representations of self-location have been found in regions such as the hippocampus. It is unknown whether more ancient brain regions contain such representations and by which pathways they may drive locomotion. Fish displaced by water currents must prevent uncontrolled drift to potentially dangerous areas. We found that larval zebrafish track such movements and can later swim back to their earlier location. Whole-brain functional imaging revealed the circuit enabling this process of positional homeostasis. Position-encoding brainstem neurons integrate optic flow, then bias future swimming to correct for past displacements by modulating inferior olive and cerebellar activity. Manipulation of position-encoding or olivary neurons abolished positional homeostasis or evoked behavior as if animals had experienced positional shifts. These results reveal a multiregional hindbrain circuit in vertebrates for optic flow integration, memory of self-location, and its neural pathway to behavior.Competing Interest StatementThe authors have declared no competing interest.
Pain thresholds are, in part, set as a function of emotional and internal states by descending modulation of nociceptive transmission in the spinal cord. Neurons of the rostral ventromedial medulla (RVM) are thought to critically contribute to this process; however, the neural circuits and synaptic mechanisms by which distinct populations of RVM neurons facilitate or diminish pain remain elusive. Here we used in vivo opto/chemogenetic manipulations and trans-synaptic tracing of genetically identified dorsal horn and RVM neurons to uncover an RVM-spinal cord-primary afferent circuit controlling pain thresholds. Unexpectedly, we found that RVM GABAergic neurons facilitate mechanical pain by inhibiting dorsal horn enkephalinergic/GABAergic interneurons. We further demonstrate that these interneurons gate sensory inputs and control pain through temporally coordinated enkephalin- and GABA-mediated presynaptic inhibition of somatosensory neurons. Our results uncover a descending disynaptic inhibitory circuit that facilitates mechanical pain, is engaged during stress, and could be targeted to establish higher pain thresholds.
Photoconvertible fluorescent proteins are potential tools for investigating dynamic processes in living cells and for emerging super-resolution microscopy techniques. Unfortunately, most probes in this class are hampered by oligomerization, small photon budgets or poor photostability. Here we report an EosFP variant that functions well in a broad range of protein fusions for dynamic investigations, exhibits high photostability and preserves the approximately 10-nm localization precision of its parent.
Orange-red fluorescent proteins (FPs) are widely used in biomedical research for multiplexed epifluorescence microscopy with GFP-based probes, but their different excitation requirements make multiplexing with new advanced microscopy methods difficult. Separately, orange-red FPs are useful for deep-tissue imaging in mammals owing to the relative tissue transmissibility of orange-red light, but their dependence on illumination limits their sensitivity as reporters in deep tissues. Here we describe CyOFP1, a bright, engineered, orange-red FP that is excitable by cyan light. We show that CyOFP1 enables single-excitation multiplexed imaging with GFP-based probes in single-photon and two-photon microscopy, including time-lapse imaging in light-sheet systems. CyOFP1 also serves as an efficient acceptor for resonance energy transfer from the highly catalytic blue-emitting luciferase NanoLuc. An optimized fusion of CyOFP1 and NanoLuc, called Antares, functions as a highly sensitive bioluminescent reporter in vivo, producing substantially brighter signals from deep tissues than firefly luciferase and other bioluminescent proteins.
Metastasis depends upon cancer cell growth and survival within the metastatic niche. Tumors which remodel their glycocalyces, by overexpressing bulky glycoproteins like mucins, exhibit a higher predisposition to metastasize, but the role of mucins in oncogenesis remains poorly understood. Here we report that a bulky glycocalyx promotes the expansion of disseminated tumor cells in vivo by fostering integrin adhesion assembly to permit G1 cell cycle progression. We engineered tumor cells to display glycocalyces of various thicknesses by coating them with synthetic mucin-mimetic glycopolymers. Cells adorned with longer glycopolymers showed increased metastatic potential, enhanced cell cycle progression, and greater levels of integrin-FAK mechanosignaling and Akt signaling in a syngeneic mouse model of metastasis. These effects were mirrored by expression of the ectodomain of cancer-associated mucin MUC1. These findings functionally link mucinous proteins with tumor aggression, and offer a new view of the cancer glycocalyx as a major driver of disease progression.
The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these 'universal' statistics.
Imaging the 4D choreography of subcellular events in living multicellular organisms at high spatiotemporal resolution could reveal life’s fundamental principles. Yet extracting these principles from petabyte-scale image data requires fusing advanced light microscopy and cutting-edge machine learning models with biological insight and expertise.