Filter
Associated Lab
- Aguilera Castrejon Lab (15) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (56) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (110) Apply Betzig Lab filter
- Beyene Lab (10) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (48) Apply Branson Lab filter
- Card Lab (40) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (12) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (46) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (16) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (28) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (34) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (58) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (71) Apply Hess Lab filter
- Ilanges Lab (1) Apply Ilanges Lab filter
- Jayaraman Lab (44) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (75) Apply Keller Lab filter
- Koay Lab (16) Apply Koay Lab filter
- Lavis Lab (136) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (25) Apply Li Lab filter
- Lippincott-Schwartz Lab (161) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (5) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (58) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (4) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (41) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (14) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (49) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (40) Apply Romani Lab filter
- Rubin Lab (139) Apply Rubin Lab filter
- Saalfeld Lab (60) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (62) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Shroff Lab (23) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (91) Apply Spruston Lab filter
- Stern Lab (152) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (29) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (31) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (17) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (46) Apply Turaga Lab filter
- Turner Lab (35) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (2) Apply Voigts Lab filter
- Wang (Meng) Lab (9) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (24) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (5) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (51) Apply FlyEM filter
- FlyLight (46) Apply FlyLight filter
- GENIE (40) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (24) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2024 (141) Apply 2024 filter
- 2023 (175) Apply 2023 filter
- 2022 (192) Apply 2022 filter
- 2021 (193) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
3920 Publications
Showing 2301-2310 of 3920 resultsDuring starvation the transcriptional activation of catabolic processes is induced by the nuclear translocation and consequent activation of transcription factor EB (TFEB), a master modulator of autophagy and lysosomal biogenesis. However, how TFEB is inactivated upon nutrient refeeding is currently unknown. Here we show that TFEB subcellular localization is dynamically controlled by its continuous shuttling between the cytosol and the nucleus, with the nuclear export representing a limiting step. TFEB nuclear export is mediated by CRM1 and is modulated by nutrient availability via mTOR-dependent hierarchical multisite phosphorylation of serines S142 and S138, which are localized in proximity of a nuclear export signal (NES). Our data on TFEB nucleo-cytoplasmic shuttling suggest an unpredicted role of mTOR in nuclear export.
Recordings of large neuronal ensembles and neural stimulation of high spatial and temporal precision are important requisites for studying the real-time dynamics of neural networks. Multiple-shank silicon probes enable large-scale monitoring of individual neurons. Optical stimulation of genetically targeted neurons expressing light-sensitive channels or other fast (milliseconds) actuators offers the means for controlled perturbation of local circuits. Here we describe a method to equip the shanks of silicon probes with micron-scale light guides for allowing the simultaneous use of the two approaches. We then show illustrative examples of how these compact hybrid electrodes can be used in probing local circuits in behaving rats and mice. A key advantage of these devices is the enhanced spatial precision of stimulation that is achieved by delivering light close to the recording sites of the probe. When paired with the expression of light-sensitive actuators within genetically specified neuronal populations, these devices allow the relatively straightforward and interpretable manipulation of network activity.
Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time–with minimal latency–opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as ’virtual reality’-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.
Drosophila melanogaster has served as a powerful model system for genetic studies of courtship songs. To accelerate research on the genetic and neural mechanisms underlying courtship song, we have developed a sensitive recording system to simultaneously capture the acoustic signals from 32 separate pairs of courting flies as well as software for automated segmentation of songs.
mRNA translation is a key step in decoding genetic information. Genetic decoding is surprisingly heterogeneous, as multiple distinct polypeptides can be synthesized from a single mRNA sequence. To study translational heterogeneity, we developed the MoonTag, a new fluorescence labeling system to visualize translation of single mRNAs. When combined with the orthogonal SunTag system, the MoonTag enables dual readouts of translation, greatly expanding the possibilities to interrogate complex translational heterogeneity. By placing MoonTag and SunTag sequences in different translation reading frames, each driven by distinct translation start sites, start site selection of individual ribosomes can be visualized in real-time. We find that start site selection is largely stochastic, but that the probability of using a particular start site differs among mRNA molecules, and can be dynamically regulated over time. Together, this study provides key insights into translation start site selection heterogeneity, and provides a powerful toolbox to visualize complex translation dynamics.
Accurate tracking of the same neurons across multiple days is crucial for studying changes in neuronal activity during learning and adaptation. New advances in high density extracellular electrophysiology recording probes, such as Neuropixels, provide a promising avenue to accomplish this goal. Identifying the same neurons in multiple recordings is, however, complicated by non-rigid movement of the tissue relative to the recording sites (drift) and loss of signal from some neurons. Here we propose a neuron tracking method that can identify the same cells independent of firing statistics, which are used by most existing methods. Our method is based on between-day non-rigid alignment of spike sorted clusters. We verified the same cell identify using measured visual receptive fields. This method succeeds on datasets separated from one to 47 days, with an 86% average recovery rate.
The paper describes a target tracking system running on a Heterogeneous Sensor Network (HSN) and presents results gathered from a realistic deployment. The system fuses audio direction of arrival data from mote class devices and object detection measurements from embedded PCs equipped with cameras. The acoustic sensor nodes perform beamforming and measure the energy as a function of the angle. The camera nodes detect moving objects and estimate their angle. The sensor detections are sent to a centralized sensor fusion node via a combination of two wireless networks. The novelty of our system is the unique combination of target tracking methods customized for the application at hand and their implementation on an actual HSN platform.
We present a fully automatic method for 3D segmentation of the mandibular bone from CT data. The method includes an adaptation of statistical shape models of the mandible, the skull base and the midfacial bones, followed by a simultaneous graph-based optimization of adjacent deformable models. The adaptation of the models to the image data is performed according to a heuristic model of the typical intensity distribution in the vincinity of the bone boundary, with special focus on an accurate discrimination of adjacent bones in joint regions. An evaluation of our method based on 18 CT scans shows that a manual correction of the automatic segmentations is not necessary in approx. 60% of the axial slices that contain the mandible.
For biomechanical simulations, the segmentation of multiple adjacent anatomical struc- tures from medical image data is often required. If adjacent structures are barely dis- tinguishable in image data, in general automatic segmentation methods for single struc- tures do not yield sufficiently accurate results. To improve segmentation accuracy in these cases, knowledge about adjacent structures must be exploited. Optimal graph searching (graph cuts) based on deformable surface models allows for a simultaneous segmentation of multiple adjacent objects. However, this method requires a correspon- dence relation between vertices of adjacent surface meshes. Line segments, each con- taining two corresponding vertices, may then serve as shared displacement directions in the segmentation process. In this paper we propose a scheme for constructing a corre- spondence relation in adjacent regions of two arbitrary surfaces. This correspondence relation implies shared displacement directions that we apply for segmentation with de- formable surfaces. Here, overlap of the surfaces is guaranteed not to occur. We show correspondence relations for regions on a femoral head and acetabulum and other adja- cent structures, as well as an evaluation of segmentation results on 50 ct images of the hip joint.
Visually guided decision-making requires integration of information from distributed brain areas, necessitating a brain-wide approach to examine its neural mechanisms. New tools in Drosophila melanogaster enable circuits spanning the brain to be charted with single cell-type resolution. Here, we highlight recent advances uncovering the computations and circuits that transform and integrate visual information across the brain to make behavioral choices. Visual information flows from the optic lobes to three primary central brain regions: a sensorimotor mapping area and two 'higher' centers for memory or spatial orientation. Rapid decision-making during predator evasion emerges from the spike timing dynamics in parallel sensorimotor cascades. Goal-directed decisions may occur through memory, navigation and valence processing in the central complex and mushroom bodies.