Filter
Associated Lab
- Aguilera Castrejon Lab (17) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (68) Apply Ahrens Lab filter
- Aso Lab (42) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (115) Apply Betzig Lab filter
- Beyene Lab (14) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (54) Apply Branson Lab filter
- Card Lab (43) Apply Card Lab filter
- Cardona Lab (64) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (15) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (52) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (20) Apply Espinosa Medina Lab filter
- Feliciano Lab (8) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- FIB-SEM Technology (1) Apply FIB-SEM Technology filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (41) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (64) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (29) Apply Hermundstad Lab filter
- Hess Lab (79) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (47) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (152) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (29) Apply Li Lab filter
- Lippincott-Schwartz Lab (174) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (7) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (64) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (7) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (49) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (52) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (48) Apply Romani Lab filter
- Rubin Lab (146) Apply Rubin Lab filter
- Saalfeld Lab (64) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (38) Apply Scheffer Lab filter
- Schreiter Lab (68) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (94) Apply Spruston Lab filter
- Stern Lab (158) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (39) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (34) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (52) Apply Turaga Lab filter
- Turner Lab (39) Apply Turner Lab filter
- Vale Lab (8) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (23) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (5) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (12) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (56) Apply FlyEM filter
- FlyLight (50) Apply FlyLight filter
- GENIE (47) Apply GENIE filter
- Integrative Imaging (6) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (27) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (196) Apply 2025 filter
- 2024 (212) Apply 2024 filter
- 2023 (159) Apply 2023 filter
- 2022 (192) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4172 Publications
Showing 1351-1360 of 4172 resultsWe present an approach for the automatic reconstruction of neurons from 3D stacks of electron microscopy sections. The core of our system is a set of possible assignments, each of which proposes with some cost a link between neuron regions in consecutive sections. These can model the continuation, branching, and end of neurons. The costs are trainable on positive assignment samples. An optimal and consistent set of assignments is found for the whole volume at once by solving an integer linear program. This set of assignments determines both the segmentation into neuron regions and the correspondence between such regions in neighboring slices. For each picked assignment, a confidence value helps to prioritize decisions to be reviewed by a human expert. We evaluate the performance of our method on an annotated volume of neural tissue and compare to the current state of the art [26]. Our method is superior in accuracy and can be trained using a small number of samples. The observed inference times are linear with about 2 milliseconds per neuron and section.
Light-sheet fluorescence microscopy is able to image large specimens with high resolution by capturing the samples from multiple angles. Multiview deconvolution can substantially improve the resolution and contrast of the images, but its application has been limited owing to the large size of the data sets. Here we present a Bayesian-based derivation of multiview deconvolution that drastically improves the convergence time, and we provide a fast implementation using graphics hardware.
Previously, in (Hermundstad et al., 2014), we showed that when sampling is limiting, the efficient coding principle leads to a 'variance is salience' hypothesis, and that this hypothesis accounts for visual sensitivity to binary image statistics. Here, using extensive new psychophysical data and image analysis, we show that this hypothesis accounts for visual sensitivity to a large set of grayscale image statistics at a striking level of detail, and also identify the limits of the prediction. We define a 66-dimensional space of local grayscale light-intensity correlations, and measure the relevance of each direction to natural scenes. The 'variance is salience' hypothesis predicts that two-point correlations are most salient, and predicts their relative salience. We tested these predictions in a texture-segregation task using un-natural, synthetic textures. As predicted, correlations beyond second order are not salient, and predicted thresholds for over 300 second-order correlations match psychophysical thresholds closely (median fractional error < 0:13).
With recent advances in high-throughput Electron Microscopy (EM) imaging it is now possible to image an entire nervous system of organisms like Drosophila melanogaster. One of the bottlenecks to reconstruct a connectome from these large volumes (œ 100 TiB) is the pixel-wise prediction of membranes. The time it would typically take to process such a volume using a convolutional neural network (CNN) with a sliding window approach is in the order of years on a current GPU. With sliding windows, however, a lot of redundant computations are carried out. In this paper, we present an extension to the Caffe library to increase throughput by predicting many pixels at once. On a sliding window network successfully used for membrane classification, we show that our method achieves a speedup of up to 57×, maintaining identical prediction results.
Biophysically accurate multicompartmental models of individual neurons have significantly advanced our understanding of the input-output function of single cells. These models depend on a large number of parameters that are difficult to estimate. In practice, they are often hand-tuned to match measured physiological behaviors, thus raising questions of identifiability and interpretability. We propose a statistical approach to the automatic estimation of various biologically relevant parameters, including 1) the distribution of channel densities, 2) the spatiotemporal pattern of synaptic input, and 3) axial resistances across extended dendrites. Recent experimental advances, notably in voltage-sensitive imaging, motivate us to assume access to: i) the spatiotemporal voltage signal in the dendrite and ii) an approximate description of the channel kinetics of interest. We show here that, given i and ii, parameters 1-3 can be inferred simultaneously by nonnegative linear regression; that this optimization problem possesses a unique solution and is guaranteed to converge despite the large number of parameters and their complex nonlinear interaction; and that standard optimization algorithms efficiently reach this optimum with modest computational and data requirements. We demonstrate that the method leads to accurate estimations on a wide variety of challenging model data sets that include up to about 10(4) parameters (roughly two orders of magnitude more than previously feasible) and describe how the method gives insights into the functional interaction of groups of channels.
We have developed methods to achieve efficient CRISPR-Cas9–mediated gene knockout in ex vivo mouse embryonic salivary epithelial explants. Salivary epithelial explants provide a valuable model for characterizing cell signaling, differentiation, and epithelial morphogenesis, but research has been limited by a paucity of efficient gene perturbation methods. Here, we demonstrate highly efficient gene perturbation by transient transduction of guide RNA–expressing lentiviruses into Cas9-expressing salivary epithelial buds isolated from Cas9 transgenic mice. We first show that salivary epithelial explants can be cultured in low-concentration, nonsolidified Matrigel suspensions in 96-well plates, which greatly increases sample throughput compared to conventional cultures embedded in solidified Matrigel. We further show that salivary epithelial explants can grow and branch with FGF7 alone, while supplementing with insulin, transferrin, and selenium (ITS) enhances growth and branching. We then describe an efficient workflow to produce experiment-ready, high-titer lentiviruses within 1 wk after molecular cloning. To track transduced cells, we designed the lentiviral vector to coexpress a nuclear fluorescent reporter with the guide RNA. We routinely achieved 80% transduction efficiency when antibiotic selection was used. Importantly, we detected robust loss of targeted protein products when testing 9 guide RNAs for 3 different genes. Moreover, targeting the β1 integrin gene (Itgb1) inhibited branching morphogenesis, which supports the importance of cell–matrix adhesion in driving branching morphogenesis. In summary, we have established a lentivirus-based method that can efficiently perturb genes of interest in salivary epithelial explants, which will greatly facilitate studies of specific gene functions using this system.
Whole-cell recording is a key technique for investigating synaptic and cellular mechanisms underlying various brain functions. However, because of its high sensitivity to mechanical disturbances, applying the whole-cell recording method to freely moving animals has been challenging. Here, we describe a technique for obtaining such recordings in freely moving, drug-free animals with a high success rate. This technique involves three major steps: obtaining a whole-cell recording from awake head-fixed animals, reliable and efficient stabilization of the pipette with respect to the animal's head using an ultraviolet (UV)-transparent collar and UV-cured adhesive, and rapid release of the animal from head fixation without loss of the recording. This technique has been successfully applied to obtain intracellular recordings from the hippocampus of freely moving rats and mice exploring a spatial environment, and should be generally applicable to other brain areas in animals engaged in a variety of natural behaviors.
To successfully forage for food, animals must balance the energetic cost of searching for food sources with the energetic benefit of exploiting those sources. While the Marginal Value Theorem provides one normative account of this balance by specifying that a forager should leave a food patch when its energetic yield falls below the average yield of other patches in the environment, it assumes the presence of other readily reachable patches. In natural settings, however, a forager does not know whether it will encounter additional food patches, and it must balance potential energetic costs and benefits accordingly. Upon first encountering a patch of food, it faces a decision of whether and when to leave the patch in search of better options, and when to return if no better options are found. Here, we explore how a forager should structure its search for new food patches when the existence of those patches is unknown, and when searching for those patches requires energy that can only be harvested from a single known food patch. We identify conditions under which it is more favorable to explore the environment in several successive trips rather than in a single long exploration, and we show how the optimal sequence of trips depends on the forager’s beliefs about the distribution and nutritional content of food patches in the environment. This optimal strategy is well approximated by a local decision that can be implemented by a simple neural circuit architecture. Together, this work highlights how energetic constraints and prior beliefs shape optimal foraging strategies, and how such strategies can be approximated by simple neural networks that implement local decision rules.
Light-sheet microscopy is a powerful method for imaging the development and function of complex biological systems at high spatiotemporal resolution and over long time scales. Such experiments typically generate terabytes of multidimensional image data, and thus they demand efficient computational solutions for data management, processing and analysis. We present protocols and software to tackle these steps, focusing on the imaging-based study of animal development. Our protocols facilitate (i) high-speed lossless data compression and content-based multiview image fusion optimized for multicore CPU architectures, reducing image data size 30–500-fold; (ii) automated large-scale cell tracking and segmentation; and (iii) visualization, editing and annotation of multiterabyte image data and cell-lineage reconstructions with tens of millions of data points. These software modules are open source. They provide high data throughput using a single computer workstation and are readily applicable to a wide spectrum of biological model systems.
