Filter
Associated Lab
- Aso Lab (2) Apply Aso Lab filter
- Card Lab (1) Apply Card Lab filter
- Cardona Lab (3) Apply Cardona Lab filter
- Dickson Lab (1) Apply Dickson Lab filter
- Espinosa Medina Lab (1) Apply Espinosa Medina Lab filter
- Feliciano Lab (1) Apply Feliciano Lab filter
- Fitzgerald Lab (1) Apply Fitzgerald Lab filter
- Funke Lab (42) Apply Funke Lab filter
- Hess Lab (6) Apply Hess Lab filter
- Jayaraman Lab (1) Apply Jayaraman Lab filter
- Keller Lab (2) Apply Keller Lab filter
- Lippincott-Schwartz Lab (2) Apply Lippincott-Schwartz Lab filter
- Reiser Lab (2) Apply Reiser Lab filter
- Romani Lab (1) Apply Romani Lab filter
- Rubin Lab (2) Apply Rubin Lab filter
- Saalfeld Lab (11) Apply Saalfeld Lab filter
- Scheffer Lab (2) Apply Scheffer Lab filter
- Stern Lab (2) Apply Stern Lab filter
- Tillberg Lab (2) Apply Tillberg Lab filter
- Turaga Lab (5) Apply Turaga Lab filter
- Turner Lab (1) Apply Turner Lab filter
Associated Project Team
Publication Date
Type of Publication
42 Publications
Showing 41-42 of 42 resultsAutomated reconstruction of neural connectivity graphs from electron microscopy image stacks is an essential step towards large-scale neural circuit mapping. While significant progress has recently been made in automated segmentation of neurons and detection of synapses, the problem of synaptic partner assignment for polyadic (one-to-many) synapses, prevalent in the Drosophila brain, remains unsolved. In this contribution, we propose a method which automatically assigns pre- and postsynaptic roles to neurites adjacent to a synaptic site. The method constructs a probabilistic graphical model over potential synaptic partner pairs which includes factors to account for a high rate of one-to-many connections, as well as the possibility of the same neuron to be pre-synaptic in one synapse and post-synaptic in another. The algorithm has been validated on a publicly available stack of ssTEM images of Drosophila neural tissue and has been shown to reconstruct most of the synaptic relations correctly.
Cells contain hundreds of organelles and macromolecular assemblies. Obtaining a complete understanding of their intricate organization requires the nanometre-level, three-dimensional reconstruction of whole cells, which is only feasible with robust and scalable automatic methods. Here, to support the development of such methods, we annotated up to 35 different cellular organelle classes-ranging from endoplasmic reticulum to microtubules to ribosomes-in diverse sample volumes from multiple cell types imaged at a near-isotropic resolution of 4 nm per voxel with focused ion beam scanning electron microscopy (FIB-SEM). We trained deep learning architectures to segment these structures in 4 nm and 8 nm per voxel FIB-SEM volumes, validated their performance and showed that automatic reconstructions can be used to directly quantify previously inaccessible metrics including spatial interactions between cellular components. We also show that such reconstructions can be used to automatically register light and electron microscopy images for correlative studies. We have created an open data and open-source web repository, 'OpenOrganelle', to share the data, computer code and trained models, which will enable scientists everywhere to query and further improve automatic reconstruction of these datasets.
