Recent advances in imaging technology have provided unprecedented views of biological events as they unfold in living organisms. Researchers routinely create movies of processes such as cells dividing and differentiating into the neurons, muscle, and skin in a Petri dish or in tiny embryos, such as the worm C. elegans. But scientists often have difficulty collecting similar data for larger specimens, such as zebrafish or fruit flies – whose embryos are about 1,000-fold larger and have challenging optical properties. A new “smart” light-sheet microscope developed at the Howard Hughes Medical Institute’s Janelia Research Campus solves this problem by analyzing a specimen continuously and adjusting its settings to optimize image quality.
“The microscope is smart in the sense that it controls the experiment itself,” explains Janelia Group Leader Philipp Keller. “It’s not the human who instructs the microscope exactly how to take images. The microscope figures out on its own what it needs to do to get a sharp image.”
This video shows a side-by-side comparison of adaptive and non-adaptive imaging in a zebrafish embryo. In the left panel are two views of a developing zebrafish embryo, with colored squares highlighting four areas of the image. These regions are enlarged in the middle panel, which shows the differences between the clarity of data collected in an adaptive (left) and non-adaptive (right) manner. The graphs on the right show how the data quality decreases over time without adaptive imaging.
Keller collaborated with scientists at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and researchers at Coleman Technologies Incorporated in Newtown Square, Pennsylvania to create the new microscope, which is described in an article published online on October 31, 2016 in the journal Nature Biotechnology. The publication provides a detailed description of the instrument, as well as the source code for its software, which is also publicly available on Github.
In 2012, Keller led a team that developed the simultaneous multi-view (SiMView) light sheet microscope, which allows scientists to image live organisms at high speeds, over long periods of time, without interfering with the processes they were observing. The microscope was based on methods Keller developed with colleagues at the European Molecular Biology Laboratory in Heidelberg, Germany to address one of the concerns associated with conventional light microscopy: the damaging effects of exposing a sample to light. Keller and his colleagues reduced this damage by illuminating only a thin section of a sample at a time with a scanned sheet of laser light, while a detector recorded the part of the sample being lit up.
The SiMView microscope was fast, and could capture dynamic processes in small, transparent specimens, such as individual cells in culture or at the surface of multi-cellular organisms. But it struggled to achieve high-resolution images of larger, more optically challenging specimens such as entire embryos.
This problem is directly linked to a key requirement in light-sheet microscopy: that the light sheet illuminating the specimen and the focal plane of the detection system are perfectly co-planar. “They should match precisely,” explains Keller. “If they deviate in any way geometrically, it’s like taking an image out of focus.”
Because living organisms aren’t physically or chemically uniform, the laser’s path can become perturbed as it travels through the specimen, making it difficult to keep the light sheet and detector property aligned. The bigger the sample, the worse the effects.
Keller’s solution to this problem was to make the microscope smarter, so that it could adjust itself to the changing specimen. “We were working on a lot of large multicellular organisms, and we needed a way to make the microscope adapt itself to the sample and conduct the experiment itself,” says Keller. “It was purely out of necessity that we began the project.”
It took Keller and his collaborators about three years to perfect the technology, which they call the AutoPilot framework. Janelia research scientist Raghav Chhetri focused on the hardware, adding more degrees of freedom to the microscope — such as the ability to rotate the light sheet in space — while Loïc Royer, a postdoctoral fellow in Gene Myers’s lab at Max Planck Institute of Molecular Cell Biology, led the software development.
The software works by analyzing images from the microscope in real time, determining how to improve the quality of images, and then adjusting the correct parameters on the microscope. “The system understands the relationship between all the different variables and when it observes that something is off, it can figure out which knob to turn to correct things,” explains Royer. Because light-sheet microscopes only collect three-dimensional images of a specimen every few seconds or hundreds of milliseconds, the calculations, analysis, and adjustments are done during the microscope’s brief down time.
Janelia scientists Bill Lemon and Yinan Wan tested the AutoPilot framework on a variety of model systems, including Drosophila and zebrafish. These experiments demonstrated improvements in spatial resolution and signal strength by a factor of 2 to 5.
Not only does the framework provide views of cellular – and even sub-cellular – structures that are otherwise unobtainable by conventional methods, is also very user-friendly. AutoPilot’s high level of instrument automatization ensures that any microscope user – even those without a technical background or training in light-sheet microscopy – can consistently and effortlessly produce optimal image data.
“We wanted to make the microscope as powerful as possible but also as easy to use as possible,” says Keller. “The framework does this by helping the user ensure that the experiment is set up correctly, and also making sure the user can effortlessly and reproducibly produce optimal quality images in every single experiment.”