Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_secondary_menu | block
janelia7_blocks-janelia7_fake_breadcrumb | block
Ahrens Lab / Publications
custom | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-021SKYQnqXW6ODq5W5dPAFEDBaEJubhN | block
general_search_page-panel_pane_1 | views_panes

57 Publications

Showing 31-40 of 57 results
01/01/06 | Large-scale biophysical parameter estimation in single neurons via constrained linear regression.
Ahrens M, Huys Q, Paninski L
Neural Information Processing Systems. 2006;18:

Our understanding of the input-output function of single cells has been substantially advanced by biophysically accurate multi-compartmental models. The large number of parameters needing hand tuning in these models has, however, somewhat hampered their applicability and interpretability. Here we propose a simple and well-founded method for automatic estimation of many of these key parameters: 1) the spatial distribution of channel densities on the cell’s membrane; 2) the spatiotemporal pattern of synaptic input; 3) the channels’ reversal potentials; 4) the intercompartmental conductances; and 5) the noise level in each compartment. We assume experimental access to: a) the spatiotemporal voltage signal in the dendrite (or some contiguous subpart thereof, e.g. via voltage sensitive imaging techniques), b) an approximate kinetic description of the channels and synapses present in each compartment, and c) the morphology of the part of the neuron under investigation. The key observation is that, given data a)-c), all of the parameters 1)-4) may be simultaneously inferred by a version of constrained linear regression; this regression, in turn, is efficiently solved using standard algorithms, without any “local minima” problems despite the large number of parameters and complex dynamics. The noise level 5) may also be estimated by standard techniques. We demonstrate the method’s accuracy on several model datasets, and describe techniques for quantifying the uncertainty in our estimates.

View Publication Page
06/01/15 | Large-scale imaging in small brains.
Ahrens MB, Engert F
Current Opinion in Neurobiology. 2015 Jun 1;32C:78-86. doi: 10.1016/j.conb.2015.01.007

The dense connectivity in the brain means that one neuron's activity can influence many others. To observe this interconnected system comprehensively, an aspiration within neuroscience is to record from as many neurons as possible at the same time. There are two useful routes toward this goal: one is to expand the spatial extent of functional imaging techniques, and the second is to use animals with small brains. Here we review recent progress toward imaging many neurons and complete populations of identified neurons in small vertebrates and invertebrates.

View Publication Page
Ahrens LabLooger LabKeller LabFreeman Lab
07/27/14 | Light-sheet functional imaging in fictively behaving zebrafish.
Vladimirov N, Mu Y, Kawashima T, Bennett DV, Yang C, Looger LL, Keller PJ, Freeman J, Ahrens MB
Nature Methods. 2014 Jul 27;11(9):883-4. doi: 10.1038/nmeth.3040

The processing of sensory input and the generation of behavior involves large networks of neurons, which necessitates new technology for recording from many neurons in behaving animals. In the larval zebrafish, light-sheet microscopy can be used to record the activity of almost all neurons in the brain simultaneously at single-cell resolution. Existing implementations, however, cannot be combined with visually driven behavior because the light sheet scans over the eye, interfering with presentation of controlled visual stimuli. Here we describe a system that overcomes the confounding eye stimulation through the use of two light sheets and combines whole-brain light-sheet imaging with virtual reality for fictively behaving larval zebrafish.

View Publication Page
12/30/14 | Light-sheet imaging for systems neuroscience.
Keller PJ, Ahrens MB, Freeman J
Nature Methods. 2014 Dec 30;12(1):27-9. doi: 10.1038/nmeth.3214

Developments in electrical and optical recording technology are scaling up the size of neuronal populations that can be monitored simultaneously. Light-sheet imaging is rapidly gaining traction as a method for optically interrogating activity in large networks and presents both opportunities and challenges for understanding circuit function.

View Publication Page
Looger LabAhrens LabFreeman LabSvoboda Lab
07/27/14 | Mapping brain activity at scale with cluster computing.
Freeman J, Vladimirov N, Kawashima T, Mu Y, Sofroniew NJ, Bennett DV, Rosen J, Yang C, Looger LL, Ahrens MB
Nature Methods. 2014 Jul 27;11(9):941-950. doi: 10.1038/nmeth.3041

Understanding brain function requires monitoring and interpreting the activity of large networks of neurons during behavior. Advances in recording technology are greatly increasing the size and complexity of neural data. Analyzing such data will pose a fundamental bottleneck for neuroscience. We present a library of analytical tools called Thunder built on the open-source Apache Spark platform for large-scale distributed computing. The library implements a variety of univariate and multivariate analyses with a modular, extendable structure well-suited to interactive exploration and analysis development. We demonstrate how these analyses find structure in large-scale neural data, including whole-brain light-sheet imaging data from fictively behaving larval zebrafish, and two-photon imaging data from behaving mouse. The analyses relate neuronal responses to sensory input and behavior, run in minutes or less and can be used on a private cluster or in the cloud. Our open-source framework thus holds promise for turning brain activity mapping efforts into biological insights.

View Publication Page
03/20/24 | Motor neurons generate pose-targeted movements via proprioceptive sculpting.
Gorko B, Siwanowicz I, Close K, Christoforou C, Hibbard KL, Kabra M, Lee A, Park J, Li SY, Chen AB, Namiki S, Chen C, Tuthill JC, Bock DD, Rouault H, Branson K, Ihrke G, Huston SJ
Nature. 2024 Mar 20:. doi: 10.1038/s41586-024-07222-5

Motor neurons are the final common pathway through which the brain controls movement of the body, forming the basic elements from which all movement is composed. Yet how a single motor neuron contributes to control during natural movement remains unclear. Here we anatomically and functionally characterize the individual roles of the motor neurons that control head movement in the fly, Drosophila melanogaster. Counterintuitively, we find that activity in a single motor neuron rotates the head in different directions, depending on the starting posture of the head, such that the head converges towards a pose determined by the identity of the stimulated motor neuron. A feedback model predicts that this convergent behaviour results from motor neuron drive interacting with proprioceptive feedback. We identify and genetically suppress a single class of proprioceptive neuron that changes the motor neuron-induced convergence as predicted by the feedback model. These data suggest a framework for how the brain controls movements: instead of directly generating movement in a given direction by activating a fixed set of motor neurons, the brain controls movements by adding bias to a continuing proprioceptive-motor loop.

View Publication Page
08/03/17 | Multi-scale approaches for high-speed imaging and analysis of large neural populations.
Friedrich J, Yang W, Soudry D, Mu Y, Ahrens MB, Yuste R, Peterka DS, Paninski L
PLoS Computational Biology. 2017 Aug 03;13(8):e1005685. doi: 10.1371/journal.pcbi.1005685

Progress in modern neuroscience critically depends on our ability to observe the activity of large neuronal populations with cellular spatial and high temporal resolution. However, two bottlenecks constrain efforts towards fast imaging of large populations. First, the resulting large video data is challenging to analyze. Second, there is an explicit tradeoff between imaging speed, signal-to-noise, and field of view: with current recording technology we cannot image very large neuronal populations with simultaneously high spatial and temporal resolution. Here we describe multi-scale approaches for alleviating both of these bottlenecks. First, we show that spatial and temporal decimation techniques based on simple local averaging provide order-of-magnitude speedups in spatiotemporally demixing calcium video data into estimates of single-cell neural activity. Second, once the shapes of individual neurons have been identified at fine scale (e.g., after an initial phase of conventional imaging with standard temporal and spatial resolution), we find that the spatial/temporal resolution tradeoff shifts dramatically: after demixing we can accurately recover denoised fluorescence traces and deconvolved neural activity of each individual neuron from coarse scale data that has been spatially decimated by an order of magnitude. This offers a cheap method for compressing this large video data, and also implies that it is possible to either speed up imaging significantly, or to "zoom out" by a corresponding factor to image order-of-magnitude larger neuronal populations with minimal loss in accuracy or temporal resolution.

View Publication Page
01/01/10 | Multilinear models of single cell responses in the medial nucleus of the trapezoid body.
Englitz B, Ahrens M, Tolnai S, Rübsamen R, Sahani M, Jost J
Network. 2010;21(1-2):91-124. doi: 10.3109/09548981003801996

The representation of acoustic stimuli in the brainstem forms the basis for higher auditory processing. While some characteristics of this representation (e.g. tuning curve) are widely accepted, it remains a challenge to predict the firing rate at high temporal resolution in response to complex stimuli. In this study we explore models for in vivo, single cell responses in the medial nucleus of the trapezoid body (MNTB) under complex sound stimulation. We estimate a family of models, the multilinear models, encompassing the classical spectrotemporal receptive field and allowing arbitrary input-nonlinearities and certain multiplicative interactions between sound energy and its short-term auditory context. We compare these to models of more traditional type, and also evaluate their performance under various stimulus representations. Using the context model, 75% of the explainable variance could be predicted based on a cochlear-like, gamma-tone stimulus representation. The presence of multiplicative contextual interactions strongly reduces certain inhibitory/suppressive regions of the linear kernels, suggesting an underlying nonlinear mechanism, e.g. cochlear or synaptic suppression, as the source of the suppression in MNTB neuronal responses. In conclusion, the context model provides a rich and still interpretable extension over many previous phenomenological models for modeling responses in the auditory brainstem at submillisecond resolution.

View Publication Page
02/03/16 | Neural circuits underlying visually evoked escapes in larval zebrafish.
Dunn TW, Gebhardt C, Naumann EA, Riegler C, Ahrens MB, Engert F, Del Bene F
Neuron. 2016 Feb 3;89(3):613-628. doi: 10.1016/j.neuron.2015.12.021

Escape behaviors deliver organisms away from imminent catastrophe. Here, we characterize behavioral responses of freely swimming larval zebrafish to looming visual stimuli simulating predators. We report that the visual system alone can recruit lateralized, rapid escape motor programs, similar to those elicited by mechanosensory modalities. Two-photon calcium imaging of retino-recipient midbrain regions isolated the optic tectum as an important center processing looming stimuli, with ensemble activity encoding the critical image size determining escape latency. Furthermore, we describe activity in retinal ganglion cell terminals and superficial inhibitory interneurons in the tectum during looming and propose a model for how temporal dynamics in tectal periventricular neurons might arise from computations between these two fundamental constituents. Finally, laser ablations of hindbrain circuitry confirmed that visual and mechanosensory modalities share the same premotor output network. We establish a circuit for the processing of aversive stimuli in the context of an innate visual behavior.

View Publication Page
02/20/08 | Nonlinearities and contextual influences in auditory cortical responses modeled with multilinear spectrotemporal methods.
Ahrens MB, Linden JF, Sahani M
The Journal of Neuroscience. 2008 Feb 20;28(8):1929-42. doi: 10.1523/JNEUROSCI.3377-07.2008

The relationship between a sound and its neural representation in the auditory cortex remains elusive. Simple measures such as the frequency response area or frequency tuning curve provide little insight into the function of the auditory cortex in complex sound environments. Spectrotemporal receptive field (STRF) models, despite their descriptive potential, perform poorly when used to predict auditory cortical responses, showing that nonlinear features of cortical response functions, which are not captured by STRFs, are functionally important. We introduce a new approach to the description of auditory cortical responses, using multilinear modeling methods. These descriptions simultaneously account for several nonlinearities in the stimulus-response functions of auditory cortical neurons, including adaptation, spectral interactions, and nonlinear sensitivity to sound level. The models reveal multiple inseparabilities in cortical processing of time lag, frequency, and sound level, and suggest functional mechanisms by which auditory cortical neurons are sensitive to stimulus context. By explicitly modeling these contextual influences, the models are able to predict auditory cortical responses more accurately than are STRF models. In addition, they can explain some forms of stimulus dependence in STRFs that were previously poorly understood.

View Publication Page