Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-BfUTt7484DSUmejmGh6NWRUlV0BgbVWM | block
facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-aK0bSsPXQOqhYQEgonL2xGNrv4SPvFLb | block
general_search_page-panel_pane_1 | views_panes

59 Results

Showing 1-10 of 59 results
Your Criteria:
    Publications
    04/03/18 | A deep (learning) dive into a cell.
    Branson K
    Nature Methods. 2018 Apr 03;15(4):253-4. doi: 10.1038/nmeth.4658
    Publications
    04/20/15 | A multilevel multimodal circuit enhances action selection in Drosophila.
    Ohyama T, Schneider-Mizell CM, Fetter RD, Aleman JV, Franconville R, Rivera-Alba M, Mensh BD, Branson KM, Simpson JH, Truman JW, Cardona A, Zlatic M
    Nature. 2015 Apr 20;520(7549):633-9. doi: 10.1038/nature14297

    Natural events present multiple types of sensory cues, each detected by a specialized sensory modality. Combining information from several modalities is essential for the selection of appropriate actions. Key to understanding multimodal computations is determining the structural patterns of multimodal convergence and how these patterns contribute to behaviour. Modalities could converge early, late or at multiple levels in the sensory processing hierarchy. Here we show that combining mechanosensory and nociceptive cues synergistically enhances the selection of the fastest mode of escape locomotion in Drosophila larvae. In an electron microscopy volume that spans the entire insect nervous system, we reconstructed the multisensory circuit supporting the synergy, spanning multiple levels of the sensory processing hierarchy. The wiring diagram revealed a complex multilevel multimodal convergence architecture. Using behavioural and physiological studies, we identified functionally connected circuit nodes that trigger the fastest locomotor mode, and others that facilitate it, and we provide evidence that multiple levels of multimodal integration contribute to escape mode selection. We propose that the multilevel multimodal convergence architecture may be a general feature of multisensory circuits enabling complex input–output functions and selective tuning to ecologically relevant combinations of cues.

    View Publication Page
    Publications
    06/20/12 | A simple strategy for detecting moving objects during locomotion revealed by animal-robot interactions.
    Zabala F, Polidoro P, Robie AA, Branson K, Perona P, Dickinson MH
    Current Biology. 2012 Jun 20;22(14):1344-50. doi: 10.1016/j.cub.2012.05.024

    An important role of visual systems is to detect nearby predators, prey, and potential mates [1], which may be distinguished in part by their motion. When an animal is at rest, an object moving in any direction may easily be detected by motion-sensitive visual circuits [2, 3]. During locomotion, however, this strategy is compromised because the observer must detect a moving object within the pattern of optic flow created by its own motion through the stationary background. However, objects that move creating back-to-front (regressive) motion may be unambiguously distinguished from stationary objects because forward locomotion creates only front-to-back (progressive) optic flow. Thus, moving animals should exhibit an enhanced sensitivity to regressively moving objects. We explicitly tested this hypothesis by constructing a simple fly-sized robot that was programmed to interact with a real fly. Our measurements indicate that whereas walking female flies freeze in response to a regressively moving object, they ignore a progressively moving one. Regressive motion salience also explains observations of behaviors exhibited by pairs of walking flies. Because the assumptions underlying the regressive motion salience hypothesis are general, we suspect that the behavior we have observed in Drosophila may be widespread among eyed, motile organisms.

    View Publication Page
    Publications
    01/06/25 | A split-GAL4 driver line resource for Drosophila neuron types
    Meissner GW, Vannan A, Jeter J, Close K, Depasquale GM, Dorman Z, Forster K, Beringer JA, Gibney TV, Hausenfluck JH, He Y, Henderson K, Johnson L, Johnston RM, Ihrke G, Iyer N, Lazarus R, Lee K, Li H, Liaw H, Melton B, Miller S, Motaher R, Novak A, Ogundeyi O, Petruncio A, Price J, Protopapas S, Tae S, Taylor J, Vorimo R, Yarbrough B, Zeng KX, Zugates CT, Dionne H, Angstadt C, Ashley K, Cavallaro A, Dang T, Gonzalez GA, Hibbard KL, Huang C, Kao J, Laverty T, Mercer M, Perez B, Pitts S, Ruiz D, Vallanadu V, Zheng GZ, Goina C, Otsuna H, Rokicki K, Svirskas RR, Cheong HS, Dolan M, Ehrhardt E, Feng K, El Galfi B, Goldammer J, Huston SJ, Hu N, Ito M, McKellar C, minegishi r, Namiki S, Nern A, Schretter CE, Sterne GR, Venkatasubramanian L, Wang K, Wolff T, Wu M, George R, Malkesman O, Aso Y, Card GM, Dickson BJ, Korff W, Ito K, Truman JW, Zlatic M, Rubin GM
    People
    Alice Robie
    Senior Scientist
    Publications
    08/12/19 | An automatic behavior recognition system classifies animal behaviors using movements and their temporal context.
    Ravbar P, Branson K, Simpson JH
    Journal of Neuroscience Methods. 2019 Aug 12;326:108352. doi: 10.1016/j.jneumeth.2019.108352

    Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal’s position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.

    View Publication Page
    Publications
    12/13/16 | An empirical analysis of deep network loss surfaces.
    Im DJ, Tao M, Branson K
    arXiv. 2016 Dec 13:arXiv:1612.04010

    The training of deep neural networks is a high-dimension optimization problem with respect to the loss function of a model. Unfortunately, these functions are of high dimension and non-convex and hence difficult to characterize. In this paper, we empirically investigate the geometry of the loss functions for state-of-the-art networks with multiple stochastic optimization methods. We do this through several experiments that are visualized on polygons to understand how and when these stochastic optimization methods find minima.

    View Publication Page
    People
    Aniket Ravan
    Machine Learning Researcher
    Publications
    07/29/14 | Automated image-based tracking and its application in ecology.
    Dell AI, Bender JA, Branson K, Couzin ID, de Polavieja GG, Noldus LP, Pérez-Escudero A, Perona P, Straw AD, Wikelski M, Brose U
    Trends in Ecology and Evolution. 2014 Jul;29(7):417-428. doi: 10.1016/j.tree.2014.05.004

    The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.

    View Publication Page
    People
    Barry Dickson
    Visiting Scientist