Date of Award
2017
Degree Type
Thesis
Degree Name
Master of Science (MS)
Department
Computer Science
Abstract
Sensory neuroscience in the early auditory and visual systems appears distinct not
only to outside observers, but to many trained neuroscientists as well. However, to a computational neuroscientist, both sensory systems represent an efficient neural coding of information. In fact, on a computational level it appears the brain is using the same processing strategy for both senses - the same algorithm with just a change in inputs. Insights like this can greatly simplify our understanding of the brain, but require a significant computational background to fully appreciate. How can such illuminating results of computational neuroscience be made more accessible to the entire neuroscience community?
We built an Android mobile app that simulates the neural coding process in the early visual and auditory system. The app demonstrates the type of visual or auditory codes that would develop depending on the images or sounds that an evolving species would be exposed to over evolutionary time. This is done by visually displaying the derived image and sound filters based on an optimal encoding that information, and comparing them to visual representations of neural receptive fields in the brain.
Image patches (or equivalently, sound clips) are efficiently encoded using Independent Components Analysis (ICA) as a proxy for the coding objective of the early visual system. As has been observed for the past two decades, the resulting code from natural images resembles the 2D Gabor filter receptive fields measured from neurons in
primary visual cortex (V1). Similarly, this efficient encoding demonstration has been done for a mixture of "natural sounds" to create linear filters resembling the gammatone filters of the spiral ganglia from the cochlea.
The app demonstrates the relationship between efficient codes of images and sounds and related sensory neural coding in an intuitive, accessible way. This enables budding neuroscientists, and even the general public, to appreciate how an understanding of computational tools (like ICA or sparse coding) can bridge research across seemingly distinct areas of the brain. This enables a more parsimonious view of how the brain processes information, and may encourage early-program neuroscientists to consider improving their computational skills.
Recommended Citation
Zhao, Xiaolu, "A Mobile App Illustrating Sensory Neural Coding Through an Efficient Coding of Collected Images and Sounds" (2017). Master's Theses. 3715.
https://ecommons.luc.edu/luc_theses/3715
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 License.
Copyright Statement
Copyright © 2017 Xiaolu Zhao