JoshuaGalla / VisSimData

Neural analysis of visual stimulation data
MIT License
0 stars 0 forks source link

Read Me:

Repository for VisSimData (in progress).

Project Overview:

The goal of this project is to perform a comprehensive analysis to understand how simulated stimuli parameters/characteristics influence neural response strength. This analysis consists of generating visualizations for both individual and population averages of simulated neurons, including neural traces, parameter-specific combo plots, sampling space heatmaps, and more.

Workflow:

An initial analysis of simulated data can be performed by running the following in terminal command line:

python run_interpolate.py

This will generate a simulated dataset (described in the Data section below) and display a non-interpolated tuning curve for the indicated neuron (which can be changed at parameters/parameters.yaml -> neuron:).

The tuning curve displays a neuron's response to a given simulated stimulus with theoretical characteristics. This simulated plot aims to replicate visualization techniques that hold more power in an experimental context, where we could use tuning curves (and other analyses explained in-depth in the sim_data_analysis.ipynb notebook) to gain a better understanding of how stimulus characteristics such as shape, color, or speed can influence neural responses and functional response mapping in behaviors such as motor movement or visual processing.

Future additions of this script will involve implementing bayesian optimization to produce an interpolated tuning curve to fill in non-sampled data and produce a complete picture of neural response to all possible sampling combinations.

Data:

Supplementary analysis can be viewed in the analysis/sim_data_analysis.ipynb notebook. The following text files (generated by running python run_interpolate.py as described in the section above) serve as input for this notebook:

1) data/stim_responses.txt: contains an nxd array of values, where each row corresponds to a different simulated neuron, and each column is the respective neuron's response in time. 2) data/stim_data.txt: contains three columns of data. The first column being the frame/time in which a simulated visual stimuli would appear, the second being the property of the first dimension of the visual stimulus shown, and the third being the property of the second dimension of the visual stimulus shown.

This analysis can help us identify a combination of properties for dimension 1 and 2 of a simulated visual stimulus that elicits the greatest response across our population of neurons. In an experimental context, this could help inform us of the characteristics of stimuli that elicit potential behavioral responses such as motor movement or pathways involved in visual processing.

Other Notes/References:

A more complex (and eventually real-time) version of this analysis is included as supplementary work in the BayesOpt repository of the Draelos Lab at the University of Michigan. We apply statistical techniques including machine learning and Bayesian Optimization to high-dimensional visual stimulation experiments to estimate neural responses that inform us of behavioral dynamics and outputs. For more information on how this work can be applied to experimental data rather than simulated data, please visit the Draelos Lab repository linked above or our lab website.