openworm / CElegansNeuroML

NeuroML based C elegans model, contained in a neuroConstruct project, as well as c302
http://opensourcebrain.org/projects/celegans
134 stars 50 forks source link

Make it easier to get at simulation results #56

Closed slarson closed 6 years ago

slarson commented 8 years ago

There have been a few calls to make it easier to see the results of simulations run from here (and c302).

Maybe we could have some discussion / thoughts on how we'd like to see that and how to accomplish it?

(cc: @aexbrown @aidanrocke @theideasmith)

theideasmith commented 8 years ago

Maybe a Geppeto module could be storing simulation results of arbitrary components of the simulation. Then these results could be saved to a file and shared as a git repo.

ghost commented 8 years ago

I really haven't given this any serious consideration yet. But, I'll look into this problem in more detail next week and should have a response by Feb 13.

aexbrown commented 8 years ago

One possibility that might be a useful quick visualisation would be to get the coordinates from the Saul Kato’s paper and to project simulation results into that space. I would consider it a nice success if the trajectories looked non-random in that space. A complementary check would be to estimate the dimensionality of the simulation results. Even if it’s not initially like the real worm, finding a dimensionality much less than 302 would at least suggest some coordination between the model neurons.

On 7 Feb 2016, at 01:06, Stephen Larson notifications@github.com<mailto:notifications@github.com> wrote:

There have been a few calls to make it easier to see the results of simulations run from here (and c302).

Maybe we could have some discussion / thoughts on how we'd like to see that and how to accomplish it?

(cc: @aexbrownhttps://github.com/aexbrown @aidanrockehttps://github.com/aidanrocke @theideasmithhttps://github.com/theideasmith)

— Reply to this email directly or view it on GitHubhttps://github.com/openworm/CElegansNeuroML/issues/56.

theideasmith commented 8 years ago

I got the data from Kato's paper and just sat down to run PCA on it.

I'd be interested in taking care of this visualization. Would I have to collect my own data or can someone on the team share a download link for it?

ghost commented 8 years ago

@slarson

Ok. So the first thing I tried to do was figure out what kind of results we can see so far. So I tried to follow the instructions here.

And I downloaded the neuron simulator. But, I encountered some difficulties: 1) Visualization within Neuroconstruct doesn't work. I can't see anything even after waiting for several minutes. 2) Changing Path It's not clear to me what is meant by the following instructions:

Install the NEURON simulation environment and set the path to NEURON's bin directory containing nrniv within neuroConstruct's menu (Settings->General Preferences and Project Defaults).

What am I supposed to do exactly?

I attached images of what exactly I can see in terms of neuroConstruct visualization and 'changing path':

change_path visualization

Note: I followed the installation instructions and started which are the following:

  • Linux/Mac *

    Mac Pre-requisites: XCode command line tools downloaded and installed (http://g.ua/WAAB) Git command line tool downloaded and installed (http://git-scm.com/download/mac) (https://help.github.com/articles/set-up-git)

    Quick install:

    git clone git://github.com/NeuralEnsemble/neuroConstruct.git cd neuroConstruct ./updatenC.sh

    To get the Open Source Brain models:

    cd osb ./checkOsb.sh -u

    To build & run (using Apache Ant)

    ant run

    To build & run (without Ant)

    << open nC.sh and change NC_HOME >> ./nC.sh -make ./nC.sh

ghost commented 8 years ago

@slarson

I would be very interested in seeing the output of the opensource brain 'CelegansNeuromechanicalGaitModulation' model. This is apparently based on the paper by Netta Cohen.

Without starting new projects(i.e. visualizations of Saul Kato's results), I think I would first try to have a close look at the output of existing models. I'm not saying that this wouldn't be an interesting addition to the existing collection. But, I think it would be a good idea to look at the existing output of available models and see whether they could be presented better and whether the simulations are relatively uniform in quality.

And, before I make any silly assumptions I thought I'd ask. What's the motivation for making it easier to see the results of simulations? As it is the process seems pretty straightforward. Take 10-20 mins to install neuron and load some examples...provided you don't get stuck as I have. I think this is a very small price to pay for OpenWorm members who are generally quite able(technically speaking).

Are we concerned with making it easier for the wider OpenWorm audience to see simulation results?

pgleeson commented 8 years ago

See comments here about the current status of the neuroConstruct project and c302:

https://groups.google.com/forum/#!searchin/openworm-discuss/gleeson|sort:date/openworm-discuss/Qxz8LY6uKGQ/AYsTBqQBBAAJ

In short, the neuroConstruct project has not been updated recently and c302 represents the latest ideas for getting a biophysical model of the neuronal network driving worm behaviour. The problem with viewing on nC might require downgrading to Java 7 (see here)

The updated model from c302 will be brought back into nC eventually, but that's a long way off.

c302 does produce output: install pyNeuroML, get a copy of c302 and run:

pynml examples/LEMS_c302_C_Muscles.xml

and you get data like:

selection_045

However, this data is currently pretty meaningless... The cells, synapses and connectivity still need to be tuned to produce some patterns of activation which would drive the muscles to have moving waves of contraction...

ghost commented 8 years ago

@pgleeson

Thank you for this clarification. But, I can't get thepynml command to work despite following the installation instructions for PyNeuroML over here. I can't run the following for example:

pynml-channelanalysis NaConductance.channel.nml

One thing I would suggest is that the simulation results should not appear as several separate windows. And, ideally we would also have a sub-figure of the simulated worm. I haven't done any GUI programming so I don't know how difficult this would be.

slarson commented 8 years ago

First of all, thanks for everyone who is chiming in on this!

This problem is one of the key use cases for Docker. If you have a Docker environment installed, you can do the following to run the simulation results:

docker run -ti openworm/openworm_test

(then, within shell)

cd CElegansNeuroML/CElegans/pythonScripts/c302/
export DISPLAY=[correct DISPLAY address for your system]
pynml examples/LEMS_c302_C_Muscles.xml

And this will render the graphs as Padraig showed -- no further messing with libraries / versions of the JRE.

(note, if you are running on an Ubuntu desktop, it may be better to do

docker run -ti -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix openworm/openworm_test

and then you don't need to export the DISPLAY variable as on the second line.)

Also, just to clarify to everyone else what Padraig's statement on "meaningless" means -- it means that there's no guarantee that what comes out is accurate dynamics biologically speaking, but from the perspective of being the output of the best iteration of the model we have assembled so far, it isn't like the output is random noise or anything :) Padraig has a high standard for when results are "meaningful" (and he is right in that it is important to understand these caveats when interpreting what is coming out of this simulation).

On @AidanRocke's question about the purpose of making results easier to see, this comes from a recent question from a few folks who would like to run c302, and as you can see some of the stumbling points to even see these graphs, we would like to reduce the barrier to entry.

On @aexbrown 's point about trying to see some correspondance between model output and output from real data, I believe that @theideasmith has been taking a crack at reproducing the results from the Kato et. al paper you are talking about and I am hoping that his code will be something that he can share and we can feed in c302 input into so we can answer that exact question!

@theideasmith -- I think at this point you do have to collect your own data output using the methods we are describing here -- there are no cached sim runs yet available.

ghost commented 8 years ago

@slarson

Thank you for these instructions. I will try out docker this weekend. As for the ultimate solution to this problem, wouldn't it be to use web-based data visualizations of simulation results? But, that's part of the Geppetto project.

Just out of curiosity, what are the current go-to tools used by OpenWorm for large scale data visualization? I have heard good things about Vispy and I think that may be used for Kato's Brain Dynamics model.

slarson commented 8 years ago

@AidanRocke Doing this in Geppetto would be awesome, and in fact we've proposed a google summer of code project to do exactly that (see 11.1 here)

There isn't a single thing used for visualization -- lots of different projects do different things.

Thanks for pointing to Vispy -- hadn't been on my radar!

pgleeson commented 8 years ago

@AidanRocke Can you give some more details on the error getting pyNeuromL installed? What OS are you using? Can you import neuroml or pyneuroml in Python? Is there any console output when you have the error?

"And, ideally we would also have a sub-figure of the simulated worm. " There is no worm (yet) in the c302 simulations, just a subset of the neurons and perhaps some of the muscles. Early days in these simulations...

ghost commented 8 years ago

@pgleeson

beaureynsiphone:libNeuroML cyrilrocke$ pynml-channelanalysis NaConductance.channel.nml -bash: pynml-channelanalysis: command not found

Also, when I try pynml for anything in general I get 'command not found'. Here's more info:

  1. I'm using a Mac OS Version 10.7.5
  2. I can't import pyneuroml but I can import neuroml from python
lukeczapla commented 8 years ago

greetings everyone. I've had good success running c302 with pynml and dumping data to help theideasmith with the data from Kato et. al. 2015. The XML seems to layout exactly what is being dumped.

Geppetto, OTOH, I have been having some major problems with. I installed the older Java 1.6 to fulfill the compatibility requirements, and it seems to load most of the code in the kernel and the Apache Tomcat server, but the org.geppetto.frontend fails out along with some other bundles and so I can't access the localhost site and play around with it to learn more. So on the end of trying to look at the code instead of the production alpha release- I have Eclipse and some other Java dev tools installed but it seems hard to find the git repository containing all the main code (unless I have to grab each module individually). Those are all issues that I've been having personally.

So we are trying to align data from simulation to experimental data and I seem to have a reasonably good grasp on the NeuroML XML files and the definitions, but I was wondering which of the example simulations might be a good starting point to look at for running simulations to make the comparison to experiment.

Thanks!

lukeczapla commented 8 years ago

Hi, we have been going through c302 and was wondering what is the best route to learn more about how each of the simulations and how they are parameterized and what we can do with c302? There is the experimental data that we have here from Kato et al. 2015 and we want to project simulation data onto the components from the experiment. So we have a few tools to read the ".dat" file outputs from c302 but some of the models we ran with c302 + pynml seem to be simplistic models and thus we are trying to track down documentation and suggestions for running c302. We have increased the simulation length and other parameters in the xml in getting started.

Is the C "full" model a good choice? The neurons we are analyzing are specific ones in that paper.

slarson commented 8 years ago

This readme is still the main docs:

https://github.com/openworm/CElegansNeuroML/tree/master/CElegans/pythonScripts/c302

note in particular the command line interface with arguments to select specific cells:

https://github.com/openworm/CElegansNeuroML/tree/master/CElegans/pythonScripts/c302#command-line-interface

There is a start on the "Parameters_C.py" for less trivial models here:

https://github.com/openworm/CElegansNeuroML/blob/master/CElegans/pythonScripts/c302/parameters_C.py

and there is a C1 here:

https://github.com/openworm/CElegansNeuroML/blob/master/CElegans/pythonScripts/c302/parameters_C1.py

My suggestion is we look at pushing those farther forward now. ᐧ

On Mon, Feb 29, 2016 at 2:31 PM, Luke Czapla notifications@github.com wrote:

Hi, we have been going through c302 and was wondering what is the best route to learn more about how each of the simulations and how they are parameterized and what we can do with c302? There is the experimental data that we have here from Kato et al. 2015 and we want to project simulation data onto the components from the experiment. So we have a few tools to read the ".dat" file outputs from c302 but some of the models we ran with c302 + pynml seem to be simplistic models and thus we are trying to track down documentation and suggestions for running c302. We have increased the simulation length and other parameters in the xml in getting started.

— Reply to this email directly or view it on GitHub https://github.com/openworm/CElegansNeuroML/issues/56#issuecomment-190429563 .

BenjiJack commented 8 years ago

Hi everyone, I'm new to the group. I've been poking around the past few weeks and spent this morning getting neuroConstruct and c302 up and running and working through some of the example data. I was excited to see the discussion here about Kato's very interesting paper. @theideasmith and @lukeczapla, it sounds like you have made some progress on reproducing and visualizing Kato's results -- how can I help?

theideasmith commented 8 years ago

@BenjiJack we have made some progress on visualizing some amount of Kato's results, though our visualizations aren't as refined as Kato; they could use some work. It's awesome that other people are interested in what we are doing; I'm in 10th grade conducting research with Dr. Czapla(@lukeczapla). I'm also new to the group. From a newcomer to newcomer, welcome!

Sample fluorescence heatmap for 107 neurons zimmer-scaled.

Separately, we ran PCA on the time derivatives as Kato did in the paper. I've been fairly busy lately so have only been doing lots of thinking concerning how to view the Kato data in the context of the C Elegans connectome, i.e. how to connect this dynamic data to structure. I haven't had time to produce figures as elegant as those in the paper. Nevertheless, a lot of the procedures can be performed with simple scipy routines. The tough part is drawing conclusions supported by the data. This requires more domain specific expertise.

PCA on Derivatives derivative-manifolds

We also have some code to deal with the Kato data, which you can access at the link provided, in the folder wbdata. In that directory, if you import transform.py as tf, you can access a dictionary containing all of Katos data with tf.wormData. The dictionary contains calcium imaging data for five worms and is keyed by the original matlab file names as they were sent by Kato himself :). Each worm's data's contains timevector ‘tv’, neural activity traces uncorrected (‘deltaFOverF') and corrected for bleaching (‘deltaFOverF_bc’) as well as derivatives (‘deltaFOverF_deriv'). The identity of neurons is in the same order in the cell array ‘NeuronIds'.

What you can do depends on what you are interested in doing. The grander OpenWorm project has lots of places for contribution - you'll have to probe more senior project members for those opportunities as I can't give a comprehensive answer.

If you specifically are interested in working with the Kato data, which certainly is very interesting and pertinent, there is lots to be done that I can think of.

I'd like to more rigorously look at the Kato data - eventually at c302 simulation data - as a dynamical system. We can think about how it changes/learns as the network searches for equilibrium (see this paper). Separately, we're wondering what kind of insight can be gleaned from looking at all 5 Kato datasets together and drawing inferences from specific differences in their dynamics. There are so many questions to ask and so many perspectives through which to answer them. I'm still figuring everything out as I go. Maybe you have some ideas?

BenjiJack commented 8 years ago

@theideasmith thank you so much for the warm welcome and congratulations on the interesting work you and @lukeczapla have done so far. Please give me a few weeks to do a deep dive into the information you sent, and I will come back to you with questions and further thoughts. Looking forward to it!

pgleeson commented 8 years ago

@AidanRocke sorry for the delay in getting back to you. Are you sure the setup.py for pyNeuroML didn't report any errors? Have you installed using sudo? It looks like you've got libNeuroML installed fine, but not any of pyNeuroML...

@lukeczapla @theideasmith @BenjiJack It's great to see so much enthusiasm about running similar analyses on c302 to the experimental data. I do have to point out again though about the stage the framework is at: very preliminary. Yes, it does produce simulations of the full connectome and produces output for the membrane potential for all cells (and in the case of params C, the calcium traces), but almost none of the parameters going in to the network setup are properly constrained (apart from the connections, which is still based on the older connectome). If there is any resemblance between the simulated traces and the experimental data it will be entirely coincidental...

My main point is that almost no part of the model can be considered a "black box", it's important to get a feel for what each of the params in the model is and the effect of adjusting them on the overall behaviour. Note also that there are fundamental limitations in the current model regarding a "biophysically realistic" simualtion. Params A and B are integrate and fire cells, so the membrane potential will be nothing like the real cells (though high level network properties can be investigated). Param C also has the limitation that the cells are spiking and there are event based synapses. I plan to update this with analogue synapses in the next 2 weeks, which will improve things. Even then there is the added complication that the data is from flourescent markers rather than the actual [Ca2+].

Hope this clarifies current status. It would be great to get some more tools for analysing/visualising the dat files produced, and if anyone gets a set of parameters which produce interesting behaviour, please share.

ghost commented 8 years ago

@slarson I couldn't get the docker software to run on my Mac as I need OS X 10.8 or newer but I have an OS X 10.7.5 But, I'll look into VM alternatives. I can probably use VM Ware for this just as well.

@pgleeson

Trying

sudo python setup.py install

does actually make a difference and I managed to get a window for Na Conductance after trying: pynml-channelanalysis NaConductance.channel.nml

But, I'm now getting other errors:

2016-03-17 01:29:04.867 python[1273:60b] setCanCycle: is deprecated.  Please use setCollectionBehavior instead
Exception in Tkinter callback
Traceback (most recent call last):
  File "/Users/cyrilrocke/anaconda/lib/python3.4/site-packages/matplotlib/backends/tkagg.py", line 22, in blit
    id(data), colormode, id(bbox_array))
_tkinter.TclError: invalid command name "PyAggImagePhoto"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/cyrilrocke/anaconda/lib/python3.4/tkinter/__init__.py", line 1538, in __call__
    return self.func(*args)
  File "/Users/cyrilrocke/anaconda/lib/python3.4/site-packages/matplotlib/backends/backend_tkagg.py", line 283, in resize
    self.show()
  File "/Users/cyrilrocke/anaconda/lib/python3.4/site-packages/matplotlib/backends/backend_tkagg.py", line 355, in draw
    tkagg.blit(self._tkphoto, self.renderer._renderer, colormode=2)
  File "/Users/cyrilrocke/anaconda/lib/python3.4/site-packages/matplotlib/backends/tkagg.py", line 30, in blit
    id(data), colormode, id(bbox_array))
_tkinter.TclError

It looks like there's a problem with Tkinter somehow? I'll look more into this tomorrow.

ghost commented 8 years ago

@pgleeson I just realized that this problem is on my side of things, not with pyneuroml. Anyway, it looks like everything was installed properly this time. Thank you for the tips.

slarson commented 8 years ago

@theideasmith @lukeczapla Great progress. The pictures say 1000 words!

Despite @pgleeson 's generally sobering perspective on the state of the simulation today, I want to remind everyone that the important progress for us to make now does not depend on a fully tuned and perfect neuronal simulation. In fact, we can make progress even with a neuronal simulation that is rather incomplete.

The key progress to make NOW is to build the tools for comparison between real data and simulations, regardless of how accurate the simulations currently are. Without such tools, we will be completely in the dark about how various improvements to the structure of the simulation actually impact the overall neuronal activity patterns. The matter of making the simulation more accurate can be completely decoupled from the matter of building the comparison tools. Even an inaccurate simulation is perfectly serviceable if we carefully track our assumptions, and if we realize that the comparison tools are extremely important. We can consider this creating "mock data" as is often done to demonstrate some particular function works before you are ready to give it the "real data". So let's not conflate those two.

So, in terms of next steps to where we are now, I would say that reusing the analysis pipeline that you have built to run on the Zimmer data to perform the same analysis on results coming from the simulation would demonstrate that we had a version zero of that comparison capability.

To demonstrate we have reused it, we would want to have a run of the same / similar 170 neurons out of c302 for a comparable amount of time and show that we can generate some kind of picture using this code. Probably we are going to get a really boring picture out, because I expect that the dynamics will quickly decay to not much. From there, I would suggest that we "play" with output results, injecting current at various times along the run of the simulation, and seeing how that affects the picture. The goal here is not biological accuracy, but exposing the capability to see the effect that permuting a test system has on the kind of dynamical systems graphs that were produced in that paper.

This will serve us well and provide greater impetus for everyone in the community to get excited about making the effort to produce the real improvements needed to bring c302 to a higher level of accuracy.

lukeczapla commented 8 years ago

@theideasmith and I are able to reproduce the PCA of the data with n=3 components as shown in the Figure 1 of the paper (you can see it in one of the PC1/PC2/PC3 phase plots; like in the paper, we also obtained these principal components fitted to deltaF/F0 derivative data), and there are 4 other datasets. I'm not sure of all of the conditions involved in each of the measurements, but they are documented and we have been able to extract the neuron IDs along with the data - although they are not sorted as they are in the paper.

I read your comments on the state of the simulation and the goal of the research, we agree entirely that developing models and simulation parameters guided by the experimental data is an important and essential task in this work. To do the same analysis on the data coming out from c302 is within our reach at the moment, we have made some progress reading the output files and have just been spending time learning more about PCA and interpreting the components themselves and projecting the reconstructed data set in 3 components. That methodology is all implemented using sklearn.

We're open to suggestions, currently, mapping the experiment results to the simulation output results is one of goals. We can develop a pipeline that would be easy for c302 developers to modify to achieve the goal of improving the models based on iterative comparison between analysis of experiment and simulation. I understand some of the limitations that @pgleeson mentioned and have been going through the literature to see how to better compare [Ca++] data to these fluorescence measurements, which have several shortcomings but give a realistic picture of what's going on.

BenjiJack commented 8 years ago

@theideasmith and @lukeczapla Well done with importing Kato's data and reproducing the PCA visualizations. @theideasmith, I was able to download and run your code, including accessing the dictionary of Kato's data as you suggested.

I have a few thoughts and questions to build on the comments by @pgleeson @slarson and others above. (I know I'm a newbie, but I figure no better way to get acquainted than to jump right in! Thanks for welcoming me.)

Generating Kato-like visualizations from c302 output I agree with @slarson about the need for tools to compare the output of simulations vs. real-world data. It sounds like @lukeczapla and @theideasmith are already in reach of taking output from c302 and generating Kato-like visualizations. This will be exciting to see.

As a step toward that goal, building on @theideasmith's work importing Kato's data, I tried to recreate Kato's visualizations myself (see iPython notebook below), but from one step upstream: I tried to use the fluorescence bleach cancelled timeseries to naively compute the derivative myself, rather than use the fluorescence derivative timeseries directly. After all, if we are to take a timeseries output from c302 and compare it to Kato's results, we will have to reproduce the entire pipeline of analysis that Kato used. One could debate whether this is worth the effort, but it seems that Kato is (at least for the moment) our richest resource for neuron-by-neuron behavior in the real world, and worth comparing against.

I had trouble reproducing Kato's results from the upstream data. Although my approach was naive, it's not immediately clear to me how to fix it or how he cleaned the data so thoroughly. He mentions in the supplemental discussion on p. 9 that "total-variation regularization (Chartrand, 2011) was used to compute de-noised time derivatives...". Maybe someone else can shed more light on this or I can go back and read further on this technique.

Here's my attempt: output_3_1

iPython notebook with code and images: kato_visualization.pdf

Could we train c302 using Kato's data? Another idea I was thinking about is to use Kato's data to train c302. Could we train the neural network using the output of real-world neurons? @theideasmith you seemed to be alluding to this with the paper you referenced previously. I am sure something like this has been debated among the group before and I would be interested to hear your comments on whether this is feasible and how we might do it.

Making c302 more accessible As a newbie I am confused about where c302 fits into OpenWorm as a whole, and how I get started with it. I was able to get neuroConstruct up and running thanks to the excellent quick start tutorial, but c302 is still a black box to me. How do I get started so I can lend a hand in implementing some of the ideas discussed?

slarson commented 8 years ago

Hi all -- as some of the information on this issue now has outgrown the original topic, I've gone ahead and created a separate repo to capture work on analyzing the dynamics of both real and simulated neurons: https://github.com/openworm/neuronal-analysis

I have created a few issues from questions raised here so we can take them up separately in turn. Let's continue the conversation on those specific topics in the relevant issues.

pgleeson commented 8 years ago

Apologies for lack of input, I've been trying to find some time to get back into c302 development. At the moment it still is pretty experimental code, so if there's no clear way it should be used it probably I probably haven't figured it out myself yet... To make it easier to see the current activity of cells in the generated networks, I've made a script which can be used to run the different configurations of c302 & display activity:

python runAndPlot.py 
python runAndPlot.py -socialB # see bottom of file for more options

All of these require the latest libNeuroML (development branch) and pyNeuroML, and some require Neuron. Activity of the social behavior network with parameters B:

selection_075

Plenty more work to do and all subject to change. I've also started adding analog synapses and the network below has 302 cells, 94 muscles, conductance based cells with internal calcium, gap junctions and only analog synapses (v slow to load):

selection_074

lukeczapla commented 8 years ago

Thank you for that work! What you are showing in the bottom figures with the analog synapses looks like something that we could work with. I ran one of the old simulations (parameters C before this new development you've started) and it did not look good - activity did not seem to have the long events seen in fluorescence data. I suppose that is really an extreme case of "apples to oranges" comparison.

That experimental data doesn't have the resolution to say anything on the scale of milliseconds (we're talking time intervals of 344 milliseconds) so I had been putting the simulation time up. The graphs on the bottom have potential there where parameters might be able to be adjusted towards reproducing some of the large-scale patterns and correlations seen. I'm curious of whether the guessed parameters reproduce anything we see if the time is extended... I'd be happy to try out the new analog representation & parameters over here and we've been analyzing all the imaging data.

lukeczapla commented 8 years ago

@pgleeson I came so close to being able to run the program, here's what ultimately made it fail (it ran for about 10 minutes before getting to this)

/0/GenericCell/caConc', 'MDR08/0/GenericCell/v', 'VD9/0/GenericCell/caConc', 'ADFL/0/GenericCell/v', 'DA7/0/GenericCell/v', 'VA6/0/GenericCell/v', 'MVL16/0/GenericCell/v', 'URYVR/0/GenericCell/v', 'MVL18/0/GenericCell/v', 'LUAL/0/GenericCell/v', 'VA3/0/GenericCell/caConc', 'RIVR/0/GenericCell/caConc', 'ASGL/0/GenericCell/v', 'IL1DL/0/GenericCell/caConc', 'RICL/0/GenericCell/v', 'DB7/0/GenericCell/caConc', 'VB4/0/GenericCell/v', 'RMGR/0/GenericCell/v', 'RMHL/0/GenericCell/v', 'DVB/0/GenericCell/v', 'MVR13/0/GenericCell/v', 'URYVL/0/GenericCell/v', 'PHAL/0/GenericCell/v', 'VB3/0/GenericCell/caConc', 'AFDR/0/GenericCell/caConc', 'PVQR/0/GenericCell/v', 'RIR/0/GenericCell/caConc', 'I5/0/GenericCell/caConc', 'IL1VL/0/GenericCell/caConc', 'MDR09/0/GenericCell/v', 'MDL22/0/GenericCell/caConc', 'SAAVR/0/GenericCell/caConc', 'AS7/0/GenericCell/caConc', 'ADER/0/GenericCell/caConc', 'MDL14/0/GenericCell/caConc', 'M3R/0/GenericCell/caConc', 'MVL12/0/GenericCell/v', 'AVAR/0/GenericCell/caConc', 'RMGL/0/GenericCell/caConc', 'MVL03/0/GenericCell/caConc']
Plotting neuron voltages

Traceback (most recent call last):
  File "runAndPlot.py", line 186, in <module>
    main('Full','C1','',1000,0.05,'jNeuroML_NEURON')
  File "runAndPlot.py", line 59, in main
    labels = [float(item.get_text())*dt for item in ax.get_xticklabels()]

there's a couple similar lines elsewhere in the runAndPlot.py. Do you know what would fix it? It takes forever to get to this point here. By commenting out lines 59 & 60 I then got here to a similar error:


Plotting neuron voltages
Opened Excel file: ../../../../CElegansNeuronTables.xls
Plotting muscle voltages
Traceback (most recent call last):
  File "runAndPlot.py", line 186, in <module>
    main('Full','C1','',1000,0.05,'jNeuroML_NEURON')
  File "runAndPlot.py", line 95, in main
    labels = [float(item.get_text())*dt for item in ax.get_xticklabels()]
ValueError: could not convert string to float: 
pgleeson commented 8 years ago

@lukeczapla have you managed to run the smaller scale models, e.g. -social? The code for reloading the data is not v efficient, and could even skip some of the points (e.g. use every 10th...) for the plots. Need to do a bit of optimisation...

theideasmith commented 8 years ago

In terms of plotting, maybe we can move to vispy? It's meant for high performance visualizations and can be GPU accelerated.

Sent from my iPhone

On Apr 8, 2016, at 9:49 AM, Padraig Gleeson notifications@github.com wrote:

@lukeczapla have you managed to run the smaller scale models, e.g. -social? The code for reloading the data is not v efficient, and could even skip some of the points (e.g. use every 10th...) for the plots. Need to do a bit of optimisation...

— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub

theideasmith commented 8 years ago

Additionally, numpy has a binning routine (np.digitize) which takes averages over a sliding window on the dataset. This could reduce the quantity of data passed to the visualization while still being truer to the actual dataset.

Sent from my iPhone

On Apr 8, 2016, at 9:49 AM, Padraig Gleeson notifications@github.com wrote:

@lukeczapla have you managed to run the smaller scale models, e.g. -social? The code for reloading the data is not v efficient, and could even skip some of the points (e.g. use every 10th...) for the plots. Need to do a bit of optimisation...

— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub

lukeczapla commented 8 years ago

@pgleeson

Yes I ran the Social model too, and got the same error. But this at least helps to work with this error though.


Opened Excel file: ../../../CElegansNeuronTables.xls
Loaded morphology file from: ../../generatedNeuroML2/ASHR.cell.nml, with id: ASHR, location: (-9.8, -265.625, 41.0)
Loaded morphology file from: ../../generatedNeuroML2/ASKR.cell.nml, with id: ASKR, location: (-8.0, -268.025, 46.399998)
Loaded morphology file from: ../../generatedNeuroML2/AWBR.cell.nml, with id: AWBR, location: (-9.75, -266.225, 43.1)
Loaded morphology file from: ../../generatedNeuroML2/IL2R.cell.nml, with id: IL2R, location: (-11.0, -285.0, 49.35)
Loaded morphology file from: ../../generatedNeuroML2/RMGR.cell.nml, with id: RMGR, location: (-12.5, -238.29999, 32.7)
Loaded morphology file from: ../../generatedNeuroML2/RMHR.cell.nml, with id: RMHR, location: (-5.2999997, -265.9, 35.7)
Loaded morphology file from: ../../generatedNeuroML2/URXR.cell.nml, with id: URXR, location: (-7.35, -269.875, 48.275)
Finished loading 7 cells
Opened Excel file: ../../../CElegansNeuronTables.xls
Written network file to: examples/c302_C_Social.nml
Written LEMS file to: examples/LEMS_c302_C_Social.xml
Validating examples/c302_C_Social.nml against /Users/luke/code/new/CElegansNeuroML/pyNeuroML/src/libneuroml/neuroml/nml/NeuroML_v2beta4.xsd
It's valid!
(Re)written network file to: examples/c302_C_Social.nml
pyNeuroML >>> Loading LEMS file: LEMS_c302_C_Social.xml and running with jNeuroML_NEURON
pyNeuroML >>> Executing: (java -Xmx400M  -Djava.awt.headless=true -jar  '/Library/Python/2.7/site-packages/pyNeuroML-0.1.7-py2.7.egg/pyneuroml/lib/jNeuroML-0.7.4-jar-with-dependencies.jar'  LEMS_c302_C_Social.xml  -neuron -run -nogui) in dir: .
Reloaded data: ['IL2R/0/GenericCell/caConc', 'URXR/0/GenericCell/caConc', 'URXR/0/GenericCell/v', 'RMGR/0/GenericCell/caConc', 'RMHR/0/GenericCell/caConc', 'RMGR/0/GenericCell/v', 'ASKR/0/GenericCell/caConc', 'AWBR/0/GenericCell/v', 't', 'ASKR/0/GenericCell/v', 'RMHR/0/GenericCell/v', 'ASHR/0/GenericCell/v', 'IL2R/0/GenericCell/v', 'AWBR/0/GenericCell/caConc', 'ASHR/0/GenericCell/caConc']
Plotting neuron voltages
Traceback (most recent call last):
  File "runAndPlot.py", line 210, in <module>
    main('Social','C','',2500,0.05,'jNeuroML_NEURON')
  File "runAndPlot.py", line 59, in main
    labels = [float(item.get_text())*dt for item in ax.get_xticklabels()]
ValueError: could not convert string to float: 

It's fine though, I can see the graphs without the correct time labels on the X axis by commenting out the problem lines, but there's only 7 cells. All the labels look like they could be converted to floating-point numbers though, so it's unusual that this error occurs.

By the way, is there any way to accelerate simulations? Which code actually performs the calculations in this work?

pgleeson commented 8 years ago

This looks like a problem parsing/converting the entries from ax.get_xticklabels(). There were some modifications to the Matplotlib methods related to axes recently far as I can see and this code works with the latest version, so perhaps try updating to the latest matplotlib?

The real bottleneck here is the reloading/plotting as opposed to the simulation. Neuron is what's running the actual simulation and that will run it in as fast as anything currently supported. You could try increasing dt (e.g. to 0.1) if you really want to speed up the sims, but that would give slightly less accurate results.

pgleeson commented 8 years ago

@lukeczapla The latest version of the runAndPlot.py is more efficient with data loading so bigger networks should load quicker.

See also: https://github.com/openworm/CElegansNeuroML/blob/master/CElegans/pythonScripts/c302/examples/summary/README.md generated from:

python runAndPlot.py -all
pgleeson commented 6 years ago

There have been significant improvements in how the data is generated/presented with runAndPlot.py, and how this is post processed when the sims are run through sibernetic_c302.py & with Docker support

Closing this for now and specific improvement requests can be opened as separate issues.

pgleeson commented 6 years ago

Latest committed nets for c302 include colours for different cell types, which propagate through to the connectivity widgets:

selection_590

pgleeson commented 6 years ago

Sorry, meant above comment for #57