Closed OlehKSS closed 6 years ago
is there any patterns in the way MNE or other EEG/MEG data typically stored?
Check out the simplified ps.py
gist I linked to before:
https://gist.github.com/larsoner/25fa656c6d6e0b02b56c40a571bfb77c
These are the standard data structures that MNE uses (and that your code will presumably need to use).
should I create a BrainMesh/Surface class for storing mesh data (e. g. vertices and faces)? Will it be useful later?
No let's stick with simple NumPy arrays for now, similar to how it's done in the gist.
What do you think about adding a GitHub project to this repository, and me as collaborator
I've given you admin permissions so you can tweak these things.
@larsoner, after analyzing the gist that you've shared I think that I will need to add these:
ipyvolume
should have itI am checking whether camera, light and shading options are available in ipyvolume
. Do you agree with the list above? Have I missed something?
Moreover, I can't find explicit documentation for the mesh.set_gl_state
function. It is a dictionary of options, but I can't find which these options are. Could you suggest anything?
I haven't found something that can be put into the read_brain_mesh
function from viz.py
module.
As far as I understand, curvature data correspond to the color of the mesh, thus it will be used in #2.
I think I also have no need to manipulate data for now.
Well, there are no options available in ipyvolume for camera scaling factor, light direction and color and shading. Azimuth can be changed for the whole scene (or figure), but not for a separate meshes.
Well, there are no options available in ipyvolume for camera scaling factor, light direction and color and shading.
That's okay, we will probably need to add these later. For now we can just live with the defaults, whatever they are.
Azimuth can be changed for the whole scene (or figure), but not for a separate meshes.
That's okay I think we want them to move together. Eventually we will want e.g. left and right hemi plots separately, but for now plotting just left, just right, or left and right together (that move together) is a good start.
I haven't found something that can be put into the read_brain_mesh function from viz.py module. As far as I understand, curvature data correspond to the color of the mesh, thus it will be used in #2.
What is the plan for the read_brain_mesh
function? What are the inputs and outputs supposed to be? It seems like everything you need to read the mesh components is already in the gist
ready to be translated to a script / function.
I have updated the according to the examples provided. You can see brain plot using notebook in the examples folder. What do you think?
Writing down steps I used with Ubuntu system Python for reference:
OlehKSS/olehkss/1
however you wantmaartenbreddels/ipyvolume
if you don't have it already according to the dev install docsjupyter nbextension enable --py --sys-prefix widgetsnbextension
And now I get this in jupyter notebook
:
So once we clean up the code a bit and make it PEP8-compliant, +1 for merge
Looks good, let's keep going @OlehKSS !
I have several questions regarding this task:
viz.read_brain_mesh
functionAnother general, what do you think about adding a GitHub project to this repository, and me as collaborator, so I will be able to assign issues I am working on and moving them from according to their status (for example, 'To Do', 'In Progress', 'Done')?
I will add unit testing the later.