cherab / solps

Other
6 stars 5 forks source link

Add standalone demo. #39

Closed jacklovell closed 3 years ago

jacklovell commented 3 years ago

This demo uses the raw files from Andrey Pshenov's PhD thesis simulations, converted to SOLPS 5 format by Vlad Neverov. The demo contains similar functionality to the existing AUG/raw, MDSplus and balance.nc demos. Additionally, it demonstrates how to add emission models to the plasma and image the plasma with a visible camera, providing a complete workflow example which should be useful to new Cherab users and familiar to experienced users new to the SOLPS package.

N.B. The simulation output files are rather large, at 31 MB, so they're version controlled in a compressed zip file which is much smaller at 5.5 MB. The user will need to extract the files before running the demo: instructions for how to do this are provided in the demo if the files have not already been extracted.

Discussion on this PR can also be used to decide whether we want to modify the other demos. For example, I've replaced the nested for loops for sampling with Cherab's own sampling functions in this PR, which should be considered best practice. Similarly, the other demos don't actually use any emission models: they reproduce the functionality of b2plot but don't exploit Cherab's unique capabilities, so perhaps we should add some imaging to them too.

jacklovell commented 3 years ago

@vsnever Github won't let me request a review from you on this, but I'd be interested in your input too.

vsnever commented 3 years ago

This is a good demonstration. However, I experienced problems with interactive mode of matplotlib. Whether I run this demo with python or ipython, the figures hang at input('Press Return to continue...'). Maybe the problem is specific to my setup, but giving matplotlib the time to render the figures solves the issue:

plt.pause(0.01)
input('Press Return to continue...')

The final image required more time to render for some reason:

plt.pause(0.1)
input('Press Return to finish...')

Also, usually I run scripts from code editors like Sublime Text, and I think other users do too. The request for the user input in the middle of the script requires me to run this demo from console or to install an additional plugin. The rest of the script adds only a single figure, and I think closing other figures before the rendering starts is unnecessary. So, replacing

input('Press Return to continue...')
plt.close('all')

with just a pause

plt.pause(0.01)

would help. Also in this case, the user can explore the figures while the final image is rendering.

Below are some other potential improvements.

vsnever commented 3 years ago

I almost forgot to add that the actual version of fort.44 file in the generomak demo is 20081111. I manually changed it to 20130210, so the fort44_2013 parser could read it. However, the total radiation of neutral atoms is read incorrectly (it's not used in this demo). I'll try to create a more universal fort44 parser soon, and then we will need to change the version of fort.44 file back to 20081111.

Mateasek commented 3 years ago

Hi @jacklovell, thanks for the demo. I would suggest following changes. I would move the zip archive into a folder named for example "data". In this folder we could add a "utility.py" which could contain methods handling extraction of the zip archive as @vsnever sugeests. This method can be called from the setup.py, if we want it. This would I think make a cleaner structure and open the space for us to have more demo simulations in future. Also it would allow to remove the checks about the files existence from every demo we will use the simulations in.

Also I would separate the demo into two actually. I would make separate demos for the camera and the profile plotting.

The structure could look like this (names of folders and files are optional):

-demos/
    -plot_solps_profiles.py
    -visible_camera.py
    -data/
        -utility.py
        -generomak/
            *solps data files*
vsnever commented 3 years ago

Just one more comment. I think this demo will make more sense from a physical point of view if RadiancePipeline2D is used instead of the default RGBPipeline2D. Also it might be worth setting camera.spectral_bins = 1 for better performance.

jacklovell commented 3 years ago

This is a good demonstration. However, I experienced problems with interactive mode of matplotlib. Whether I run this demo with python or ipython, the figures hang at input('Press Return to continue...'). Maybe the problem is specific to my setup, but giving matplotlib the time to render the figures solves the issue:

plt.pause(0.01)
input('Press Return to continue...')

The final image required more time to render for some reason:

plt.pause(0.1)
input('Press Return to finish...')

Also, usually I run scripts from code editors like Sublime Text, and I think other users do too. The request for the user input in the middle of the script requires me to run this demo from console or to install an additional plugin. The rest of the script adds only a single figure, and I think closing other figures before the rendering starts is unnecessary. So, replacing

input('Press Return to continue...')
plt.close('all')

with just a pause

plt.pause(0.01)

would help. Also in this case, the user can explore the figures while the final image is rendering.

Below are some other potential improvements.

This is strange. I haven't noticed this behaviour myself, and am wary of spraying plt.pause commands everywhere as it then breaks Jupyter notebooks. But I agree with your other suggestion to split this into two demos and remove the "Press return to continue" steps, which should hopefully solve this anyway.

vsnever commented 3 years ago

This is strange. I haven't noticed this behaviour myself, and am wary of spraying plt.pause commands everywhere as it then breaks Jupyter notebooks. But I agree with your other suggestion to split this into two demos and remove the "Press return to continue" steps, which should hopefully solve this anyway.

The behaviour of matplotlib interactive mode is kind of buggy. There are several threads on stackoverflow about this and some of them are very recent: https://stackoverflow.com/questions/50490426/matplotlib-fails-and-hangs-when-plotting-in-interactive-mode https://stackoverflow.com/questions/64043973/matplotlib-interactive-mode-wont-work-no-matter-what-i-do

However I do not insist on this change, the user can always turn off the interactive mode if it does not work.

jacklovell commented 3 years ago

I almost forgot to add that the actual version of fort.44 file in the generomak demo is 20081111. I manually changed it to 20130210, so the fort44_2013 parser could read it. However, the total radiation of neutral atoms is read incorrectly (it's not used in this demo). I'll try to create a more universal fort44 parser soon, and then we will need to change the version of fort.44 file back to 20081111.

Ah, yes, I've tried plotting neutrad and got some highly suspicious 1e23 W/m3 emissivity. Thanks for pointing this out. Eneutrad will stay out of the demo until the fort.44 parser can handle it.

jacklovell commented 3 years ago

Just one more comment. I think this demo will make more sense from a physical point of view if RadiancePipeline2D is used instead of the default RGBPipeline2D. Also it might be worth setting camera.spectral_bins = 1 for better performance.

I quite like the colour: it's more engaging than a black and white plot. I've just tried adding a PowerPipeline2D to the camera's pipelines, and the resulting images are surprisingly similar (other than the colour). I presume this is because most of the higher charge states are at very low densities in the edge region, and you're not getting the large volumetric VUV emission from the core as it's not present in the simulation. These edge simulations are where visible light tends to be dominant, so an RGB camera does a pretty good job of capturing most of the plasma emission.

I could leave both pipelines in: you'd see a visible image and a total emission that way. It makes no difference to the render time, which is dominated by evaluating the emission models.

vsnever commented 3 years ago

I quite like the colour: it's more engaging than a black and white plot. I've just tried adding a PowerPipeline2D to the camera's pipelines, and the resulting images are surprisingly similar (other than the colour). I presume this is because most of the higher charge states are at very low densities in the edge region, and you're not getting the large volumetric VUV emission from the core as it's not present in the simulation. These edge simulations are where visible light tends to be dominant, so an RGB camera does a pretty good job of capturing most of the plasma emission.

I could leave both pipelines in: you'd see a visible image and a total emission that way. It makes no difference to the render time, which is dominated by evaluating the emission models.

But the total radiated power is not resolved over wavelength, or am I wrong?

jacklovell commented 3 years ago

Ah yes, you're correct. It's spread uniformly over the spectrum. Fair point, I'll remove RGB.

jacklovell commented 3 years ago

I've made changes based on reviewers' suggestions, and pushed a new version.

I'd like to add fort.44 2008 compatibility to the Eirene data reader and change the version number back in the Generomak data file, so that I can include the total radiated power in the profile plots. Otherwise I hope this is ready to merge.

jacklovell commented 3 years ago

I've changed the fort.44 file's version header, so it now records that it's the 20081111 format. Since #40 was merged we can at least read all the relevant data for this demo correctly.

I've also made a small change to the plots: using SymLogNorm rather than LogNorm as the colour normalisation means we can show values of 0 in the plots (previously they were treated as NaN and not shown, even though the data did actually exist in the sampled arrays being plotted). To keep things tidy, this meant refactoring the plotting code out into a separate function which is just called once for each plot.