Open tejinc opened 2 years ago
Hi Tejin - I just saw this issue (sorry for the delay!)
We'll look into this. We're not using LMOD, but that might be the right approach - create a Jupyterlab extension that executes a user-specified preamble script. I think NERSC has something like this in limited usage: https://github.com/NERSC/jupyterhub-entrypoint
@tejinc you might want to look at what @dladams put together, I believe it's a shell script that will source the environment for you from CVMFS. This was tested on EAF: https://github.com/dladams/dunerun
I think @dladams script works for the CLI, not notebooks. Tackling this issue is next on my list once we make the switch to cvmfsexec.
My package dunerun provides a command dune-run that makes it easy to start a shell with dunesw on the command line and provides a python class DuneRun that opens a shell with a dunesw env and enable the user to run commands in that shell. It doesn't (yet) provide capability to return objects from Root (TH1, TTree, ...) or Python (DataFrame, ...) that one might visualize from a notebook although one could easily retrieve objects written to files.
It might be interesting to build a kernel for one or every dunesw release. There are a lot of DUNE classes available at the python command line via Root dictionaries.
It would be helpful to have some specific use cases to understand what approach is best.
FWIW, I started dunerun with the idea that notebooks would be a good way to share analyis snippets, e.g. to show someone how to create and view the DQM plots for a particular event, but I am now leaning toward running dune commands in a terminal and then using the jupyter file browser to look at the images files. Of course one could also use a python class for that.
I know it's possible to source CVMFS from the Terminal environments inside the JupyterLab, just do it the usual way, i.e. source /cvmfs~/setup_dune.sh. The whole thing behave just like any other virtual machine terminal.
However, dynamically adding CVMFS products inside a notebook environment is much harder, i.e.
I understand that once a single-user JupyterLab server is launched, the default system environment is basically fixed in place. However, there seems to be a way to setup a product before launching a notebook inside the server:
see https://github.com/cmd-ntrf/jupyter-lmod
Since (if I'm correct) all the product-sourcing business really boils down to getting the paths into the kernel, it might be possible to either inject the paths on the fly or dynamically create a kernel before notebook launch?
Or is there an easier way?