dandi / dandi-hub

Infrastructure and code for the dandihub
https://hub.dandiarchive.org
Other
11 stars 23 forks source link

Environment not persistent between sessions? #202

Open rcpeene opened 1 month ago

rcpeene commented 1 month ago

I think ever since the update where you select your environment upon launch, my environment no longer persists between sessions of Dandihub. This is a bit inconvenient since I have to reinstall The OpenScope Databook environment every time. Is there a way to manually save my environment?

satra commented 1 month ago

@rcpeene - let's use this opportunity to create a specific image for openscope. @asmacdo - could you please point carter to instructions for adding an image? i will let you also address carter's issue. perhaps some initiation process is overwriting something.

asmacdo commented 1 month ago

@rcpeene each user has persistent storage under /home/jovyan, so you'll just need to use a virtual environment somewhere in there. (The default conda env is in /opt, which is not persistent.)

For example:

$ python -m venv ~/venvs/my_venv
$ source ~/venvs/my_venv/bin/activate
(my_venv) $ 

For creating a new image we've got a short blurb in the docs: https://www.dandiarchive.org/handbook/50_hub/#custom-server-image. However, that has a broken links, and there's a bit more to it... Lets just develop those docs as we go and update when you're finished, (also feel free to ping me in MIT slack).

First youll want to add a new Dockerfile, ie Dockerfile.openscope https://github.com/dandi/dandi-hub/tree/b2f344a23bc726a2ff8690d97fdff637be539e54/images

Once that's built and pushed to dockerhub, it can be added as an image choice to the jupyterhub configuration. https://github.com/dandi/dandi-hub/blob/b2f344a23bc726a2ff8690d97fdff637be539e54/envs/shared/jupyterhub.yaml#L116-L125

We already have 2 choices, regular and Matlab. Youll need to add the new image as an option to each of the profiles.

rcpeene commented 3 weeks ago

Sorry for not getting back here;

For some environment reasons that we don't fully I understand, it is nontrivial for us to get a working docker container. Probably in the next several months we'll working on getting a docker container for the Databook and we'll check back.