Closed Marie59 closed 1 year ago
Isn't it possible to replace RUN jupyter labextension install @jupyterlab/geojson-extension
by RUN conda install jupyterlab-geojson
? and do the same for the following packages ? Could it fix the issue ? (maybe I am saying nonsense here sorry if it's the case)
I don't know @Marie59, we need to test. Does it work if you build it locally? I'm just trying here :)
I don't know @Marie59, we need to test. Does it work if you build it locally? I'm just trying here :)
I am doing something wrong to build the image locally because I can't build any image locally (but I can run them). I am trying to fix that on my machine.
Okay @bgruening the build on my machine of jupyter/datascience-notebook finally worked ! But the build of this docker-jupyter-notebook did not and broke with the same error as here so I'm going to test what i suggested yesterday (it seems to work on my machine).
@Marie59 its green :)
But we still don't know if this extension works in Jupyter so this would need to be tested locally. And we need to include the other extensions as well.
@Marie59 its green :)
But we still don't know if this extension works in Jupyter so this would need to be tested locally. And we need to include the other extensions as well.
Yes ! I will try to see how to test locally. Do you mean that for the other extensions I should also change to RUN conda install XXX
?
Yes, if this works as we hope, we should use conda for all of them.
ok then I think one of them is missing on conda the jupyterlab/toc-extension
.
Ah no just found it ! forget what I said
https://github.com/jupyterlab/jupyterlab-toc
but it also looks dead? Skip it if it makes problems and feel free to add others if useful :)
So, I removed jupyterlab-toc because it is now included in jupyterlab. I try to build and then run the image using RUN conda install
for the 3 extensions but with keeping the old FROM jupyter/datascience-notebook:d990a62010ae
everything is working. I don't know if I need to test something else to check if the extensions are working ? However, the new FROM jupyter/datascience-notebook:python-3.10
is building correctly but not running correctly due to a migration to Notebook 7. I am looking into it but if you have any leads on that...
Mh, no I don't really know :-( Maybe that is helpful? https://jupyterlab-contrib.github.io/migrate_from_classical.html
Okay I was able to run it. I am pretty sure the extension katex and geojson are working fine wwith the migration. I am not sure for the fasta extension. I have more difficulties to find a way to test it (I will keep searching to be sure during the day). However I found that https://github.com/jupyterlab/jupyter-renderers I think it could come replace fasta and geojson extensions but it's not on conda yet. Do you think I should try to make a conda recipe for that (never try for a jupyter package) ?
If you think this is a good package, we can create a conda one for it.
grayscull pypi jupyter-renderers
should generate us a conda-forge package.
It does not seem to be on pypi. But, after checking it is called in the recipes of the fasta and geojson extension here home: https://github.com/jupyterlab/jupyter-renderers
. So maybe, the packages on conda are indeed up to date with that (I am not quite sure here)
Yep I confirm everything is linked back to jupyter-renderers forget what I said sorry !
Sorry, that this was closed automatically. I think Github was confused because you used your master branch.
I created a new container: https://quay.io/repository/bgruening/docker-jupyter-notebook?tab=tags
23.3 should be the new one, with your changes :)
I think you can now use this new container and just add your notebooks and "done" :)
Thanks!
Change the tag of jupyter/datascience-notebook to the one with julia-1.8.5. There are even more recent releases but I did not no if it was relevant to take one of them. This is all quite new for me but if I understood correctly I think it's the only change that need to be done in order to update the julia version. @bgruening tell me if I need to do anything else.