Closed hakasapl closed 3 years ago
Hey @hakasapl thank you for the detailed report. Could you run jupyterlab in debug mode in order to get the output of the conda commands?
Thanks for the log @hakasapl
So the error is on the command mamba search --json
and only that one (I see that listing environments and packages is working). And the error is:
"error": "ImportError(\"cannot import name 'id' from partially initialized module 'conda._vendor.distro' (most likely due to a circular import) (/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/conda/_vendor/distro.py)\")",
"exception_name": "ImportError",
"exception_type": "<class 'ImportError'>",
"traceback": "Traceback (most recent call last):\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/conda/gateways/connection/session.py\", line 60, in __call__\n return cls._thread_local.session\nAttributeError: '_thread._local' object has no attribute 'session'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/conda/exceptions.py\", line 1079, in __call__\n return func(*args, **kwargs)\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/mamba/mamba.py\", line 882, in exception_converter\n raise e\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/mamba/mamba.py\", line 876, in exception_converter\n exit_code = _wrapped_main(*args, **kwargs)\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/mamba/mamba.py\", line 835, in _wrapped_main\n result = do_call(args, p)\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/mamba/mamba.py\", line 714, in do_call\n exit_code = getattr(module, func_name)(args, parser)\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/conda/cli/main_search.py\", line 73, in execute\n matches = sorted(SubdirData.query_all(spec, channel_urls, subdirs),\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-packages/conda/core/subdir_data.py\", line 120, in query_all\n result = tuple(concat(executor.map(subdir_query, channel_urls)))\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/concurrent/futures/_base.py\", line 600, in result_iterator\n yield fs.pop().result()\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/concurrent/futures/_base.py\", line 433, in result\n return self.__get_result()\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/concurrent/futures/_base.py\", line 389, in __get_result\n raise self._exception\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/concurrent/futures/thread.py\", line 52, in run\n result = self.fn(*self.args, **self.kwargs)\n File \"/modules/apps/miniconda/4.8.3/envs/jupyter-2-1/lib/python3.9/site-pack
You may want to report an issue on mamba (and/or conda). But before that, you could try opening a terminal a running:
conda search --json
mamba search --json
to see if they are working. If conda works and not mamba - please report an issue on mamba repository. A solution could be to downgrade mamba or conda to a older version in the meantime.
Yes, I've confirmed mamba fails but conda succeeds. I've filed a bug report on mamba: https://github.com/mamba-org/mamba/issues/668
Thank you for the assistance!
Description
When loading into the conda package manager, I get the message "an error occurred while retrieving available packages", and the package list is not populated.
Weirdly, it seems to work sometimes. I launch using JupyterHub, so occasionally my launched session will work, but will fail if I try to launch again.
Reproduce
Expected behavior
List gets populated with conda packages.
Context
I am using JupyterHub and BatchSpawner to batch JupyterLab session using the Slurm scheduler. Not sure what to make of the logs pasted below.
Command Line Output
Browser Output
Thanks for your help!