Closed oscar6echo closed 1 year ago
When you launch:
jupyverse --set auth.mode=user
Then there is no token involved anymore. The authentication mechanism needs a user to be registered and then log in. If you open the interactive documentation at http://127.0.0.1:8000/docs
, you'll see a POST /auth/register
route and a POST /auth/login
route. There is an example of user registration/login in the tests. Note that you first need to be logged in to be able to register a new user. This can be done for instance if you launch jupyverse --set auth.test=true
, because this will create an "admin" user with a given name and password.
I agree that all of this needs some better UI and documentation. In particular, we should probably have a way to create/delete/modify users from the CLI.
Ok I start to understand the general idea.
Indeed the swagger at /docs
is where one should spend time to get a feeling of how things work.
I could /auth/login
as admin@jupyter.com
then create a new user.
I noted the following unexpected behaviors:
/auth/user/{id}
does not work for all user, admin or not, oneself or not. Always as follows, while endpoint GET /auth/users
works fine
{
"detail": "Forbidden"
}
/auth/login
: requested username
is fact email
- Not a big deal but I could not find the corresponding code - even more elusive is param grant_type
: Absent in repo (unless mistaken)admin@jupyter.com
and guest@jupyter.com
.Would you have hints/links on these points ?
Re your remark about CLI, why not but maybe simpler and more educative would be a collection of notebooks enabling the same interaction than the swagger but faster to use - like living documentation.
- endpoint GET
/auth/user/{id}
does not work for all user, admin or not, oneself or not. Always as follows, while endpoint GET/auth/users
works fine{ "detail": "Forbidden" }
It seems that you need to be a superuser.
- endpoint GET
/auth/login
: requestedusername
is factgrant_type
: Absent in repo (unless mistaken)
Yes, it is a design choice of FastAPI-Users.
- Generally the various permissions possible are not defined and I could find a clue in the code base besides the given users
admin@jupyter.com
andguest@jupyter.com
.
It's basically read
, write
and execute
, see the Jupyter Server documentation.
All right !
I now understand that some missing parts are to be found in FastAPI-users and others in Jupyter Server.
Thx for the clarification. I need to read more.
Thinking:
I agree that all of this needs some better UI and documentation. In particular, we should probably have a way to create/delete/modify users from the CLI.
Re your remark about CLI, why not but maybe simpler and more educative would be a collection of notebooks enabling the same interaction than the swagger but faster to use - like living documentation
However the CLI should at least provide a way to get the an "admin/superuser" cookie. That would be the starting point of a notebook based interaction with the swagger: simple, easy to document/experiment/demo.
Security-wise, it seems reasonable to allow the creation of such cookie from the terminal, right ?
What do you think ?
Yes, I've also been thinking about using a notebook to interact with an API. But I'm not sure it should use a client in the browser, we could do requests directly from Python. It could be used as a user management dashboard, possibly using widgets. Were you thinking about something like that?
Yes, I meant a regular "desktop" jupyter notebook whose Python kernel runs on the same machine as the jupyverse server under the same username (whoami
) implying the user behind it is the same person who started the jupyverse server and consequently should have all rights - I'm being very explicit to try and be clear.
It would have an easy way to get the "superuser" cookie from jupyverse as a starting point, like:
from jupyverse import admin
cookie = admin.get_superuser_cookie()
# etc
Such "superuser" cookie could be sitting next to the other jupyverse files in ~/.local/share/jupyter
with maybe the port and starting datetime in the filename to:
But do you think the notebook should handle cookies? I was thinking that it could directly access the user database, or the API with credentials.
I think we say the same, but I don't say it very clearly.
I would argue the notebook should emulate the web swagger SPA as much as possible, for a transparent leaning experience, by 1/ getting a "superuser" cookie by taking advantage of full access to ~/.local/share/jupyter
then 2/ use a requests sessions with this cookie to interact with the API (Cf. https://requests.readthedocs.io/en/latest/api/#api-cookies)
So I suppose what I describe is the same as "access the API with credentials" ?
Yes, I think we are on the same page. I will start experimenting with this Swagger Notebook and we can iterate on it.
I tried to use notebook admin_user.ipynb but even if it did create a user the rest of the workflow failed as described below.
Workflow:
delete ~/.local/share/jupyter/jupyverse_users.db
start jupyverse:
# cf dev install doc
hatch -e dev.jupyterlab-auth shell
jupyverse --set auth.mode=user
# cf dev install doc
hatch -e dev.jupyterlab-auth shell
jupyter lab
# permissions are empirically determined from above conversation namely:
# 'admin': ['read', 'write'] - https://github.com/jupyter-server/jupyverse/blob/77c85f4e4e0d4f16e0f3b413f9e6f91742cfe16e/plugins/auth/fps_auth/main.py#L37
# jupyter server permissions https://jupyter-server.readthedocs.io/en/latest/operators/security.html#authorization
# remark: they are not consistent, admin permissions is not part of jupyter server permissions
permissions = {
'admin': ['read', 'write'],
'contents': ['read', 'write'],
'kernels': ['read', 'write', 'execute'],
'sessions': ['read', 'write'],
'terminals': ['read', 'write', 'execute'],
}
user = await create_user("aaa", "aaa@ex.com", "taaa", True, permissions)
http://127.0.0.1:8000/docs#/default/auth_cookie_login_auth_login_post
/api/me
no permissions is attached to user:http://127.0.0.1:8000
I get redirected to http://127.0.0.1:8000/lab
but without any rights, so useless:It would be useful to describe a recommended workflow that allows a jupyverse server admin to go from "pip install" to a running jupyverse with one superuser and say 2 regular users with different rights (maybe per kernel and folder, or whatever is possible).
For now it is not clear - to me at least.
- Check on sqllite db viewer that user is created (ideally it would be more convenient to see that from an admin notebook)
I pushed to #308 with a function to show users.
- all seems ok but when i check swagger endpoint
/api/me
no permissions is attached to user
The GET /api/me
expects a permissions query parameter. These are permissions to check, for instance if you pass permissions={'sessions': ['read', 'write', 'execute']}
then in your case you will get back permissions={'sessions': ['read', 'write']}
. I'm not sure it is documented, even in jupyter-server. So if you don't pass any permission to check, then it's normal that you don't get back any checked permission.
The reason permissions
doesn't appear in Swagger is that I couldn't figure out how to do so. It has to be a query parameter because a GET
cannot have body parameters, but since it's a dictionary of lists, it's not obvious to me how it should be encoded. Right now, permissions
is treated as a JSON string blob. It's not great, and if you have ideas on how to improve it, it would be great.
This means that you can only use Swagger for the GET /api/me
to get back the identity, but not the permissions. But it doesn't mean that the created user doesn't have permissions.
- Then unsurprisingly when I visit
http://127.0.0.1:8000
I get redirected tohttp://127.0.0.1:8000/lab
but without any rights, so useless
Actually yes, your user has the right permissions. You can see that it can open a terminal, because you gave enough rights. Same for contents, since you can see them in the file browser. But since you didn't give rights to read kernel specs with 'kernelspecs':['read']
, you won't be able to execute code.
I agree that all of that deserves to be documented.
This means that you can only use Swagger for the
GET /api/me
to get back the identity, but not the permissions.
I opened #309 to improve that, see also https://github.com/jupyter-server/jupyter_server/issues/1284.
I tried you notebook (with a bit of wrapping) and it works fine and is very convenient:
Then I could login from the swagger /auth/login
, and visit /lab
and create/edit/run/delete notebooks and text files.
But when I tried to create a folder and get into it, I could not despite my user having all content
rights. Here is the server stack trace:
Then I tried endpoint /api/me
but even passing permission as query param it did not work in the swagger - probably a bad syntaxt ?
Then I tried endpoint /auth/user/me
but surprisingly it redirected me to the html login page. Is that expected ?
Idem for endpoint /auth/users
despite my user having is_superuser=True
. Is that expected ?
So I am making some progress but there still are things I do not understand.
But when I tried to create a folder and get into it, I could not despite my user having all
content
rights.
I opened #311 to fix this, thanks for reporting!
Then I tried endpoint
/api/me
but even passing permission as query param it did not work in the swagger - probably a bad syntaxt ?
Yes, you should pass {'sessions': ['read', 'write', 'execute']}
.
Then I tried endpoint
/auth/user/me
but surprisingly it redirected me to the html login page. Is that expected ?
Yes, you need your user to have permission admin=["read"] to be able to get this kind of information.
Ok thx for the clarifications along the way! It does works now.
To ease the discovery for future users - and myself when I get back to it - I would recommend:
permission
query param for endpoint /api/me
- even if only as free text with maybe a link (for fully self documented API it probably would have to be GraphQL instead of REST but that's a completely different route)Both seem possible judging by the various placeholders in official swagger petstore online demo.
Another question re permissions:
Is it possible to manage user rights per path on server e.g. user-a@ex.com
would have RW (read write) access to folder-a
and RO (read only) access to folder-b
and the reverse for user-b@ex.com
, and both users RW access to folder-shared
(a plausible even generic use case).
If not, could that be a "plugin" i.e. an option to jupyverse ?
I am trying to find out what is possible now, potentially in the future, or off limit with jupyverse, with a view to using a jupyverse server in a small team, with basic RBAC capabilities.
This is not exactly related to the issue but still sort of "auth" related. I can open another issue if you think it's better.
- the jupyter server authorization page is not currently correct e.g. the "admin" permission does not exist
Actually the admin
permission resource is only present in Jupyverse's fps-auth
plugin, not in the "official" jupyter-server permissions. I added it to protect some routes provided by FastAPI-Users that not all users should have access to.
Is it possible to manage user rights per path on server
The "official" contents
resource is global so it is currently not possible. A user either has access to the whole contents or not, regardless of the action.
But yes a finer-grained plugin could do that, maybe by including the file system path in the resource name, e.g. {"contents:folder-a": ["read", "write"]}
and {"contents:folder-b": ["read"]}
.
Also, in #312 I am adding WebDAV support to Jupyverse. I'm not sure it supports this kind of permissions, but maybe it's worth exploring.
I am trying to find out what is possible now, potentially in the future, or off limit with jupyverse, with a view to using a jupyverse server in a small team, with basic RBAC capabilities.
I think you are the first to explore the user-related capabilities of Jupyverse, so your use-case could drive future developments! Let us know about your needs and we can work on making them possible. Regarding RBAC, Fief already supports it and it's a great solution for user management. Jupyverse supports it with the fps-auth-fief plugin.
I tried to use auth-fief but could not test as it crashed upon start.
First as in the doc:
[olivier]dev/jupyverse-zone/jupyverse-fief [🐍 v3.11.3(dev.jupyterlab-auth_fief)]
❯ jupyverse --set frontend.collaborative=true
[2023-06-05 12:50:55,553 INFO] Running in development mode
[2023-06-05 12:50:55,585 INFO] Starting application
[2023-06-05 12:50:55,669 ERROR] Error during application startup
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/runner.py", line 110, in run_application
event_loop.run_until_complete(coro)
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/tasks.py", line 479, in wait_for
return fut.result()
^^^^^^^^^^^^
File "/home/olivier/GDrive/dev/jupyverse-zone/jupyverse/jupyverse_api/jupyverse_api/main/__init__.py", line 64, in start
await super().start(ctx)
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/web/asgi3.py", line 121, in start
await super().start(ctx)
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/component.py", line 112, in start
self.add_component(alias)
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/component.py", line 101, in add_component
component = component_types.create_object(**config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/utils.py", line 190, in create_object
return plugin_class(**constructor_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/olivier/GDrive/dev/jupyverse-zone/jupyverse/plugins/auth_fief/fps_auth_fief/main.py", line 11, in __init__
self.auth_fief_config = _AuthFiefConfig(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 3 validation errors for _AuthFiefConfig
base_url
field required (type=value_error.missing)
client_id
field required (type=value_error.missing)
client_secret
field required (type=value_error.missing)
[2023-06-05 12:50:55,670 INFO] Stopping application
[2023-06-05 12:50:55,670 INFO] Application stopped
Then after some trial and error: running fief-server in docker (fief quickstart) and setting some params:
[olivier]dev/jupyverse-zone/jupyverse-fief [🐍 v3.11.3(dev.jupyterlab-auth_fief)]
❯ jupyverse \
--set frontend.collaborative=true \
--set auth_fief.base_url=http://localhost:8000 \
--set auth_fief.client_id=jr1oGZiEsJYI13_Ce5d4yv1ifH8KdjnFetq1cnNBuFg \
--set auth_fief.client_secret=z4r0K1pTjPK3hesBEboVjIOgdhW8t8j-ct0LObZLzv4
[2023-06-05 12:54:32,209 INFO] Running in development mode
[2023-06-05 12:54:32,238 INFO] Starting application
[2023-06-05 12:54:32,691 INFO] WebDAV user 224d7e6cc693452196f0f3ad95e3e3e9 has password 940d4f7d3e2544be9a4a2bf6a862e41b
2023-06-05 12:54:32,692 INFO: [asgi_webdav.server] ASGI WebDAV Server(v1.3.2) starting...
2023-06-05 12:54:32,692 INFO: [asgi_webdav.auth] Register User: 224d7e6cc693452196f0f3ad95e3e3e9, allow:[''], deny:[]
2023-06-05 12:54:32,692 INFO: [asgi_webdav.web_dav] Mapping Prefix: /webdav => file://.
2023-06-05 12:54:32,693 INFO: [asgi_webdav.server] ASGI WebDAV Server running on http://?:? (Press CTRL+C to quit)
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fea1bb40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8dde00>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fea1be40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8deb20>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fea1bc40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8de880>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fea1bd40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8dec00>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fea1ba40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8ddfc0>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fe918740>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8def80>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fe918840>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8de960>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object Context.request_resource at 0x7fd9fe918b40>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 745, in request_resource
await wait_event(
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/event.py", line 320, in wait_event
async with aclosing(stream_events(signals, filter)) as events:
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/contextlib.py", line 374, in __aexit__
await self.thing.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object Queue.get at 0x7fd9fe8de6c0>
Traceback (most recent call last):
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/queues.py", line 160, in get
getter.cancel() # Just in case getter is not done yet.
^^^^^^^^^^^^^^^
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 761, in call_soon
self._check_closed()
File "/home/olivier/micromamba/envs/jupyverse-dev/lib/python3.11/asyncio/base_events.py", line 519, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Exception ignored in: <coroutine object KernelsComponent.start at 0x7fd9fe8f0940>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 938, in wrapper
await generator.aclose()
RuntimeError: aclose(): asynchronous generator is already running
Exception ignored in: <coroutine object YjsComponent.start at 0x7fd9fe8f04c0>
Traceback (most recent call last):
File "/home/olivier/.local/share/hatch/env/virtual/jupyverse/aDgSOB3x/dev.jupyterlab-auth_fief/lib/python3.11/site-packages/asphalt/core/context.py", line 938, in wrapper
await generator.aclose()
RuntimeError: aclose(): asynchronous generator is already running
What is wrong ?
I put auth_fief.base_url=http://localhost:8000
, is that correct ?
How does fief know how to redirect back to jupyterlab frontend ?
btw should I keep extending this issue ? or create another one ?
For reference: Experimenting with fief I hit a wall with the docker compose setup - see this discussion https://github.com/orgs/fief-dev/discussions/210
Thanks for reporting the issue @oscar6echo, I opened #316.
I use the beta of Fief cloud, but you can also use the docker image to run the Fief server locally. You will need auth_fief.base_url
to point to the URL of that server though, not the URL of Jupyverse.
Your Fief user must have some permissions, for instance:
And you can use Fief's roles to manage them nicely.
BTW, frontend.collaborative=true
is not needed anymore, you just need to install jupyter_collaboration
. I need to update the documentation.
btw should I keep extending this issue ? or create another one ?
Maybe use discussions?
For reference, I confirm that it now works in both cases:
# launch jupyverse - terminal 1
jupyverse \
--set auth_fief.base_url=http://localhost:8001 \
--set auth_fief.client_id=jr1oGZiEsJYI13_Ce5d4yv1ifH8KdjnFetq1cnNBuFg \
--set auth_fief.client_secret=z4r0K1pTjPK3hesBEboVjIOgdhW8t8j-ct0LObZLzv4
docker run \ --name fief-server \ --rm \ -p 8001:8001 \ --env-file config-docker.env \ ghcr.io/fief-dev/fief:latest
with
```env
# file config-docker-env
# Reference: https://docs.fief.dev/self-hosting/environment-variables/
SECRET=nXnNdCaXygjwevjQ6vafRTaqkO-DQ_m0Eps7DGr_YamEX1wwA7bfCQE4-MXj9cMOLiMADCBVIpd1iDEEydHv7Q
FIEF_CLIENT_ID=jr1oGZiEsJYI13_Ce5d4yv1ifH8KdjnFetq1cnNBuFg
FIEF_CLIENT_SECRET=z4r0K1pTjPK3hesBEboVjIOgdhW8t8j-ct0LObZLzv4
ENCRYPTION_KEY=Nja4Jo9w7RkT3ls0HjJNTtjUvYzDxJAg27w8044pGlI=
PORT=8001
ROOT_DOMAIN=localhost:8001
FIEF_DOMAIN=localhost:8001
FIEF_MAIN_USER_EMAIL=oscar22@gmail.com
FIEF_MAIN_USER_PASSWORD=Toto*/12345
CSRF_COOKIE_SECURE=False
SESSION_DATA_COOKIE_SECURE=False
USER_LOCALE_COOKIE_SECURE=False
LOGIN_HINT_COOKIE_SECURE=False
LOGIN_SESSION_COOKIE_SECURE=False
REGISTRATION_SESSION_COOKIE_SECURE=False
SESSION_COOKIE_SECURE=False
FIEF_ADMIN_SESSION_COOKIE_SECURE=False
# launch jupyverse - terminal 1
jupyverse \
--set auth_fief.base_url=https://jupyverse-XXXXX.fief.dev \
--set auth_fief.client_id=XXXXXXXXXXXXXXXXXXXXXXXX \
--set auth_fief.client_secret=XXXXXXXXXXXXXXXXXXXXXX
In both cases with the following role - all 12 permissions are necessary for a regular jupyterlab user experience:
assigned to 2 test users:
IMPORTANT:
The fief server does NOT work with the docker compose proposed in their documentation as I wrote in a discussion on their repo: https://github.com/orgs/fief-dev/discussions/210
Unfortunately this limitation makes it impossible to use in a corp env as the online service is not an option, and the "docker run" solution does not allow to persist data so is only for testing.
As suggested I open discussion #318 .
Great to see some progress!
IMPORTANT: The fief server does NOT work with the docker compose proposed in their documentation as I wrote in a discussion on their repo: https://github.com/orgs/fief-dev/discussions/210
Unfortunately this limitation makes it impossible to use in a corp env as the online service is not an option, and the "docker run" solution does not allow to persist data so is only for testing.
See also this comment, for setting up a local server without Docker. I'm sure we can work with François towards a solution.
See also this comment, for setting up a local server without Docker. I'm sure we can work with François towards a solution.
Ok that would be good! I added a note to signal interest.
Description
I try to start jupyverse with auth, but cannot connect from browser:
In a regular install I get:
In dev mode I get:
In both cases no token is displayed in the terminal.
If I click on github to authenticate I sign in then get redirected to:
with the url :
https://github.com/login/oauth/authorize?client_id=&redirect_uri=http%3A%2F%2F127.0.0.1%3A8000%2Fauth%2Fgithub%2Fcallback&response_type=code&scope=user%3Aemail&state=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJmYXN0YXBpLXVzZXJzOm9hdXRoLXN0YXRlIiwiZXhwIjoxNjg0MDE4MTkyfQ.P92dWsI2hX_Adznt_leX1ebziIRmbjVVWpaNEQnYwis
Reproduce
Follow doc page install with micromamba then multi-user.
Then launch jupyverse as shown above.
Expected behavior
Not clear but probably not the first case (regular install).
Indeed the doc says:
So maybe a db must be configured ? If so probably sqlite as below - empirically:
But how is the person launching jupyverse supposed to have filled this db ? Particularly as it gets created on the fly if missing.
Any way I did not seed a login screen at any point.
Maybe a minimal workflow should be described including what minimal input is necessary and what output is to be expected.
Else maybe the pointers to the code when the answer implicitly lies.
I plan to investigate this jupyverse a bit, so pls do not hesitate to spell out how I could contribute the doc for users.