goauthentik / authentik

The authentication glue you need.
https://goauthentik.io
Other
12.96k stars 865 forks source link

Authentik [worker] CPU 100% and redis. #7025

Closed crisukbot closed 6 months ago

crisukbot commented 11 months ago

Describe the bug Authentik Worker clogs the processor to 100% and eventually shuts down the entire system. After deleting the redis folder, everything worked fine. (Maybe there's a problem with how Authentik works with Redis?)

To Reproduce It's hard to explain, I started authentik and after three or four or five hours the server shut down.

Expected behavior Authentik does not shut down my server

Screenshots image

Logs Not anymore.

Version and Deployment (please complete the following information):

Zegorax commented 11 months ago

I have the same problem. Worker takes 800% of my CPUs and crashes everything.

sammcj commented 11 months ago

Same problem here on 2023.8.3.

I did notice that despite setting the log level to warning it seems to run in debug level - I wonder if that's not helping?

{"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1697580575.820365, "version": "2023.8.3"}
{"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1697580575.8209903}
{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1697580575.821178, "path": "authentik.enterprise.settings"}

Honestly every time I've come across Celery in various project I've seen performance problems 😞

sammcj commented 11 months ago

Running a strace on the process it looks like it's constantly trying to access a file that doesn't exist:

Trace of process 1838696 - python3.11β”‚python -m manage worker

09:21:22.193794 close(951681938)        = -1 EBADF (Bad file descriptor) <0.000005>
09:21:22.193811 close(951681937)        = -1 EBADF (Bad file descriptor) <0.000005>
09:21:22.193828 close(951681936)        = -1 EBADF (Bad file descriptor) <0.000010>
09:21:22.193850 close(951681935)        = -1 EBADF (Bad file descriptor) <0.000005>
......

lsof:

lsof -p 1838696
COMMAND     PID      USER   FD      TYPE DEVICE SIZE/OFF       NODE NAME
python  1838696 authentik  cwd       DIR  0,376       34 2147513058 /
python  1838696 authentik  rtd       DIR  0,376       34 2147513058 /
python  1838696 authentik  txt       REG  0,376    14472    5389540 /usr/local/bin/python3.11
python  1838696 authentik  mem       REG  0,376    14472    5389540 /usr/local/bin/python3.11
python  1838696 authentik  mem       REG  0,376  4642880 2429934891 /usr/local/lib/python3.11/site-packages/pydantic_core/_pydantic_core.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376 31262256 2421132818 /usr/lib/x86_64-linux-gnu/libicudata.so.72.1
python  1838696 authentik  mem       REG  0,376  2078888 2421132828 /usr/lib/x86_64-linux-gnu/libicuuc.so.72.1
python  1838696 authentik  mem       REG  0,376   268576 2421132737 /usr/lib/x86_64-linux-gnu/libxslt.so.1.1.35
python  1838696 authentik  mem       REG  0,376  1750104 2421132859 /usr/lib/x86_64-linux-gnu/libxml2.so.2.9.14
python  1838696 authentik  mem       REG  0,376   484848 2421132863 /usr/lib/x86_64-linux-gnu/libxmlsec1.so.1.2.37
python  1838696 authentik  mem       REG   0,38 71727588    3356601 /geoip/GeoLite2-City.mmdb
python  1838696 authentik  mem       REG  0,376   299656 2421132861 /usr/lib/x86_64-linux-gnu/libxmlsec1-openssl.so.1.2.37
python  1838696 authentik  mem       REG  0,376   423032 2957237627 /usr/local/lib/python3.11/site-packages/xmlsec.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   526904 1621742567 /usr/local/lib/python3.11/site-packages/lxml/_elementpath.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14640  809902022 /usr/lib/x86_64-linux-gnu/librt.so.1
python  1838696 authentik  mem       REG  0,376 10158968 1621742579 /usr/local/lib/python3.11/site-packages/lxml/etree.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376  4123656 1621742876 /usr/local/lib/python3.11/site-packages/psycopg_c/_psycopg.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   529216  809901980 /usr/lib/x86_64-linux-gnu/libgmp.so.10.4.1
python  1838696 authentik  mem       REG  0,376   292856  809901986 /usr/lib/x86_64-linux-gnu/libhogweed.so.6.6
python  1838696 authentik  mem       REG  0,376   317544  809902001 /usr/lib/x86_64-linux-gnu/libnettle.so.8.6
python  1838696 authentik  mem       REG  0,376    83968  809902037 /usr/lib/x86_64-linux-gnu/libtasn1.so.6.6.3
python  1838696 authentik  mem       REG  0,376  1792040  809902046 /usr/lib/x86_64-linux-gnu/libunistring.so.2.2.0
python  1838696 authentik  mem       REG  0,376   198776  809901988 /usr/lib/x86_64-linux-gnu/libidn2.so.0.3.8
python  1838696 authentik  mem       REG  0,376  1261376  809902008 /usr/lib/x86_64-linux-gnu/libp11-kit.so.0.3.0
python  1838696 authentik  mem       REG  0,376    60328  809902021 /usr/lib/x86_64-linux-gnu/libresolv.so.2
python  1838696 authentik  mem       REG  0,376    22448 1892869753 /usr/lib/x86_64-linux-gnu/libkeyutils.so.1.10
python  1838696 authentik  mem       REG  0,376  2197240  809901982 /usr/lib/x86_64-linux-gnu/libgnutls.so.30.34.3
python  1838696 authentik  mem       REG  0,376   113392 2421132853 /usr/lib/x86_64-linux-gnu/libsasl2.so.2.0.25
python  1838696 authentik  mem       REG  0,376    63832 2421132830 /usr/lib/x86_64-linux-gnu/liblber-2.5.so.0.1.8
python  1838696 authentik  mem       REG  0,376   183048 1892869751 /usr/lib/x86_64-linux-gnu/libk5crypto.so.3.1
python  1838696 authentik  mem       REG  0,376   888080 1892869755 /usr/lib/x86_64-linux-gnu/libkrb5.so.3.3
python  1838696 authentik  mem       REG  0,376   380432 2421132832 /usr/lib/x86_64-linux-gnu/libldap-2.5.so.0.1.8
python  1838696 authentik  mem       REG  0,376   342000 2421132846 /usr/lib/x86_64-linux-gnu/libpq.so.5.15
python  1838696 authentik  mem       REG  0,376  2740136  277643806 /usr/local/lib/python3.11/site-packages/nacl/_sodium.abi3.so
python  1838696 authentik  mem       REG  0,376    52224 1892869757 /usr/lib/x86_64-linux-gnu/libkrb5support.so.0.1
python  1838696 authentik  mem       REG  0,376    18344  809901961 /usr/lib/x86_64-linux-gnu/libcom_err.so.2.1
python  1838696 authentik  mem       REG  0,376  1959696  544762015 /usr/local/lib/python3.11/site-packages/bcrypt/_bcrypt.abi3.so
python  1838696 authentik  mem       REG  0,376  1065976 2957386397 /usr/local/lib/python3.11/site-packages/_cffi_backend.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14480  809901967 /usr/lib/x86_64-linux-gnu/libdl.so.2
python  1838696 authentik  mem       REG  0,376 13449232  273345187 /usr/local/lib/python3.11/site-packages/cryptography/hazmat/bindings/_rust.abi3.so
python  1838696 authentik  mem       REG  0,376    41984 1075279911 /usr/local/lib/python3.11/lib-dynload/_multibytecodec.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   418056 2696146050 /usr/local/lib/python3.11/site-packages/charset_normalizer/md__mypyc.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   125312  809901976 /usr/lib/x86_64-linux-gnu/libgcc_s.so.1
python  1838696 authentik  mem       REG  0,376  2190440  809902033 /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30
python  1838696 authentik  mem       REG  0,376  1016256 2158917686 /usr/local/lib/python3.11/site-packages/msgpack/_cmsgpack.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   334584 1892869747 /usr/lib/x86_64-linux-gnu/libgssapi_krb5.so.2.2
python  1838696 authentik  mem       REG  0,376  1862600 1621742878 /usr/local/lib/python3.11/site-packages/psycopg_c/pq.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    28288 1075279755 /usr/local/lib/python3.11/lib-dynload/mmap.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   252208 2158892760 /usr/local/lib/python3.11/site-packages/_argon2_cffi_bindings/_ffi.abi3.so
python  1838696 authentik  mem       REG  0,376   724400 2957386396 /usr/local/lib/python3.11/site-packages/_cbor2.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    35480  810122095 /usr/local/lib/python3.11/site-packages/websockets/speedups.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    20056 1075279912 /usr/local/lib/python3.11/lib-dynload/_multiprocessing.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376  1131664 1075279765 /usr/local/lib/python3.11/lib-dynload/unicodedata.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    37344 1075279895 /usr/local/lib/python3.11/lib-dynload/_csv.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    15120 1075279753 /usr/local/lib/python3.11/lib-dynload/grp.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14480  809902020 /usr/lib/x86_64-linux-gnu/libpthread.so.0
python  1838696 authentik  mem       REG  0,376   188976 1346621769 /usr/local/lib/python3.11/site-packages/zope/interface/_zope_interface_coptimizations.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    34872  809902049 /usr/lib/x86_64-linux-gnu/libuuid.so.1.3.0
python  1838696 authentik  mem       REG  0,376    14504 1075279744 /usr/local/lib/python3.11/lib-dynload/_uuid.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    43480  809901975 /usr/lib/x86_64-linux-gnu/libffi.so.8.1.2
python  1838696 authentik  mem       REG  0,376   115072 1075279896 /usr/local/lib/python3.11/lib-dynload/_ctypes.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    36832 1075279907 /usr/local/lib/python3.11/lib-dynload/_json.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    32088 1075279747 /usr/local/lib/python3.11/lib-dynload/_zoneinfo.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    31232 1075279764 /usr/local/lib/python3.11/lib-dynload/termios.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    92776 1075279914 /usr/local/lib/python3.11/lib-dynload/_pickle.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    19568 1075279917 /usr/local/lib/python3.11/lib-dynload/_queue.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    61992 1075279883 /usr/local/lib/python3.11/lib-dynload/_asyncio.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14320 1075279893 /usr/local/lib/python3.11/lib-dynload/_contextvars.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    22800 1075279906 /usr/local/lib/python3.11/lib-dynload/_heapq.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    23080 1075279916 /usr/local/lib/python3.11/lib-dynload/_posixsubprocess.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    22952 1075279752 /usr/local/lib/python3.11/lib-dynload/fcntl.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14288 1075279935 /usr/local/lib/python3.11/lib-dynload/_typing.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    14704 1075279913 /usr/local/lib/python3.11/lib-dynload/_opcode.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    17496 2696146048 /usr/local/lib/python3.11/site-packages/charset_normalizer/md.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376  2504120 3238437676 /usr/local/lib/python3.11/site-packages/yaml/_yaml.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    19232 1075279760 /usr/local/lib/python3.11/lib-dynload/resource.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   212592 1075279902 /usr/local/lib/python3.11/lib-dynload/_decimal.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   190456  809901992 /usr/lib/x86_64-linux-gnu/liblzma.so.5.4.1
python  1838696 authentik  mem       REG  0,376    40816 1075279909 /usr/local/lib/python3.11/lib-dynload/_lzma.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    74688  809901953 /usr/lib/x86_64-linux-gnu/libbz2.so.1.0.4
python  1838696 authentik  mem       REG  0,376    27824 1075279886 /usr/local/lib/python3.11/lib-dynload/_bz2.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    41104 1075279768 /usr/local/lib/python3.11/lib-dynload/zlib.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   692256  273373124 /usr/lib/x86_64-linux-gnu/libssl.so.3
python  1838696 authentik  mem       REG  0,376   208576 1075279925 /usr/local/lib/python3.11/lib-dynload/_ssl.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    88552 1075279900 /usr/local/lib/python3.11/lib-dynload/_datetime.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    58112 1075279748 /usr/local/lib/python3.11/lib-dynload/array.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    32576 1075279761 /usr/local/lib/python3.11/lib-dynload/select.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    96040 1075279923 /usr/local/lib/python3.11/lib-dynload/_socket.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    31920 1075279922 /usr/local/lib/python3.11/lib-dynload/_sha512.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    58000 1075279754 /usr/local/lib/python3.11/lib-dynload/math.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    40368 1075279885 /usr/local/lib/python3.11/lib-dynload/_blake2.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376  4709656  273373123 /usr/lib/x86_64-linux-gnu/libcrypto.so.3
python  1838696 authentik  mem       REG  0,376   121280  809902053 /usr/lib/x86_64-linux-gnu/libz.so.1.2.13
python  1838696 authentik  mem       REG  0,376    19104 1075279918 /usr/local/lib/python3.11/lib-dynload/_random.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    55344 1075279905 /usr/local/lib/python3.11/lib-dynload/_hashlib.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    15056 1075279884 /usr/local/lib/python3.11/lib-dynload/_bisect.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    27888 1075279750 /usr/local/lib/python3.11/lib-dynload/binascii.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    50768 1075279927 /usr/local/lib/python3.11/lib-dynload/_struct.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    64552 1075279903 /usr/local/lib/python3.11/lib-dynload/_elementtree.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376   174216 1892869739 /usr/lib/x86_64-linux-gnu/libexpat.so.1.8.10
python  1838696 authentik  mem       REG  0,376    57648 1075279758 /usr/local/lib/python3.11/lib-dynload/pyexpat.cpython-311-x86_64-linux-gnu.so
python  1838696 authentik  mem       REG  0,376    27028 1346707067 /usr/lib/x86_64-linux-gnu/gconv/gconv-modules.cache
python  1838696 authentik  mem       REG  0,376   353616  809901860 /usr/lib/locale/C.utf8/LC_CTYPE
python  1838696 authentik  mem       REG  0,376   907784  809901993 /usr/lib/x86_64-linux-gnu/libm.so.6
python  1838696 authentik  mem       REG  0,376  1922136  809901954 /usr/lib/x86_64-linux-gnu/libc.so.6
python  1838696 authentik  mem       REG  0,376  5128248 1346706731 /usr/local/lib/libpython3.11.so.1.0
python  1838696 authentik  mem       REG  0,376   210968  809901872 /usr/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2
python  1838696 authentik    0u      CHR    1,3      0t0          5 /dev/null
python  1838696 authentik    1w     FIFO   0,14      0t0  170827659 pipe
python  1838696 authentik    2w     FIFO   0,14      0t0  170827660 pipe
python  1838696 authentik    3u  a_inode   0,15        0       1060 [eventpoll:4,6]
python  1838696 authentik    4u     sock    0,8      0t0  170847253 protocol: UNIX-STREAM
python  1838696 authentik    5u     sock    0,8      0t0  170847254 protocol: UNIX-STREAM
python  1838696 authentik    6r     FIFO   0,14      0t0  170847255 pipe
python  1838696 authentik    7w     FIFO   0,14      0t0  170847255 pipe
python  1838696 authentik    8u     sock    0,8      0t0  170847267 protocol: TCP
python  1838696 authentik    9r      REG   0,38 71727588    3356601 /geoip/GeoLite2-City.mmdb
python  1838696 authentik   10u  a_inode   0,15        0       1060 [eventpoll]
python  1838696 authentik   11u  a_inode   0,15        0       1060 [eventpoll]
python  1838696 authentik   12u     sock    0,8      0t0  170847281 protocol: TCP
python  1838696 authentik   13u  a_inode   0,15        0       1060 [eventpoll:27,30,32,40,45]
python  1838696 authentik   17u  a_inode   0,15        0       1060 [eventpoll]
python  1838696 authentik   18r      CHR    1,3      0t0          5 /dev/null
python  1838696 authentik   19w     FIFO   0,14      0t0  170847290 pipe
sammcj commented 11 months ago

Perhaps interesting - if I reduce the worker cpu to 0.1 I notice the following errors:

β”‚   {"event": "Timed out waiting for UP message from <ForkProcess(ForkPoolWorker-50, started daemon)>", "level": "error", "logger": "celery.concu   β”‚
β”‚   rrency.asynpool", "timestamp": 1697598225.8266332}                                                                                              β”‚
β”‚   Process 'ForkPoolWorker-48' pid:487 exited with 'signal 9 (SIGKILL)'                                                                            β”‚
β”‚   {"event": "Timed out waiting for UP message from <ForkProcess(ForkPoolWorker-51, started daemon)>", "level": "error", "logger": "celery.concu   β”‚
β”‚   rrency.asynpool", "timestamp": 1697598232.0265675}                                                                                              

This could be because it's only got 0.1 cpu and it's too slow for whatever its checking there - but it might also be a hint to what might be going wrong.

Where is this "UP" message supposed to come from? Celery? The worker app? Something in another container? - perhaps this will give a hint.

sammcj commented 11 months ago

Found the bad file descriptor!

{
    "event": "Encountered OSError when accessing fd 39 ",
    "exception": [
        {
            "exc_type": "OSError",
            "exc_value": "[Errno 9] Bad file descriptor",
            "frames": [
                {
                    "filename": "/usr/local/lib/python3.11/site-packages/celery/concurrency/asynpool.py",
                    "line": "",
                    "lineno": 215,
                    "locals": {
                        "_meta_fd_argument_maker": "'<function iterate_file_descriptors_safely.<locals>._meta_fd_argument_maker at 0x'+13",
                        "args": "'(<function ResultHandler._make_process_result.<locals>.on_result_readable at 0x7'+21",
                        "fd": "39",
                        "fds_iter": "'{39: <ForkProcess(ForkPoolWorker-3, stopped[SIGTERM] daemon)>, 37: <ForkProcess('+35",
                        "hub_args": "'[<function ResultHandler._make_process_result.<locals>.on_result_readable at 0x7'+17",
                        "hub_kwargs": "{}",
                        "hub_method": "<bound method Hub.add_reader of <Hub@0x7faabc2d9190: R:1 W:0>>",
                        "kwargs": "{}",
                        "source_data": "'{39: <ForkProcess(ForkPoolWorker-3, stopped[SIGTERM] daemon)>, 37: <ForkProcess('+35",
                        "stale_fds": "[]"
                    },
                    "name": "iterate_file_descriptors_safely"
                },
                {
                    "filename": "/usr/local/lib/python3.11/site-packages/kombu/asynchronous/hub.py",
                    "line": "",
                    "lineno": 220,
                    "locals": {
                        "args": "(39,)",
                        "callback": "'<function ResultHandler._make_process_result.<locals>.on_result_readable at 0x7f'+11",
                        "fds": "39",
                        "self": "<Hub@0x7faabc2d9190: R:1 W:0>"
                    },
                    "name": "add_reader"
                },
                {
                    "filename": "/usr/local/lib/python3.11/site-packages/kombu/asynchronous/hub.py",
                    "line": "",
                    "lineno": 170,
                    "locals": {
                        "args": "(39,)",
                        "callback": "'<function ResultHandler._make_process_result.<locals>.on_result_readable at 0x7f'+11",
                        "consolidate": "False",
                        "fd": "39",
                        "flags": "25",
                        "self": "<Hub@0x7faabc2d9190: R:1 W:0>"
                    },
                    "name": "add"
                },
                {
                    "filename": "/usr/local/lib/python3.11/site-packages/kombu/utils/eventio.py",
                    "line": "",
                    "lineno": 66,
                    "locals": {
                        "events": "25",
                        "fd": "39",
                        "self": "<kombu.utils.eventio._epoll object at 0x7faabc2d9290>"
                    },
                    "name": "register"
                }
            ],
            "is_cause": false,
            "syntax_error": null
        }
    ],
    "level": "warning",
    "logger": "celery.concurrency.asynpool",
    "timestamp": 1697600502.6591609
}
null-affinity commented 11 months ago

After updating to 2023.8.3, I am also running into the issue where my authentik-worker container is running at 100% CPU. No relevant errors coming out of stderr, but I can reproduce the "Bad file descriptor" message others are reporting by running strace on the worker process. The application is otherwise running nominally, just consuming unnecessary resources. Deployed via docker compose.

sammcj commented 11 months ago

Anyone had any luck working around this?

sammcj commented 11 months ago

Something else weird about Authentik I noticed is that of all the many docker containers I have running - Authentik is the only one that consistently has incorrect uptime.

In the below example I created and started these Authentik containers around 3 minutes before taking this screenshot:

SCR-20231024-hmvp

By chance Authentik isn't doing something funky with time is it?


This problem seems to be wide spread with both open and closed tickets.

I've also started a discussion on Discord as not getting much traction on Github: https://discord.com/channels/809154715984199690/1166120303341084693/1166120303341084693

sammcj commented 11 months ago

Something changed with the new Authentik image (latest tag) pushed two days ago and my worker is no longer pegging the CPU:

image

Any ideas what the fix was (if it is fixed)?

crisukbot commented 10 months ago

image after update, cpu 100%.

sammcj commented 10 months ago

I spoke to soon - it's back to pegging the CPU again!

SCR-20231128-kvrq

(note the container is limited to 0.25 CPU because of this, hence 24-25% CPU usage showing)

DKingAlpha commented 9 months ago
Collecting samples from 'python -m manage worker' (python v3.11.5)
Total Samples 1700
GIL: 26.00%, Active: 100.00%, Threads: 1

  %Own   %Total  OwnTime  TotalTime  Function (filename)                                                                                                                                    
100.00% 100.00%   17.00s    17.00s   close_open_fds (billiard/compat.py)
  0.00% 100.00%   0.000s    17.00s   handle (authentik/core/management/commands/worker.py)
  0.00% 100.00%   0.000s    17.00s   execute (django/core/management/base.py)
  0.00% 100.00%   0.000s    17.00s   _run_code (<frozen runpy>)
  0.00% 100.00%   0.000s    17.00s   execute (django/core/management/__init__.py)
  0.00% 100.00%   0.000s    17.00s   <module> (manage.py)
  0.00% 100.00%   0.000s    17.00s   _bootstrap (billiard/process.py)
  0.00% 100.00%   0.000s    17.00s   run_from_argv (django/core/management/base.py)
  0.00% 100.00%   0.000s    17.00s   _run_module_as_main (<frozen runpy>)
  0.00% 100.00%   0.000s    17.00s   execute_from_command_line (django/core/management/__init__.py)
  0.00% 100.00%   0.000s    17.00s   __init__ (billiard/popen_fork.py)
  0.00% 100.00%   0.000s    17.00s   _launch (billiard/popen_fork.py)
  0.00% 100.00%   0.000s    17.00s   start (celery/worker/worker.py)
  0.00% 100.00%   0.000s    17.00s   run (celery/beat.py)
  0.00% 100.00%   0.000s    17.00s   start (billiard/process.py)
  0.00% 100.00%   0.000s    17.00s   _Popen (billiard/context.py)
  0.00% 100.00%   0.000s    17.00s   start (celery/bootsteps.py)

The looping call stack:

Thread 36 (active): "MainThread"
    close_open_fds (billiard/compat.py:150)
    run (celery/beat.py:713)
    _bootstrap (billiard/process.py:323)
    _launch (billiard/popen_fork.py:77)
    __init__ (billiard/popen_fork.py:22)
    _Popen (billiard/context.py:331)
    _Popen (billiard/context.py:274)
    start (billiard/process.py:120)
    start (celery/bootsteps.py:365)
    start (celery/bootsteps.py:116)
    start (celery/worker/worker.py:202)
    handle (authentik/core/management/commands/worker.py:47)
    execute (django/core/management/base.py:458)
    run_from_argv (django/core/management/base.py:412)
    execute (django/core/management/__init__.py:436)
    execute_from_command_line (django/core/management/__init__.py:442)
    <module> (manage.py:31)
    _run_code (<frozen runpy>:88)
    _run_module_as_main (<frozen runpy>:198)
DKingAlpha commented 9 months ago

So this looks like an internal error of celery,

def get_fdmax(default=None):
    """Return the maximum number of open file descriptors
    on this system.

    :keyword default: Value returned if there's no file
                      descriptor limit.

    """
    try:
        return os.sysconf('SC_OPEN_MAX')    # container has value 1073741816
    except:
        pass
    if resource is None:  # Windows
        return default
    fdmax = resource.getrlimit(resource.RLIMIT_NOFILE)[1]
    if fdmax == resource.RLIM_INFINITY:
        return default
    return fdmax

# ............
# ............
    def close_open_fds(keep=None):  # noqa
        keep = [maybe_fileno(f)
                for f in (keep or []) if maybe_fileno(f) is not None]
        for fd in reversed(range(get_fdmax(default=2048))):
            if fd not in keep:
                try:
                    os.close(fd)
                except OSError as exc:
                    if exc.errno != errno.EBADF:
                        raise

close_open_fds won't return until all fds closed from 1073741815 to 0

I believe they should at least try to enumerate all valid fds from procfs instead of scanning it over.

PKizzle commented 9 months ago

@DKingAlpha Could you report this over at the celery/billiard repository?

DKingAlpha commented 9 months ago

@DKingAlpha Could you report this over at the celery repository?

Well I may need to correct myself about "internal error of celery".

My host system reports os.sysconf('SC_OPEN_MAX') == 1024, while the container of authentik worker reports os.sysconf('SC_OPEN_MAX') == 1073741816.

Of cource celery could have closed all fds in a safer manner, but for authentik, it just doesn't need 1073741816 fds.

I believe authentik container should lower it's fd max, then all should be good.

EDIT: authentik have nothing to do with the value. It could be the docker daemon that changed the value. Other unrelated containers also report inconsistent SC_OPEN_MAX

DKingAlpha commented 9 months ago

I dug into this and found many things interesting.

  1. libc.sysconf returns libc.getdtablesize() for _SC_OPEN_MAX, which is actually __getrlimit (RLIMIT_NOFILE, &ru) < 0 ? OPEN_MAX : ru.rlim_cur;.
  2. Bash ulimit -n basically returns libc.getdtablesize() too.
  3. linux kernel support syscall getdtablesize() which returns sysctl_nr_open (default 1024*1024) but it's not used by userland libc (bash ulimit / sysconf etc)
  4. __getrlimit is a kernel syscall and the the values should be consistent to different container on same host since they share the same kernel. However they are not.
  5. default systemd docker.service file changed the ulimit for dockerd process.
  6. docker provides a command line option --ulimit for user to change it while creating containers.

To summarize

A. comment these "infinity" out should fix it globally.

/lib/systemd/system/docker.service

# Having non-zero Limit*s causes performance problems due to accounting overhead
# in the kernel. We recommend using cgroups to do container-local accounting.
LimitNOFILE=infinity
LimitNPROC=infinity
LimitCORE=infinity

B. without changing service file, add these to docker-compose.yml instead should also fix it

services:
  worker:
    ulimits:
      nofile:
        soft: 10240
        hard: 10240
sammcj commented 9 months ago

Great find @DKingAlpha !

Confirming that after adding the ulimits via docker-compose, the CPU drops down to 0% after 30s~:

image
rissson commented 9 months ago

For reference:

Here's what I have at home:

$ ulimit -Sn
1048576
# ulimit -Hn
1048576

And we have the same in production at authentik

The weird thing is that (at home at least), I also have

LimitNOFILE=infinity

in my systemd docker.service unit (systemctl cat docker.service).

DKingAlpha commented 9 months ago

For reference:

Here's what I have at home:

$ ulimit -Sn
1048576
# ulimit -Hn
1048576

And we have the same in production at authentik

The weird thing is that (at home at least), I also have

LimitNOFILE=infinity

in my systemd docker.service unit (systemctl cat docker.service).

Host and docker container can return different ulimit values. Where is 1048576 from? host or container? Can you list both of them?

rissson commented 9 months ago

Sorry, I should have mentioned that both of those are directly from inside the container.

On the host (personal instance) it's:

# ulimit -Sn
1024
# ulimit -Hn
1048576
DKingAlpha commented 9 months ago

Sorry, I should have mentioned that both of those are directly from inside the container.

On the host (personal instance) it's:

# ulimit -Sn
1024
# ulimit -Hn
1048576

Sorry for the late reply. ulimit -n returns rlimit_cur, i.e. soft limit, -Sn, I believe 1024 is what we are looking for.

As for infinity, my guess is that systemd on your linux distribution defines them to 0x100000, but I dont have time to check the source code currently.

DKingAlpha commented 9 months ago

To understand why infinity is NOT RLIM_INFINITY, these keywords are related, search them in https://github.dev/systemd/systemd may help.

systemd/systemd/src/basic/fd-util.c

int read_nr_open(void) {
        _cleanup_free_ char *nr_open = NULL;
        int r;

        /* Returns the kernel's current fd limit, either by reading it of /proc/sys if that works, or using the
         * hard-coded default compiled-in value of current kernels (1M) if not. This call will never fail. */

        r = read_one_line_file("/proc/sys/fs/nr_open", &nr_open);
        if (r < 0)
                log_debug_errno(r, "Failed to read /proc/sys/fs/nr_open, ignoring: %m");
        else {
                int v;

                r = safe_atoi(nr_open, &v);
                if (r < 0)
                        log_debug_errno(r, "Failed to parse /proc/sys/fs/nr_open value '%s', ignoring: %m", nr_open);
                else
                        return v;
        }

        /* If we fail, fall back to the hard-coded kernel limit of 1024 * 1024. */
        return 1024 * 1024;
}

kernel: fs/file_table.c

    {
        .procname   = "nr_open",
        .data       = &sysctl_nr_open,
        .maxlen     = sizeof(unsigned int),
        .mode       = 0644,
        .proc_handler   = proc_dointvec_minmax,
        .extra1     = &sysctl_nr_open_min,
        .extra2     = &sysctl_nr_open_max,
    },

kernel: fs/file.c

unsigned int sysctl_nr_open __read_mostly = 1024*1024;       // THIS EXPLAINS 1048576
DKingAlpha commented 9 months ago

I am still figuring out why my system returns 0x3FFFFFF8 from nr_open

EDIT: kernel default value for sysctl_nr_open is 1024*1024, this is confirmed by static analysing. So it is changed by some runtime process but I have lost interest finding out.

rissson commented 9 months ago

Alright I get why those might be different from system to system, thank you for your detailed analysis! Can we specify higher values than what is currently in #7762? The values I put above are what Debian and Amazon Linux (based on Centos) are setting, they should be good defaults.

DKingAlpha commented 9 months ago

Hi rission, I would rather not to give you some random digit and get them merged into your product. 10240 is a digit I made up that LOOKS okay to me. Instead I would like to advise you write a test python script to test close fd from TestMaxNoFile to 3, one by one in try catch, single threaded, so you will have basic idea how long they will take. Also pay attention to fd count peek per 1000 concurrency Then you will get a good digit short enough and large enough.

BeryJu commented 9 months ago

I'd also prefer finding out what is causing this, if it's our code or celery, I could imagine it's our blueprint watching code

DKingAlpha commented 9 months ago

I don't think this is anyone's fault. The existence of this whole lot of mess is a result of a combination of historical debts, long-standing real world solutions, and all-in-one components like systemd. We are currently considering mitigating this major issue by adding a few explicit parameters.

Lets just find a good number and move on.

tonydm commented 9 months ago

Thank you everybody for your efforts to resolve this issue. As @DKingAlpha suggested, I modified my docker-compose.yml file adding the ulimits option. This change didn't resolve the high cpu % or file i/o. I then modified the systemd docker.service commenting out the Limitxxxx entries. Running sudo systemctl daemon-reload; sudo systemctl restart docker.service. Same behavior, no discernible improvement of the high cpu % but did see a slight improvement in the file i/o, although still not a workable resolution.

  worker:
    image: ${AUTHENTIK_IMAGE:-ghcr.io/goauthentik/server}:${AUTHENTIK_TAG:-2023.10.4}
    container_name: authentik_worker
    restart: unless-stopped
    security_opt:
      - no-new-privileges: true
    user: "1001:994"
    ulimits:
      nofile:
        soft: 10240
        hard: 10240
    command: worker
    networks:
      authentik_network:
        ipv4_address: 192.168.45.6
    environment:
      AUTHENTIK_REDIS__HOST: redis
      AUTHENTIK_POSTGRESQL__HOST: postgresql
      AUTHENTIK_POSTGRESQL__USER: ${PG_USER:-authentik}
      AUTHENTIK_POSTGRESQL__NAME: ${PG_DB:-authentik}
      AUTHENTIK_POSTGRESQL__PASSWORD: ${PG_PASS}
    volumes:
      - /etc/localtime:/etc/localtime:ro
      - /etc/timezone:/etc/timezone:ro
      - /var/run/docker.sock:/var/run/docker.sock
      - ./media:/media
      - ./certs:/certs
      - ./custom-templates:/templates
    depends_on:
      - server
      - postgresql
      - redis

Selection_080

DKingAlpha commented 9 months ago

@tonydm Did you recreate the container rather than just restarting it?

Migration also took me a long time and as I am aware of, there was a bug (#7326) that broke migration between earlier versions, which need another fix. See if that's related to this case.

If docker compose down/up does not help, try profiling python process in container.

cenkalti commented 9 months ago

My issue is resolved after setting ulimits for worker in docker-compose file. The CPU didn't drop to zero immediately because worker container was stopped for last few weeks and there were many tasks queued up. After it processed all of them and logs stop flowing, the CPU usage did dropped to zero.

tonydm commented 9 months ago

@DkingAlpha - I did an up -d --force-create. When you say "profiling python process in container", are you referring profiling the or a process in the "the" worker container or a separate container in the same stack?

@cenkalti - How long would you say it tooK? I have other services on this server that grind to a halt. I'm ok with giving it some time, but there's a limit to the time I can tolerate.

cenkalti commented 9 months ago

@tonydm I haven't measured it specifically but it felt like 10 minutes to me.

tonydm commented 9 months ago

@cenkalti - Thank you for that. In my case, even at 30min, I was still seeing high cpu and file i/o. I admit I'm somewhat ignorant to what the "tasks that build up" refer. I'm not understanding what tasks would build up/queue as it relates to what Authentik's role is. In my case, there's no services configured to send requests to this Authentik deployment as of yet. As soon as I created and spun the stack up, I immediately saw the issue and therefore hadn't moved on to configuring any services to use Authentik.

DKingAlpha commented 9 months ago

@DKingAlpha - I did an up -d --force-create. When you say "profiling python process in container", are you referring profiling the or a process in the "the" worker container or a separate container in the same stack?

I will write a detailed guide here. Hope this help other people with further investigation.

How to locate the hot spot

  1. do a docker stats -a, find the hot container
  2. In the hot container, do a top, find the hot process
  3. In my case, luckily the hot process is a python process. There are some tools like py-spy that help you find the hot python code directly. Do a py-spy top --pid <PID>, that will give you output like this.
Collecting samples from 'python -m manage worker' (python v3.11.5)
Total Samples 1700
GIL: 26.00%, Active: 100.00%, Threads: 1

  %Own   %Total  OwnTime  TotalTime  Function (filename)                                                                                                                                    
100.00% 100.00%   17.00s    17.00s   close_open_fds (billiard/compat.py)
  0.00% 100.00%   0.000s    17.00s   handle (authentik/core/management/commands/worker.py)
  0.00% 100.00%   0.000s    17.00s   execute (django/core/management/base.py)
  ..............
  1. py-spy dump --pid <PID> will give you this.

The looping call stack:

Thread 36 (active): "MainThread"
    close_open_fds (billiard/compat.py:150)
    run (celery/beat.py:713)
    ............

This will be a great starting point. @tonydm

tonydm commented 9 months ago

@DKingAlpha - Awesome and thank you for the detail. I wasn't able to really get a good snapshot. Once I start the Authentik stack, the terminal becomes so unresponsive, I'm lucky to get the stack shut down.

I'm using a DigitalOcean droplet. I admit, it's not a beefy one, it's only a base config w/ 1CPU and 1G Ram. The avg CPU usages hovers about 5%. All my current production stack is running is (CF Tunnel, Traefik, Crowdsec, Crowdsec Bouncer, Portainer, Wordpress/& MySQL). I don't know what Authentik requires for resources but didn't expect this drastic result.

With all that said, attempting to test as you so kindly detailed, a docker stats -a stats are only partially written out to the screen and I am not able to exec into the container due to the high cpu load. Since I can't test and get any info as requested, perhaps the log can shed some light. And perhaps not.

Maybe this VM is simply under provisioned for the stack I'm trying to add test/run. I will try this on one of my homelab servers where I can provision an LXC or VM with say 4 cores and 2G Ram. Sorry to have yelled "squirrel" prematurely if that's this is the case.

Attaching to authentik_server, authentik_worker, postgresql, redis
postgresql        | 
postgresql        | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgresql        | 
redis             | 1:C 23 Dec 2023 22:13:53.285 # WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition. Being disabled, it can also cause failures without low memory condition, see https://github.com/jemalloc/jemalloc/issues/1328. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
postgresql        | chmod: /var/run/postgresql: Operation not permitted
postgresql        | 2023-12-23 22:13:53.363 UTC [1] LOG:  starting PostgreSQL 12.17 on x86_64-pc-linux-musl, compiled by gcc (Alpine 13.2.1_git20231014) 13.2.1 20231014, 64-bit
postgresql        | 2023-12-23 22:13:53.363 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgresql        | 2023-12-23 22:13:53.363 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgresql        | 2023-12-23 22:13:53.370 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgresql        | 2023-12-23 22:13:53.402 UTC [13] LOG:  database system was shut down at 2023-12-23 22:06:27 UTC
postgresql        | 2023-12-23 22:13:53.412 UTC [1] LOG:  database system is ready to accept connections
authentik_server  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369636.5412555, "file": "/authentik/lib/default.yml"}
authentik_server  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369636.5454283, "count": 14}
authentik_server  | {"event": "Starting authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369636.5461152}
authentik_server  | {"event": "PostgreSQL connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369636.5973084}
authentik_server  | {"event": "Redis Connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369636.6148486}
authentik_server  | {"event": "Finished authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369636.6149015}
authentik_worker  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369637.7875323, "file": "/authentik/lib/default.yml"}
authentik_worker  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369637.7888894, "count": 14}
authentik_worker  | {"event": "Starting authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369637.7906113}
authentik_worker  | {"event": "PostgreSQL connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369637.8375165}
authentik_worker  | {"event": "Redis Connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369637.8541021}
authentik_worker  | {"event": "Finished authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369637.854802}
authentik_server  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369639.1881032, "file": "/authentik/lib/default.yml"}
authentik_server  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369639.189375, "count": 14}
authentik_worker  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369639.6442103, "file": "/authentik/lib/default.yml"}
authentik_worker  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369639.6490474, "count": 14}
authentik_server  | 2023-12-23 22:13:59 [info     ] applying django migrations
authentik_server  | 2023-12-23 22:13:59 [info     ] waiting to acquire database lock
authentik_worker  | 2023-12-23 22:14:00 [info     ] applying django migrations
authentik_worker  | 2023-12-23 22:14:00 [info     ] waiting to acquire database lock
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5621328, "path": "authentik.admin.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5653405, "path": "authentik.crypto.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.567512, "path": "authentik.events.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5702424, "path": "authentik.outposts.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5761812, "path": "authentik.policies.reputation.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.581808, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5854623, "path": "authentik.sources.ldap.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5873864, "path": "authentik.sources.oauth.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.589161, "path": "authentik.sources.plex.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.5951707, "path": "authentik.stages.authenticator_totp.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.6086872, "path": "authentik.blueprints.settings"}
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369644.609579, "version": "2023.10.4"}
authentik_server  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369644.6124024}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369644.613856, "path": "authentik.enterprise.settings"}
authentik_server  | {"event": "Loaded GeoIP database", "last_write": 1700595417.0, "level": "info", "logger": "authentik.events.geo", "pid": 8, "timestamp": "2023-12-23T22:14:06.650937"}
authentik_server  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 8, "task_id": "f545370c07be486693f29dd6ae3c810b", "task_name": "authentik.blueprints.v1.tasks.blueprints_discovery", "timestamp": "2023-12-23T22:14:08.616136"}
authentik_server  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 8, "task_id": "ed3349f311524861825e3f51cc79d049", "task_name": "authentik.blueprints.v1.tasks.clear_failed_blueprints", "timestamp": "2023-12-23T22:14:08.618707"}
authentik_server  | {"app_name": "authentik.stages.authenticator", "event": "Could not import app's URLs", "exc": "ModuleNotFoundError(\"No module named 'authentik.stages.authenticator.urls'\")", "level": "warning", "logger": "authentik.api.v3.urls", "pid": 8, "timestamp": "2023-12-23T22:14:09.030950"}
authentik_server  | Operations to perform:
authentik_server  |   Apply all migrations: auth, authentik_blueprints, authentik_core, authentik_crypto, authentik_enterprise, authentik_events, authentik_flows, authentik_outposts, authentik_policies, authentik_policies_dummy, authentik_policies_event_matcher, authentik_policies_expiry, authentik_policies_expression, authentik_policies_password, authentik_policies_reputation, authentik_providers_ldap, authentik_providers_oauth2, authentik_providers_proxy, authentik_providers_radius, authentik_providers_saml, authentik_providers_scim, authentik_rbac, authentik_sources_ldap, authentik_sources_oauth, authentik_sources_plex, authentik_sources_saml, authentik_stages_authenticator_duo, authentik_stages_authenticator_sms, authentik_stages_authenticator_static, authentik_stages_authenticator_totp, authentik_stages_authenticator_validate, authentik_stages_authenticator_webauthn, authentik_stages_captcha, authentik_stages_consent, authentik_stages_deny, authentik_stages_dummy, authentik_stages_email, authentik_stages_identification, authentik_stages_invitation, authentik_stages_password, authentik_stages_prompt, authentik_stages_user_delete, authentik_stages_user_login, authentik_stages_user_logout, authentik_stages_user_write, authentik_tenants, contenttypes, guardian, sessions
authentik_server  | Running migrations:
authentik_server  |   No migrations to apply.
authentik_server  | {"event": "Bootstrap completed", "level": "info", "logger": "bootstrap"}
authentik_server  | {"event":"Loaded config","level":"debug","path":"inbuilt-default","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"Loaded config","level":"debug","path":"/authentik/lib/default.yml","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"Loaded config from environment","level":"debug","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"not enabling debug server, set `AUTHENTIK_DEBUG` to `true` to enable it.","level":"info","logger":"authentik.go_debugger","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"Starting Metrics server","level":"info","listen":"0.0.0.0:9300","logger":"authentik.router.metrics","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"Starting HTTP server","level":"info","listen":"0.0.0.0:9000","logger":"authentik.router","timestamp":"2023-12-23T22:14:12Z"}
authentik_server  | {"event":"Starting HTTPS server","level":"info","listen":"0.0.0.0:9443","logger":"authentik.router","timestamp":"2023-12-23T22:14:14Z"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.558283, "path": "authentik.admin.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.5637033, "path": "authentik.crypto.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.5653694, "path": "authentik.events.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.567944, "path": "authentik.outposts.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.5764332, "path": "authentik.policies.reputation.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.58151, "path": "authentik.providers.scim.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.589524, "path": "authentik.sources.ldap.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.5916042, "path": "authentik.sources.oauth.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.5933368, "path": "authentik.sources.plex.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.6081371, "path": "authentik.stages.authenticator_totp.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.6186192, "path": "authentik.blueprints.settings"}
authentik_worker  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369657.6195323, "version": "2023.10.4"}
authentik_worker  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369657.6222053}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369657.623476, "path": "authentik.enterprise.settings"}
authentik_server  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369658.9339645, "file": "/authentik/lib/default.yml"}
authentik_server  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369658.9385712, "count": 14}
authentik_worker  | {"event": "Loaded GeoIP database", "last_write": 1700595417.0, "level": "info", "logger": "authentik.events.geo", "pid": 8, "timestamp": "2023-12-23T22:14:21.077450"}
authentik_server  | {"event": "Starting gunicorn 21.2.0", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369662.364476}
authentik_server  | {"event": "Listening at: unix:/dev/shm/authentik-core.sock (15)", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369662.3709905}
authentik_server  | {"event": "Using worker: lifecycle.worker.DjangoUvicornWorker", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369662.37169}
authentik_server  | {"event": "Booting worker with pid: 17", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369662.3822675}
authentik_server  | {"event": "Booting worker with pid: 18", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369662.409381}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 8, "task_id": "4ecdeee092ba49cf82d842a0cbaaae0c", "task_name": "authentik.blueprints.v1.tasks.blueprints_discovery", "timestamp": "2023-12-23T22:14:25.958191"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 8, "task_id": "df0d11a7b1ce45bcb9054b977d408521", "task_name": "authentik.blueprints.v1.tasks.clear_failed_blueprints", "timestamp": "2023-12-23T22:14:25.969579"}
authentik_worker  | {"app_name": "authentik.stages.authenticator", "event": "Could not import app's URLs", "exc": "ModuleNotFoundError(\"No module named 'authentik.stages.authenticator.urls'\")", "level": "warning", "logger": "authentik.api.v3.urls", "pid": 8, "timestamp": "2023-12-23T22:14:28.092892"}
authentik_server  | {"event": "WORKER TIMEOUT (pid:18)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369693.3348784}
authentik_server  | {"event": "Worker (pid:18) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369694.154926}
authentik_server  | {"event": "WORKER TIMEOUT (pid:17)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369694.1638644}
authentik_server  | {"event": "Worker (pid:17) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369694.1787603}
authentik_server  | {"event": "Booting worker with pid: 30", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369694.207149}
authentik_server  | {"event": "Booting worker with pid: 31", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369694.2853556}
authentik_worker  | Operations to perform:
authentik_worker  |   Apply all migrations: auth, authentik_blueprints, authentik_core, authentik_crypto, authentik_enterprise, authentik_events, authentik_flows, authentik_outposts, authentik_policies, authentik_policies_dummy, authentik_policies_event_matcher, authentik_policies_expiry, authentik_policies_expression, authentik_policies_password, authentik_policies_reputation, authentik_providers_ldap, authentik_providers_oauth2, authentik_providers_proxy, authentik_providers_radius, authentik_providers_saml, authentik_providers_scim, authentik_rbac, authentik_sources_ldap, authentik_sources_oauth, authentik_sources_plex, authentik_sources_saml, authentik_stages_authenticator_duo, authentik_stages_authenticator_sms, authentik_stages_authenticator_static, authentik_stages_authenticator_totp, authentik_stages_authenticator_validate, authentik_stages_authenticator_webauthn, authentik_stages_captcha, authentik_stages_consent, authentik_stages_deny, authentik_stages_dummy, authentik_stages_email, authentik_stages_identification, authentik_stages_invitation, authentik_stages_password, authentik_stages_prompt, authentik_stages_user_delete, authentik_stages_user_login, authentik_stages_user_logout, authentik_stages_user_write, authentik_tenants, contenttypes, guardian, sessions
authentik_worker  | Running migrations:
authentik_worker  |   No migrations to apply.
authentik_worker  | {"event": "Bootstrap completed", "level": "info", "logger": "bootstrap"}
authentik_worker  | {"event": "Not running as root, disabling permission fixes", "level": "info", "logger": "bootstrap"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8785748, "path": "authentik.admin.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8792868, "path": "authentik.admin.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.886467, "path": "authentik.crypto.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8871675, "path": "authentik.crypto.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8904266, "path": "authentik.events.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8924773, "path": "authentik.events.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8972383, "path": "authentik.outposts.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.8984458, "path": "authentik.outposts.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9114258, "path": "authentik.policies.reputation.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.912212, "path": "authentik.policies.reputation.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9291341, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9288552, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9369292, "path": "authentik.sources.ldap.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9374728, "path": "authentik.sources.ldap.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.940863, "path": "authentik.sources.oauth.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9418244, "path": "authentik.sources.oauth.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9448261, "path": "authentik.sources.plex.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9454045, "path": "authentik.sources.plex.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.955154, "path": "authentik.stages.authenticator_totp.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.955549, "path": "authentik.stages.authenticator_totp.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.986778, "path": "authentik.blueprints.settings"}
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369705.9880707, "version": "2023.10.4"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9859402, "path": "authentik.blueprints.settings"}
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369705.9892952, "version": "2023.10.4"}
authentik_server  | 
authentik_server  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369705.99451}{"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369705.9951987}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9986255, "path": "authentik.enterprise.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369705.9982572, "path": "authentik.enterprise.settings"}
authentik_server  | 
authentik_worker  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369706.8443182, "file": "/authentik/lib/default.yml"}
authentik_worker  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369706.8486323, "count": 14}
authentik_server  | {"event": "WORKER TIMEOUT (pid:30)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369724.5067801}
authentik_server  | {"event": "WORKER TIMEOUT (pid:31)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369724.7103846}
authentik_server  | {"event": "Worker (pid:30) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369724.747366}
authentik_server  | {"event": "Booting worker with pid: 44", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369724.790362}
authentik_server  | {"event": "Worker (pid:31) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369724.7996645}
authentik_server  | {"event": "Booting worker with pid: 45", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369724.8408933}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.7730258, "path": "authentik.admin.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.7767522, "path": "authentik.crypto.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.7792552, "path": "authentik.events.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.7863832, "path": "authentik.outposts.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.7955418, "path": "authentik.policies.reputation.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.8062887, "path": "authentik.providers.scim.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.811645, "path": "authentik.sources.ldap.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.813819, "path": "authentik.sources.oauth.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.8183668, "path": "authentik.sources.plex.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.8247154, "path": "authentik.stages.authenticator_totp.settings"}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.8402586, "path": "authentik.blueprints.settings"}
authentik_worker  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369727.84119, "version": "2023.10.4"}
authentik_worker  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369727.8457196}
authentik_worker  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369727.8472505, "path": "authentik.enterprise.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369731.9883084, "path": "authentik.admin.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369731.987739, "path": "authentik.admin.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369731.9937491, "path": "authentik.crypto.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369731.993806, "path": "authentik.crypto.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0014281, "path": "authentik.events.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0015624, "path": "authentik.events.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.00888, "path": "authentik.outposts.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0114033, "path": "authentik.outposts.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0228395, "path": "authentik.policies.reputation.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0243278, "path": "authentik.policies.reputation.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0366573, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.038751, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.047628, "path": "authentik.sources.ldap.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0483813, "path": "authentik.sources.ldap.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0518484, "path": "authentik.sources.oauth.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0524206, "path": "authentik.sources.oauth.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.05587, "path": "authentik.sources.plex.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.0584347, "path": "authentik.sources.plex.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.069924, "path": "authentik.stages.authenticator_totp.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.070767, "path": "authentik.stages.authenticator_totp.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.110405, "path": "authentik.blueprints.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.111113, "path": "authentik.blueprints.settings"}
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369732.1119606, "version": "2023.10.4"}
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369732.1127207, "version": "2023.10.4"}
authentik_server  | 
authentik_server  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369732.1190588}{"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369732.1201537}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.1235511, "path": "authentik.enterprise.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369732.1240745, "path": "authentik.enterprise.settings"}
authentik_server  | 
authentik_worker  | {"event": "Loaded GeoIP database", "last_write": 1700595417.0, "level": "info", "logger": "authentik.events.geo", "pid": 6, "timestamp": "2023-12-23T22:15:32.344977"}
authentik_server  | {"event": "WORKER TIMEOUT (pid:44)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369755.5706391}
authentik_server  | {"event": "WORKER TIMEOUT (pid:45)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369755.6769283}
authentik_server  | {"event": "Worker (pid:44) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369755.7222085}
authentik_server  | {"event": "Worker (pid:45) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369755.7751777}
authentik_server  | {"event": "Booting worker with pid: 57", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369755.7872906}
authentik_server  | {"event": "Booting worker with pid: 58", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369755.8505762}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "aa63664fce86454cb88597033d7328cb", "task_name": "authentik.blueprints.v1.tasks.blueprints_discovery", "timestamp": "2023-12-23T22:16:01.077896"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "856971d09bf44036ae2cf0139162e60d", "task_name": "authentik.blueprints.v1.tasks.clear_failed_blueprints", "timestamp": "2023-12-23T22:16:01.099224"}
authentik_worker  | {"app_name": "authentik.stages.authenticator", "event": "Could not import app's URLs", "exc": "ModuleNotFoundError(\"No module named 'authentik.stages.authenticator.urls'\")", "level": "warning", "logger": "authentik.api.v3.urls", "pid": 6, "timestamp": "2023-12-23T22:16:29.641705"}
authentik_server  | {"event": "WORKER TIMEOUT (pid:58)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369788.7257516}
authentik_server  | {"event": "Worker (pid:58) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369790.5062706}
authentik_server  | {"event": "WORKER TIMEOUT (pid:57)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369790.5137055}
authentik_server  | {"event": "Booting worker with pid: 70", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369790.5653915}
authentik_server  | {"event": "Worker (pid:57) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369790.5718057}
authentik_server  | {"event": "Booting worker with pid: 71", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369790.6467197}
authentik_worker  | /ak-root/venv/lib/python3.11/site-packages/celery/platforms.py:799: SecurityWarning: An entry for the specified gid or egid was not found.
authentik_worker  | We're assuming this is a potential security issue.
authentik_worker  | 
authentik_worker  |   warnings.warn(SecurityWarning(ASSUMING_ROOT))
authentik_worker  | /ak-root/venv/lib/python3.11/site-packages/celery/platforms.py:829: SecurityWarning: You're running the worker with superuser privileges: this is
authentik_worker  | absolutely not recommended!
authentik_worker  | 
authentik_worker  | Please specify a different user using the --uid option.
authentik_worker  | 
authentik_worker  | User information: uid=1001 euid=1001 gid=994 egid=994
authentik_worker  | 
authentik_worker  |   warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
authentik_worker  | {"event": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine\nwhether broker connection retries are made during startup in Celery 6.0 and above.\nIf you wish to retain the existing behavior for retrying connections on startup,\nyou should set broker_connection_retry_on_startup to True.\n  warnings.warn(\n", "level": "warning", "logger": "py.warnings", "timestamp": 1703369793.5174503}
authentik_worker  | {"event": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine\nwhether broker connection retries are made during startup in Celery 6.0 and above.\nIf you wish to retain the existing behavior for retrying connections on startup,\nyou should set broker_connection_retry_on_startup to True.\n  warnings.warn(\n", "level": "warning", "logger": "py.warnings", "timestamp": 1703369793.8222506}
authentik_server  | {"event": "WORKER TIMEOUT (pid:70)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369827.1808195}
authentik_server  | {"event": "Worker (pid:70) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369833.2668498}
authentik_server  | {"event": "WORKER TIMEOUT (pid:71)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369833.1472306}
authentik_server  | {"event": "Worker (pid:71) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369833.3834198}
authentik_server  | {"event": "Booting worker with pid: 72", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369833.3974397}
authentik_worker  | {"event": "consumer: Connection to broker lost. Trying to re-establish the connection...", "exception": [{"exc_type": "ConnectionError", "exc_value": "Error -3 connecting to redis:6379. Temporary failure in name resolution.", "frames": [{"filename": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py", "line": "", "lineno": 340, "locals": {"blueprint": "<celery.worker.consumer.consumer.Consumer.Blueprint object at 0x7f252c038f10>", "connection_retry": "True", "connection_retry_type": "broker_connection_retry", "exc": "\"ConnectionError('Error -3 connecting to redis:6379. Temporary failure in name re\"+11", "is_connection_loss_on_startup": "False", "recoverable_errors": "\"(<class 'amqp.exceptions.ConnectionError'>, <class 'kombu.exceptions.Inconsisten\"+200", "self": "<Consumer: celery@d669eeb6a6b9 (running)>"}, "name": "start"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/celery/bootsteps.py", "line": "", "lineno": 116, "locals": {"i": "3", "parent": "<Consumer: celery@d669eeb6a6b9 (running)>", "self": "<celery.worker.consumer.consumer.Consumer.Blueprint object at 0x7f252c038f10>", "step": "<step: Gossip>"}, "name": "start"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/gossip.py", "line": "", "lineno": 107, "locals": {"__class__": "'step:celery.worker.consumer.gossip.Gossip{[step:celery.worker.consumer.mingle.Mi'+108", "c": "<Consumer: celery@d669eeb6a6b9 (running)>", "self": "<step: Gossip>"}, "name": "start"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/celery/bootsteps.py", "line": "", "lineno": 397, "locals": {"c": "<Consumer: celery@d669eeb6a6b9 (running)>", "channel": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>", "self": "<step: Gossip>"}, "name": "start"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/gossip.py", "line": "", "lineno": 175, "locals": {"channel": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>", "ev": "<celery.events.receiver.EventReceiver object at 0x7f252c0f2fd0>", "self": "<step: Gossip>"}, "name": "get_consumers"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/messaging.py", "line": "", "lineno": 402, "locals": {"accept": "None", "auto_declare": "None", "callbacks": "None", "channel": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>", "no_ack": "True", "on_decode_error": "None", "on_message": "'functools.partial(<bound method Gossip.on_message of <step: Gossip>>, <bound met'+105", "prefetch_count": "None", "queues": "'[<unbound Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <unbound Exchan'+34", "self": "'<Consumer: [<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange ce'+62", "tag_prefix": "None"}, "name": "__init__"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/messaging.py", "line": "", "lineno": 424, "locals": {"channel": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>", "qname": "celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae", "queue": "'<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange celeryev(fanou'+48", "self": "'<Consumer: [<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange ce'+62"}, "name": "revive"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/messaging.py", "line": "", "lineno": 438, "locals": {"queue": "'<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange celeryev(fanou'+48", "self": "'<Consumer: [<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange ce'+62"}, "name": "declare"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/entity.py", "line": "", "lineno": 617, "locals": {"channel": "None", "nowait": "False", "self": "'<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange celeryev(fanou'+48"}, "name": "declare"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/entity.py", "line": "", "lineno": 626, "locals": {"channel": "None", "nowait": "False", "self": "'<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange celeryev(fanou'+48"}, "name": "_create_queue"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/entity.py", "line": "", "lineno": 655, "locals": {"channel": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>", "nowait": "False", "passive": "False", "queue_arguments": "{}", "self": "'<Queue celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae -> <Exchange celeryev(fanou'+48"}, "name": "queue_declare"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/transport/virtual/base.py", "line": "", "lineno": 538, "locals": {"kwargs": "\"{'durable': False, 'exclusive': False, 'auto_delete': True, 'arguments': {}, 'no\"+13", "passive": "False", "queue": "celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae", "self": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>"}, "name": "queue_declare"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/kombu/transport/redis.py", "line": "", "lineno": 997, "locals": {"client": "Redis<ConnectionPool<Connection<host=redis,port=6379,db=0>>>", "pipe": "Pipeline<ConnectionPool<Connection<host=redis,port=6379,db=0>>>", "pri": "9", "queue": "celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae", "self": "<kombu.transport.redis.Channel object at 0x7f252c7cb810>"}, "name": "_size"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/client.py", "line": "", "lineno": 1425, "locals": {"conn": "None", "execute": "'<bound method Pipeline._execute_transaction of Pipeline<ConnectionPool<Connectio'+31", "raise_on_error": "True", "self": "Pipeline<ConnectionPool<Connection<host=redis,port=6379,db=0>>>", "stack": "\"[(('LLEN', 'celeryev.cac168d7-c335-4f1b-ac61-29db147c79ae'), {}), (('LLEN', 'cel\"+207"}, "name": "execute"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/connection.py", "line": "", "lineno": 1073, "locals": {"command_name": "MULTI", "connection": "Connection<host=redis,port=6379,db=0>", "keys": "(None,)", "options": "{}", "self": "ConnectionPool<Connection<host=redis,port=6379,db=0>>"}, "name": "get_connection"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/connection.py", "line": "", "lineno": 265, "locals": {"self": "Connection<host=redis,port=6379,db=0>"}, "name": "connect"}], "is_cause": false, "syntax_error": null}, {"exc_type": "gaierror", "exc_value": "[Errno -3] Temporary failure in name resolution", "frames": [{"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/connection.py", "line": "", "lineno": 259, "locals": {"self": "Connection<host=redis,port=6379,db=0>"}, "name": "connect"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/retry.py", "line": "", "lineno": 46, "locals": {"do": "<function AbstractConnection.connect.<locals>.<lambda> at 0x7f2539112160>", "fail": "<function AbstractConnection.connect.<locals>.<lambda> at 0x7f252c0e3100>", "failures": "0", "self": "<redis.retry.Retry object at 0x7f252c101e10>"}, "name": "call_with_retry"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/connection.py", "line": "", "lineno": 260, "locals": {"self": "Connection<host=redis,port=6379,db=0>"}, "name": "<lambda>"}, {"filename": "/ak-root/venv/lib/python3.11/site-packages/redis/connection.py", "line": "", "lineno": 585, "locals": {"err": "None", "self": "Connection<host=redis,port=6379,db=0>"}, "name": "_connect"}, {"filename": "/usr/local/lib/python3.11/socket.py", "line": "", "lineno": 962, "locals": {"addrlist": "[]", "family": "0", "flags": "0", "host": "redis", "port": "6379", "proto": "0", "type": "<SocketKind.SOCK_STREAM: 1>"}, "name": "getaddrinfo"}], "is_cause": false, "syntax_error": null}], "level": "warning", "logger": "celery.worker.consumer.consumer", "timestamp": 1703369828.154866}
authentik_worker  | {"event": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:391: CPendingDeprecationWarning: \nIn Celery 5.1 we introduced an optional breaking change which\non connection loss cancels all currently executed tasks with late acknowledgement enabled.\nThese tasks cannot be acknowledged as the connection is gone, and the tasks are automatically redelivered\nback to the queue. You can enable this behavior using the worker_cancel_long_running_tasks_on_connection_loss\nsetting. In Celery 5.1 it is set to False by default. The setting will be set to True by default in Celery 6.0.\n\n  warnings.warn(CANCEL_TASKS_BY_DEFAULT, CPendingDeprecationWarning)\n", "level": "warning", "logger": "py.warnings", "timestamp": 1703369833.3829083}
authentik_worker  | {"event": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine\nwhether broker connection retries are made during startup in Celery 6.0 and above.\nIf you wish to retain the existing behavior for retrying connections on startup,\nyou should set broker_connection_retry_on_startup to True.\n  warnings.warn(\n", "level": "warning", "logger": "py.warnings", "timestamp": 1703369833.3855643}
authentik_server  | {"event": "Booting worker with pid: 73", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369833.4969754}
authentik_worker  | {"event": "/ak-root/venv/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine\nwhether broker connection retries are made during startup in Celery 6.0 and above.\nIf you wish to retain the existing behavior for retrying connections on startup,\nyou should set broker_connection_retry_on_startup to True.\n  warnings.warn(\n", "level": "warning", "logger": "py.warnings", "timestamp": 1703369833.546229}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 54, "task_id": "ae07ef04985947bdb8e089559fe1b82c", "task_name": "authentik.core.tasks.clean_expired_models", "timestamp": "2023-12-23T22:17:13.569132"}
postgresql        | 2023-12-23 22:17:22.931 UTC [77] LOG:  could not send data to client: Broken pipe
postgresql        | 2023-12-23 22:17:22.935 UTC [77] FATAL:  connection to client lost
authentik_worker  | {"event": "Dispatching startup tasks...", "level": "info", "logger": "authentik.root.celery", "pid": 6, "timestamp": "2023-12-23T22:17:28.888536"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "70503033fb0c40298535ff0cf0c118aa", "task_name": "authentik.admin.tasks.clear_update_notifications", "timestamp": "2023-12-23T22:17:28.950062"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "37940c39cd7b462887f7f12af19c0a82", "task_name": "authentik.outposts.tasks.outpost_connection_discovery", "timestamp": "2023-12-23T22:17:29.136037"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "9497f24a719a4c85bda499e7507ff40d", "task_name": "authentik.outposts.tasks.outpost_controller_all", "timestamp": "2023-12-23T22:17:29.295290"}
authentik_worker  | {"event": "Task published", "level": "info", "logger": "authentik.root.celery", "pid": 6, "task_id": "740fdcefde384579a77d2dfaee15e3d7", "task_name": "authentik.providers.proxy.tasks.proxy_set_defaults", "timestamp": "2023-12-23T22:17:29.462171"}
authentik_server  | {"event": "WORKER TIMEOUT (pid:73)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369864.7710366}
authentik_server  | {"event": "Worker (pid:73) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369864.885364}
authentik_server  | {"event": "WORKER TIMEOUT (pid:72)", "level": "critical", "logger": "gunicorn.error", "timestamp": 1703369864.8922284}
authentik_server  | {"event": "Worker (pid:72) was sent SIGABRT!", "level": "error", "logger": "gunicorn.error", "timestamp": 1703369864.9235976}
authentik_server  | {"event": "Booting worker with pid: 86", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369864.9278958}
authentik_worker  | {"event": "Task started", "level": "info", "logger": "authentik.root.celery", "pid": 53, "task_id": "e70f3a65-74d3-4743-94aa-68a3cb8cfcc1", "task_name": "blueprints_discovery", "timestamp": "2023-12-23T22:17:31.091541"}
authentik_server  | {"event": "Booting worker with pid: 87", "level": "info", "logger": "gunicorn.error", "timestamp": 1703369865.0164363}
authentik_worker exited with code 0
authentik_worker  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369869.0300925, "file": "/authentik/lib/default.yml"}
authentik_worker  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369869.0316997, "count": 14}
authentik_worker  | {"event": "Starting authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369869.0324938}
authentik_worker  | {"event": "PostgreSQL connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369869.0912967}
authentik_worker  | {"event": "Redis Connection successful", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369869.1168923}
authentik_worker  | {"event": "Finished authentik bootstrap", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369869.1169467}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6563635, "path": "authentik.admin.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6587968, "path": "authentik.admin.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6664007, "path": "authentik.crypto.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6670341, "path": "authentik.crypto.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6704736, "path": "authentik.events.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.671064, "path": "authentik.events.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6788282, "path": "authentik.outposts.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.6799836, "path": "authentik.outposts.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.703838, "path": "authentik.policies.reputation.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.704351, "path": "authentik.policies.reputation.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.734674, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7353566, "path": "authentik.providers.scim.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7480862, "path": "authentik.sources.ldap.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.748839, "path": "authentik.sources.ldap.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7551134, "path": "authentik.sources.oauth.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7561867, "path": "authentik.sources.oauth.settings"}
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7585008, "path": "authentik.sources.plex.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.759309, "path": "authentik.sources.plex.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7748687, "path": "authentik.stages.authenticator_totp.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7752526, "path": "authentik.stages.authenticator_totp.settings"}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7975538, "path": "authentik.blueprints.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.7984152, "path": "authentik.blueprints.settings"}
authentik_server  | 
authentik_server  | {"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369871.8001134, "version": "2023.10.4"}{"event": "Booting authentik", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369871.8004303, "version": "2023.10.4"}
authentik_server  | 
authentik_server  | {"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369871.806917}{"event": "Enabled authentik enterprise", "level": "info", "logger": "authentik.lib.config", "timestamp": 1703369871.8081357}
authentik_server  | 
authentik_server  | {"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.8115244, "path": "authentik.enterprise.settings"}{"event": "Loaded app settings", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.8122318, "path": "authentik.enterprise.settings"}
authentik_server  | 
authentik_worker  | {"event": "Loaded config", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.8692417, "file": "/authentik/lib/default.yml"}
authentik_worker  | {"event": "Loaded environment variables", "level": "debug", "logger": "authentik.lib.config", "timestamp": 1703369871.8750205, "count": 14}
authentik_worker  | 2023-12-23 22:17:52 [info     ] applying django migrations
authentik_worker  | 2023-12-23 22:17:52 [info     ] waiting to acquire database lock
authentik-automation[bot] commented 7 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

MyIgel commented 6 months ago

Even if that is closed automatically, the fix in the docker-compose file worked for me. Thx for the investigation!

FloTheSysadmin commented 4 weeks ago

Worked for me, thanks