goauthentik / authentik

The authentication glue you need.
https://goauthentik.io
Other
13.45k stars 897 forks source link

New install: Not Found on http://<ip>:9000/if/flow/initial-setup/ #11046

Open brunokc opened 2 months ago

brunokc commented 2 months ago

Describe the bug First time user, trying to install Authentik. Just followed the instructions for docker compose and at the end got an error.

To Reproduce

did what the docs suggested. Literally:

  1. wget https://goauthentik.io/docker-compose.yml
  2. Created my .env file by following the instructions:

echo "PG_PASS=$(openssl rand -base64 36 | tr -d '\n')" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 60 | tr -d '\n')" >> .env

Then added some email settings -- just used the example and changed AUTHENTIK_EMAIL__HOST and AUTHENTIK_EMAIL__FROM.

Then:

docker-compose up

Waited for the spew to calm down a bit and then hit the http://<ip>:9000/if/flow/initial-setup/ URL (with or without the trailing /, it doesn't matter) and got this:

image

Expected behavior I expected the initial flow to appear.

Logs docker-compose-authentik-logs.zip

Version and Deployment (please complete the following information):

Additional Context After many tries, I was sometimes able to see the initial-flow screen, asking for the admin's email and password (maybe 1 in every 10-20 attempts?). However, even in those cases, after entering email and password, I'd still get either a "Not Found" error, or an "Access Denied" error. After spending 2hrs on this yesterday I have yet to be able to login successfully and see the dashboard in Authentik.

I've seen reports similar to mine but contrary to those reports, nothing so far has helped.

MildlyInterested commented 2 months ago

I'm trying to find the issue I followed when encountering this. I used an older version of authentik for the inital setup and then pulled the latest image. Found it: version 2023.1.1 worked for me. https://github.com/goauthentik/authentik/issues/9584

brunokc commented 2 months ago

@MildlyInterested, thanks for the reference. I had actually seen it as I was trying to find a solution. I tried some of the suggestions there and none worked for me. I'm also hesitant to having to go back to such an older version to get this to work. The latest and greatest shouldn't have such a blocker of an issue, right?

The fact is, the setup is simple (3-step process), so hopefully there aren't any problems on my part (though I'm open to being educated here). Hopefully the team will see this and will take a look.

MildlyInterested commented 2 months ago

Agreed, not sure why this seems to still be a problem. I was following along this guide to get SWAG, Authentik and CrowdSec working. Authentik Version 2023.10.1 also seems to work, that's a bit more recent. Updating to the latest version afterwards worked without any issues.

CharlesChiuGit commented 2 months ago

@MildlyInterested

Authentik Version 2023.10.1

tks! I had been struggling w/ 2024.6.4 for few hours, 2023.10.1 works for me, at least i can login... then upgrade to 2024.6.4 works fine, no crashing anymore! i'm wrong, it still crashed after service started a few minutes. 2023.10.7 seen stable for me too.

brunokc commented 2 months ago

Agreed, not sure why this seems to still be a problem. I was following along this guide to get SWAG, Authentik and CrowdSec working. Authentik Version 2023.10.1 also seems to work, that's a bit more recent. Updating to the latest version afterwards worked without any issues.

Thanks for linking that guide. I took a quick look and it looked well written, I liked it. I'll take a closer later today. I might take a closer look at SWAG as well.

brunokc commented 2 months ago

Update: tried with 2024.6.4 now that the docker-compose file was updated. Same exact issue.

Gabee01 commented 2 months ago

I was having this issue, after a few tries I was able to fix it. I'm not 100% sure what fixed it, but my guess is I was generating the secret with openssl rand -base64 40 | docker secret create authentik_secret_key -

This generates a secret like this: 9uz3Q1r1/KUW3LX9gq2RLeavb1gfGo7h0sclTZqjgnFhEazjSWDMVA== With some special chars.

I've watched a video where the guy used pwgen -s 40 1 | docker secret create authentik_secret_key - to generate the secret (it generates something like this: ix08kia2C2lmE4dfPNXkzMdNSpCLmmSEUzMYwpd8).

I did the same and I think it helped, I had some issues with the db password as well, ended up using simple passwords (only letters and numbers) to fix it.

You also have to wait a while for things to start up.

I'm new to authentik, so I'm not sure if that should work like that or not.. anyway, hope this helps someone!

brunokc commented 2 months ago

Thanks for trying to help, @Gabee01!

I've watched a video where the guy used pwgen -s 40 1 | docker secret create authentik_secret_key - to generate the secret (it generates something like this: ix08kia2C2lmE4dfPNXkzMdNSpCLmmSEUzMYwpd8).

I did the same and I think it helped, I had some issues with the db password as well, ended up using simple passwords (only letters and numbers) to fix it.

I tried using pwgen to generate passwords without special characters and also a bit shorter, but that didn't seem to help. I got the same "Not Found" error upon opening the initial-setup page.

You also have to wait a while for things to start up.

Yes, in all my attempts I always wait for the log output to calm down a bit. That's my only indication that setup is done (it takes a while).

capdeveloping commented 2 months ago

I used the following compose file, which is not really diffrent from the original file. My password and token is generated with bitwarden generator. Just look at the worker logs. If this one is ready you can browse to the url and setup the user. I used portainer to deploy this compose file and portainer holds the env file. Thats why i don't need to add this inside the container.



services:
  postgresql:
    image: postgres:16.4-alpine
    restart: unless-stopped
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -d $${POSTGRES_DB} -U $${POSTGRES_USER}"]
      start_period: 20s
      interval: 30s
      retries: 5
      timeout: 5s
    volumes:
      - /path/to/authentik/database:/var/lib/postgresql/data
    environment:
      POSTGRES_PASSWORD: $PG_PASS
      POSTGRES_USER: $PG_USER
      POSTGRES_DB: $PG_DB
    networks:
      auth:

  redis:
    image: redis:7.4.0-alpine
    command: --save 60 1 --loglevel warning
    restart: unless-stopped
    healthcheck:
      test: ["CMD-SHELL", "redis-cli ping | grep PONG"]
      start_period: 20s
      interval: 30s
      retries: 5
      timeout: 3s
    volumes:
      - /path/to/authentik/redis:/data
    networks:
      auth:

  server:
    image: ghcr.io/goauthentik/server:2024.6.4
    restart: unless-stopped
    command: server
    hostname: authentik
    environment:
      AUTHENTIK_REDIS__HOST: redis
      AUTHENTIK_POSTGRESQL__HOST: postgresql
      AUTHENTIK_POSTGRESQL__USER: $PG_USER
      AUTHENTIK_POSTGRESQL__NAME: $PG_DB
      AUTHENTIK_POSTGRESQL__PASSWORD: $PG_PASS
      AUTHENTIK_SECRET_KEY: $AUTHENTIK_SECRET_KEY
      AUTHENTIK_DEBUG: True
    volumes:
      - /path/to/authentik/media:/media
      - /path/to/authentik/custom-templates:/templates
    ports:
      - 9900:9000
      - 9443:9443
    depends_on:
      - postgresql
      - redis
    networks:
      auth:

  worker:
    image: ghcr.io/goauthentik/server:2024.6.4
    restart: unless-stopped
    command: worker
    environment:
      AUTHENTIK_REDIS__HOST: redis
      AUTHENTIK_POSTGRESQL__HOST: postgresql
      AUTHENTIK_POSTGRESQL__USER: $PG_USER
      AUTHENTIK_POSTGRESQL__NAME: $PG_DB
      AUTHENTIK_POSTGRESQL__PASSWORD: $PG_PASS
      AUTHENTIK_SECRET_KEY: $AUTHENTIK_SECRET_KEY
    user: root
    volumes:
      - /path/to/authentik/media:/media
      - /path/to/authentik/certs:/certs
      - /path/to/authentik/custom-templates:/templates
    depends_on:
      - postgresql
      - redis
    networks:
      auth:

networks:
  auth:
    name: auth
    driver: bridge
    driver_opts:
      com.docker.network.bridge.name: br-docker-auth```
f0o commented 2 months ago

I can confirm that simply rebooting the entire stack (or server if that's easier) does solve it.

Using stock docker-compose.yml all I did was waiting for migrations to pass in the worker container, then see a lovely 404 on /if/flow/initial-setup/ and simply reboot the box. After the reboot the initial-setup was available.

tulexx commented 1 month ago

I had the same issue. Going back to 2023.1.1 like @MildlyInterested suggested fixed it (I removed all the docker-compose volumes data between changing versions) and I was able to log in.

Two caveats though:

It seems that there is some problem with initial setup on the 2024.8.* images (from my testing).

My docker-compose.yaml which didn't work was:

services:                                                                                                                                                                                                                                                  
  postgresql:                                                                                                                                                                                                                                              
    image: docker.io/library/postgres:16-alpine                                                                                                                                                                                                            
    restart: unless-stopped                                                                                                                                                                                                                                
    healthcheck:                                                                                                                                                                                                                                           
      test: ["CMD-SHELL", "pg_isready -d $${POSTGRES_DB} -U $${POSTGRES_USER}"]                                                                                                                                                                            
      start_period: 20s                                                                                                                                                                                                                                    
      interval: 30s                                                                                                                                                                                                                                        
      retries: 5                                                                                                                                                                                                                                           
      timeout: 5s                                                                                                                                                                                                                                          
    volumes:                                                                                                                                                                                                                                               
      - ./postgress:/var/lib/postgresql/data                                                                                                                                                                                                               
    environment:                                                                                                                                                                                                                                           
      POSTGRES_PASSWORD: ${PG_PASS:?database password required}
      POSTGRES_USER: ${PG_USER:-authentik}
      POSTGRES_DB: ${PG_DB:-authentik}
    env_file:
      - .env
  redis:
    image: docker.io/library/redis:alpine
    command: --save 60 1 --loglevel warning
    restart: unless-stopped
    healthcheck:
      test: ["CMD-SHELL", "redis-cli ping | grep PONG"]
      start_period: 20s
      interval: 30s
      retries: 5
      timeout: 3s
    volumes:
      - ./redis:/data
  server:
    image: ghcr.io/goauthentik/server:2024.8.2
    restart: unless-stopped
    command: server
    environment:
      AUTHENTIK_REDIS__HOST: redis
      AUTHENTIK_POSTGRESQL__HOST: postgresql
      AUTHENTIK_POSTGRESQL__USER: ${PG_USER:-authentik}
      AUTHENTIK_POSTGRESQL__NAME: ${PG_DB:-authentik}
      AUTHENTIK_POSTGRESQL__PASSWORD: ${PG_PASS}
      AUTHENTIK_BOOTSTRAP_PASSWORD: abcd
    volumes:
      - ./media:/media
      - ./custom-templates:/templates
    env_file:
      - .env
    ports:
      - "9000:9000"
      - "9443:9443"
    depends_on:
      - postgresql
      - redis
  worker:
    image: ghcr.io/goauthentik/server:2024.8.2
    restart: unless-stopped
    command: worker
    environment:
      AUTHENTIK_REDIS__HOST: redis
      AUTHENTIK_POSTGRESQL__HOST: postgresql
      AUTHENTIK_POSTGRESQL__USER: ${PG_USER:-authentik}
      AUTHENTIK_POSTGRESQL__NAME: ${PG_DB:-authentik}
      AUTHENTIK_POSTGRESQL__PASSWORD: ${PG_PASS}
    user: root
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./media:/media
      - ./certs:/certs
      - ./custom-templates:/templates
    env_file:
      - .env
    depends_on:
      - postgresql
      - redis
moneyblind commented 1 month ago

I was struggling with this as well.

Here is what I did to solve it.

Long Form: I removed the networks, cleared the volumes, deleted the containers, and closed all browsers that could phone home to the site. I then "docker compose up" to ensure it was done before I loaded a fresh browser (incognito, not that i meant anything, just saying what i did)

After the instance started, I saw the email prompt and was able to navigate to the setup. Once setup username/password, I restarted the service with the networks in there (separate DB network from server), it booted no issues.

Summary:

I believe more so that if the server receives a request too early during the creation of the volumes, database migrations, etc etc, it behaves sporadically. However, the inclusion / exclusion of external networks was a determining factor in 2 separate tests. Now that my instance is working for the time being, I am going to move on! Hope this info helps.

malwinmp commented 3 weeks ago

Problem still persists in 2024.8.3

JensHKnudsen commented 3 weeks ago

yes I can confirm issue does indeed exist in 2024.8.3...

juliu1902 commented 2 weeks ago

Can confirm too that the issue still persists. None of the suggestions worked for me either.. Has someone found another solution yet?

andrewlee102 commented 1 week ago

Issue exists as of 10/24/2024. As was suggested from @moneyblind, perhaps something to do with trying to access /initial-setup/ while worker is still doing its thing. I got it to work this way:

1) Exited all browsers except one showing portainer 2) Deleted last 24 hours browser history and cookies 3) Completely removed all Containers 4) Completely removed all Images 5) Completely removed all related volumes 6) docker compose up -d 7) Walked away from computer for 10 minutes (grabbed a drink, looked at pictures of cats) 8) Tried accessing /initial-setup/ and bam, it lets me create the admin user. No more "Not found" or Flow not set for this user, headaches.

I don't know exactly what the issue is or even if my solution would work with anyone else. I just took @moneyblind 's suggestion of allowing worker to do its thing first before trying anything in the browser and it worked for me. Hope it works for you guys as well (baffling that this isn't fixed or some sort of discussion on it on their discord already when people are having issues).

g3rzi commented 1 day ago

I had the same issue; it's funny, but it's because I didn't add the / at the end.
This is what I used and didn't work:

http://192.168.109.143:9000/if/flow/initial-setup  

This is the fix:

http://192.168.109.143:9000/if/flow/initial-setup/

Yes, funny.