cookiecutter / cookiecutter-django

Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.
https://cookiecutter-django.readthedocs.io
BSD 3-Clause "New" or "Revised" License
12.1k stars 2.89k forks source link

REDIS_URL: unbound variable error #3769

Closed BenA-SA closed 2 years ago

BenA-SA commented 2 years ago

What happened?

I originally managed to get my project up and running on digitalocean with no problems, however needed to make some changes. Now when I try and get my project up, I am getting a REDIS_URL: unbound variable error. If I hardcode this in my entrypoint file, then I get the same issues with the POSTGRES variables. This would suggest to me that I'm having a problem reading the .env file, but I am a complete loss as to why this is. So postgres and traefik fail with exit code 1, and the other containers are simply waiting for progres. I've asked around on a few forums and been unable to resolve the issue, so hoping that posting here will get me some assistance from people with more specific knowledge!

What should've happened instead?

Project should've run.

Additional details

I believe the above summarises my problem to the extent I am able to understand it! Thanks for any assistance any is able to provide!

foarsitter commented 2 years ago

What changes did you made? How are you hosting your codebase? Check you path's on top of config/settings/base.py and config/wsgi.py and check READ_DOT_ENV_FILE on top of base.py.

BenA-SA commented 2 years ago

I'm hosting my codebase on a digitalocean droplet. The changes were not relevant - it was to do with a template in one of the directories. However the error began when I pulled the change from GitHub into my repository on digitalocean. As far as I am aware, I haven't made any changes to the files you've mentioned.

My READ_DOT_ENV_FILE is READ_DOT_ENV_FILE = env.bool("DJANGO_READ_DOT_ENV_FILE", default=False), however it is commented out in my .production/.django file?

DJANGO_READ_DOT_ENV_FILE=True.

Should this be commented out? I tried changing this yesterday I believe but it didn't appear to make any difference.

Could it be an issue with the ROOT_DIR path file? I've not changed any of the locations so that should still be correct?

Finally could it be an issue with docker-compose changing within my digitalocean droplet somehow?

foarsitter commented 2 years ago

If you are running without docker you should run merge_production_dotenvs_in_dotenv.py to create a .env file. Make sure DJANGO_READ_DOT_ENV_FILE=True is listed in the .env

BenA-SA commented 2 years ago

I am using docker-compose to "build" and "up" project - do I still need to do this?

I should say that this problem does not occur when running locally - only in my production environment.

If you are running without docker you should run merge_production_dotenvs_in_dotenv.py to create a .env file. Make sure DJANGO_READ_DOT_ENV_FILE=True is listed in the .env

foarsitter commented 2 years ago

Nevermind, forget what I said in my comment about the .env since you are running with docker-compose.

Do you run docker-compose with -f production.yml? Did you change production.yml? If so, can you post the contents? Are the .django and .postgres files available in .envs/.production? Sorry for my guessing here, trying to sum up the mistakes I made in the past.

BenA-SA commented 2 years ago

Nevermind, forget what I said in my comment about the .env since you are running with docker-compose.

Do you run docker-compose with -f production.yml? Did you change production.yml? If so, can you post the contents? Are the .django and .postgres files available in .envs/.production? Sorry for my guessing here, trying to sum up the mistakes I made in the past.

No problem at all - I appreciate guessing! I've been guessing for hours over the last few days so having other people's inputs is great. To answer your question, yes - I renamed my production.yml file to docker-compose.yml and yes I run docker-compose with -f docker-compose.yml. And yes the .django and .postgres are both available in the .envs/.production. I have tried taking them out, which then throws an error saying they're not there. When I put them back in, the error disappears. So the project is definitely aware they are there, but is just not reading the values from them for some reason I can't understand.

Separate question - I might just try and avoid the problem by changing my hosting provider - do you know what is recommended for django-cookiecutter docker hosting, or what do you use? I've no experience with Amazon hosting but willing to give it a go if it's going to be quicker than burning more time on this issue haha.

foarsitter commented 2 years ago

Using docker as abstraction layer makes it machine agnostic, so changing the server does not make any sense to me. Instead, try building en running production.yml locally. This shortens the deployment cycle.

BenA-SA commented 2 years ago

I have the same issue running production.yml locally. Does this help us understand the issue?

EDIT: and using -f local.yml in my production environment works. So this must mean there is an issue with my docker-compose.yml (production.yml) file?

Using docker as abstraction layer makes it machine agnostic, so changing the server does not make any sense to me. Instead, try building en running production.yml locally. This shortens the deployment cycle.

foarsitter commented 2 years ago

Can you post production.yml here? Production depends on a different Dockerfile and uses a different start script. Did you made changes to the production Dockerfile?

BenA-SA commented 2 years ago

Can you post production.yml here? Production depends on a different Dockerfile and uses a different start script. Did you made changes to the production Dockerfile?

I've resolved it. I had made changes to the production.yml file to include selenium, and thought I had not made changes to anything else, however upon comparing the github cookiecutter template with my production file, I had somehow fat fingered an extra letter into one of the image names. Can't believe I've spent so long resolving such a random and stupid accident.

Thanks for your assistance in resolving! Can't express how much I appreciate it.