Closed snowme34 closed 5 years ago
Remember to prefix the commit message with the number of the issue, so for this one it would be #50: ...
Rather than using export SDSCHEDULE_SCRAPE=0
, we could just use unset SDSCHEDULE_SCRAPE
. It's cleaner; that way we don't persist anything for sure.
~Also, I thought running the script from the command line doesn't persist variables? As in exporting a variable from the script won't save it to the CL session.~ Ignore this portion. I'll think about a cleaner solution.
I've submitted some changes. Will remove myself as a reviewer.
My changes can be summed as such: rather than using the docker_install.sh script, we should just have the user run docker-compose
themselves, since that's all the script was doing anyway. Then, we can just have the user switch commands depending on their current context (listed in the README). I think it's a lot cleaner this way, and also avoids much of the reliance on export
that was being discussed earlier. Sure, the user could export SDSCHEDULE_SCRAPE=1
for their bash/zsh session if they wanted to, but one of the commands I listed does that already and doesn't screw with the global environment.
@snowme34 Can you spend some time and try to find a better way? Ideally there should be a command that runs and scrapes and one that runs without scraping.
@snowme34 Can you spend some time and try to find a better way? Ideally there should be a command that runs and scrapes and one that runs without scraping.
It's either we let the user set the environment, or we have different docker-compose for different situations, or we separate the scraper from the back-end.
Something as trivial as this should not require such complex solutions! Please look this over and clean it up if possible.
LGTM! By doing that it sets the bash variable in the context of the running session correct?