Closed TheGITofTeo997 closed 2 years ago
I quickly read your debug log. I didn't find any obvious problem.
I tested one more time using latest
image (currently 2022.09.2
) and I got the password set via WEBPASSWORD
:
...
[i] Installing latest logrotate script...
[i] Existing logrotate file found. No changes made.
[i] Assigning password defined by Environment Variable
[✓] New password set
[i] Added ENV to php:
...
What messages do you see on docker's log, when you start the container?
I quickly read your debug log. I didn't find any obvious problem.
I tested one more time using
latest
image (currently2022.09.2
) and I got the password set viaWEBPASSWORD
:... [i] Installing latest logrotate script... [i] Existing logrotate file found. No changes made. [i] Assigning password defined by Environment Variable [✓] New password set [i] Added ENV to php: ...
What messages do you see on docker's log, when you start the container?
Thanks for the answer. I updated too to 2022.09.2 (I was using .1) but still things are not working.
What I've noted too, is that the behaviour I descripted here it's actually a bit different. I don't get a new password upon every restart, I always get the SAME one (which unfortunately is not mine).
If I inspect setupVars.conf after the container has started, I always get the line:
WEBPASSWORD=196dec5af4f0dfc9fbf53ee837e9d515482d8b40f248466907d548f699cd5b53
I got used to it and remembered it, but again this is way far from what I've set in the env var.
This is the log I get on my Raspberry with my customized (just added some vars about upstream servers) configuration:
root@dns:~# docker logs pihole
s6-rc: info: service s6rc-oneshot-runner: starting
s6-rc: info: service s6rc-oneshot-runner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service cron: starting
s6-rc: info: service cron successfully started
s6-rc: info: service _uid-gid-changer: starting
s6-rc: info: service _uid-gid-changer successfully started
s6-rc: info: service _startup: starting
[i] Starting docker specific checks & setup for docker pihole/pihole
[i] Setting capabilities on pihole-FTL where possible
[i] Applying the following caps to pihole-FTL:
* CAP_CHOWN
* CAP_NET_BIND_SERVICE
* CAP_NET_RAW
[i] Ensuring basic configuration by re-running select functions from basic-install.sh
[i] Installing configs from /etc/.pihole...
[i] Existing dnsmasq.conf found... it is not a Pi-hole file, leaving alone!
[✓] Installed /etc/dnsmasq.d/01-pihole.conf
[✓] Installed /etc/dnsmasq.d/06-rfc6761.conf
[i] Installing latest logrotate script...
[i] Existing logrotate file found. No changes made.
[i] Assigning password defined by Environment Variable
[✓] New password set
[i] Added ENV to php:
"TZ" => "Europe/Rome",
"PIHOLE_DOCKER_TAG" => "2022.09.2",
"PHP_ERROR_LOG" => "/var/log/lighttpd/error-pihole.log",
"CORS_HOSTS" => "",
"VIRTUAL_HOST" => "0.0.0.0",
[i] Using IPv4 and IPv6
[i] Preexisting ad list /etc/pihole/adlists.list detected (exiting setup_blocklists early)
[i] Converting DNS1 to PIHOLE_DNS_
[i] Converting DNS2 to PIHOLE_DNS_
[i] Setting DNS servers based on PIHOLE_DNS_ variable
[i] Applying pihole-FTL.conf setting LOCAL_IPV4=0.0.0.0
[i] FTL binding to default interface: eth0
[i] Enabling Query Logging
[i] Testing lighttpd config: Syntax OK
[i] All config checks passed, cleared for startup ...
[i] Docker start setup complete
Pi-hole version is v5.12 (Latest: v5.12)
AdminLTE version is v5.14.2 (Latest: v5.14.2)
FTL version is v5.17 (Latest: v5.17)
Container tag is: 2022.09.2
[i] pihole-FTL (no-daemon) will be started as pihole
s6-rc: info: service _startup successfully started
s6-rc: info: service pihole-FTL: starting
s6-rc: info: service pihole-FTL successfully started
s6-rc: info: service lighttpd: starting
s6-rc: info: service lighttpd successfully started
s6-rc: info: service _gravityonboot: starting
s6-rc: info: service _gravityonboot successfully started
s6-rc: info: service legacy-services: starting
Checking if custom gravity.db is set in /etc/pihole/pihole-FTL.conf
s6-rc: info: service legacy-services successfully started
[i] Neutrino emissions detected...
[✓] Pulling blocklist source list into range
[✓] Preparing new gravity database
[i] Using libz compression
[i] Target: https://raw.githubusercontent.com/StevenBlack/hosts/master/hosts
[✓] Status: Retrieval successful
[i] Analyzed 137497 domains
[i] List stayed unchanged
[i] Target: http://sysctl.org/cameleon/hosts
[✓] Status: No changes detected
[i] Analyzed 20562 domains
[i] Target: https://s3.amazonaws.com/lists.disconnect.me/simple_tracking.txt
[✗] Status: s3.amazonaws.com is blocked by . Using DNS on 172.18.0.2#5054 to download https://s3.amazonaws.com/lists.disconnect.me/simple_tracking.txt
[✓] Status: No changes detected
[i] Analyzed 34 domains
[i] Target: https://s3.amazonaws.com/lists.disconnect.me/simple_ad.txt
[✗] Status: s3.amazonaws.com is blocked by . Using DNS on 172.18.0.2#5054 to download https://s3.amazonaws.com/lists.disconnect.me/simple_ad.txt
[✓] Status: No changes detected
[i] Analyzed 2701 domains
[i] Target: http://phishing.mailscanner.info/phishing.bad.sites.conf
[✓] Status: Retrieval successful
[i] Analyzed 47574 domains
[i] List has been updated
[i] Target: https://gitlab.com/quidsup/notrack-blocklists/raw/master/notrack-blocklist.txt
[✓] Status: Retrieval successful
[i] Analyzed 14970 domains
[i] List stayed unchanged
[i] Target: https://gitlab.com/quidsup/notrack-blocklists/raw/master/notrack-malware.txt
[✓] Status: Retrieval successful
[i] Analyzed 341 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/crazy-max/WindowsSpyBlocker/master/data/hosts/spy.txt
[✓] Status: Retrieval successful
[i] Analyzed 347 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/w13d/adblockListABP-PiHole/master/Spotify.txt
[✓] Status: Retrieval successful
[i] Analyzed 47 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/StevenBlack/hosts/master/data/StevenBlack/hosts
[✓] Status: Retrieval successful
[i] Analyzed 2164 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/FadeMind/hosts.extras/master/add.Spam/hosts
[✓] Status: Retrieval successful
[i] Analyzed 57 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/FadeMind/hosts.extras/master/add.Risk/hosts
[✓] Status: Retrieval successful
[i] Analyzed 2190 domains
[i] List stayed unchanged
[i] Target: https://adaway.org/hosts.txt
[✓] Status: No changes detected
[i] Analyzed 7193 domains
[i] Target: https://v.firebog.net/hosts/AdguardDNS.txt
[✓] Status: No changes detected
[i] Analyzed 47292 domains
[i] Target: https://raw.githubusercontent.com/anudeepND/blacklist/master/adservers.txt
[✓] Status: Retrieval successful
[i] Analyzed 42553 domains
[i] List stayed unchanged
[i] Target: https://v.firebog.net/hosts/Easylist.txt
[✓] Status: No changes detected
[i] Analyzed 21021 domains, 1 domains invalid!
Sample of invalid domains:
- wytyphoji.pro&popup
[i] Target: https://pgl.yoyo.org/adservers/serverlist.php?hostformat=hosts&showintro=0&mimetype=plaintext
[✓] Status: No changes detected
[i] Analyzed 3671 domains
[i] Target: https://raw.githubusercontent.com/FadeMind/hosts.extras/master/UncheckyAds/hosts
[✓] Status: Retrieval successful
[i] Analyzed 9 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/bigdargon/hostsVN/master/hosts
[✓] Status: Retrieval successful
[i] Analyzed 17417 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/jdlingyu/ad-wars/master/hosts
[✓] Status: Retrieval successful
[i] Analyzed 1665 domains
[i] List stayed unchanged
[i] Target: https://raw.githubusercontent.com/DandelionSprout/adfilt/master/Alternate%20versions%20Anti-Malware%20List/AntiMalwareHosts.txt
[✓] Status: Retrieval successful
[i] Analyzed 8344 domains
[i] List stayed unchanged
[i] Target: https://osint.digitalside.it/Threat-Intel/lists/latestdomains.txt
[✓] Status: No changes detected
[i] Analyzed 62 domains
[i] Target: https://s3.amazonaws.com/lists.disconnect.me/simple_malvertising.txt
[✗] Status: s3.amazonaws.com is blocked by . Using DNS on 172.18.0.2#5054 to download https://s3.amazonaws.com/lists.disconnect.me/simple_malvertising.txt
[✓] Status: No changes detected
[i] Analyzed 2735 domains
[i] Target: https://v.firebog.net/hosts/Prigent-Malware.txt
[✓] Status: No changes detected
[i] Analyzed 64491 domains
[i] Target: https://bitbucket.org/ethanr/dns-blacklists/raw/8575c9f96e5b4a1308f2f12394abd86d0927a4a0/bad_lists/Mandiant_APT1_Report_Appendix_D.txt
[✓] Status: No changes detected
[i] Analyzed 2046 domains
[i] Target: https://phishing.army/download/phishing_army_blocklist_extended.txt
[✓] Status: Retrieval successful
[i] Analyzed 110008 domains
[i] List has been updated
[i] Target: https://v.firebog.net/hosts/Shalla-mal.txt
[✗] Status: Not found
[✗] List download failed: using previously cached list
[i] Analyzed 19239 domains
[i] Target: https://raw.githubusercontent.com/Spam404/lists/master/main-blacklist.txt
[✓] Status: Retrieval successful
[i] Analyzed 8147 domains
[i] List stayed unchanged
[i] Target: https://urlhaus.abuse.ch/downloads/hostfile/
[✓] Status: Retrieval successful
[i] Analyzed 718 domains
[i] List has been updated
[i] Target: https://raw.githubusercontent.com/HorusTeknoloji/TR-PhishingList/master/url-lists.txt
[✓] Status: Retrieval successful
[i] Analyzed 826138 domains
[i] List stayed unchanged
[i] Target: https://v.firebog.net/hosts/Airelle-hrsk.txt
[✗] Status: Not found
[✗] List download failed: no cached list available
[i] Target: https://blocklistproject.github.io/Lists/drugs.txt
[✓] Status: No changes detected
[i] Analyzed 26588 domains
[i] Target: https://blocklistproject.github.io/Lists/fraud.txt
[✓] Status: No changes detected
[i] Analyzed 196088 domains
[i] Target: https://blocklistproject.github.io/Lists/malware.txt
[✓] Status: No changes detected
[i] Analyzed 435308 domains
[i] Target: https://blocklistproject.github.io/Lists/phishing.txt
[✓] Status: No changes detected
[i] Analyzed 190241 domains
[i] Target: https://blocklistproject.github.io/Lists/ransomware.txt
[✓] Status: No changes detected
[i] Analyzed 1904 domains
[i] Target: https://blocklistproject.github.io/Lists/scam.txt
[✓] Status: No changes detected
[i] Analyzed 1265 domains
[i] Target: https://blocklistproject.github.io/Lists/ads.txt
[✓] Status: No changes detected
[i] Analyzed 154558 domains
[i] Target: https://github.com/mhxion/pornaway/blob/8a807e4bf0cda9cc94879624e8674889d5c81dd5/hosts/porn_ads.txt
[✓] Status: Retrieval successful
[i] Analyzed 15 domains, 15 domains invalid!
Sample of invalid domains:
- content="github.com">
- content="github.com">
- data-target="input-demux.source"
- data-targets="input-demux.sinks"
- cache-key="v0:1613426343.298404"
[i] List stayed unchanged
[✓] Creating new gravity databases
[✓] Storing downloaded domains in new gravity database
[✓] Building tree
[✓] Swapping databases
[✓] The old database remains available.
[i] Number of gravity domains: 2417184 (2060291 unique domains)
[i] Number of exact blacklisted domains: 22
[i] Number of regex blacklist filters: 0
[i] Number of exact whitelisted domains: 47
[i] Number of regex whitelist filters: 0
[✓] Cleaning up stray matter
[✓] FTL is listening on port 53
[✓] UDP (IPv4)
[✓] TCP (IPv4)
[✓] UDP (IPv6)
[✓] TCP (IPv6)
[✓] Pi-hole blocking is enabled
I'm guessing here, but maybe you current password has "different" characters or a space and the final password gets different than the password you expected.
Can you test again using a very simple password (like 123456
)?
I'm guessing here, but maybe you current password has "different" characters or a space and the final password gets different than the password you expected.
Can you test again using a very simple password (like
123456
)?
Okay so some heavy misunderstanding was going on here. I've found the issue.
This is actually a bit embarassing, so I hope I will be forgiven for this. Here's what was going on:
I've had my launch script for so much time (probably years, up to when the docker-run.sh was in the root of this repo before it being reorganized). For some reasons that I don't even remember, I had my env-variable WEBPASSWORD set equal to the HASH and not the actual password! (Yes you can be mad at me now).
So I had it like -e WEBPASSWORD="eb7650860e0b1b80adb014f96f075fe9e515937907c95dde7a70762483aa0972"
The thing is, that for all these years this worked flawlessly and now I'm wondering why. Of course now I found out that I have to store the password as plain text, this means that my actual password was being hashed TWO TIMES, in fact I was able to log in with my hash as password.
Now this arises some questions to me, first of course, how this worked for years if I was doing it in the wrong way (maybe the utilize of this variable changed over time?). If it was wrong, why it randomly and concurrently broke when the WEBPASSWORD issue was introduced? And eventually, why did this work by removing the "hashed hash" from the running container's setupVars.conf? By doing so I was able to log in with my ACTUAL password.
Now I learned the lesson and I will store it plaintext if there are not better ways. Thanks for understanding.
The behavior was changed a few months ago.
The original behavior was to always use the setupVars
password, if set. We only used the environment variable if there was nothing set.
Now, the behavior is to always use the WEBPASSWORD
var. If not present, use setupVars
. If not present, generate a new one.
In your case:
When you removed the WEBPASSWORD
var nothing new was set, the hash (already set in setupVars
) was used and you could login with your password.
Now I learned the lesson and I will store it plaintext if there are not better ways
If you really want to avoid storing your password, you can execute pihole -a -p
after you create your container.
Note: storing the hash is also not safe (you shouldn't post your hash, if your system can be externally accessed).
Now I learned the lesson and I will store it plaintext if there are not better ways
If you really want to avoid storing your password, you can execute
pihole -a -p
after you create your container.Note: storing the hash is also not safe (you shouldn't post your hash, if your system can be externally accessed).
I've actually restored the original, wanted behaviour. I've ran the container once with the env var WEBPASSWORD, so that this time it could create the right hash in setupVars, after this I stopped the container and deleted the env var in plaintext, by doing so as you suggested, the password is now picked from the encoded steupVars, resulting in the login to work. Sorry again for this
This is a: Bug
Details
Hello, as already discussed in #1087, I am unfortunately getting still this behavior. Whenever I restart the container/machine on which is hosted on, my password specified through -e WEBPASSWORD gets overwritten/changed. The only way to log into the interface is to manually remove the
WEBPASSWORD="different stuff from what I've specified"
from the setupVars.conf, and then it starts to work.To make sure it was not something with my config I did many tests (hope it's actually not something from my config). As stated in #1087, I firstly tried to run the script on my Raspberry without any -v volume in a new container with the latest 2022.09.1 tag, but this did not fix it. So I decided to make the final test, I installed a new fresh Debian 9 in VMWare and git cloned this repo to run the container with the vanilla script, just including my WEBPASSWORD and leaving it all as it is: I still got the same behaviour.
At this point I don't know if I am probably missing something by my end, in this case I apologize if I opened a duplicate issue, if that's the case please just close this.
As I was asked I generated a debug token from my Raspberry which you can find at: https://tricorder.pi-hole.net/yBqCXD9x/
Thanks in advance
Related Issues
How to reproduce the issue
Environment data Note: Since I got this issue on my Raspberry Pi 3 where I use Pi-Hole all the time, I decided to run an additional fresh session on a new fresh Debian machine to nail out possible issues. I will state both
docker-compose.yml contents, docker run shell command, or paste a screenshot of any UI based configuration of containers here
any additional info to help reproduce
These common fixes didn't work for my issue
docker run
example(s) in the readme (removing any customizations I added)If the above debugging / fixes revealed any new information note it here. Add any other debugging steps you've taken or theories on root cause that may help.