Closed jkaberg closed 8 years ago
Tested and confirmed working
I hit that issue as well. The weird thing is that robots.txt
isn't disallowing any robot as far as I could tell. Also it happened only when reusing a container, and doing a docker-compose rm
first fixes it for the next start.
Did you encounter any SSL issue? The Plex website should be signed and removing that check makes me uncomfortable. May be split it as another PR.
@wernight yeah I did get an ssl error (see OP), dont know why?
I may merge this as a temporary fix, but I'd like to find the root cause. Are you using docker or docker-compose and/or do you have steps to reproduce the issue?
Just normal docker, I simply run the container as in the readme.md and restart when I see an update (or like today when PLEX was acting up) which gave me the errors you see in OP
Steps then seem to be:
docker[-compose] up -d ...
docker[-compose] stop && docker[-compose] up -d ...
Does that look like your steps? Did you for example just kill or did you docker rm
it?
@wernight I only did docker restart <container id>
(after initial setup)
Yes so that still follows the steps. docker rm
also fixes it, so there must be something remaining, some caching. I'm disappointed that Mechanize isn't documenting properly its caching mechanism.
By default mechanize checks robots.txt before doing anything more, and since Plex is now using robots.txt we need to bypass this in order to make the container work. Obviously not what the Plex team wants
This fixes:
and (when applying ignore robots.txt patch you get:)