Closed MikeHofmann closed 4 years ago
I’m uncertain how I can action this. I could setup a proxy and see if my mirroring fails, but I would expect it to just work and be hard to reproduce.
The same files says to me your proxy is maybe caching the upstream errors it got?
I’d suggest trying:
P.s. I’d also suggest taking the example blacklist out of your config :)
Also, the tips I gave here to try tweak might help too: https://github.com/pypa/bandersnatch/issues/511#issuecomment-626748283
I moved to a host with a direct internet connection. The base problem seems to persist, even if there is no proxy involved.
The same files says to me your proxy is maybe caching the upstream errors it got?
The proxy is only for whitelisting specific sites and doesn't do any caching.
1 or 2 workers - do you still fail?
switched to 1 worker
Use tcpdump to compare wget (working) behavior and aiohttp...
working on it.
I changed my config a little:
root@5a77221e3b7d:/# grep -Ev "^(;|$)" /etc/bandersnatch.conf
[mirror]
directory = /mnt/pypi
json = false
master = https://pypi.org
timeout = 10
workers = 1
hash-index = false
stop-on-error = false
verifiers = 3
[plugins]
enabled =
whitelist_project
[whitelist]
packages =
tf-nightly
so it just tries to mirror tf-nightly
(the operations
package, seems to be gone from pypi.org).
I got this running inside a docker container (based on ubuntu:bionic). This allows me to test this on a local machine, as well from our internal site using mostly the same conditions (except for the proxy).
ok, i believe i found the problem. I changed timeout
to 120 in bandersnatch.conf
and now the mirror runs. As it doesn't have anything to do with our proxy, i'm closing this but will make a contribution in #511
I'm behind a firewall and can't access
pypi.org
directly. Instead i use asquid3
withsquidguard
proxy to accesspypi.org
. We provide the proxy-address via env variable like so:please note, the proxy uses
http
(without s)Mirroring now always fails each time on the same set of packages, with for example:
i can however download the packages with for example
wget
without any problems.additional info: