molu8bits / s3bucket_exporter

S3 bucket exporter exposing metrics size and number of objects
MIT License
20 stars 10 forks source link

cannot make it work on S3 compatible storage #14

Closed pierreloicq closed 5 months ago

pierreloicq commented 5 months ago

Hi, I don't manage to make it work on a s3 compatible storage that is not AWS. On my local computer on windows 10, docker desktop and Git Bash, I do :

docker run -p 9655:9655 -d -e LISTEN_PORT=:9655 -e S3_DISABLE_SSL=False -e S3_ENDPOINT=https://s3.waw2-1.cloudferro.com -e S3_ACCESS_KEY=xxxxxxxxx -e S3_SECRET_KEY=xxxxxxxxxxxxxxxx -e S3_NAME=s3_cloudferro -e S3_DISABLE_ENDPOINT_HOST_PREFIX=True -e LOG_LEVEL=Debug -e S3_FORCE_PATH_STYLE=True docker.io/molu8bits/s3bucket_exporter:1.0.2

Then I do: curl -v http://127.0.0.1:9655/metrics

I get:

*   Trying 127.0.0.1:9655...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0* Connected to 127.0.0.1 (127.0.0.1) port 9655 (#0)
> GET /metrics HTTP/1.1
> Host: 127.0.0.1:9655
> User-Agent: curl/7.84.0
> Accept: */*
>
  0     0    0     0    0     0      0      0 --:--:--  0:07:56 --:--:--     0

In the logs I have:

time="2024-04-12T10:13:08Z" level=info msg="Beginning to serve on port:9655"
time="2024-04-12T10:13:08Z" level=info msg="s3 name 's3_wekeo' available at s3 endpoint 'https://s3.waw2-1.cloudferro.com' will be monitored"
time="2024-04-12T10:13:08Z" level=info msg="listenPort ::9655"

In the docker statistics, after one minute or so, the CPU usage increases and the network input increases by ~3 Mb/sec, reaching 500 Mb after like 2-3 minutes.

S3_NAME is not supposed to match with something, right ? It's just a string that I choose. Do you see any mistake ?

Thank you

molu8bits commented 5 months ago

Hi, It was supposed to work with S3 compatible storage. S3_NAME doesn't matter, it's the string you choose. Do you know how many objects are in this storage? I've never tested it with really huge number of objects on the bucket and suppose that problem might be with this.

molu8bits commented 5 months ago

Hi, It was supposed to work with S3 compatible storage. S3_NAME doesn't matter, it's the string you choose. Do you know how many objects are in this storage? I've never tested it with really huge number of objects on the bucket and suppose that problem might be with this.

pierreloicq commented 5 months ago

Alright, thank you. I indeed have a lot of objects, like > 50 million.

pierreloicq commented 5 months ago

Finally, because my goal was just to know if my s3 is reachable, I created a bucket with 1 file and target it in the S3_ENDPOINT url and it works. Thank you