lancachenet / monolithic

A monolithic lancache service capable of caching all CDNs in a single instance
https://hub.docker.com/r/lancachenet/monolithic
Other
726 stars 73 forks source link

Running multiple instances of lancachenet/monolithic #116

Closed darki73 closed 3 years ago

darki73 commented 3 years ago

Describe the issue you are having

How are you running the container(s)

Running them in Unraid via Docker interface, each container on it's own IP address (192.168.2.201-204 for monolithic and 192.169.2.200 for DNS)

image

DNS Configuration

Not using generic cache, specifying IPs for each service i want to cache the data from

image

Server Specs

Processor: Ryzen Threadripper 2970WX Memory: 128GB Kingston ECC 3200Mhz Network Interface Card: Aquanta 10Gbit

All files initially downloaded to 1TB Samsung Evo 970 NVMe drive, so IO is not a bottleneck, then they are transferred to 7200 HDD

Output of container(s)

My question is, i am seeing an increase in speed download in all clients (initial download), however, when i am using docker stats the amount of data received by the containers is NOT what the client shows.

So is it worth running multiple containers for monolithic with one DNS container? Is it a mistake (the amount downloaded in the docker stats versus the actual download) that the downloaded amount does not match?

And one more question, is monolithic able to determine if the cached version of a chunk already exists or it simply overwrites the data? I've followed the Adding more IPs guide, and ended up with 1.2G downloaded in the client and 1.2G * 4 containers in the docker stats

Here is how my nslookup looks like image

MathewBurnett commented 3 years ago

When a file is requested monolithic will fetch it from the internet if missing or hand it our if cached already.

I have not found docker stats to be particularly accurate when it comes to net/disk io.

stale[bot] commented 3 years ago

This issue has been automatically marked as inactive because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 3 years ago

This issue has been automatically closed after being inactive for 30 days. If you require further assistance please reopen the issue with more details or talk to us on discord