Right now when storing a large amount of data, the cid provider system can consume a lot of memory. It's also probably not really needed to re-announce to the network that you're providing xTB of data multiple times a day, or however long the reprovider internal is.
Instead we should enable a slightly tweak provider system, one that instead randomly selects data we are storing and reprovides that, instead of providing the entire dataset all at once. In theory if you were storing sufficiently large amounts of data, say a petabyte or so, you probably wouldn't even be able to provide the entire set in a 24 hour window.
Right now when storing a large amount of data, the cid provider system can consume a lot of memory. It's also probably not really needed to re-announce to the network that you're providing xTB of data multiple times a day, or however long the reprovider internal is.
Instead we should enable a slightly tweak provider system, one that instead randomly selects data we are storing and reprovides that, instead of providing the entire dataset all at once. In theory if you were storing sufficiently large amounts of data, say a petabyte or so, you probably wouldn't even be able to provide the entire set in a 24 hour window.