Closed prodalex closed 6 months ago
@alanshaw running script to gather upload data + dag cargo metrics -- scripts will run for a few days.
Summary:
2021 140.08TB (23,876,266)
mar: 48.75GB (4,306)
apr: 123.62GB (7,920)
may: 949.65GB (596,463)
jun: 1.36TB (579,641)
jul: 2.38TB (703,663)
aug: 3.52TB (1,481,545)
sep: 42.84TB (2,287,320)
oct: 8.59TB (3,046,295)
nov: 12.99TB (9,022,112)
dec: 67.3TB (6,147,001)
2022 524.1TB (138,265,968)
jan: 45.02TB (14,600,026)
feb: 19.58TB (13,929,981)
mar: 15.07TB (12,852,344)
apr: 18.66TB (12,474,496)
may: 15.24TB (7,253,530)
jun: 44.57TB (6,019,972)
jul: 36.29TB (13,596,979)
aug: 45.97TB (11,821,222)
sep: 44.71TB (11,466,150)
oct: 47.07TB (11,192,823)
nov: 74TB (9,973,740)
dec: 117.92TB (13,084,705)
2023 281.15TB (61,704,599)
jan: 33.8TB (12,401,359)
feb: 30.01TB (8,175,807)
mar: 62.83TB (9,476,979)
apr: 33.55TB (5,666,328)
may: 12.07TB (4,215,548)
jun: 16.16TB (3,761,764)
jul: 35.7TB (4,645,763)
aug: 9.59TB (2,375,961)
sep: 9.75TB (2,507,484)
oct: 7.13TB (3,333,733)
nov: 11.22TB (2,851,351)
dec: 19.35TB (2,292,522)
2024 23.06TB (8,581,017)
jan: 6.32TB (2,320,213)
feb: 7.52TB (2,205,368)
mar: 8.04TB (2,694,061)
apr: 1.19TB (1,361,375)
Using https://www.csvplot.com/ for graphs it is excellent!
Summary:
2022 369.23TB
apr: 163.82TB
may: 12.94TB
jun: 15.06TB
jul: 31.5TB
aug: 18.16TB
sep: 21.6TB
oct: 23.3TB
nov: 24.68TB
dec: 58.17TB
2023 86.12TB
jan: 380.66GB
apr: 16.31TB
may: 22.71TB
jun: 6.22TB
jul: 12.92TB
aug: 4.2TB
sep: 4.9TB
oct: 4.04TB
nov: 4.88TB
dec: 9.57TB
2024 11.14TB
jan: 3.32TB
feb: 3.88TB
mar: 3.34TB
apr: 604.25GB
Scripts to generate this data can be found in https://github.com/w3s-project/nftstorage-tools
Specifically count-*.mjs
.
Explanation on discrepancy between our stored data vs filecoin deal data: https://www.notion.so/w3sat/NFT-Storage-stored-data-vs-dagcargo-stored-data-on-filecoin-476fcf03c8a64e9391badf9d06ed6443
Rationale: Molly needs more justification into how much data has been stored on nft.storage (and respective filecoin deals) as she remembered lower numbers than the 600+ TB we communicated.
Data required (without spam):
out of scope for now: