fidlabs / Open-Data-Pathway

3 stars 7 forks source link

[DataCap Application] <<Re-forecast>> #19

Open Laycuicui opened 1 month ago

Laycuicui commented 1 month ago

Data Owner Name

NOAA

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

http://www.noaa.gov/

Social Media Handle

https://twitter.com/NOAA

Social Media Type

Other

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

390TiB

Number of replicas to store

8

Weekly allocation of DataCap requested

1000TiB

On-chain address for first allocation

f1wmz2bbj3xu7e23daodegr45bxrnz2373wftcifq

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

NOAA is an agency that enriches life through science. Our reach goes from the surface of the sun to the depths of the ocean floor as we work to keep the public informed of the changing environment around them. I'm a DP with some experience of distribute data.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

NOAA has generated a multi-decadal reanalysis and reforecast data set to accompany the next-generation version of its ensemble prediction system, the Global Ensemble Forecast System, version 12 (GEFSv12).

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

None

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

No response

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

https://registry.opendata.aws/noaa-gefs-reforecast/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Sporadic

For how long do you plan to keep this dataset stored on Filecoin

2 to 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, South America, Europe, Australia (continent)

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Shipping hard drives

How did you find your storage providers

Slack, Partners

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f03074583  
f03074586   
f03074587   
f03074589   
f03074592

How do you plan to make deals to your storage providers

Boost client, Lotus client, Singularity

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 1 month ago

Application is waiting for allocator review

kevzak commented 1 month ago

Hi @Laycuicui - thank you for applying here.

Two initial requests: 1) Please complete a KYC check of your GitHub ID using https://filplus.storage/kyc Here you can login and complete a quick Third Party check - no personal information is shared on GitHub. 2) Please share more information about the SP miner IDs you are working with. We need to know Entity and Location. f03074583
f03074586
f03074587
f03074589
f03074592

You say you want to store 8 copies, but you have 5 miner IDs. Our only requirement is 2 copies stored across 2 entities if you want to store 8 copies, we need more proof of a plan for that.

Laycuicui commented 1 month ago
image
  1. verified.

  2. Because the first round amount is only 500TiB, the plan is to find these 5 SPs to work with first. f03074583 Fly Japan f03074586 Datastone HK f03074587 Hanku Korea f03074589 Xing HK f03074592 HKblockchain USA

We will continue to look for new SPs. The SPs we are currently communicating with are f03028318, f02363305, f02182802.

kevzak commented 1 month ago

@Laycuicui

First round is 500TiB? I'm confused. You are applying 5PiB with dataset size 390TiB. Please clarify. Also please show proof of dataset size.

Laycuicui commented 1 month ago

image

image

image

Based on the storage barrels, it can be assumed that there are at least 400TiB of real data. I plan to send 8x copies, each for 390TiB raw data. Since there is a conversion rate between the raw data size and Datacap consumption size, each copy would require around 650TiB Datacap, with a total amount of 5PiB Datacap. Also first round is 500TiB.

kevzak commented 1 month ago

Hello @Laycuicui of the miner IDs you shared: f03074583 Fly Japan f03074586 Datastone HK f03074587 Hanku Korea f03074589 Xing HK f03074592 HKblockchain USA

We'll need some better representation of the entity and location. This list shows five IDs that are likely in the same datacenter. Need a better detail of who is storing which copies.

Also all five have 0% retrieval or are unavailable on Spark Dashboard. SPs you choose need to make retrievals available or you will not be able to receive DataCap from this allocator. Please advise

image

Laycuicui commented 1 week ago

@kevzak Sorry for the late reply.

  1. Each entity will be storing a backup, of course this is just the first round, as more and more quota is issued, more SPs will be entered for storage.
  2. Spark retrieval, SPs feedback that the Spark team is collaborating on all DC paths including V4 and V5, currently only collaborating on LDN in V4 mode, so for new storage in V5, the retrieval rate is 0. But rest assured, SPs are fully compliant. You can follow the follow-up report.
kevzak commented 1 week ago

@Laycuicui please complete this form with more detailed client and SP information so I can review: https://form.jotform.com/240786057753667