filecoin-project / filecoin-plus-large-datasets

Hub for client applications for DataCap at a large scale
109 stars 62 forks source link

[DataCap Application] <DATADAO> - <NOAA-HRRR> #2327

Closed yse88483 closed 8 months ago

yse88483 commented 8 months ago

Data Owner Name

NOAA-HRRR

What is your role related to the dataset

Data Preparer

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

https://registry.opendata.aws/

Social Media

https://registry.opendata.aws/

Total amount of DataCap being requested

12PiB

Expected size of single dataset (one copy)

2PiB

Number of replicas to store

6

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1snhspx7pphhlr5i4ent6d7k66nz2sfkosmutbwy

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

The Registry of Open Data on AWS is now available on AWS Data Exchange
All datasets on the Registry of Open Data are now discoverable on AWS Data Exchange alongside 3,000+ existing data products from category-leading data providers across industries. Explore the catalog to find open, free, and commercial data sets. Learn more about AWS Data Exchange

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

There are no other stakeholders

Describe the data being stored onto Filecoin

The Registry of Open Data on AWS is now available on AWS Data Exchange
All datasets on the Registry of Open Data are now discoverable on AWS Data Exchange alongside 3,000+ existing data products from category-leading data providers across industries. Explore the catalog to find open, free, and commercial data sets. Learn more about AWS Data Exchange

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

United States

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

No response

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

357-resule 
2020-08-25 17:22:47  398.4 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf13.grib2
2020-08-25 17:22:52  396.0 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf14.grib2
2020-08-25 17:22:52  394.6 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf15.grib2
2020-08-25 17:22:52  390.2 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf16.grib2
2020-08-25 17:23:08  387.1 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf17.grib2
2020-08-25 17:23:05  384.8 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf18.grib2
2021-09-28 03:48:22   31.6 KiB index.html

Total Objects: 43282174
   Total Size: 2.1 PiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, Europe

How will you be distributing your data to storage providers

HTTP or FTP server, IPFS, Shipping hard drives, Lotus built-in data transfer

How do you plan to choose storage providers

Slack, Filmine

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

No response

How do you plan to make deals to your storage providers

Boost client, Lotus client, Bidbot

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

large-datacap-requests[bot] commented 8 months ago

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

yse88483 commented 8 months ago

f02850067 | ninadaidata@gmail.com | datastone | Hong Kong f01992563|ksjhsux888@gmail.com| SAcloud | china f01996719|ksjhsux888@gmail.com | SAcloud | china f01996817|ksjhsux888@gmail.com | SAcloud |china f02812304 | ndlabsipollo@gmail.com | SY-Tech | Singapore | f02320270 | ukvsusv@gmail.com | R1 | US

yse88483 commented 8 months ago
WX20240229-175336@2x
yse88483 commented 8 months ago

We submitted the form and hope to see more progress, thank you.

Sunnyiscoming commented 8 months ago
  1. There is a contradiction between the two value. Please correct them.

Total Size: 2.1 PiB Expected size of single dataset (one copy): 3PiB

  1. Please answer the following questions.

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details? No response

If you are not preparing the data, who will prepare the data? (Provide name and business) No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution. No response

Sunnyiscoming commented 8 months ago

SP List provided: [{"providerID":"f02850067","City":"HongKong","Country":"china","SPOrg","datastone"}, {"providerID":"f01992563","City":"XYZ","Country":"china","SPOrg","SAcloud"}, {"providerID":"f01996719","City":"XYZ","Country":"china","SPOrg","SAcloud"}, {"providerID":"f01996817","City":"XYZ","Country":"china","SPOrg","SAcloud"}, {"providerID":"f02812304","City":"XYZ","Country":"Singapore","SPOrg","SY-Tech"}, {"providerID":"f02320270","City":"XYZ","Country":"US","SPOrg","R1"},]

yse88483 commented 8 months ago

Hi, dear, we have modified the form:

Total amount of DataCap being requested 12PiB

Expected size of single dataset (one copy) 2PiB

Number of replicas to store 6

yse88483 commented 8 months ago

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details? Yes,I am data preparer, DATADAO have technical team that specializes in providing Car data packaging services to our clients, which include: 1. technical expertise: our team is skilled in using advanced tools such as go-graphsplit, singularity, etc. to ensure efficient and accurate data packaging. 2. Data Processing Flow: We use self-developed programs to cut, package, and compress the data provided by our clients into 28-32GB zip files to meet the storage requirements of the Filecoin network. Through these services, we are able to effectively help our clients process and prepare data for storage and use on the Filecoin network.

If you are not preparing the data, who will prepare the data? (Provide name and business) We are data preparer.

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution. Dataset been stored on the Filecoin,But there are three reasons why we use it to apply:

  1. This is a relatively precious dataset, we only store 6 backups.
  2. We started downloading the dataset three months ago and have downloaded a lot of data. the SP we are cooperating with is finally going to be launched. Now, we hope to get your approval.
  3. We have observed that many people have duplicate data storage for the dataset. We are downloading all the data of the dataset instead of only downloading part of the data.
Sunnyiscoming commented 8 months ago

Datacap Request Trigger

Total DataCap requested

12 PiB

Expected weekly DataCap usage rate

1 PiB

Client address

f1snhspx7pphhlr5i4ent6d7k66nz2sfkosmutbwy

large-datacap-requests[bot] commented 8 months ago

DataCap Allocation requested

Multisig Notary address

f02049625

Client address

f1snhspx7pphhlr5i4ent6d7k66nz2sfkosmutbwy

DataCap allocation requested

512TiB

Id

313730bc-e3f9-4ccb-ad35-e1cdaddfd833

yse88483 commented 8 months ago

Considering that v3.1 has no quota, it is temporarily closed.