PluskitOfficial / bookkeeping

bookkeeping
0 stars 0 forks source link

[DataCap Application] <NOAA-HRRR> #5

Open dataapplier opened 1 week ago

dataapplier commented 1 week ago

Data Owner Name

NOAA-HRRR

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

https://github.com/awslabs/open-data-docs/tree/main/docs/noaa/noaa-hrrr

Social Media Handle

/

Social Media Type

Other

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

10PiB

Expected size of single dataset (one copy)

2PiB

Number of replicas to store

5

Weekly allocation of DataCap requested

780TiB

On-chain address for first allocation

f1xr7g7h7mvi6lay74qkarkzmb4lnasr6caaruvhq

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced.
HRRR implementations at NCEP
HRRRv1 - 30 Sept 2014
HRRRv2 - 23 Aug 2016
HRRRv3 - 12 July 2018
HRRRv4 - 2 Dec 2020

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

The High-Resolution Rapid Refresh (HRRR) is sourced by Global System Laboratory from National Oceanic && Atmospheric Administration. It is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh.
HRRR dataset stored on AWS is the archive since 2014 with a total size of 2PiB and 38526457 Object.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

United States

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

Mainly official tools such as boost or lotus.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

It was stored before, but I want to store more copies for this is a very important dataset.

Please share a sample of the data

https://rapidrefresh.noaa.gov/hrrr/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

Permanently

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3)

How did you find your storage providers

Slack

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f02127525 HK
f02234029 HK
f02125293 HK
f02094399 CN GUANGDONG
f02061213CN HEILONGJIANG
f02128256 CANADA
f02041447 CN HUBEI
I will continue to cooperate with storage nodes that I have cooperated with before, and ensure that no nodes exceed 20%.

How do you plan to make deals to your storage providers

Boost client, Lotus client, Singularity

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 1 week ago

Application is waiting for allocator review

dataapplier commented 1 week ago

https://github.com/PluskitOfficial/bookkeeping/issues/4

datacap-bot[bot] commented 1 week ago

Datacap Request Trigger

Total DataCap requested

10PiB

Expected weekly DataCap usage rate

780TiB

DataCap Amount - First Tranche

390TiB

Client address

f1xr7g7h7mvi6lay74qkarkzmb4lnasr6caaruvhq

datacap-bot[bot] commented 1 week ago

DataCap Allocation requested

Multisig Notary address

Client address

f1xr7g7h7mvi6lay74qkarkzmb4lnasr6caaruvhq

DataCap allocation requested

390TiB

Id

88a202c7-ad9a-4832-8dc0-05116094b589

datacap-bot[bot] commented 1 week ago

Application is ready to sign

datacap-bot[bot] commented 1 week ago

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedxwi3rtygxsciuj3x7jzscffpf2hnhhvnqvq4nwwx3jlmmipeazm

Address

f1xr7g7h7mvi6lay74qkarkzmb4lnasr6caaruvhq

Datacap Allocated

390TiB

Signer Address

f1tgnlhtcmhwipfm7thsftxhn5k52velyjlazpvka

Id

88a202c7-ad9a-4832-8dc0-05116094b589

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedxwi3rtygxsciuj3x7jzscffpf2hnhhvnqvq4nwwx3jlmmipeazm

datacap-bot[bot] commented 1 week ago

Application is Granted