Carohere / Caro-Allocator

0 stars 0 forks source link

[DataCap Application] National Blend of Models #1

Open bernie414qq opened 4 months ago

bernie414qq commented 4 months ago

Version

1

DataCap Applicant

Bernie

Project ID

1

Data Owner Name

National Oceanic and Atmospheric Administration

Data Owner Country/Region

Afghanistan

Data Owner Industry

Life Science / Healthcare

Website

https://www.noaa.gov/

Social Media Handle

https://twitter.com/NOAA

Social Media Type

Twitter

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

10PiB

Expected size of single dataset (one copy)

1084TiB

Number of replicas to store

9

Weekly allocation of DataCap requested

512TiB

On-chain address for first allocation

f1o7zezcse2zkyrrvp6oeqrztchxwu3d62qeej45i

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

This dataset keeps on updating, so we plan to store this data into filecoin step by step.
The National Blend of Models (NBM) is a nationally consistent and skillful suite of calibrated forecast guidance based on a blend of both NWS and non-NWS numerical weather prediction model data and post-processed model guidance. The goal of the NBM is to create a highly accurate, skillful and consistent starting point for the gridded forecast.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

National Blend of Models (NBM) COG Format
New data notifications for NBM-COG Format, only Lambda and SQS protocols allowed
National Blend of Models (NBM) Grib2 Format
New data notifications for NBM-Grib2 Format, only Lambda and SQS protocols allowed

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

Korea, Republic of

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

We will split and pack the data into zip by using our program. Then we will convert these files into car files by lotus and transfer it to sp via online & offline way.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

s3://noaa-nbm-pds/
s3://noaa-nbm-grib2-pds/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

More than 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, South America, Europe, Australia (continent)

How will you be distributing your data to storage providers

HTTP or FTP server, Shipping hard drives, Lotus built-in data transfer

How did you find your storage providers

Slack, Big Data Exchange

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f02130185|Guangdong
f01843749|HongKong
f02029743|Singapore
f0420161|Canada
f02363999|Singapore

How do you plan to make deals to your storage providers

Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 4 months ago

Application is waiting for allocator review

Carohere commented 4 months ago

Already communicated with applicant on slack

datacap-bot[bot] commented 4 months ago

Datacap Request Trigger

Total DataCap requested

10PiB

Expected weekly DataCap usage rate

512TiB

DataCap Amount - First Tranche

1PiB

Client address

f1o7zezcse2zkyrrvp6oeqrztchxwu3d62qeej45i

datacap-bot[bot] commented 4 months ago

DataCap Allocation requested

Multisig Notary address

Client address

f1o7zezcse2zkyrrvp6oeqrztchxwu3d62qeej45i

DataCap allocation requested

1PiB

Id

1b6b661a-9ad8-4edd-91e0-33d4e8c0c409

datacap-bot[bot] commented 4 months ago

Application is ready to sign

datacap-bot[bot] commented 4 months ago

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzaceca3wa4wvgqx3hqa3qw6c4ynp2ptkwibsupda6d7qjqqx7k3kpz22

Address

f1o7zezcse2zkyrrvp6oeqrztchxwu3d62qeej45i

Datacap Allocated

1PiB

Signer Address

f1w4uyrtnvie2bf2ya4i6xne7fn6ifmkm5tdamali

Id

1b6b661a-9ad8-4edd-91e0-33d4e8c0c409

You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceca3wa4wvgqx3hqa3qw6c4ynp2ptkwibsupda6d7qjqqx7k3kpz22

datacap-bot[bot] commented 4 months ago

Application is Granted

filecoin-watchdog commented 4 months ago

checker:manualTrigger

datacap-bot[bot] commented 4 months ago

DataCap and CID Checker Report[^1]

No active deals found for this client.

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

filecoin-watchdog commented 4 months ago

Giving 1PiB to a brand new GitHub handle as a first allocation and 5 SP IDs with no information nor retrievability. Seems like this allocator @Carohere is off to a good start here.....

Carohere commented 3 months ago

Thought it needs to go with the size of a copy to match, still familiarising myself with the rules. Thank you for your input. @filecoin-watchdog

Carohere commented 3 months ago

Test it again for a smaller amount.

filecoin-watchdog commented 1 month ago

checker:manualTrigger

datacap-bot[bot] commented 1 month ago

DataCap and CID Checker Report Summary[^1]

Storage Provider Distribution

⚠️ 1 storage providers sealed more than 90% of total datacap - f03010701: 100.00%

⚠️ All storage providers are located in the same region.

⚠️ The average retrieval success rate is 0.44%

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients[^3]

✔️ No CID sharing has been observed.

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger

[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Full report

Click here to view the CID Checker report.

datacap-bot[bot] commented 1 month ago

Client used 75% of the allocated DataCap. Consider allocating next tranche.

Carohere commented 3 weeks ago

KYC record image image