filecoin-project / filecoin-plus-large-datasets

Hub for client applications for DataCap at a large scale
110 stars 62 forks source link

[DataCap Application] <NOAA-HRRR> (1/5) #1868

Closed pan1012 closed 1 year ago

pan1012 commented 1 year ago

Data Owner Name

National Oceanic and Atmospheric Administration

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

https://registry.opendata.aws/noaa-hrrr-pds

Social Media

Website: https://www.noaa.gov/
Twitter: https://twitter.com/NOAA

Total amount of DataCap being requested

5PiB

Weekly allocation of DataCap requested

600TiB

On-chain address for first allocation

f1jgqycg7yo7jkrsajw23cxcu226f65rjc4vmk27y

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

NOAA is an agency that enriches life through science. Their reach goes from the surface of the sun to the depths of the ocean floor as we work to keep the public informed of the changing environment around them.

The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh.

Is this project associated with other projects/ecosystem stakeholders?

Yes

If answered yes, what are the other projects/ecosystem stakeholders

NOAA is an agency that enriches life through science. Their reach goes from the surface of the sun to the depths of the ocean floor as we work to keep the public informed of the changing environment around them.

The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh.

Describe the data being stored onto Filecoin

357-resule 
2020-08-25 17:22:47  398.4 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf13.grib2
2020-08-25 17:22:52  396.0 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf14.grib2
2020-08-25 17:22:52  394.6 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf15.grib2
2020-08-25 17:22:52  390.2 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf16.grib2
2020-08-25 17:23:08  387.1 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf17.grib2
2020-08-25 17:23:05  384.8 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf18.grib2
2021-09-28 03:48:22   31.6 KiB index.html

Total Objects: 43282174
   Total Size: 2.1 PiB

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

How do you plan to prepare the dataset

lotus

If you answered "other/custom tool" in the previous question, enter the details here

No response

Please share a sample of the data

357-resule 
2020-08-25 17:22:47  398.4 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf13.grib2
2020-08-25 17:22:52  396.0 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf14.grib2
2020-08-25 17:22:52  394.6 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf15.grib2
2020-08-25 17:22:52  390.2 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf16.grib2
2020-08-25 17:23:08  387.1 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf17.grib2
2020-08-25 17:23:05  384.8 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf18.grib2
2021-09-28 03:48:22   31.6 KiB index.html

Total Objects: 43282174
   Total Size: 2.1 PiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, South America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3)

How do you plan to choose storage providers

Slack, Big data exchange

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

f02050599   Bang Sai, Phra Nakhon Si Ayutthaya, TH
f02063202   Mueang Nonthaburi, Nonthaburi, TH
f02064089   Singapore, Singapore, SG
f02045964   Sham Shui Po, Sham Shui Po, HK
f02055638   Kuala Lumpur, Kuala Lumpur, MY
f02048990   Hong Kong, Central and Western, HK
f02041085   Tokyo, Tokyo, JP
f02046736   Curug, Banten, ID
f02029895   Seoul, Seoul, KR
f01170282   Hong Kong, Central and Western, HK
f02042992   Singapore, Singapore, SG

How do you plan to make deals to your storage providers

Boost client, Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

large-datacap-requests[bot] commented 1 year ago

Thanks for your request!

Heads up, you’re requesting more than the typical weekly onboarding rate of DataCap!
large-datacap-requests[bot] commented 1 year ago

Thanks for your request!

Heads up, you’re requesting more than the typical weekly onboarding rate of DataCap!
large-datacap-requests[bot] commented 1 year ago

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

large-datacap-requests[bot] commented 1 year ago

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

Sunnyiscoming commented 1 year ago

Can you introduce yourself or your organization? How you find these sps?

pan1012 commented 1 year ago

@Sunnyiscoming My boss is very familiar with the filecoin project and is planning to invest in it for a long time, preparing to package 25P DC computing power. We have also talked about many SPs, which are still in the selection stage. My boss has already prepared a lot of coins to start the project in April.

Carohere commented 1 year ago

Hi @pan1012, appreciate the enthusiasm you have for being part of the network. 25PiBs DataCap requires professional technical support and significant pledge costs. However, I noted your account is brand new and I would like to learn more about your organization/business.

pan1012 commented 1 year ago

@Carohere Hello, we are a Chinese company and it is not convenient to disclose any other information. Please understand, The node is provided by the SP we are currently contacting. Finally, there may be some differences in actual packaging.

Sunnyiscoming commented 1 year ago

Please provide more information about your organization to build trust in the community.

pan1012 commented 1 year ago

@Sunnyiscoming @Carohere My boss is just an individual investor. Everyone should know that Chinese companies cannot do this. We will find multiple SPs to cooperate with at the technical level. If you have SPs to introduce, I would be very grateful.

Carohere commented 1 year ago

Hello, we are a Chinese company and it is not convenient to disclose any other information. Please understand,

My boss is just an individual investor. Everyone should know that Chinese companies cannot do this.

I'm not sure what you mean. To move your application forward, you have to prove who you are and which organization you belong to.

Sunnyiscoming commented 1 year ago

Any update here?

Sunnyiscoming commented 1 year ago

It will be closed if there is no reply in 3 days.