FroghubMan / filplus

0 stars 0 forks source link

[DataCap Application] Hubble Space Telescope Public Data #29

Open zzslkj opened 3 months ago

zzslkj commented 3 months ago

Version

1

DataCap Applicant

Revelation Limited

Project ID

1

Data Owner Name

Space Telescope Science Institute

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

https://astroquery.readthedocs.io/en/latest/mast/mast.html#module-astroquery.mast

Social Media Handle

https://astroquery.readthedocs.io/en/latest/mast/mast.html#module-astroquery.mast

Social Media Type

Other

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

5PiB

Unit for total amount of DataCap being requested

PiB

Expected size of single dataset (one copy)

500TiB

Unit for expected size of single dataset

TiB

Number of replicas to store

10

Weekly allocation of DataCap requested

350TiB

Unit for weekly allocation of DataCap requested

PiB

On-chain address for first allocation

f13xtiyivvhnmlue5xlvfbyp6dxyvejvrh3lxhqhi

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

Revelation Limited is a technology company based in Hong Kong. We are mainly in the blockchain business. Business covers Bitcoin, ETH, Filecoin, Chia, spacemech, livepeer, io.net, subspace, etc. There are rich development and server operation and maintenance operations. We are also one of Fielcoin's storage providers.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

The Hubble Space Telescope (HST) is one of the most productive scientific instruments ever created. This dataset contains calibrated and raw data for all of the currently active instruments on HST: ACS, COS, STIS and WFC3.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

Hong Kong

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

No response

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

Used to be stored, but most have begun to expire. I want to keep storing this data.

Please share a sample of the data

https://registry.opendata.aws/hst/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China

How will you be distributing your data to storage providers

HTTP or FTP server, Shipping hard drives

How did you find your storage providers

Partners, Others

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f02831252 Guangdong
f02831211 HongKong
f02051391 HongKong
f02824414 Tokyo,Japan
f02831203 HongKong
f01989372 BeiJing
There are also new sp's that will be added later.

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 3 months ago

Application is waiting for governance review

FroghubMan commented 3 months ago

Do you have any experience in preparing data? Please describe how you will process the data. (e.g., tools)

zzslkj commented 3 months ago

Do you have any experience in preparing data? Please describe how you will process the data. (e.g., tools)

Sure, I know how to process the data:

  1. Download data from AWS,
  2. Use Singularity to cut files into xx.car
  3. Shipping the hard drive containing xx.car file to SPs.
FroghubMan commented 3 months ago

In the above application, you plan to store 10 copies of this data. But you don't provide enough SPs. Is there any further planning? If you lack SP, I can recommend some SP for you.

FroghubMan commented 3 months ago

Please send the entity certificate to team@froghub.io by official email.

zzslkj commented 3 months ago

In the above application, you plan to store 10 copies of this data. But you don't provide enough SPs. Is there any further planning? If you lack SP, I can recommend some SP for you.

That would be great If you can introduce some SP for me.

zzslkj commented 3 months ago

Please send the entity certificate to team@froghub.io by official email.

OK