filecoin-project / filecoin-plus-large-datasets

Hub for client applications for DataCap at a large scale
109 stars 62 forks source link

[DataCap Application] Sentinel-2 Cloud-Optimized GeoTIFFs - stage2 #2320

Closed FroghubMan closed 3 months ago

FroghubMan commented 3 months ago

Data Owner Name

sinergise

What is your role related to the dataset

Data Preparer

Data Owner Country/Region

Afghanistan

Data Owner Industry

Information, Media & Telecommunications

Website

https://www.sinergise.com/

Social Media

https://twitter.com/sinergise
https://github.com/cirrus-geo/cirrus-earth-search

Total amount of DataCap being requested

15PiB

Expected size of single dataset (one copy)

1-3Pib

Number of replicas to store

10

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1rfq3ne2gz5bobejpsoejbsznfs3dhbrnaaolswi

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

FrogHub has always defined itself as a tool developer and infrastructure builder in the Filecoin ecosystem. In 2019, we started to focus on technical solutions and development based on the IPFS protocol and Filecoin network. We have been working hard to become a qualified builder in the filecoin ecosystem.

Our team is a very pure development team, more than 90% of which are developers, more than half of whom have more than 5 years of development experience in communication, Internet, blockchain and other industries. We hope that we can gain users' recognition by exporting useful tools and platforms.

In order to contribute to the filecoin community, we have developed the open source sector repair tool Filecoin-Sealer-Recover and the nft free authoring platform NFT-Creator.
In addition, we plan to provide a sector browser for the community in 2023 and build the liquidity pledge platform STFIL on FVM.

See the links below for details.
- making: https://github.com/orgs/froghub-io

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

The Sentinel-2 mission is a land monitoring constellation of two satellites that provide high resolution optical imagery and provide continuity for the current SPOT and Landsat missions. The mission provides a global coverage of the Earth's land surface every 5 days, making the data of great use in ongoing studies. This dataset is the same as the Sentinel-2 dataset, except the JP2K files were converted into Cloud-Optimized GeoTIFFs (COGs). Additionally, SpatioTemporal Asset Catalog metadata has were in a JSON file alongside the data, and a STAC API called Earth-search is freely available to search the archive. This dataset contains all of the scenes in the original Sentinel-2 Public Dataset and will grow as that does. L2A data are available from April 2017 over wider Europe region and globally since December 2018.

Update Frequency
New Sentinel data are added regularly, usually within few hours after they are available on Copernicus OpenHub."

Source: https://registry.opendata.aws/sentinel-2-l2a-cogs/

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

Singapore

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

After we download data from the Internet, the data is cut into disks through Singularity, and then the hard disk is mailed to the SPs.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No, I haven't found anyone storing this data set in its entirety on the Filecoin network.

Please share a sample of the data

s3://sentinel-cogs/(16.4 PiB)
s3://sentinel-cogs-inventory/(3.4 TiB)

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Shipping hard drives

How do you plan to choose storage providers

Slack, Partners

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

| MinerID | Entity | City | Continent |
| --------- | ---------- | ---------- | --------- |
| f0870354 | Personal  | beijing | CN |
| f01989372   | Personal| beijing | CN |
| f02816837   | B.B  Tech| guangdong | CN |
| f01907578  | ssdminer| Putian | CN |
| f0123931   | ssdminer| Fuzhou | CN |
| f02806894   | Chimsen | Seoul | Korea |
| f02372022   | Chimsen | Tokyo | Japan |
| f02810687  | DR66 Tech| HongKong | CN |
| f02636860   | DR66 Tech| HongKong | CN |
| f01854510  | 3cloud tech| HongKong | CN |
| f02803754  | Personal | Osaka | Japan |
These are some of the SPs we are currently cooperating with, and more SPs may join in the future. I have already done KYC in other applications. I can do it again if necessary.

How do you plan to make deals to your storage providers

Boost client, Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

large-datacap-requests[bot] commented 3 months ago

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

FroghubMan commented 3 months ago

This is a very large data set (~ 16PiB). We plan to process this data in multiple stages. This process again processes data sets of 1-2PiB.

Sunnyiscoming commented 3 months ago

It is adviced to adjust the amount of datacap requested to the total amount. It is not recommended to split the dataset, and the overall number greater than 15PB is now supported.

FroghubMan commented 3 months ago

If you think it's better that way, sure.