filplus-bookkeeping / IPFSCN

Bookkeeping repo for Allocator #1086
1 stars 1 forks source link

[DataCap Application] Smithsonian Institution #31

Open lijo76618 opened 5 hours ago

lijo76618 commented 5 hours ago

Version

1

DataCap Applicant

Smithsonian Institution

Project ID

01

Data Owner Name

Smithsonian Institution

Data Owner Country/Region

United States

Data Owner Industry

Web3 / Crypto

Website

https://www.si.edu/visit

Social Media Handle

https://www.si.edu/visit

Social Media Type

Slack

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

8PiB

Expected size of single dataset (one copy)

1PiB

Number of replicas to store

8

Weekly allocation of DataCap requested

512TiB

On-chain address for first allocation

f1j7vbm3p2qadwjn3yupi2bk4sye3zucwqwahnu2y

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

The Smithsonian’s mission is the "increase and diffusion of knowledge" and has been collecting since 1846. The Smithsonian, through its efforts to digitize its multidisciplinary collections, has created millions of digital assets and related metadata describing the collection objects. On February 25th, 2020, the Smithsonian released over 2.8 million CC0 interdisciplinary 2-D and 3-D images, related metadata, and additionally, research data from researches across the Smithsonian. The 2.8 million "open access" collections are a subset of the Smithsonian’s 155 million objects, 2.1 million library volumes and 156,000 cubic feet of archival collections held in 19 museums, 9 research centers, libraries, archives and the National Zoo. Digitization of collections is ongoing.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

The Smithsonian’s mission is the "increase and diffusion of knowledge" and has been collecting since 1846. The Smithsonian, through its efforts to digitize its multidisciplinary collections, has created millions of digital assets and related metadata describing the collection objects. On February 25th, 2020, the Smithsonian released over 2.8 million CC0 interdisciplinary 2-D and 3-D images, related metadata, and additionally, research data from researches across the Smithsonian. The 2.8 million "open access" collections are a subset of the Smithsonian’s 155 million objects, 2.1 million library volumes and 156,000 cubic feet of archival collections held in 19 museums, 9 research centers, libraries, archives and the National Zoo. Digitization of collections is ongoing.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

None

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

After we download data from the Internet, the data is cut into disks through Singularity, and then the hard disk is mailed to the SPs.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

s3://smithsonian-open-access/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, South America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), Shipping hard drives, Lotus built-in data transfer

How did you find your storage providers

Slack, Filmine

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f02852273 London
f02984331 Singapore
f02883857 Singapore
f02973061 Russia
f02889193 Vietnam

How do you plan to make deals to your storage providers

Boost client, Lotus client, Singularity

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 5 hours ago

Application is waiting for allocator review

ipfscn commented 4 hours ago

You mentioned that the data will be cut by the Singularity tool and then mailed to Storage Providers (SPs). How do you ensure the security and integrity of the data during the mailing process? Additionally, how do you plan to handle and store this data to ensure it meets the requirements of the Filecoin network?

lijo76618 commented 1 hour ago

We have developed our own data processing system based on Singularity. After processing the data, we transfer it to hard drives in batches. Each hard drive will be labeled to ensure that data is not lost, duplicated, or mixed up, and to ensure that all processes comply with FIL+ requirements.