NDLABS-Leo / Allocator-Pathway-ND-CLOUD

Notary Allocator Pathway Name : ND CLOUD
1 stars 0 forks source link

[DataCap Application] Commoncraw #39

Open nike-mp opened 7 hours ago

nike-mp commented 7 hours ago

Version

1

DataCap Applicant

FileTech

Project ID

FileTech-02

Data Owner Name

Commoncrawl

Data Owner Country/Region

United States

Data Owner Industry

Life Science / Healthcare

Website

https://commoncrawl.org/

Social Media Handle

https://commoncrawl.org/

Social Media Type

Slack

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

512TiB

Number of replicas to store

10

Weekly allocation of DataCap requested

300TiB

On-chain address for first allocation

f17eeci7jslxmqrotwj4j6j6xgerjaffmbx6lx53i

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

FileTech focuses on providing excellent data storage solutions. We have a passionate and knowledgeable team with extensive experience and expertise in the field of data storage. Whether it's data storage, data management, data recovery, or data center design and construction, we possess abundant technical capabilities and solutions.

At FileTech, we understand the importance of data in modern businesses. We not only offer high-performance data storage devices and solutions, but also provide comprehensive data management tools to help clients efficiently organize, classify, and protect their data assets.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Common Crawl maintains a free, open repository of web crawl data that can be used by anyone.
Primary training corpus in every LLM.82% of raw tokens used to train GPT-3.Free and open corpus since 2007.Cited in over 8000 research papers.3–5 billion new pages added each month.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

United States

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

No response

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

357-resule 
2020-08-25 17:22:47  398.4 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf13.grib2
2020-08-25 17:22:52  396.0 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf14.grib2
2020-08-25 17:22:52  394.6 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf15.grib2
2020-08-25 17:22:52  390.2 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf16.grib2
2020-08-25 17:23:08  387.1 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf17.grib2
2020-08-25 17:23:05  384.8 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf18.grib2
2021-09-28 03:48:22   31.6 KiB index.html

Total Objects: 43282174
   Total Size: 2.1 PiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), IPFS

How did you find your storage providers

Slack, Filmine

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

f03215853 | United States
f03218576 | Portland, Oregon, United States
f03157910 | China
f03157905 | China
f03220176 | Hong Kong
f03220172 | Singapore

How do you plan to make deals to your storage providers

Boost client, Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 7 hours ago

Application is waiting for allocator review

NDLABS-Leo commented 3 hours ago

@nike-mp Hello, thank you for your application. I hope that our future progress will proceed smoothly. I would like to conduct some additional reviews of the submitted information:

  1. I would appreciate it if you could provide some information that would facilitate completing the KYB (Know Your Business) certification to verify that the above application is indeed a voluntary action by the entity.
  2. I see that you have provided a total data volume of 2.1 PiB. I would like you to provide some additional samples so I can understand what kind of data is being stored (such as files that can be opened, or details about the format of the files, etc.).
  3. Are you familiar with the FIL+ program and the rules of our channel? Do the storage nodes meet the basic requirements for the project?