TOPPOOL-LEE / Allocator-Pathway-TOP-POOL

4 stars 1 forks source link

[DataCap Application]Commoncrawl #1

Closed yse88483 closed 7 months ago

yse88483 commented 7 months ago

Version

1

DataCap Applicant

DATADAO

Project ID

DATADAO-01

Data Owner Name

Commoncrawl

Data Owner Country/Region

United States

Data Owner Industry

Life Science / Healthcare

Website

https://commoncrawl.org/

Social Media Handle

https://commoncrawl.org/

Social Media Type

Twitter

What is your role related to the dataset

Dataset Owner

Total amount of DataCap being requested

12

Unit for total amount of DataCap being requested

PiB

Expected size of single dataset (one copy)

2

Unit for expected size of single dataset

PiB

Number of replicas to store

6

Weekly allocation of DataCap requested

250

Unit for weekly allocation of DataCap requested

TiB

On-chain address for first allocation

f1snhspx7pphhlr5i4ent6d7k66nz2sfkosmutbwy

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

DATADAO is built upon technical expertise and utilizes proprietary programs and advanced tools to provide customized car data packaging services to clients. Our team boasts extensive experience and proficiency in utilizing cutting-edge tools such as go-graphsplit and singularity to ensure efficient and accurate data packaging. Our service process encompasses the following key points:

Technical Expertise: Our team possesses rich technical expertise in utilizing advanced tools. By leveraging tools like go-graphsplit and singularity, we ensure the efficiency and accuracy of data packaging. These tools aid us in processing and packaging client-provided data to meet the storage requirements of the Filecoin network.

Data Processing Flow: We employ proprietary programs to segment, package, and compress client-provided data, ultimately generating ZIP files ranging from 28-32GB in size to align with the storage requirements of the Filecoin network. Through these services, we effectively assist clients in processing and preparing data for storage and usage on the Filecoin network.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Common Crawl maintains a free, open repository of web crawl data that can be used by anyone.
Primary training corpus in every LLM.82% of raw tokens used to train GPT-3.Free and open corpus since 2007.Cited in over 8000 research papers.3–5 billion new pages added each month.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

United States

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

Yes,I am data preparer, DATADAO have technical team that specializes in providing Car data packaging services to our clients, which include: 1. technical expertise: our team is skilled in using advanced tools such as go-graphsplit, singularity, etc. to ensure efficient and accurate data packaging. 2. Data Processing Flow: We use self-developed programs to cut, package, and compress the data provided by our clients into 28-32GB zip files to meet the storage requirements of the Filecoin network. Through these services, we are able to effectively help our clients process and prepare data for storage and use on the Filecoin network.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

No response

Please share a sample of the data

357-resule 
2020-08-25 17:22:47  398.4 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf13.grib2
2020-08-25 17:22:52  396.0 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf14.grib2
2020-08-25 17:22:52  394.6 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf15.grib2
2020-08-25 17:22:52  390.2 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf16.grib2
2020-08-25 17:23:08  387.1 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf17.grib2
2020-08-25 17:23:05  384.8 MiB hrrr_v2.20160823/conus/hrrr.t09z.wrfprsf18.grib2
2021-09-28 03:48:22   31.6 KiB index.html

Total Objects: 43282174
   Total Size: 2.1 PiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

2 to 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Shipping hard drives, Lotus built-in data transfer

How did you find your storage providers

Slack, Filmine

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

[{"providerID":"f02850067","City":"HongKong","Country":"china","SPOrg","datastone"},
{"providerID":"f01992563","City":"XYZ","Country":"china","SPOrg","SAcloud"},
{"providerID":"f01996719","City":"XYZ","Country":"china","SPOrg","SAcloud"},
{"providerID":"f01996817","City":"XYZ","Country":"china","SPOrg","SAcloud"},
{"providerID":"f02812304","City":"XYZ","Country":"Singapore","SPOrg","SY-Tech"},
{"providerID":"f02320270","City":"XYZ","Country":"US","SPOrg","R1"},]

How do you plan to make deals to your storage providers

Boost client, Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

yse88483 commented 7 months ago

hey, this is a continuation of 2327 https://github.com/filecoin-project/filecoin-plus-large-datasets/issues/2327, please help us advance @TOPPOOL-LEE

datacap-bot[bot] commented 7 months ago

Application is waiting for governance review

TOPPOOL-LEE commented 7 months ago

@yse88483 1PiB per week seems too much. Can you adjust 1PiB to 200TiB or 300TiB?

yse88483 commented 7 months ago

Two days have passed and I finally saw your reply. We have modified it as requested, so can you help us?

yse88483 commented 7 months ago

Our filecoin-project/filecoin-plus-large-datasets#2327, got the application very early, but it was delayed because there was no quota for v3.1, I think we should be able to get the quota quickly

TOPPOOL-LEE commented 7 months ago

I wanted to try and help you, it seems like nothing has changed here, maybe you should close this application and resubmit a new one

TOPPOOL-LEE commented 7 months ago

Sorry, the volume of your weekly application is too large, it cannot be approved.

TOPPOOL-LEE commented 7 months ago

It is recommended that you close your application and reapply.

yse88483 commented 7 months ago

If I reapply, you will pass me?

TOPPOOL-LEE commented 7 months ago

It looks like you applied for too many, we can only start with one