fidlabs / Open-Data-Pathway

6 stars 8 forks source link

[DataCap Application] <zinc15 > <2024-06-21T05:04:25.948Z> #39

Open martapiekarska opened 4 months ago

martapiekarska commented 4 months ago

Version

2024-06-21T05:04:25.948Z

DataCap Applicant

@dos2un1x

Data Owner Name

zinc15

Data Owner Country/Region

Life Science / Healthcare

Website

zinc15.docking.org

Social Media Handle

https://registry.opendata.aws/zinc15/

Social Media Type

Slack

What is your role related to the dataset

Other

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

989TiB

Number of replicas to store

4

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Identifier

Share a brief history of your project and organization

Welcome to ZINC15, a research tool for ligand discovery, chemical biology and pharmacology. We don't believe documentation should be necessary. Our goal is to make ZINC so blindingly obvious to use that it requires none.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

3D models for molecular docking screens.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

If you are a data preparer. What is your location (Country/Region)

Singapore

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

If you are not preparing the data, who will prepare the data? (Provide name and business)

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

Please share a sample of the data

sudo aws s3 ls --no-sign-request s3://zinc3d/ --recursive --human-readable --summarize | grep Total Total Objects: 5840977 Total Size: 989.7 TiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

Confirm

If you chose not to confirm, what was the reason

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1 to 1.5 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America, South America, Europe, Australia (continent), Antarctica

How will you be distributing your data to storage providers

Cloud storage (i.e. S3)

How did you find your storage providers

Slack, Partners, Others

If you answered "Others" in the previous question, what is the tool or platform you used

Please list the provider IDs and location of the storage providers you will be working with.

1.f02837226 UK 2.f02864300 US 3.f02894286 KR 4.f02870401 ID

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 4 months ago

Application is waiting for allocator review

kevzak commented 4 months ago

Hello @dos2un1x

Can you list SP Entity names for each SP. 1.f02837226 UK 2.f02864300 US 3.f02894286 KR 4.f02870401 ID These SPs are not prepared to provide retrievals. Please confirm or choose other SPs

datacap-bot[bot] commented 4 months ago

KYC has been requested. Please complete KYC at https://kyc.allocator.tech/?owner=fidlabs&repo=Open-Data-Pathway&client=f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy&issue=39

kevzak commented 4 months ago

Also we're asking you to complete a gitcoin KYC check above @dos2un1x

The other KYC option is

datacap-bot[bot] commented 4 months ago

KYC completed for client address f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy with Optimism address 0xA2011DC10c4eB5db5600ccC91AB40da093f69897 and passport score 30.

dos2un1x commented 4 months ago

@kevzak Dear Notary, our KYC has been completed. I need your signature. Thank you!

kevzak commented 4 months ago

Hi @dos2un1x - KYC is confirmed.

Please reply to this request before we can begin: https://github.com/fidlabs/Open-Data-Pathway/issues/39#issuecomment-2182566119

You need to confirm SPs that meet retrieval requirements upfront. Thanks.

dos2un1x commented 4 months ago

Hi @dos2un1x - KYC is confirmed.

Please reply to this request before we can begin: #39 (comment)

You need to confirm SPs that meet retrieval requirements upfront. Thanks.

I have discussed this issue with SPs, and they support Graphsync/HTTP retrieval mode. We will also add new SPs later to ensure data distribution.

kevzak commented 4 months ago

Please provide SP entity information @dos2un1x

https://github.com/fidlabs/Open-Data-Pathway/issues/39#issuecomment-2194729145

dos2un1x commented 4 months ago

1.f02837226 UK Jerry fuqiancheng@kinghash.com kinghash 2.f02864300 US miaozi Cordarell@chainup.com chainup 3.f02894286 KR Lee karl@hs88.kr HS88 4.f02870401 ID akcd4040 Jordan@bitwind.com bitwind

datacap-bot[bot] commented 4 months ago

Datacap Request Trigger

Total DataCap requested

5PiB

Expected weekly DataCap usage rate

DataCap Amount - First Tranche

50TiB

Client address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

datacap-bot[bot] commented 4 months ago

DataCap Allocation requested

Multisig Notary address

Client address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

DataCap allocation requested

50TiB

Id

5c52eb6e-c375-4972-94de-f369ec4de17a

datacap-bot[bot] commented 4 months ago

Application is ready to sign

datacap-bot[bot] commented 4 months ago

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44

Address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

Datacap Allocated

50TiB

Signer Address

f1v24knjbqv5p6qrmfjj5xmlaoddzqnon2oxkzkyq

Id

5c52eb6e-c375-4972-94de-f369ec4de17a

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44

datacap-bot[bot] commented 4 months ago

Application is Granted

kevzak commented 4 months ago

OK @dos2un1x - 50Tibs was allocated - let's see matching SP and retrievals. Thank you

dos2un1x commented 4 months ago

@kevzak Hi, dear notary, when can we start the next round?The first round is so small!

datacap-bot[bot] commented 3 months ago

Issue has been modified. Changes below:

(OLD vs NEW)

State: ChangesRequested vs Granted

kevzak commented 3 months ago

checker:manualTrigger

datacap-bot[bot] commented 3 months ago

Issue information change request has been approved.

datacap-bot[bot] commented 3 months ago

DataCap and CID Checker Report Summary[^1]

Storage Provider Distribution

✔️ Storage provider distribution looks healthy.

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients[^3]

✔️ No CID sharing has been observed.

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger

[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Full report

Click here to view the CID Checker report.

kevzak commented 3 months ago

@dos2un1x - we've implemented some new data preparation questions for open datasets. Please take some time to answer these for us. Thanks.

1: How the data is being transformed into deals for Filecoin What is the transformation process from the files available for download and what will be stored on filecoin? How when we sample your deals will we be able to confirm that it has come from the dataset? Given a 32GB payload, what steps can an independent entity take to confirm it comes from the relevant upstream dataset?

2: How the data is made available When a deal is sampled for verification, how will we be able to confirm that it is part of this dataset? (how is is chunked into car files?) We want to see how a client could be able to make use of this dataset, can you share the documentation? This could be a client script for how to iterate through / process over the data This could be a web site allowing browsing / identification of specific pieces of data from the dataset as stored This could be identification of clients making use of the data

datacap-bot[bot] commented 3 months ago

Issue has been modified. Changes below:

(OLD vs NEW)

Weekly Allocation: 1PiB vs State: ChangesRequested vs Granted

datacap-bot[bot] commented 3 months ago

Issue information change request has been approved.

datacap-bot[bot] commented 3 months ago

Client used 75% of the allocated DataCap. Consider allocating next tranche.

dos2un1x commented 2 months ago
  1. You can use the aws s3 sync to download https://registry.opendata.aws/zinc15/ corresponding data set.
  2. We archive the downloaded data set and generate the car file with the boost tool according to reasonable rules.
willscott commented 2 months ago

The boost tool does not generate a car file - it makes a deal with a car file. We are asking you to explain these "reasonable rules"

dos2un1x commented 2 months ago

Just because the generate-car parameter has been removed from the new version of boostx doesn't mean I can't use another version, right?

willscott commented 2 months ago

"the boost tool according to reasonable rules" does not provide enough detail for us to understand the transformation you are proposing to perform. boostx is a different tool that you are now specifying somewhat more specifically, but you still have not provided enough information for anyone else to be able to reconstruct the original data set from the deals you make to filecoin.

dos2un1x commented 2 months ago

I think you're taking the term boost tool too seriously!

willscott commented 2 months ago

indeed - I'm asking for you to actually provide the details that allow for technical replication for how the open data set will be stored to filecoin. This was always the intention of public data sets being stored to filecoin, and is now being enforced - at least by this datacap pathway.

dos2un1x commented 2 months ago

Why provide technical details? Why not in other issues?

dos2un1x commented 2 months ago

Do I have to share my code with you?

martapiekarska commented 1 month ago

Since June/July this year we are trying to raise the standards on data prep for our clients. All of our clients are now asked to share details of how they prepare data so that we can spot check it. It is important that anytone in the community is able to make use of the data set and in order to do that, they will need to reconstruct it.

dos2un1x commented 1 month ago

I think you can provide a standard for healthy community development. There won't be so many disputes when everyone meets the same criteria.

martapiekarska commented 1 month ago

hi @dos2un1x please review our Policies on https://github.com/fidlabs/Open-Data-Pathway/wiki and let me know if you have further questions.

When you are ready to share the dataprep plan or connect us with the team doing it for you and ensure that your data is available for download as a open data set, please let us know