Open martapiekarska opened 4 months ago
Application is waiting for allocator review
Hello @dos2un1x
Can you list SP Entity names for each SP. 1.f02837226 UK 2.f02864300 US 3.f02894286 KR 4.f02870401 ID These SPs are not prepared to provide retrievals. Please confirm or choose other SPs
KYC has been requested. Please complete KYC at https://kyc.allocator.tech/?owner=fidlabs&repo=Open-Data-Pathway&client=f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy&issue=39
Also we're asking you to complete a gitcoin KYC check above @dos2un1x
The other KYC option is
KYC completed for client address f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy
with Optimism address 0xA2011DC10c4eB5db5600ccC91AB40da093f69897
and passport score 30
.
@kevzak Dear Notary, our KYC has been completed. I need your signature. Thank you!
Hi @dos2un1x - KYC is confirmed.
Please reply to this request before we can begin: https://github.com/fidlabs/Open-Data-Pathway/issues/39#issuecomment-2182566119
You need to confirm SPs that meet retrieval requirements upfront. Thanks.
Hi @dos2un1x - KYC is confirmed.
Please reply to this request before we can begin: #39 (comment)
You need to confirm SPs that meet retrieval requirements upfront. Thanks.
I have discussed this issue with SPs, and they support Graphsync/HTTP retrieval mode. We will also add new SPs later to ensure data distribution.
Please provide SP entity information @dos2un1x
https://github.com/fidlabs/Open-Data-Pathway/issues/39#issuecomment-2194729145
1.f02837226 UK Jerry fuqiancheng@kinghash.com kinghash 2.f02864300 US miaozi Cordarell@chainup.com chainup 3.f02894286 KR Lee karl@hs88.kr HS88 4.f02870401 ID akcd4040 Jordan@bitwind.com bitwind
Total DataCap requested
5PiB
Expected weekly DataCap usage rate
DataCap Amount - First Tranche
50TiB
Client address
f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy
f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy
50TiB
5c52eb6e-c375-4972-94de-f369ec4de17a
Application is ready to sign
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44
Address
f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy
Datacap Allocated
50TiB
Signer Address
f1v24knjbqv5p6qrmfjj5xmlaoddzqnon2oxkzkyq
Id
5c52eb6e-c375-4972-94de-f369ec4de17a
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44
Application is Granted
OK @dos2un1x - 50Tibs was allocated - let's see matching SP and retrievals. Thank you
@kevzak Hi, dear notary, when can we start the next round?The first round is so small!
(OLD vs NEW)
State: ChangesRequested vs Granted
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report.
@dos2un1x - we've implemented some new data preparation questions for open datasets. Please take some time to answer these for us. Thanks.
1: How the data is being transformed into deals for Filecoin What is the transformation process from the files available for download and what will be stored on filecoin? How when we sample your deals will we be able to confirm that it has come from the dataset? Given a 32GB payload, what steps can an independent entity take to confirm it comes from the relevant upstream dataset?
2: How the data is made available When a deal is sampled for verification, how will we be able to confirm that it is part of this dataset? (how is is chunked into car files?) We want to see how a client could be able to make use of this dataset, can you share the documentation? This could be a client script for how to iterate through / process over the data This could be a web site allowing browsing / identification of specific pieces of data from the dataset as stored This could be identification of clients making use of the data
(OLD vs NEW)
Weekly Allocation: 1PiB vs State: ChangesRequested vs Granted
Client used 75% of the allocated DataCap. Consider allocating next tranche.
The boost tool does not generate a car file - it makes a deal with a car file. We are asking you to explain these "reasonable rules"
Just because the generate-car parameter has been removed from the new version of boostx doesn't mean I can't use another version, right?
"the boost tool according to reasonable rules" does not provide enough detail for us to understand the transformation you are proposing to perform. boostx
is a different tool that you are now specifying somewhat more specifically, but you still have not provided enough information for anyone else to be able to reconstruct the original data set from the deals you make to filecoin.
I think you're taking the term boost tool too seriously!
indeed - I'm asking for you to actually provide the details that allow for technical replication for how the open data set will be stored to filecoin. This was always the intention of public data sets being stored to filecoin, and is now being enforced - at least by this datacap pathway.
Why provide technical details? Why not in other issues?
Do I have to share my code with you?
Since June/July this year we are trying to raise the standards on data prep for our clients. All of our clients are now asked to share details of how they prepare data so that we can spot check it. It is important that anytone in the community is able to make use of the data set and in order to do that, they will need to reconstruct it.
I think you can provide a standard for healthy community development. There won't be so many disputes when everyone meets the same criteria.
hi @dos2un1x please review our Policies on https://github.com/fidlabs/Open-Data-Pathway/wiki and let me know if you have further questions.
When you are ready to share the dataprep plan or connect us with the team doing it for you and ensure that your data is available for download as a open data set, please let us know
Version
2024-06-21T05:04:25.948Z
DataCap Applicant
@dos2un1x
Data Owner Name
zinc15
Data Owner Country/Region
Life Science / Healthcare
Website
zinc15.docking.org
Social Media Handle
https://registry.opendata.aws/zinc15/
Social Media Type
Slack
What is your role related to the dataset
Other
Total amount of DataCap being requested
5PiB
Expected size of single dataset (one copy)
989TiB
Number of replicas to store
4
Weekly allocation of DataCap requested
1PiB
On-chain address for first allocation
f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy
Data Type of Application
Public, Open Dataset (Research/Non-Profit)
Identifier
Share a brief history of your project and organization
Welcome to ZINC15, a research tool for ligand discovery, chemical biology and pharmacology. We don't believe documentation should be necessary. Our goal is to make ZINC so blindingly obvious to use that it requires none.
Is this project associated with other projects/ecosystem stakeholders?
No
If answered yes, what are the other projects/ecosystem stakeholders
3D models for molecular docking screens.
Where was the data currently stored in this dataset sourced from
AWS Cloud
If you answered "Other" in the previous question, enter the details here
If you are a data preparer. What is your location (Country/Region)
Singapore
If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
If you are not preparing the data, who will prepare the data? (Provide name and business)
Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
Please share a sample of the data
sudo aws s3 ls --no-sign-request s3://zinc3d/ --recursive --human-readable --summarize | grep Total Total Objects: 5840977 Total Size: 989.7 TiB
Confirm that this is a public dataset that can be retrieved by anyone on the Network
Confirm
If you chose not to confirm, what was the reason
What is the expected retrieval frequency for this data
Yearly
For how long do you plan to keep this dataset stored on Filecoin
1 to 1.5 years
In which geographies do you plan on making storage deals
Greater China, Asia other than Greater China, Africa, North America, South America, Europe, Australia (continent), Antarctica
How will you be distributing your data to storage providers
Cloud storage (i.e. S3)
How did you find your storage providers
Slack, Partners, Others
If you answered "Others" in the previous question, what is the tool or platform you used
Please list the provider IDs and location of the storage providers you will be working with.
1.f02837226 UK 2.f02864300 US 3.f02894286 KR 4.f02870401 ID
How do you plan to make deals to your storage providers
Boost client
If you answered "Others/custom tool" in the previous question, enter the details here
Can you confirm that you will follow the Fil+ guideline
Yes