Closed Megan008 closed 1 year ago
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
Do you have permission from Folding@home?
Who are the SPs you plan to work with and what exactly is your data transfer plan? the outlined plan is really unclear
@raghavrmadya Thank you for your questions. COVID-19 is a public dataset and is not exclusive to a specific organization. So it is not necessary to have permission from folding@home in advance to download and store the dataset. It is similar to how programmers do not need to get permission from github to use their public code. The SPs we have worked and discussed before include f01854755, f01823070 and f01878693, etc. After our application has been approved, we plan to divide data to 8-10 SPs according to BDE platform.
Thanks @Megan008. We have cases before where clients have needed approval of the manager for public data sets. I also see that you have many applications open. Can you share more about yourself and any organization you are representing as onboarding many PiBs of data through multiple applications requires a team effort. I'm tagging @Kernelogic as they have dealt with such challenges with clients before as it relates to public datasets
Folding@home dataset is CreativeCommons licensed so license wise it should be fine.
It consists about 450TB of raw data from AWS S3: arn:aws:s3:::fah-public-data-covid19-antibodies | us-east-2 | 8.6 TiB arn:aws:s3:::fah-public-data-covid19-cryptic-pockets | us-east-2 | 71.0 TiB arn:aws:s3:::fah-public-data-covid19-absolute-free-energy | us-east-2 | 369.5 TiB arn:aws:s3:::fah-public-data-covid19-moonshot-dynamics | us-east-2 | 1.8 TiB
However, I would have the following questions:
@raghavrmadya I'm a community member. As I mentioned before, I'm going to contact more SPs to distribute data via BDE platform next. And I also have sp that I have worked with will continue to work together, so I think we can complete it.
@raghavrmadya Thank you for your questions. COVID-19 is a public dataset and is not exclusive to a specific organization. So it is not necessary to have permission from folding@home in advance to download and store the dataset. It is similar to how programmers do not need to get permission from github to use their public code. The SPs we have worked and discussed before include f01854755, f01823070 and f01878693, etc. After our application has been approved, we plan to divide data to 8-10 SPs according to BDE platform.
@kernelogic Thank you for your points and concern! I am currently in Singapore, but I look forward to contacting SPs around the world. I am not participating in the Slingshot, so I think I need to follow LDN's rules rather than slingshot's.
Total DataCap requested
5PiB
Expected weekly DataCap usage rate
100TiB
Client address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
50TiB
fd3d1516-a183-467e-9ad7-01964bb49b11
The applicant contacted me via DM and went through our due diligence. Willing to support in the first round and will keep an eye on later allocations. Looking forward to seeing your next milestone.
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacedq4hx2jsgyifcgazexuerxpsrkxlpvjl6f4e25667gzgaq7gexfc
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
50.00TiB
Signer Address
f15impf3j2zcaex4lhyxndxswuuhv24vzstuqtxsi
Id
fd3d1516-a183-467e-9ad7-01964bb49b11
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedq4hx2jsgyifcgazexuerxpsrkxlpvjl6f4e25667gzgaq7gexfc
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacecubgphhwp2m6spj7idtuvaaopy2ohlpawpkrkxpruz62lgcqxeks
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
50.00TiB
Signer Address
f1vxbqrf7rfum3n6m5u6eb4re6xj7amvsaqnzu64y
Id
fd3d1516-a183-467e-9ad7-01964bb49b11
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacecubgphhwp2m6spj7idtuvaaopy2ohlpawpkrkxpruz62lgcqxeks
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
100TiB
bec108fc-6849-40c6-a516-a3362ca51c28
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
100% of weekly dc amount requested
100TiB
50TiB
4.95PiB
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
null | null | 50TiB | null | 352GiB |
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacedemxymy7etyr37dk57soef4zczfyxvoqmafdthz4lvyjnjaaug76
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
100.00TiB
Signer Address
f1irqs2gmctiv3jcdfwuch7oxvf4ixh3k4b2wc24i
Id
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedemxymy7etyr37dk57soef4zczfyxvoqmafdthz4lvyjnjaaug76
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacebluals2lpiz7lbjkcuxivocn6drljn5umrfmlzghkvqaosn6b63s
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
100.00TiB
Signer Address
f15impf3j2zcaex4lhyxndxswuuhv24vzstuqtxsi
Id
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebluals2lpiz7lbjkcuxivocn6drljn5umrfmlzghkvqaosn6b63s
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
200TiB
cbbb6625-7682-4b65-880f-c0d6f5d5c06d
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
200% of weekly dc amount requested
200TiB
9094.9YiB
-1.09B
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
null | null | 100TiB | null | 33.81TiB |
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacedxtl7kpxvlndxflemhl2cqlgu5w3gtxjatmdtnhv7fbsl3mqdqmi
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
200.00TiB
Signer Address
f174fg3bqbln3zjnkxtyf6s54txqkr7yqkj6cig7y
Id
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedxtl7kpxvlndxflemhl2cqlgu5w3gtxjatmdtnhv7fbsl3mqdqmi
Reported for datacap abuse and violation of code of conduct.
@raghavrmadya @dkkapur
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the full report.
The client followed the allocation plan as planned, good report is all dimensions. Keep it up!
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacecmz55642hdg5pgzop7blkrc52pe3qkrp5ch2xx3sdofcl4ybwove
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
200.00TiB
Signer Address
f1bwugfihrmn3iyunzyxst5nttql3dge4khwmurtq
Id
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacecmz55642hdg5pgzop7blkrc52pe3qkrp5ch2xx3sdofcl4ybwove
Just saw the record that the applicant has some non-compliant usage records, but not this application. I don't know what the community's opinion is on whether applicants with non-compliant usage records are allowed to apply for LDN again, I hope to have an answer soon?
Willing to abide by the consensus opinion of the community.
@Meibuy
Yes, the opinion is that if someone harms the community that there should be due diligence done in a way that it cannot happen again.
You signed on an application that is disputed in notion. I will add this to the dispute list.
I noticed that the applicant had a violation record before signing. However, she repeatedly assured me in DMs that it was a cooperation issue of miners. There will be no such problem in the future. I think the applicant knows enough about Filecoin as a hackathon participant, so I decided to give her a chance to prove herself again. So far, the report of this application looks healthy.
We all know that Github is easy to sign up for, and applicants can use different accounts to apply for LDN applications. After the filplus checker went live, many applications had problems with duplicate data. I would like to know the community's views of these applications and the applicants.
If the community prohibits applicants involved in violation from applying again, I recommend @Megan008 or a governance team member to close this issue.
I agree with the above points. There are no explicit rules for clients who have been in dispute before. I will abide by community feedback and official recommendations on how to handle such applications.
[quote]I noticed that the applicant had a violation record before signing. However, she repeatedly assured me in DMs that it was a cooperation issue of miners. [/quote]
How? @Megan008 is a data preparer. She sends her data from her wallet to an SP as a verified deal. What happened is that she did not pack the dataset as she said. She packed only a few files and shared that thousands of times with the people involved in that scheme. Miners can just accept a deal for a price. They don't have influence on what is stored on their drives so don't come here with those fairytales.
Again ... You guys decided to sign a disputed application. The rules for this are clear, you should not. Reasons why are also clear.
From my point of view the dispute is valid. Datacap should be revoked. If we were to get an upfront answer i would reconsider but we A don't get any KYC and B don't get any honest answer on what happened. Thirdly it is beyond me that we get dozens of new Github handles per day ( from the same people ) and specifically this one needs attention to continue.
@dkkapur @raghavrmadya your call.
@cryptowhizzard Sorry for raising disputes. I understand that past mistakes may have reduced my trustworthiness. I'm deeply sorry about it and i wish i can get a chance to prove that i'm fixing it. Please tell me what should I do.
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400TiB
dc5d06c4-9f85-4bc9-8090-0060f99e192c
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400% of weekly dc amount requested
400TiB
18189894035458574336.0YiB
-2.19B
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
null | null | 200TiB | null | 47.15TiB |
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzaceb3c3zbbm46n76nfl4rmju6esbwaznh6mlf3aptyp4qj4e42bic66
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
400.00TiB
Signer Address
f1d4yb3wags3mtddzesxoo63jv7dmlec3bq4yteni
Id
dc5d06c4-9f85-4bc9-8090-0060f99e192c
You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceb3c3zbbm46n76nfl4rmju6esbwaznh6mlf3aptyp4qj4e42bic66
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacebwgiuxbzfcmuga4cpudzlu5hxwgghhkbnt2a54goaulifwscryms
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
400.00TiB
Signer Address
f15impf3j2zcaex4lhyxndxswuuhv24vzstuqtxsi
Id
dc5d06c4-9f85-4bc9-8090-0060f99e192c
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebwgiuxbzfcmuga4cpudzlu5hxwgghhkbnt2a54goaulifwscryms
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400TiB
b1b3a935-36cc-4800-90cb-7c5b98cd6d12
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400% of weekly dc amount requested
400TiB
18189894035458574336.0YiB
-2.19B
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
null | null | 200TiB | null | 3.78TiB |
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacebktvx7k47limisrv73ioz6c7xgkbgjyc4mxtu2wq2bth4p4a5voa
Address
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
Datacap Allocated
400.00TiB
Signer Address
f1ihv7gz3vn3xqvikpt4rwryecgisl7745lodx3yi
Id
b1b3a935-36cc-4800-90cb-7c5b98cd6d12
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebktvx7k47limisrv73ioz6c7xgkbgjyc4mxtu2wq2bth4p4a5voa
Public data with clear allocation plan, we are willing to help onboard more valuable dataset.
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400TiB
ce15296d-34d7-4f78-a693-db943fcaeec6
f02049625
f1au3nipqjprr5xp2mwsarr7obvpx2dwy4is6qn4y
400% of weekly dc amount requested
400TiB
3.6379788070917164e+49YiB
3.6379788070917164e+49YiB
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
25124 | 13 | 400TiB | 13.61 | 100.09TiB |
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
checker:manualTrigger
⚠️ All retrieval success ratios are below 1%.
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.
checker:manualTrigger
⚠️ All retrieval success ratios are below 1%.
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.
All these SP's are involved in CID sharing and do not support retrieval.
Akin, nothing works.
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
Large Dataset Notary Application
To apply for DataCap to onboard your dataset to Filecoin, please fill out the following.
Core Information
Please respond to the questions below by replacing the text saying "Please answer here". Include as much detail as you can in your answer.
Project details
Share a brief history of your project and organization.
What is the primary source of funding for this project?
What other projects/ecosystem stakeholders is this project associated with?
Use-case details
Describe the data being stored onto Filecoin
Where was the data in this dataset sourced from?
Can you share a sample of the data? A link to a file, an image, a table, etc., are good ways to do this.
Confirm that this is a public dataset that can be retrieved by anyone on the Network (i.e., no specific permissions or access rights are required to view the data).
What is the expected retrieval frequency for this data?
For how long do you plan to keep this dataset stored on Filecoin?
DataCap allocation plan
In which geographies (countries, regions) do you plan on making storage deals?
How will you be distributing your data to storage providers? Is there an offline data transfer process?
How do you plan on choosing the storage providers with whom you will be making deals? This should include a plan to ensure the data is retrievable in the future both by you and others.
How will you be distributing deals across storage providers?
Do you have the resources/funding to start making deals as soon as you receive DataCap? What support from the community would help you onboard onto Filecoin?