Closed Netonline2016 closed 1 year ago
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
Dear Applicant,
Due to the increased amount of erroneous/wrong Filecoin+ data recently, on behalf of the entire community, we feel compelled to go deeper into datacap requests. Hereby to ensure that the overall value of the Filecoin network and Filecoin+ program increases and is not abused.
Please answer the questions below as comprehensively as possible.
Customer data
We expect that for the onboarding of customers with the scale of an LDN there would have been at least multiple email and perhaps several chat conversations preceding it. A single email with an agreement does not qualify here.
Did the customer specify the amount of data involved in this relevant correspondence?
Why does the customer in question want to use the Filecoin+ program?
Should this only be soley for acquiring datacap this is of course out of the question. The customer must have a legitimate reason for wanting to use the Filecoin+ program which is intended as a program to store useful and public datasets on the network.
(As an intermediate solution Filecoin offers the FIL-E program or the glif.io website for business datasets that do not meet the requirements for a Filecoin+ dataset)
Files and Processing
Hopefully you understand the caution the overall community has for onboarding the wrong data. We understand the increased need for Filecoin+, however, we must not allow the program to be misused. Everything depends on a valuable and useful network, let's do our best to make this happen. Together.
@herrehesse Hello, herrehesse. Regarding the above questions, we made the following responses 1.Could you demonstrate exactly how and to what extent customer contact occurred? We hold online or offline meetings with customers every month to communicate the amount of data storage generated each month and which data will be stored in Filecoin+
2.Did the customer specify the amount of data involved in this relevant correspondence? yes
3.Why does the customer in question want to use the Filecoin+ program? Customers want to share new products, new functions, and new technologies of the medical platform to Filecoin+, so that more medical industries can communicate, so as to promote the improvement of the current difficulties in medical treatment for patients and chaotic hospital management.
4.Why is the customer data considered Filecoin+ eligible? These data are precious materials output during customer business development and project research and development. Sharing to Filecoin+ is to let more people in the same industry know about us, so as to help progress together.
5.Could you please demonstrate to us how you envision processing and transporting the customer data in question to any location for preparation? We have temporarily communicated with a storage provider f01926777, who promised that all storage devices will be stored in IDC and have a bandwidth of no less than 500M
6.Would you demonstrate to us that the customer, the preparer and the intended storage providers all have adequate bandwidth to process the set with its corresponding size? This is our IDC bandwidth speed test
7.Would you tell us how the data set preparer takes into account the prevention of duplicates in order to prevent data cap abuse? Before uploading data, we will use our self-made program to filter the uploaded data, remove duplicate data, and then manually filter out invalid data to ensure that the uploaded data is valid and meaningful.
How much data do you have? How many copies will you store? What's the relationship between you and the organization? Whether the sps you choose can support data retrieval?
Hello,
I see that the retrieval frequency for this data is 1 or 2 times a day, i think this would be a challenge on the short term. The only way to do this is to use Boost and this is not an "entry level" assignment that is easy to acomplish.
However, i will ask some people from your region to join this chat who are respectable SP's and data preparers to see if they can make this a working project.
On a side note .... The data you have should really belong in Fil-E and not in Fil+ as it is not public data. Medical records should be kept private and encrypted.
5PiB
100TiB
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
f02049625
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
50TiB
ebb9f6e7-d14e-41c5-ab07-4165622d9257
There is no previous allocation for this issue.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
Dear applicant,
Thank you for applying for datacap. As Filecoin FIL+ notary i am screening your application and conducting due diligence.
Looking at your application i have some questions: Can you show us visible proof of the size of your data and the storage systems you have there?
As last question i would like you to fill out this form to provide us with the necessary information to make a educated decision on your LDN request if we would like to support it.
Thanks!
@cryptowhizzard Hello, this is the storage system we use, which stores the amount of data generated by the organization's business.
As last question i would like you to fill out this form to provide us with the necessary information to make a educated decision on your LDN request if we would like to support it.
?
@cryptowhizzard Hello,completed the form.
Hi @Netonline2016
Thanks. It is well received.
You provided me with only one SP id where you are going to store data. I need a minimum of 4. What i need is the 4 SP id' s and their contact information. If you prefer you can provide the SP ID's in here, and the contact information by e-mail to kyc@dcent.nl.
Cheers.
@cryptowhizzard Hello, I have sent the email to the designated email address using kcolle69@163com
Hi
The SP's you provided are not qualified for FIL+ . They don't have their IP address published on chain and are not open for retrieval.
Please please please ... just read the rules. If you have questions or need help ask on slack.
RULES: In order to be eligible for the Filecoin+ incentive program, which provides 10x the power (QAP) for storing data on the Filecoin network, there are certain rules that must be followed:
Data must be stored with multiple independent service providers, not just one organization running multiple SPs. Data must be distributed across multiple regions. Using a VPN to fake presence is not allowed. Data must be publicly retrievable for verification that it is being stored as claimed. It is highly recommended to run boost. Data must align with the mission of FIL+, meaning it must be valuable for humanity, whether scientific or unique in some other way. Data must be open and not encrypted. If you wish to encrypt your data, consider applying for the FIL-E program instead. Note: As a data preparer, you are allowed to store one copy yourself for redundancy at the request of the client. However, if your copy is stored in the USA, the other organizations/SPs must be outside your region. Dcent encourages applications with a minimum spread of 2 continents and 3 different organizations.
@cryptowhizzard Hello, very sorry that the sp information was provided incorrectly. It has been re-sent through the kcolle69@163com email address. The newly sent sp has the detection function.
Hi,
I have not received any e-mail from kcolle69@163com . Please send the SP's to kyc@dcent.nl
@cryptowhizzard Hello, I sent it again here, thank you for checking it
Hello,
Thanks. F02017390 does not qualify and has been involved in non-compliant storage of FIL+ data. The other 3 ( f01955028,f01955030,f01955034) are all in CN , Zhejiang , Hangzhou.
Let me state the FIL+ rules again:
RULES: In order to be eligible for the Filecoin+ incentive program, which provides 10x the power (QAP) for storing data on the Filecoin network, there are certain rules that must be followed:
Data must be stored with multiple independent service providers, not just one organization running multiple SPs. Data must be distributed across multiple regions. Data must be publicly retrievable for verification that it is being stored as claimed. It is highly recommended to run boost. Data must align with the mission of FIL+, meaning it must be valuable for humanity, whether scientific or unique in some other way. Data must be open and not encrypted. If you wish to encrypt your data, consider applying for the FIL-E program instead. Note: As a data preparer, you are allowed to store one copy yourself for redundancy at the request of the client. However, if your copy is stored in the USA, the other organizations/SPs must be outside your region. Dcent encourages applications with a minimum spread of 2 continents and 3 different organizations.
@cryptowhizzard I am so sorry,The sp information has been carefully sent to the designated mailbox kyc@dcent.nl as required
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacecvuw25zsezvh2gt4glzq2xnmd3z7gbbkgibdrp6ljdu237ubqy5o
Address
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
Datacap Allocated
50.00TiB
Signer Address
f1krmypm4uoxxf3g7okrwtrahlmpcph3y7rbqqgfa
Id
ebb9f6e7-d14e-41c5-ab07-4165622d9257
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacecvuw25zsezvh2gt4glzq2xnmd3z7gbbkgibdrp6ljdu237ubqy5o
Thanks, well received.
Looking forward to your first milestone!
@cryptowhizzard Thank you!
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacebprzwmxaixrzdigoohrrujhohjsgqjbxh646lrjn5obqyndtmpck
Address
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
Datacap Allocated
50.00TiB
Signer Address
f1zffqhxwq2rrg7rtot6lmkl6hb2xyrrseawprzsq
Id
ebb9f6e7-d14e-41c5-ab07-4165622d9257
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebprzwmxaixrzdigoohrrujhohjsgqjbxh646lrjn5obqyndtmpck
f02049625
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
100TiB
9105a64f-3a6d-4acd-a3fb-e1d122e31571
f01858410
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
GaryGJG & cryptowhizzard
100% of weekly dc amount requested
100TiB
64GiB
4.99PiB
Number of deals | Number of storage providers | Previous DC Allocated | Top provider | Remaining DC |
---|---|---|---|---|
1 | 1 | 50TiB | 100 | 32GiB |
⚠️ 1 storage providers sealed more than 70% of total datacap - f01955034: 100.00%
⚠️ All storage providers are located in the same region.
⚠️ 100.00% of deals are for data replicated across less than 3 storage providers.
⚠️ CID sharing has been observed. (Top 3)
Glif auto verified
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the full report.
checker:manualTrigger
⚠️ 1 storage providers sealed more than 70% of total datacap - f02233608: 97.87%
⚠️ 100.00% of deals are for data replicated across less than 3 storage providers.
⚠️ CID sharing has been observed. (Top 3)
Glif auto verified
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval report.
checker:manualTrigger
⚠️ 1 storage providers sealed more than 70% of total datacap - f02233608: 99.75%
⚠️ 100.00% of deals are for data replicated across less than 3 storage providers.
⚠️ CID sharing has been observed. (Top 3)
Glif auto verified
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval report.
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
Hello, the existing quota has been used up, please allocate the rest
checker:manualTrigger
⚠️ 1 storage providers sealed more than 70% of total datacap - f02233608: 100.00%
⚠️ All storage providers are located in the same region.
⚠️ 100.00% of deals are for data replicated across less than 3 storage providers.
⚠️ CID sharing has been observed. (Top 3)
Glif auto verified
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
⚠️ CID sharing has been observed. (Top 3)
Glif auto verified
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.
@Sunnyiscoming Hello, can you help me? The quota is gone. The robot did not skip to the next round.
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
⚠️ CID sharing has been observed. (Top 3)
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.
@Netonline2016 CID sharing has been observed, What caused it?
@zcfil Hello, our first allocation was completed by inviting the technical assistance of a third-party company, and it may have been confused due to mistakes during the operation at that time. We found out later and corrected the behavior quickly. In the future, we will strictly abide by the fil+ rules.
Okay, we must control the duplication of CID, pass this round of audit, and continuously observe the data status in the future
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzaceb3hbjzjyl4eaco77bmnp5ynyfu3wvlxpqsgh2yzxabuxreseyzs6
Address
f1zwqkfljcsal74irleiia2yqeq6qcdfngqpfxpyy
Datacap Allocated
100.00TiB
Signer Address
f1cjzbiy5xd4ehera4wmbz63pd5ku4oo7g52cldga
Id
9105a64f-3a6d-4acd-a3fb-e1d122e31571
You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceb3hbjzjyl4eaco77bmnp5ynyfu3wvlxpqsgh2yzxabuxreseyzs6
Hello @Netonline2016
As this upcoming allocation is relatively small i won't dispute , but it seems that retrieval is not yet as it should be, especially with SP f02233608
Please do regular checks to make sure you are able to retrieve the data needed for proper due diligence.
Cheers!
cat 1517-f02002405-f02233608-46446961-baga6ea4seaqicpcdjhsor7yv5fnb4ao4tyny7kmhrqm3h3gj774gwdgfwwsrgfq.log 2023-07-27T03:24:11.010+0200 WARN rpc go-jsonrpc@v0.3.1/client.go:615 unmarshaling failed {"message": "{\"Err\":\"exhausted 5 attempts but failed to open stream, err: protocols not supported: [/fil/retrieval/qry/1.0.0]\",\"Root\":null,\"Piece\":null,\"Size\":0,\"MinPrice\":\"\u003cnil\u003e\",\"UnsealPrice\":\"\u003cnil\u003e\",\"PricePerByte\":\"\u003cnil\u003e\",\"PaymentInterval\":0,\"PaymentIntervalIncrease\":0,\"Miner\":\"f02233608\",\"MinerPeer\":{\"Address\":\"f02233608\",\"ID\":\"12D3KooWDJ6hxpTidDnxnHu9Efz8kYHn7brFgd82bvJYmpowMLNx\",\"PieceCID\":null}}"} 2023-07-27T03:24:11.010+0200 INFO retry retry/retry.go:17 Retrying after error:RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"' 2023-07-27T03:25:00.819+0200 WARN rpc go-jsonrpc@v0.3.1/client.go:615 unmarshaling failed {"message": "{\"Err\":\"exhausted 5 attempts but failed to open stream, err: protocols not supported: [/fil/retrieval/qry/1.0.0]\",\"Root\":null,\"Piece\":null,\"Size\":0,\"MinPrice\":\"\u003cnil\u003e\",\"UnsealPrice\":\"\u003cnil\u003e\",\"PricePerByte\":\"\u003cnil\u003e\",\"PaymentInterval\":0,\"PaymentIntervalIncrease\":0,\"Miner\":\"f02233608\",\"MinerPeer\":{\"Address\":\"f02233608\",\"ID\":\"12D3KooWDJ6hxpTidDnxnHu9Efz8kYHn7brFgd82bvJYmpowMLNx\",\"PieceCID\":null}}"} 2023-07-27T03:25:00.819+0200 INFO retry retry/retry.go:17 Retrying after error:RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"' 2023-07-27T03:26:20.772+0200 WARN rpc go-jsonrpc@v0.3.1/client.go:615 unmarshaling failed {"message": "{\"Err\":\"exhausted 5 attempts but failed to open stream, err: protocols not supported: [/fil/retrieval/qry/1.0.0]\",\"Root\":null,\"Piece\":null,\"Size\":0,\"MinPrice\":\"\u003cnil\u003e\",\"UnsealPrice\":\"\u003cnil\u003e\",\"PricePerByte\":\"\u003cnil\u003e\",\"PaymentInterval\":0,\"PaymentIntervalIncrease\":0,\"Miner\":\"f02233608\",\"MinerPeer\":{\"Address\":\"f02233608\",\"ID\":\"12D3KooWDJ6hxpTidDnxnHu9Efz8kYHn7brFgd82bvJYmpowMLNx\",\"PieceCID\":null}}"} 2023-07-27T03:26:20.773+0200 INFO retry retry/retry.go:17 Retrying after error:RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"' 2023-07-27T03:28:12.434+0200 WARN rpc go-jsonrpc@v0.3.1/client.go:615 unmarshaling failed {"message": "{\"Err\":\"exhausted 5 attempts but failed to open stream, err: protocols not supported: [/fil/retrieval/qry/1.0.0]\",\"Root\":null,\"Piece\":null,\"Size\":0,\"MinPrice\":\"\u003cnil\u003e\",\"UnsealPrice\":\"\u003cnil\u003e\",\"PricePerByte\":\"\u003cnil\u003e\",\"PaymentInterval\":0,\"PaymentIntervalIncrease\":0,\"Miner\":\"f02233608\",\"MinerPeer\":{\"Address\":\"f02233608\",\"ID\":\"12D3KooWDJ6hxpTidDnxnHu9Efz8kYHn7brFgd82bvJYmpowMLNx\",\"PieceCID\":null}}"} 2023-07-27T03:28:12.435+0200 INFO retry retry/retry.go:17 Retrying after error:RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"' 2023-07-27T03:28:56.199+0200 WARN rpc go-jsonrpc@v0.3.1/client.go:615 unmarshaling failed {"message": "{\"Err\":\"exhausted 5 attempts but failed to open stream, err: protocols not supported: [/fil/retrieval/qry/1.0.0]\",\"Root\":null,\"Piece\":null,\"Size\":0,\"MinPrice\":\"\u003cnil\u003e\",\"UnsealPrice\":\"\u003cnil\u003e\",\"PricePerByte\":\"\u003cnil\u003e\",\"PaymentInterval\":0,\"PaymentIntervalIncrease\":0,\"Miner\":\"f02233608\",\"MinerPeer\":{\"Address\":\"f02233608\",\"ID\":\"12D3KooWDJ6hxpTidDnxnHu9Efz8kYHn7brFgd82bvJYmpowMLNx\",\"PieceCID\":null}}"} 2023-07-27T03:28:56.199+0200 ERROR retry retry/retry.go:29 Failed after 5 attempts, last error: RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"' ERROR: RPC client error: unmarshaling result: failed to parse big string: '"\u003cnil\u003e"'
Adding a second one for SP f02237295.
cat /var/www/html/filplus/1517-f02002405-f02237295-48400251-baga6ea4seaqcef2fmcwlk2gdlljp6c5durc7e7f5e2jocz6qcwi75bf7rdeh6nq.log Recv 0 B, Paid 0 FIL, Open (New), 9ms [1690024162200371807|0] Recv 0 B, Paid 0 FIL, DealProposed (WaitForAcceptance), 46ms [1690024162200371807|0] Recv 0 B, Paid 0 FIL, DealRejected (RetryLegacy), 35h5m34.584s [1690024162200371807|0] Recv 0 B, Paid 0 FIL, DealProposed (WaitForAcceptanceLegacy), 35h5m34.637s [1690024162200371807|0]
If you can come up with a plan for improving data retrieval, we'd like to sign it.
@MRJAVAZHAO Hi, thanks for the reminder. We are also in the process of continuous improvement. The reason for the low number of retrievals is that our sps are still in storage, and the rest of the retrieved data are distributed and stored in these sps. Once all stores are complete, retrieval will resume normally.
name: Large Dataset Notary application about: Clients should use this application form to request a DataCap allocation via a LDN for a dataset title: "Medical platform" labels: 'application, Phase: Diligence' assignees: ''
Large Dataset Notary Application
To apply for DataCap to onboard your dataset to Filecoin, please fill out the following.
Core Information
Please respond to the questions below by replacing the text saying "Please answer here". Include as much detail as you can in your answer.
Project details
Share a brief history of your project and organization.
What is the primary source of funding for this project?
What other projects/ecosystem stakeholders is this project associated with?
Use-case details
Describe the data being stored onto Filecoin
Where was the data in this dataset sourced from?
Can you share a sample of the data? A link to a file, an image, a table, etc., are good ways to do this.
Confirm that this is a public dataset that can be retrieved by anyone on the Network (i.e., no specific permissions or access rights are required to view the data).
What is the expected retrieval frequency for this data?
For how long do you plan to keep this dataset stored on Filecoin?
DataCap allocation plan
In which geographies (countries, regions) do you plan on making storage deals?
How will you be distributing your data to storage providers? Is there an offline data transfer process?
How do you plan on choosing the storage providers with whom you will be making deals? This should include a plan to ensure the data is retrievable in the future both by you and others.
How will you be distributing deals across storage providers?
Do you have the resources/funding to start making deals as soon as you receive DataCap? What support from the community would help you onboard onto Filecoin?