filecoin-project / Allocator-Governance

7 stars 28 forks source link

Community Review of NewHuoPool #139

Open filecoin-watchdog opened 3 weeks ago

filecoin-watchdog commented 3 weeks ago

This allocator had an allocation issue https://github.com/filecoin-project/Allocator-Registry/issues/81 - not sure if it was ever clarified by governance team?

Allocator applilcation: https://github.com/filecoin-project/notary-governance/issues/1040

Allocator compliance review: https://compliance.allocator.tech/report/f03019924/1724112375/report.md

filecoin-watchdog commented 3 weeks ago

One example: https://github.com/NewHuoPool/NewHuoPoolPathway/issues/1

Datacap given to one client over several allocations.

SPs continuously change, retrievals are low.

Allocator claims they used lotus to check retrievals. However, there is no detail on data preparation, all questions in the applications are left blank and answers from client in comments are vague. I'd challenge the allocator to push for more details and demonstrate proof of how dataset is prepared and documented and how the client can actually retrieve the open files.

NewHuoPool commented 3 weeks ago

@filecoin-watchdog Thank you for your review.I would like to give an explanation of above.

Regarding Allocation Anomaly While processing the application for client yse88483, we ran into an unexpected issue due to a temporary DMOB API glitch, which caused an over-allocation of DataCap. Thanks to the quick actions of kevzak and galen-mcandrew, the issue has now been completely resolved.

Regarding Allocator Review Process https://github.com/NewHuoPool/NewHuoPoolPathway/issues/1 Allocated DC: 150T - 450T - 900T We always stress the importance of keeping SP information up to date, and we make sure to remind our clients about this throughout the review process. image

The client promptly updated the changes in the SP and explained the uncertainties in their collaboration with the SP. Under these circumstances, I expressed my understanding and continued to support the client's application.

NewHuoPool commented 3 weeks ago

The client's Spark retrieval rate has improved with each allocation which matches up with our phased assessment standards. de

During the ongoing optimization of retrieval tools, our review principles will prioritize Spark but also accommodate other methods like HTTP, Graphsync, and Lotus commands. However, as an allocator, we acknowledge our technical limitations. Moving forward, we will require clients to provide more detailed demonstrations of successful retrievals to ensure data accessibility.

NewHuoPool commented 3 weeks ago

We’re committed to following the review principles of Fil+ andour allocator's, and we also agree with Kevin’s proposal to make some improvements to our review process. This includes: ⦁ Asking clients to provide detailed info about their datasets and how they’re being stored. ⦁ Thoroughly checking the data transformation and packaging process to make sure it meets Fil+ storage requirements. ⦁ Verifying storage transactions to ensure the datasets can be independently checked and retrieved.

galen-mcandrew commented 6 days ago

Given the above information and additional investigation, this allocator is working to perform diligence and intervene with their clients to enforce compliance. We want to call attention to some specific issues that should be addressed with existing and new clients going forward, specifically:

As a reminder, the allocator team is responsible for verifying, supporting, and intervening with their clients. If a client is NOT providing accurate deal-making info (such as incomplete or inaccurate SP details) or making deals with noncompliant unretrievable SPs, then the allocator needs to intervene and require client updates before more DataCap should be awarded.

Despite the flags above, we are seeing evidence of allocator diligence and interventions. We are requesting an additional 5PiB of DataCap from RKH, to allow this allocator to show increased diligence and alignment.

Please verify that you will instruct, support, and require your clients to work with retrievable storage providers. @NewHuoPool can you verify that you will enforce retrieval requirements, such as through Spark?

NewHuoPool commented 3 days ago

@galen-mcandrew Thank you for your guidance. We will strictly follow the official guidelines, ensuring thorough reviews of client information, SP distribution, and data retrieval success rates. We are committed to enforcing retrieval requirements and will instruct and require our clients to work with compliant SPs.