filecoin-project / Allocator-Governance

7 stars 36 forks source link

[Allocator Application] <Wuhan Jinyu Information Technology Center>< IPFSYUN> PR #156 #158

Open martapiekarska opened 2 months ago

martapiekarska commented 2 months ago

Allocator Application

Application Number

recV6ImitX6ZB2UrQ

Organization Name

Wuhan Jinyu Information Technology Center

Organization On-chain Identity

f22yzezugt24abipxkm3ncqss2s4dd6tryiyvxx7q

Allocator Pathway Name

IPFSYUN

Github PR Number

156

Region of Operation

Asia minus GCR,Europe,Greater China Region,North America,South America

GitHub ID

martapiekarska

On-chain address

I will provide an address on a later date

Type of Allocator

RFA

Filecoin Community Agreement

As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack. Additionally, I will adhere to all local & regional laws & regulations that may relate to my role as a business partner, organization, notary, allocator, or other operating entity Acknowledge

Type of Allocator and RFA List

RFA: Market-based - Client/SP Fees

Allocator Description

I. Clarifying Distribution Goals and Principles

First, as RFA distributors, we will clearly define our distribution goals and basic principles. This includes, but is not limited to, ensuring fair distribution of DC, optimizing distribution efficiency, supporting the sustainable development of key businesses or projects, and complying with relevant laws, regulations, and industry standards. By establishing these goals and principles, we can provide clear guidance for subsequent distribution efforts.

II. Developing Detailed Distribution Strategies

1. Customer Demand Analysis: Conduct a thorough analysis of the needs of customers and service providers (SPs) requiring distribution, understanding the specific demand and priorities for DC from different users.

2. Customer Usage Evaluation: Perform a comprehensive assessment of market conditions, actively monitor real-time data from bots, and evaluate customers' actual usage to provide foundational data support for rational distribution.

3. Strategy Formulation: Based on the results of demand analysis and resource evaluation, develop detailed distribution strategies. These strategies will consider multiple aspects such as fairness, efficiency, and sustainability of resource allocation, and may include specific measures such as prioritization, quota systems, and dynamic adjustment mechanisms.

III. Implementing a Dynamic Adjustment Mechanism

To ensure that distribution can be flexibly adjusted according to actual conditions, we will establish a dedicated FIL+ department to create a dynamic adjustment mechanism. This includes regularly assessing resource allocation and making necessary adjustments to the distribution strategies based on evaluation results.

IV. Strengthening Communication and Collaboration

As RFA distributors, we will actively communicate and collaborate with all parties involved. This includes participating in notary meetings, establishing effective communication channels and cooperation mechanisms with customers and SPs, and sharing information and progress on resource distribution through regular meetings and reports.

V. Ensuring Compliance and Sustainability

During the implementation of RFA distribution, we will strictly adhere to FIL+ requirements, ensuring compliance, fairness, and decentralization in distribution. We will actively optimize resource allocation based on data to help the Filecoin network better store valuable data.

Contributions to EcosystemOnboard >10PiBs of Data,Produce educational materials,Host or sponsor community events such as meetups, hackathons, conferences, FilDev summits etc

Monetization and Fee structure

SP fees,Client fees,Block rewards, pools.

Target Clients

Web3 developers,Nonprofit organizations,Commercial/Enterprise

Client Diligence Check

3rd party Know your business (KYB) service,Client promise/attestation,Manual verification,Proof of provenance,NDA with additional verification

Description of client diligence

How to MitigateSybil Attacks:

We will introduce KYC and other identity verification mechanisms for SPs to raise the entry threshold. Additionally, SPs will be required to provide a certain amount of FIL as collateral to reduce the incentive for malicious SP participation. We will also require a minimum of four SPs to participate, with no single SP receiving more than 25% of the allocation, thereby reducing the influence of any one SP. We will encourage community members to actively participate in node supervision and reporting; for nodes that repeatedly engage in malicious behavior, we will permanently refuse to collaborate with them.

We will conduct client investigations using a combination of manual and automated methods.

1. We have applied for LDN on five Allocators, and each application requires filling out a GitHub application form. If we are elected as Allocators, we will continue to follow this method, ensuring that all applications must be submitted via GitHub. The link to the GitHub application will be permanently available for anyone to view, serving as our primary means of verifying clients and establishing initial trust.

2. We will review the materials submitted by applicants, including verifying their information on social media and platforms like Qichacha to ensure its authenticity. To further understand the applicant, we will inquire about their country and region. If they are nearby, we will conduct an on-site visit; if they are far away, we will require a video conference, which is our main way of familiarizing ourselves with clients.

3. We will primarily assess the relationship between the applicant and the SP, as well as the SP's technical background, experience, capabilities, and their understanding of compliance, data processing, and Spark retrieval.

4. In the event of a dispute, whether with a new or existing client, we will immediately halt DC allocation and form an investigation team consisting of five members. If necessary, we may involve the governance team in the investigation or discuss it at a notary committee meeting.

How to Verify the Authenticity of Customer Data Ownership Claims:

1. We will focus on the applicant's technical solution to ensure data compliance and authenticity. For example, we will verify how the data is converted into Filecoin transaction orders.

2. We will conduct manual spot checks on sectors, examining data consistency and comparing how the sampled data becomes part of the dataset.

3. We will require clients to explain which sector corresponds to which piece of data.

How to Provide Evidence and Documentation to the Governance Team for Auditing Your Customer Due Diligence:

1. We will synchronize emails sent by corporate clients to the governance team’s mailbox.

2. Communications with clients, including emails and phone calls, will also be documented and synchronized on GitHub for the governance team to audit.

3. Before each round of DC allocation, we will participate in meetings to publicly discuss our allocation status.

Type of data

Public, open, and retrievable

Description of Data Diligence

How to Execute Data Due Diligence to Verify Clients: First, prior to collaboration, we will verify client identities through domain email, KYC, and KYB processes. After establishing cooperation, we will fully utilize the check-bot’s functionality to review check data before each round of signing. We will mainly observe whether the SP's disclosures align with actual cooperation, whether there is CID sharing, the proportion of DC obtained by the SP, and the level of support for Spark. If any doubts arise, we will suspend signing until the client provides a reasonable explanation. If the client fails to resolve the issues, we will cease support for them. Second, we have a dedicated legal compliance team responsible for monitoring and ensuring that our data processing activities comply with local and regional legal requirements. Additionally, we have established clear data protection policies and implemented appropriate technical and organizational measures to ensure the confidentiality, integrity, and availability of data. We also conduct regular risk assessments and compliance reviews to ensure our data processing activities meet local and regional legal standards. Types of Data Sampling and Tools Used to Confirm Data Consistency: We employ random sampling methods to regularly check and verify the data provided by clients. For instance, before each round of signing, we will download 10 sectors, decompress the data, and compare it with the source data provided by the client. We also use professional data audit tools, such as integrity checks, to confirm that the data is consistent with the initial declarations. We will establish a client management system to record the results of each sampling and verification, ensuring full-chain management of clients and SPs to eliminate collaboration with dishonest clients. Presenting Evidence to the Governance Team: Clients submit their applications on GitHub, and most records are synchronized there, making them readily accessible. If clients send us domain emails and other corporate information, such as business licenses and shareholder information, via email, we will forward these emails to the governance team’s inbox. Additionally, our team will conduct manual investigations of clients. We will document the data due diligence process and results, creating detailed audit reports. The reports will specify the steps taken, tools used, issues discovered, and ratings given to clients. We will communicate the audit results to the governance team through emails, participation in notary meetings, communication on Slack, and establishing GitHub proposals.

Data Preparation

Singularity,Go-CAR,Other existing ecosystem tooling

Replicas required, verified by CID checker

5+

Distribution required

Equal distribution of deals across regions

Number of Storage Providers required

5+

Retrieval Requirements

Public data highly retrievable over Spark.

Allocation Tranche Schedule TypeSingle size allocation.

256

Will you use FIDL tooling, such as allocator.tech and other bots?

Yes, all available tools

GitHub Bookkeeping Repo Link

https://github.com/nike-mp/IPFSYUN-Allocator

Success metrics

Number of clients,Retrievability of data,Amount of data onboarded, daily & aggregate,Number of returning client customers,Number of paid deals,Ecosystem marketing/comms

Timeline to begin allocating to clients

1 month from RKH approval

Funnel: Expected DataCap usage over 12 months

75-100PiB

Risk mitigation strategies

In the process of applying for Filecoin notaries, the processes of protecting organizations, reputations, and preventing abuse are key to ensuring fairness, security, and sustainability. Below are the detailed measures I will take:

Operational Security (OpSec) Standards

Rigorous Due Diligence: Before accepting new clients, we will implement comprehensive due diligence procedures, including but not limited to business legitimacy verification, financial status assessment, and compliance checks for data processing. This helps ensure that new clients have a reliable business background and data processing capabilities.

KYB Process: Conduct in-depth KYB audits on service providers (SPs) to ensure they have the capability and reputation to provide high-quality services. This will help us filter out reliable partners and offer better service guarantees to clients.

Risk-Based Assessment Model: Employ a refined risk assessment model, where new clients receive initial DataCap allocations that match their compliance records and company profiles, with adjustments based on subsequent performance. This helps ensure rational resource allocation while incentivizing clients to comply with rules.

Security Technologies and Tools: Utilize advanced encryption technologies, firewalls, intrusion detection systems, and other means to ensure the physical and network security of systems and data. Additionally, tools like datacapstats.io and CID check bots will be used to monitor DataCap allocation and usage in real time.

User Agreements and Compliance Measures

Clear User Agreements: Develop detailed user agreements that explicitly define the rights and obligations between users and notaries, including data ownership, usage rules, and violation handling. The agreement will fully protect user rights while clearly stating the consequences of violations.

Compliance Training and Promotion: Regularly conduct compliance training for employees and users to enhance their understanding of Filecoin network rules, data protection, and security awareness. Simultaneously, promote compliance policies through multiple channels to foster a positive compliance atmosphere.

Data Privacy Protection: Strictly adhere to data protection regulations to ensure that the collection, storage, processing, and transmission of user data comply with relevant legal requirements. Use encryption technologies to protect user data and prevent data leakage and abuse.

Alert and Restriction Mechanisms

Real-Time Monitoring and Anomaly Detection: Establish a real-time monitoring system to continuously track DataCap allocation, usage, and user behavior. Utilize intelligent algorithms to analyze data, promptly detecting abnormal behavior and triggering alerts.

Resource Limitations and Isolation: Set resource usage limits for new or high-risk users, such as DataCap quotas and storage speeds. Upon discovering violations, immediate isolation measures will be taken to prevent risk spread.

Audits and Inspections: Conduct regular remote or on-site audits to verify the consistency between stored data and DataCap allocation. Record and analyze audit results to provide a basis for subsequent compliance evaluations.

Preliminary Verification and Trust Assessment

Multi-Dimensional Verification: Combine due diligence, KYB audits, risk assessments, and user agreement signing as multi-dimensional means to comprehensively verify new clients. Ensure clients have legitimate identities, good reputations, and compliant operational capabilities.

Community Feedback and Transparency: Actively listen to community opinions and feedback, regularly reporting to the community on DataCap allocation, compliance check results, and improvement measures. By enhancing transparency, we aim to build trust with the community.

Continuous Supervision and Assessment: Establish a long-term supervision and assessment mechanism to continuously track and evaluate client behavior. Adjust DataCap quotas and resource limitations based on assessment results to ensure fairness and security.

In summary, I will effectively protect the organization, reputation, and the Filecoin ecosystem from abuse through stringent operational security standards, clear user agreements, efficient alert and restriction mechanisms, and comprehensive preliminary verification and trust assessment processes.

Dispute Resolutions

In the Filecoin network, disputes that may arise during the DataCap allocation process are inevitable. However, we can effectively resolve these disputes through a series of efficient, fair, and transparent measures, ensuring that the rights of all participants are protected while maintaining the stability and reliability of the network.

  1. Response Time We commit to initiating the dispute resolution process within 24 hours of receiving any dispute application or relevant information. By responding quickly, we can intervene in a timely manner, minimizing the impact of the dispute on all parties and the overall network.

  2. Transparency Assurance GitHub Platform Handling: All dispute resolution processes will be conducted on GitHub, leveraging the platform's openness and traceability to ensure transparency. Proposals and Comments: For each dispute, we will create a detailed proposal outlining the background and issues involved, and allow the parties in dispute to comment and submit evidence below the proposal. This way, all participants can clearly understand the full picture of the dispute and the perspectives of each party. Public Communication: In addition to written communication on GitHub, we will maintain open and direct communication with the parties involved through video conferencing (e.g., Zoom meetings) and instant messaging tools (e.g., Slack) to ensure real-time information transfer and in-depth discussions.

  3. Processing Workflow Proposal Construction: First, we will create a comprehensive proposal that clearly outlines the background of the dispute, the issues involved, and preliminary suggestions for resolution. Community Comment Period: The proposal will be made public on GitHub, with a 15-day comment period. During this time, parties in dispute and community members can freely express their views, submit evidence, or propose improvements. Zoom Meeting Discussion: To gain a deeper understanding of each party's stance and perspective, we will organize a Zoom meeting for face-to-face discussions. During the meeting, parties can exchange opinions and provide more detailed explanations and evidence. Further Discussion on Slack: After the meeting, if necessary, we will continue discussions on Slack to gain a more comprehensive understanding of the dispute and explore possible solutions. Involving External Participants: If a dispute is complex and difficult to resolve internally, we will invite the Fil+ governance team or other notaries to participate in the discussion, utilizing their expertise and experience to assist in resolving the dispute. Results Announcement and Accountability: The final decision will be published on GitHub, including penalties for the offending party (such as revocation of DataCap, designation as an unreliable storage provider, inclusion on a blacklist, etc.) and rewards for the honest party (such as increased DataCap allocation and more opportunities to participate in network development). We will also ensure the effective execution of accountability mechanisms to maintain the fairness and reliability of the network.

Compliance Audit Check

  1. Manual Checks and Assessments: We will regularly conduct rigorous audits and evaluations of our clients and service providers' (SPs) data practices to ensure complete compliance with legal requirements as well as specific requirements of the Filecoin program and pathways. For example, we will use https://datacapstats.io/ and filplus-backend to assess the usage, distribution, and proportions of service providers (SPs) to evaluate their credibility. At the same time, we will strengthen the management of LDN and continue to provide an application portal on GitHub, utilizing filplus-backend for management.

  2. Supplementary Checks via Bots: In addition to internal audits, we will utilize a check bot for supplementary compliance checks. The bot can monitor our operations and data practices in real-time, ensuring that no potential compliance risks are overlooked.

  3. Focus on Spark's Retrieval Rate: We will place emphasis on the retrieval rates of Spark, seeking to collaborate with SPs that support Spark while reducing partnerships with those that do not.

  4. Submission of Audit Reports After Each Round of DataCap Usage: After each round of DC usage, we will submit an audit report via https://github.com/filecoin-project/Allocator-Governance/issues. When our DC usage reaches 80%, we will provide a proposal for DC usage for the governance team to review and actively participate in notary meetings.

  5. Providing KYB Audit Content, Client Data Reports, and Compliance Reports to the Governance Team: To specifically demonstrate our compliance, we will prepare KYB audit content, client data reports, and compliance reports. These materials will be sent to the governance team’s email and will detail our operational processes, data storage practices, and client interactions, providing intuitive and comprehensive support for the audit process.

Compliance Report content presented for audit

Success metric: onchain report of data onboarded,Success metric: onchain data report,Client Diligence: Client statements, client provided verification,Client Diligence: KYC/KYB report on clients,Data Compliance: Proof of provenance report,Data Compliance: Data Samples,Data Compliance: Manual report,Compliance: CID report.

Connections to Filecoin Ecosystem

Storage provider,Big data contributor,Event sponsor

Slack ID

Ke di

Kevin-FF-USA commented 2 months ago

Hello!

Thanks for applying to serve as an Allocator in the Filecoin Plus Program. In looking at this application, the TYPE OF ALLOCATOR is currently set to REQUEST FOR ALLOCATOR (RFA). In this application under the type of Allocator you have selected Market-based - Client/SP Fees, but the descriptions of the services read like a Manual review. Can you please describe what type of Allocator pathway you wish to operate as?

For more details on the types of Allocators please check out this blog. https://blog.allocator.tech/2024/04/allocator-tech-blog.html

Kevin-FF-USA commented 2 months ago

For MANUAL pathways, would like to make you aware that these types of applications are being processes as the network need becomes available. Currently there are ~50 pathways available for Manual reviews, so the priority for onboarding new Allocators is to pathways doing something novel to support the network.

image image

Kevin-FF-USA commented 1 month ago

Hi Wuhan Jinyu Information Technology Center,

In this application the listed github handle is Marta - the Executive Director of FIDL. As a result I cant tag you, and not sure if you are seeing any of the comms. A friendly check in. Able to make it to any of the calls?

Thanks for your patience while we collected and reviewed new applicants to the Fil+ program. Wanted to share some feedback about this application, because in reading this description, what is being described is a Manual pathway. No real mention of how this market would operate. Also, many of the detailed fields are either empty or lacking detail that would give a score high enough to onboard as a new Allocator. This information will play a key part in the scoring mechanisms for this pathway.

Recommending a few options.

  1. Check out some of the other applications and the details of their program. Specifics are key - how will this pathway be measured for success and diligence. Right now, there is not much detail. Here is an example of a recently onboarded Application .

  2. If you have clients you can demonstrate this by brining them to network with an existing Allocator. Bring one of your clients into the ecosystem with an Existing Allocator Pathway. Establish that you have real clients and can maintain the diligence standards of this application. Demonstrate that to the community as the ability and value to onboarding this new Manual Pathway.

Steps

  1. Work with any existing Allocator to create an application on behalf of your client. FIDL runs an enterprise Allocator if you were looking for a pathway with existing support in place to help with questions.

  2. Once the data is onboarded, reply back to this application with the following

    1. Client ID
    2. Links to the DataCap Application
    3. Spark retrieval %
    4. Verification that the Data reached the targeted number of SP's
    5. What the data type was

Onboarding Once the ability to onboard clients through the application process has been verified, this application will receive a KYC check and begin onboarding as an Allocator to onboard clients directly.

For questions or support

nike-mp commented 1 month ago

@Kevin-FF-USA Hello, we are discussing about the automation platform. We plan to participate in the meeting and speak in the future. We have applied for about 2.5P DC on 5 Allocators . Next, we will screen sp that supports spark and apply for DC.Our team is checking the conference playback record. Thank you.

Kevin-FF-USA commented 1 month ago

Hi @nike-mp Wonderful - looking forward to it!

If you'd like to discuss on the upcoming 29Oct call just let me know here in this sign up issue: https://github.com/filecoin-project/Allocator-Governance/issues/197

nike-mp commented 4 weeks ago

Now, applications for datacap are submitted via issues on the GitHub pages of various allocators, which is not convenient for the widespread use of Filecoin. There are language barriers and differences in operational habits. We should provide a more convenient way for users to apply for datacap. Therefore, we plan to create a dedicated website to accept datacap applications. Users can register as website users through WeChat, GitHub, Twitter, and other platforms, and submit application information and SP data via a form. We will review applications and allocate quotas through an administrator account. For compliance, we will integrate with the Spark interface to monitor the usage of DC. A snapshot of the DC usage will be recorded before each signature for regulatory review.

Development Plan:

  1. By mid-January, we will complete the development of the basic testing version of the website, including user registration and login, LDN application submission, SP update management, and other functionalities.
  2. By mid-March, the beta version will be launched, allowing users to apply and integrating Spark to track the usage of DC issued to LDNs. Administrators will decide whether to grant quotas through self-review.
  3. After April, we plan to gradually enable the automatic quota issuance feature. We will implement an automatic check mechanism in the system backend, which will trigger the next round of quota signing when all conditions for the corresponding LDN meet the threshold for the next batch of quota issuance.

    Of course, we may need to start manually, progressing from manual to hybrid and then to fully automated, which will take some time. we will try my best.

nike-mp commented 4 weeks ago

Since the bot data has not been updated, there are some duplicate data problems. We want to wait until the data is updated before giving a speech, which is expected to be in November. Thank you.

Kevin-FF-USA commented 4 weeks ago

Sounds good @nike-mp , Looking forward to seeing the development.

While you work on the page, one thing to note is the DC issued. For Filecoin Plus new Allocator applications, the processing and onboarding takes places AFTER an MVP for the Market is stood up. Should you need access to DC in testing, recommend you reach out to the FIDL team at https://www.fil.org/ecosystem-explorer/filecoin-incentive-designlabs.

Once you have a working website, we can then review it for this application.

If you would like to speak or get dedicated support for questions, the next community governance calls are taking place 12 November 03 December 17 December

Thanks and Best!

nike-mp commented 3 weeks ago

https://github.com/filplus-bookkeeping/IPFSCN/issues/15

WX20241104-160748@2x

https://github.com/NDLABS-Leo/Allocator-Pathway-ND-CLOUD/issues/39

381440458-6d55d5c6-cb8a-487d-b4b0-58882250a3db

They are the SP we are cooperating with. These SPs support spark.