Open martapiekarska opened 3 months ago
Linking to discussion from https://github.com/filecoin-project/Allocator-Governance/issues/144 2nd Notary Call.
HI @chainupk
Thanks for applying. Wanted to check in to see if you were still interested in operating as an Allocator? One of the scoring mechanisms for pathways is their ability to onboard quality data to the network. Given your existing clients, suggesting a proposal to help establish your ability within the ecosystem to serve as an Allocator performing MANUAL diligence.
Proposal Bring one of your clients into the ecosystem with an Existing Allocator Pathway. Establish that you have real clients and can maintain the diligence standards of this application. Demonstrate that to the community as the ability and value to onboarding this new Manual Pathway.
Steps
Work with any existing Allocator to create an application on behalf of your client. FIDL runs an enterprise Allocator if you were looking for a pathway with existing support in place to help with questions.
Once the data is onboarded, reply back to this application with the following
Onboarding Once the ability to onboard clients through the application process has been verified, this application will receive a KYC check and begin onboarding as an Allocator to onboard clients directly.
For questions or support
Dear @Kevin-FF-USA ,
Thank you for reaching out and for your consideration of my application. Indeed we are still interested in operating as a Manual Allocator within the Filecoin ecosystem.
I appreciate your proposal to help establish my ability to onboard quality data and perform manual diligence. I will proceed as suggested:
Collaboration with an Existing Allocator: I will work with an existing Allocator, to create a DataCap application on behalf of one of my clients.
Data Onboarding: Once the data is onboarded to the network, I will ensure that all diligence standards are met and that the data reaches the targeted number of Storage Providers.
Providing Required Information: After successful onboarding, I will reply back with the following details:
Client ID Links to the DataCap Application Spark Retrieval Percentage Verification that the Data Reached the Targeted Number of SPs Description of the Data Type Please let me know if there are any specific Allocators you recommend I work with or any additional steps I should be aware of.
Thank you for your guidance and support.
Best regards K
Hi @chainupk , Excllent - thats exactly it. Show that you can bring a client to another Allocator--- then maintain the full accountability on that deal. POst metrics in this thread when completed.
Allocator Application
Application Number
rechK2uembjbdcqZ5
Organization Name
origin storage
Organization On-chain Identity
f1q6bpjlqia6iemqbrdaxr2uehrhpvoju3qh4lpga
Allocator Pathway Name
future-storage
Github PR Number
85
Region of Operation
Africa,Asia minus GCR,Europe,Greater China Region,North America,Oceania,South America
GitHub ID
chainupk
On-chain address
I have a multisig I want to provide now
Type of Allocator
Similar to existing allocator pathways
Filecoin Community Agreement
As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack. Additionally, I will adhere to all local & regional laws & regulations that may relate to my role as a business partner, organization, notary, allocator, or other operating entity Acknowledge
Type of Allocator and RFA List
Manual - Existing similar diligence pathway
Allocator Description
We have existing clients, a mix of research and enterprise clients.
Contributions to EcosystemDevelop Open-Source Tools,Onboard >10PiBs of Data,Build better data onboarding pathway,Produce educational materials,Data Stewardship: Curate and provide high-quality datasets to be stored on the Filecoin network, enhancing the overall value and utility of the network.
Monetization and Fee structure
Client fees,SP fees,Block rewards, pools,Client staking.
Target Clients
Web3 developers,Nonprofit organizations,Commercial/Enterprise,Individuals,Open/Public
Client Diligence Check
3rd party Know your business (KYB) service,3rd party Know your customer (KYC) service,Automated deterministic,Client promise/attestation,Manual verification,NDA with additional verification
Description of client diligence
We are going for the manual approach. During the consultancy phase KYC procedures are performed. Below is the approach to our due diligence coverage. Initial Contact Assessment: We conduct an initial review of the client's inquiry, gathering basic information about their business, the nature of their data storage needs, and their overall objectives. This helps in preliminary filtering and understanding the client's background. Requirement Analysis: We delve deeper into understanding the specific data storage requirements of the client, including volume, nature of data (such as whether it's sensitive or regulated), and specific service needs. This helps in assessing the complexity and risk associated with the data storage. Business Verification and Background Check: We conduct a thorough background check on the client's business. This includes verifying business registration, understanding the ownership structure, and checking for any past legal issues or controversies. We also review their financial stability through available financial statements or credit reports. Compliance and Legal Check: We ensure that the client is in compliance with relevant laws and regulations, especially those related to data handling, privacy, and security. This is crucial for clients dealing with sensitive or regulated data. Risk Assessment: We evaluate potential risks associated with engaging the client. This includes data security risks, legal and compliance risks, and reputational risks. Based on this assessment, we decide on the necessary risk mitigation strategies. Reference Check: If applicable, we contact previous or current partners or service providers of the client for references. This helps in understanding the client's business conduct and reputation in the industry. Meeting and Discussion: We hold direct meetings (virtual or in-person) with key representatives of the client for detailed discussions. This helps in assessing the seriousness and genuineness of their requirements and in building a relationship. Documentation and Agreement Review: Before finalizing any partnership, we thoroughly review all legal documents and agreements to ensure that they are in line with our service standards and legal requirements. Ongoing Monitoring: After onboarding a client, we will conduct regular reviews and monitoring to ensure ongoing compliance and to promptly address any emerging issues. Optionally, we will encourage our clients to make use of the KYC tooling that filecoin recently introduce. Where the client will head down to the github issue and complete their KYC there as well. We will try to push all clients to make use of this, although some larger and more traditional web2.0 clients might find it difficult. Also, we created a basic KYC information section in our github repository. This will have some comments and basic information about clients, so that our data cap allocation issues can refer to KYC information. All employees are required to comment on the KYC issue AND the application before the next action is triggered. We have implemented the checker and signer roles, so signers will have to "watch" the github repository and look out for checker's comments before triggering an action. While the checker will have to "watch" the repository for new applications or comments and response with their own set of comments.
Type of data
Private encrypted with on-chain deal pricing,Public, open, and retrievable,Proof of concept, network utilities
Description of Data Diligence
Our process for verifying a client’s data ownership is meticulous and multi-faceted, ensuring thorough validation while respecting client confidentiality and data protection laws. Our verification process involves the following steps: Initial Data Ownership Verification: We request clients to provide documentation that proves their ownership or legitimate rights to the data they wish to store. This documentation may include data acquisition agreements, licenses, or legal attestations. For data generated internally by the client, we may request organizational charts, process descriptions, or other relevant documents that illustrate how the data is generated and used within their operations. We have since created a public KYC form: Go to https://github.com/Origin-Storage-IO/future-storage/issues and create a new issue. KYB Process: For enterprise clients, we have our own internal compliance department that does the KYB verification on the legitimacy of their business and their lawful right to the data. Our Legal and Compliance department specializes in comprehensive business verification, including legal status, beneficial ownership, regulatory compliance, and reputation checks. These KYB processes also help us assess the risk level associated with the client, including checks against global sanctions lists, Politically Exposed Persons (PEP) lists, and adverse media screenings. Data Provenance Checks: We conduct data provenance checks to understand the origin, movement, and lifecycle of the data. This involves reviewing the client's data management practices, historical data logs, and any relevant data transfer agreements. For sensitive or regulated data, we may require additional evidence of compliance with relevant data protection regulations (e.g., GDPR, HIPAA). Client Interviews and Meetings: Direct interactions with the client, such as interviews or meetings, are conducted to gather more insights about their data handling practices and to clarify any doubts or ambiguities arising from the documentation provided. Ongoing Monitoring and Audits: After onboarding, we implement periodic reviews and audits to ensure continued compliance with data ownership and legitimacy. Any significant changes in the client's business or data usage are subject to additional verification checks. By integrating these steps into our client onboarding process, we ensure a robust and reliable method for verifying data ownership. We remain adaptable to evolving regulatory requirements and technological advancements in KYB services to maintain the highest standards of client verification. To ensure that the data stored and managed by our clients meets local and regional legal requirements, we have implemented a comprehensive compliance framework. This framework is designed to adapt to the diverse legal landscapes in which we and our clients operate, and includes the following key components: Legal and Regulatory Research: We conduct ongoing research to stay updated on local and regional laws and regulations pertaining to data storage, privacy, and security. This includes GDPR in Europe, CCPA in California, and other relevant data protection and cybersecurity laws worldwide. We engage with legal experts and consultants who specialize in data law to help us interpret and apply these regulations effectively. Customized Compliance Strategies: Based on our understanding of various legal requirements, we develop tailored compliance strategies for each geographic region. This involves customizing our data storage solutions to meet specific local and regional legal standards. For clients operating in multiple regions, we provide multi-jurisdictional compliance advice to ensure their data management practices are lawful across all regions. Client Education and Consultation: We actively educate our clients about the legal requirements relevant to their data and provide guidance on best practices for compliance. This includes workshops, training sessions, and detailed compliance guides. For complex scenarios, we offer personalized consultation services to address specific legal compliance challenges. Data Processing and Transfer Agreements: We use robust data processing and transfer agreements that clearly outline the responsibilities and obligations of all parties involved, ensuring compliance with local and regional laws. These agreements include clauses on data sovereignty, cross-border data transfer restrictions, and data handling requirements as per applicable laws. Regular Compliance Audits and Assessments: We conduct regular audits and assessments of our infrastructure and our clients’ data practices to ensure ongoing compliance with legal requirements. Any non-compliance issues identified during these audits are promptly addressed and rectified. Data Protection Officer (DPO) and Legal Team: We have appointed a Data Protection Officer and a dedicated legal team responsible for overseeing compliance with data-related laws and regulations. The DPO and legal team also handle any legal queries or concerns from clients and ensure that our policies and procedures are always in alignment with current legal standards. Through these measures, we ensure that the data managed and stored by our clients is in full compliance with all relevant local and regional legal requirements, thus safeguarding both our clients and our business against legal risks and liabilities. Please refer to the reference sheet for geographical data governance, sent on slack. We have an annual internal audit process. This proof can be shared with the governance team. Also, the team might not have been too familiar with the changes in the process previously. Some of the tranches of data cap allocation might have been crossed because they thought they were familiar customers that we had a previous relationship with previously. We have rectified the issue with the manager and moving forward, we will closely monitor the approval process. In the future, all employees are required to check for the comments before signing off. Also, we have implemented the checker and signer roles, so signers will have to "watch" the github repository and look out for checker's comments before triggering an action. While the checker will have to "watch" the repository for new applications or comments and response with their own set of comments.
Data Preparation
Client-provided,Other existing ecosystem tooling,IPFS Kubo,Go-CAR
Replicas required, verified by CID checker
5+
Distribution required
Single region of SPs
Number of Storage Providers required
5+
Retrieval Requirements
Public data highly retrievable over Spark.
Allocation Tranche Schedule TypeManual or other allocation schedule.
• First: 1 PiB • Second: 2 PiB • Third: 2 PiB • Fourth: 5 PiB • Subsequently all is MAX of 10 PiB
We will create an additional automated check in our onboarding procedure to ensure that the tranche schedules are met. Currently while the automated checker is work in progress we implemented the checker and signer role, where employees are required to "watch" the github repositories and act according to their role. And add in their relevant comments on the github repository.
Will you use FIDL tooling, such as allocator.tech and other bots?
Yes, all available tools
GitHub Bookkeeping Repo Link
https://github.com/Origin-Storage-IO/future-storage
Success metrics
Retrievability of data,Number of clients,Amount of data onboarded, daily & aggregate,Number of paid deals,Number of returning client customers,Speed of allocations (TTD),Ecosystem marketing/comms
Timeline to begin allocating to clients
1 week from RKH approval
Funnel: Expected DataCap usage over 12 months
Risk mitigation strategies
Our compliance check mechanism for our clients is designed to ensure responsible allocation and utilization of DataCap, maintaining the integrity of the storage marketplace. Our approach includes multiple layers of checks and balances, which are outlined as follows: Initial Due Diligence and Trust Evaluation: For new clients, we conduct a comprehensive due diligence process to assess their credibility and the legitimacy of their data storage needs. This includes verifying business registration, financial health, and data handling practices. Any high risk clients will be flagged and require enhanced due diligence done by the management level. A trust evaluation system is implemented, where new clients start with a lower DataCap allocation. As they demonstrate compliance and reliability, their trust score, and consequently, their DataCap allocation, can be increased. This is currently simplistic and primarily dependent on the period of collaboration with us, < 1 year, < 2 year and > 3 year or the existing reputation of the company, whether it is a well known company in their industry etc. Noted that this might have caused a contradiction to our trache schedule. Moving forward, all companies will have to be treated as if they are a new company in the filecoin allocator program. Also, we have implemented the checker and signer roles, so signers will have to "watch" the github repository and look out for checker's comments before triggering an action. While the checker will have to "watch" the repository for new applications or comments and response with their own set of comments. Regular Check-ins and Audits: We establish a schedule of regular bi-annual check-ins and audits with our clients. These are designed to review their DataCap usage and ensure adherence to agreed terms. Audits are conducted both remotely and, if necessary, on-site. They focus on verifying the actual data stored against the DataCap allocated and checking for any signs of misuse or non-compliance. DataCap Distribution Tracking: We utilize tools like datacapstats.io / CID checker bots to monitor the distribution of DataCap. This tool helps in tracking which CIDs (Content Identifiers) are being stored, ensuring that the storage aligns with the allocated DataCap. Metrics such as allocation frequency, size of DataCap per allocation, and the frequency of client requests are closely monitored and reported and discussed on a weekly basis. Downstream Usage Monitoring: We track the downstream usage of our clients using tools like Retrievability Bot. This tool helps in ensuring that the data stored is retrievable and aligns with the intended purpose of the DataCap. Reports on a quarterly basis are generated to analyze the usage patterns and detect any anomalies or deviations from normal usage. Client Demographics and Time Metrics Analysis: Understanding client demographics is crucial for customizing our compliance checks and interventions. We gather and analyze data about our clients' industries, size, and storage behavior. Time metrics such as the duration of DataCap usage and the retention period of data are monitored and reported on a weekly basis during the same meeting mentioned above to identify any unusual patterns. Interventions and Dispute Management: Our policy for interventions is clear and strictly enforced. In cases of non-compliance, we are prepared to take actions such as removing DataCap and keeping detailed records of such incidents. We have a defined process for handling disputes, ensuring timely and fair resolution. This process involves a thorough investigation and engagement with all relevant parties. Disputes are to be picked up and handled in 120 hours. Transparency and Community Engagement: We maintain transparency in our operations and decisions regarding DataCap allocations. Regular updates and summaries of our activities are shared with the community, on channels such as Github. We actively engage with the community to gather feedback and insights, which helps in continuously improving our compliance mechanisms. Our public relations team holds weekly meetings to discuss, measure and brainstorm new engagement ideas. Our tolerance level for non-compliance is minimal, especially for new clients, as we are committed to upholding the highest standards of responsibility and fairness in the storage marketplace. This comprehensive compliance check mechanism mentioned ensures effective management and monitoring of DataCap distribution and utilization, safeguarding the ecosystem from misuse.
Dispute Resolutions
Initial Assessment and Response Time: Upon receiving a dispute notification, we conduct an initial assessment within 120 hours. This initial step helps in understanding the nature of the dispute, whether it's over DataCap distribution, data compliance, or execution of storage deals. We acknowledge the dispute with all involved parties and inform them about the estimated timeline for resolution. Information Gathering and Analysis: Our team collects all relevant information and documentation related to the dispute. This includes communication logs, DataCap allocation records, storage deal agreements, and any other pertinent data. We analyze this information to identify the root cause of the dispute and to assess the validity of the claims made by each party. Dispute Resolution Meetings: We schedule meetings with the involved parties to discuss the dispute. For internal disputes (between ourselves and our client), these meetings aim to understand each party's perspective and to find a mutually acceptable solution. For external disputes (involving another notary or the Fil+ Governance Team), we prepare a detailed defense of our decisions, ensuring that all our actions were in compliance with the established guidelines and were transparently documented. Mediation and Conflict Resolution: If needed, we engage in mediation to facilitate a resolution. This involves an impartial third-party mediator who helps in negotiating a solution acceptable to all parties. Our goal is to resolve disputes amicably while upholding the principles of fairness and adherence to Filecoin network's rules and standards. Transparency and Documentation: Throughout the dispute resolution process, we maintain high levels of transparency. All decisions, discussions, and outcomes are documented and shared with the relevant parties and non-sensitive data will also be shared publically on github. We also keep records of all disputes and their resolutions as part of our internal audit and compliance process. Accountability and Review: If the dispute resolution results in identifying any faults or errors on our part, we take full responsibility and implement corrective actions promptly. We also review our policies and procedures post-dispute to learn and improve our processes, preventing similar issues in the future. Community and Governance Engagement: In cases involving the broader Filecoin community or the Fil+ Governance Team, we actively engage with the community to explain our stance and to gather feedback. We respect the decisions made by the Fil+ Governance Team and comply with any directives issued as part of the dispute resolution. Our dispute resolution process aims to address issues efficiently and fairly, ensuring that all parties are heard and that resolutions are in line with the overarching goals and rules of the Filecoin network.
Compliance Audit Check
Regular Compliance Audits and Assessments: We will conduct regular audits and assessments of our infrastructure and our clients' data practices to ensure ongoing compliance with legal requirements. Any non-compliance issues identified during these audits are promptly addressed and rectified. Also, there are automated bots from the filecoin project that is able to do additional checks on our compliance. In addition, we will release basic information about our clients on the public repository, including basic information from our internal audit reports.
Compliance Report content presented for audit
Client Diligence: Client statements, client provided verification,Client Diligence: KYC/KYB report on clients,Client Diligence: Financial Audits and Credit Check reports,Client Diligence: Legal Review documents,Data Compliance: Proof of provenance report,Data Compliance: Data Samples,Data Compliance: Manual report,Compliance: CID report,Client/Data Compliance: external third-party audit ,Success metric: Proof of Payments from clients,Success metric: onchain data report,Contributions: Educational Materials Developed,Contributions: Github repos with the tools developed,Success metric: onchain report of data onboarded,More. What i want to add on here is that we will defintely provide the client diligence and data compliance as newly implemented in our public repository. And we will provide other proofs and reports such as success metric contributions when required and relevant. We are trying our best to automate this process at the moment, especially for the success metrics.
Connections to Filecoin Ecosystem
Previous allocator,Previous notary,Code contributor,Storage provider,Big data contributor,Developer
Slack ID
kenneth goh