Open martapiekarska opened 4 months ago
Hi, when will the review start?
Hi @Hunter3576 Wanted to let you know this application has been received, thank you for submitting to become an Allocator in the Filecoin Plus Program. How did you hear about Fil+?
Since there are over 40 current DataCap pathways available and running for applicants to request DataCap currently, we are prioritizing onboarding applications that are implementing systems other than Manual. When this application starts you will be contacted via email and here in this issue for next KYC and onboarding steps.
Please join us on the next call, would love to hear more and answer any questions you may have. https://docs.google.com/presentation/d/1Ek0TO9AzhLKQLFoFPsIUwKSY7Wp9N0Gefwdb-wU_c34/edit?usp=drive_link
Every other Tuesday (03Sep) Call 1: 0900 pdt /1600 utc Call 2: 1900 pdt / 0200 utc (next day) https://calendar.google.com/calendar/embed?src=c_k1gkfoom17g0j8c6bam6uf43j0%40group.calendar.google.com&ctz=America%2FLos_Angeles
Hi @Kevin-FF-USA Sorry it took so long to get back to you.
I learned about fil+ in 2023 and managed to store data to SP in 2024
I hope to attend the next meeting.
Hi @Hunter3576
One of the scoring mechanisms for pathways is their ability to onboard quality data to the network. Given your existing clients, suggesting a proposal to help establish your ability within the ecosystem to serve as an Allocator performing MANUAL diligence.
Proposal Bring one of your clients into the ecosystem with an Existing Allocator Pathway. Establish that you have real clients and can maintain the diligence standards of this application. Demonstrate that to the community as the ability and value to onboarding this new Manual Pathway.
Steps
Work with any existing Allocator to create an application on behalf of your client. FIDL runs an enterprise Allocator if you were looking for a pathway with existing support in place to help with questions.
Once the data is onboarded, reply back to this application with the following
Onboarding Once the ability to onboard clients through the application process has been verified, this application will receive a KYC check and begin onboarding as an Allocator to onboard clients directly.
For questions or support
https://github.com/MikeH1999/RFfil/issues/43
HI @Kevin-FF-USA This is the medical data I have stored under the RFfil , you can check it out, thanks!
Hi @Hunter3576,
If you've taken a client through with RFfil, Once the data is onboarded, reply back to this application with the following
- Client ID
- Links to the DataCap Application
https://github.com/MikeH1999/RFfil/issues/43
SP's retrieval rate has been unstable, I also contacted mike has been assisting in communication and has been improving the retrieval rate issue with SP
- Verification that the Data reached the targeted number of SP's
- What the data type was
Healthcare
@Kevin-FF-USA Thanks
Allocator Application
Application Number
recvmuY09rObs0NHa
Organization Name
Shenyang Dongya Medical Research Institute Co.
Organization On-chain Identity
f1hyqz2misaxpnwbnvlipz2kd32ibk2tlhymbhgqy
Allocator Pathway Name
SDMR
Github PR Number
75
Region of Operation
Asia minus GCR,Europe,Greater China Region,North America,South America,Oceania
GitHub ID
Hunter3576
On-chain address
I have a multisig I want to provide now
Type of Allocator
Similar to existing allocator pathways
Filecoin Community Agreement
As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack. Additionally, I will adhere to all local & regional laws & regulations that may relate to my role as a business partner, organization, notary, allocator, or other operating entity Acknowledge
Type of Allocator and RFA List
Manual - Existing similar diligence pathway
Allocator Description
https://github.com/MikeH1999/RFfil/issues/6
Contributions to EcosystemBuild better data onboarding pathway,Onboard >10PiBs of Data,Data Stewardship: Curate and provide high-quality datasets to be stored on the Filecoin network, enhancing the overall value and utility of the network.,Host or sponsor community events such as meetups, hackathons, conferences, FilDev summits etc,Develop Open-Source Tools
Monetization and Fee structure
Client fees,None.
Target Clients
Web3 developers,Nonprofit organizations,Commercial/Enterprise,Individuals,Open/Public
Client Diligence Check
3rd party Know your business (KYB) service,3rd party Know your customer (KYC) service,Automated deterministic,Client promise/attestation,Manual verification,NDA with additional verification,Proof of provenance
Description of client diligence
In order to ensure the authenticity of customer information, we will verify customer information through the following contents, and assign special personnel to continuously follow up and verify after allocating DataCap
Type of data
Private encrypted with on-chain deal pricing,Public, open, and retrievable,Proof of concept, network utilities
Description of Data Diligence
Data Preparation
Singularity
Replicas required, verified by CID checker
3+
Distribution required
Equal distribution of deals across regions
Number of Storage Providers required
3+
Retrieval Requirements
Public data highly retrievable over Spark.
Allocation Tranche Schedule TypeManual or other allocation schedule.
50TiB for the first time, 10% for the second time, 20% for the third time, 30% for the fourth time, and 40% for the fifth time
Will you use FIDL tooling, such as allocator.tech and other bots?
Yes, all available tools
GitHub Bookkeeping Repo Link
https://github.com/Hunter3576/SDMR
Success metrics
Number of clients,Amount of data onboarded, daily & aggregate,Retrievability of data,Number of returning client customers,Speed of allocations (TTD),Ecosystem marketing/comms
Timeline to begin allocating to clients
1 week from RKH approval
Funnel: Expected DataCap usage over 12 months
100-200PiB
Risk mitigation strategies
For a signature wallet, use a hardware wallet (Ledger) to manage it. The hardware wallet is kept by the team, and the signature is completed by the manager who reviews the customer's application
Dispute Resolutions
When receiving a dispute request, record the request within 72 hours. Ensure that all relevant details and evidence are recorded for subsequent processing. Conduct investigation and analysis to understand the background and reasons for the dispute. This may involve communicating with relevant parties, collecting evidence and data, and conducting necessary due diligence. Based on the results of investigation and analysis, make a fair and reasonable decision. Ensure that your decisions comply with relevant compliance requirements and policies. Maintain transparency and actively communicate with relevant parties throughout the entire processing process. Explain your decision-making basis and process, and provide necessary information and support. If the dispute involves other notaries or Fil+governance teams, provide necessary support and cooperation. Follow the guidelines and procedures of the FIL+governance team to resolve disputes.
Compliance Audit Check
Based on the current robot report, first exclude necessary conditions such as CID sharing, number of copies, and repetition rate Conduct regular audits to check if the storage usage of each customer matches the allocated DataCap. During the audit process, compare the actual usage and allocation to ensure that customers do not exceed or abuse them
Compliance Report content presented for audit
Success metric: Proof of Payments from clients,Success metric: onchain report of data onboarded,Success metric: onchain data report,Client Diligence: Client statements, client provided verification,Client Diligence: Legal Review documents,Client Diligence: Financial Audits and Credit Check reports,Client Diligence: KYC/KYB report on clients,Data Compliance: Proof of provenance report,Data Compliance: Data Samples,Data Compliance: Manual report,Compliance: CID report,Client/Data Compliance: external third-party audit .
Connections to Filecoin Ecosystem
Storage provider,Event sponsor,Big data contributor
Slack ID
U05LPA27T8V