filecoin-project / notary-governance

113 stars 55 forks source link

v5 Notary Allocator Application: FogMeta #1051

Open Normalnoise opened 5 months ago

Normalnoise commented 5 months ago

v5 Notary Allocator Application

To apply to be an allocator, organizations will submit one application for each proposed pathway to DataCap. If you will be designing multiple specific pathways, you will need to submit multiple applications.

Please complete the following steps:

1. Fill out the information below and create a new GitHub Issue

  1. Notary Allocator Pathway Name (This can be your name, or the name of your pathway/program. For example "E-Fil+"): Leo Zhang
  2. Organization Name: FogMeta
  3. On-chain address for Allocator (Provide a NEW unique address. During ratification, you will need to initialize this address on-chain): f1vk5p7jbblg6hh7zyedf3srmqpqrxbrkdibpjvdi
  4. Country of Operation (Where your organization is legally based): GCR
  5. Region of Operation (What region will you serve?): All Regions
  6. Type of Allocator, diligence process: (Automated/programmatic, Market-based, or Manual (human-in-the-loop at some phase): Market-based and Manual
  7. DataCap requested for allocator for 12 months of activity (This should be an estimate of overall expected activity. Estimate the total amount of DataCap you will be distributing to clients in 12 months, in TiB or PiB): 250PiB

2. Access allocator application (download to save answers)

Click link below to access a Google doc version of the allocator application that can be used to save your answers if you are not prepared to fully submit the application in Step 3. https://docs.google.com/document/d/1-Ze8bo7ZlIJe8qX0YSFNPTka4CMprqoNB1D6V7WJJjo/copy

3. Submit allocation application

Clink link below to access full allocator questionnaire and officially submit your answers: https://airtable.com/appvyE0VHcgpAkt4Z/shrQxaAIsD693e1ns

Note: Sections of your responses WILL BE posted back into the GitHub issue tracking your application. The final section (Additional Disclosures) will NOT be posted to GitHub, and will be maintained by the Filecoin Foundation. Application information for notaries not accepted and ratified in this round will be deleted.

Kevin-FF-USA commented 5 months ago

Hi @Normalnoise Wanted to let you know this application has been received. Also verifying you have submitted the Airtable form with your detailed Allocator plan - the public answers will be posted in a thread below soon. If you have any questions - please let me know.

Normalnoise commented 5 months ago

Thanks Kevin------------------------------------------------------------------ @.> 日 期:2024年01月09日 01:45:33 @.> @.>; @.> 主 题:Re: [filecoin-project/notary-governance] v5 Notary Allocator Application: FogMeta (Issue #1051)

Hi @Normalnoise Wanted to let you know this application has been received. Also verifying you have submitted the Airtable form with your detailed Allocator plan - the public answers will be posted in a thread below soon. If you have any questions - please let me know. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

ghost commented 5 months ago

Basic Information

1. Notary Allocator Pathway Name: Leo Zhang

2. Organization: FogMeta

3. On Chain Address for Allocator: f1vk5p7jbblg6hh7zyedf3srmqpqrxbrkdibpjvdi

4. Country of Operation: China

5. Region(s) of operation: Africa , Asia minus GCR, Greater China, Europe, Oceania, Japan, North America, South America, Other

6. Type of Allocator: Market-based

7. DataCap requested for allocator for 12 months of activity: 250PiB

8. Is your allocator providing a unique, new, or diverse pathway to DataCap? How does this allocator differentiate itself from other applicants, new or existing?: No. We will continue following what we have done in the past and use the same due diligence process as previous notary rounds.

9. As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack.: Acknowledge

Client Diligence

10. Who are your target clients?: Individuals learning about Filecoin, Small-scale developers or data owners, Enterprise Data Clients, Other (specified above)

11. Describe in as much detail as possible how you will perform due diligence on clients. If you are proposing an automated pathway, what diligence mechanism will you use to determine client eligibility?: (1) Individuals and Small Developers/Data Owners:

(2) Enterprise Data Clients:

12. Please specify how many questions you’ll ask, and provide a brief overview of the questions.: The basic rule is that the higher the DataCap application, more information will be required for the client. First, we will ask relevant information about customers and organization.

(1) Personal Information:

(2) Company or organization information:

(3) dataset information

(4) how to distribute DataCap:

13. Will you use a 3rd-party Know your client (KYC) service?: no

14. Can any client apply to your pathway, or will you be closed to only your own internal clients? (eg: bizdev or self-referral): Yes, all of our customers can apply.

15. How do you plan to track the rate at which DataCap is being distributed to your clients?: We will check it by the “CID checker tool” and “aggregation-and-compliance”.

Data Diligence

16. As an operating entity in the Filecoin Community, you are required to follow all local & regional regulations relating to any data, digital and otherwise. This may include PII and data deletion requirements, as well as the storing, transmit: Acknowledge

17. What type(s) of data would be applicable for your pathway?: Public Open Dataset (Research/Non-Profit), Public Open Commercial/Enterprise, Private Commercial/Enterprise, Private Non-Profit/Social Impact, Other (specified elsewhere)

18. How will you verify a client’s data ownership? Will you use 3rd-party KYB (know your business) service to verify enterprise clients?: We will check the client's publicly available websites, social media or other information to see if it makes sense for the client to have provenance over the claim data. If anything is suspicious, we will meet with the client in a meeting to confirm. We will not use a 3rd-party KYB service.

19. How will you ensure the data meets local & regional legal requirements?: 1 First, ask DC applicants whether they have reviewed whether they meet local and local legal requirements. 2. Check sample data to see if it violates the authority of individuals or other institutions, depending on whether they answered or not. 3 It also checks whether there is illegal, or harmful data according to social wisdom.

20. What types of data preparation will you support or require?: We recommend using CAR files as the final data type for encapsulation. There is no size limitation for car files, but each file should be below the maximum sector capacity of 32/64G. Tool Support for Non-Technical Teams: For customers without a technical team or with an immature technical solution, we offer a user-friendly tool for packaging CAR files.(Swan-client: https://github.com/filswan/go-swan-client Automated Processing for Small Data: For small data, such as NFT image data, we provide an SDK service for automated processing. Customers can upload source data through the SDK, and our system will automatically package and distribute the data to Storage Providers (SP). (SDK service: https://github.com/FogMeta/go-mc-sdk ) Global Support: With offices in mainland China, Vietnam, and Canada, we offer flexibility.

21. What tools or methodology will you use to sample and verify the data aligns with your pathway?: Extract a large number of random sample data, And retrieve the data from the SPs to check them random; We will require the data to be at least 50% of a sector. Otherwise, it will be considered “sector-size abuse”

Data Distribution

22. How many replicas will you require to meet programmatic requirements for distribution?: 4+

23. What geographic or regional distribution will you require?: 2 or more physical locations. 2 or more separate geopolitical regions.We have no areas to exclude, all areas are equal.

24. How many Storage Provider owner/operators will you require to meet programmatic requirements for distribution?: 2+

25. Do you require equal percentage distribution for your clients to their chosen SPs? Will you require preliminary SP distribution plans from the client before allocating any DataCap?: I will ask for a preliminary SP allocation plan. But considering the unpredictable factors, I will not force SPs to be allocated in equal percentages, and I am willing to accept clients to add new SPs.

26. What tooling will you use to verify client deal-making distribution?: https://datacapstats.io/, https://filecoin.tools/ and the CID checking bot can be used to verify customer transaction distribution.

27. How will clients meet SP distribution requirements?: I will fully understand the customer's needs for data distribution. If the customer lacks SP in some regions. I will help customers to join the filswan platform. It provides enough SPs located around the world who have performed well in historical audits.

28. As an allocator, do you support clients that engage in deal-making with SPs utilizing a VPN?: Before starting, I ask the client if they are using a VPN and I support their use of one. If using a VPN, customers will need to provide documentation proving that the physical devices are not in the same region.

DataCap Allocation Strategy

29. Will you use standardized DataCap allocations to clients?: No, client specific

30. Allocation Tranche Schedule to clients:: • First: lesser of 5% of total DataCap requested or 50% of weekly allocation rate • Second: lesser of 10% of total DataCap requested or 100% of weekly allocation rate • Third: lesser of 20% of total DataCap request or 200% of weekly allocation rate • Fourth: lesser of 40% of total DataCap requested or 400% of weekly allocation rate • Max per client overall:lesser of 80% of total DataCap request or 800% of weekly allocation rate

31. Will you use programmatic or software based allocations?: No, manually calculated & determined

32. What tooling will you use to construct messages and send allocations to clients?: Use the existing notary registry tool (https://filplus.fil.org/ #/).

33. Describe the process for granting additional DataCap to previously verified clients.: I’ll use the Subsequent Allocation(SA) bot.

34. Describe in as much detail as possible the tools used for: • client discoverability & applications • due diligence & investigation • bookkeeping • on-chain message construction • client deal-making behavior • tracking overall allocator health • disput: • client discoverability & applications - spreadsheet tracking and meetings • due diligence & investigation - publicly available information and meetings
 • bookkeeping - spreadsheet • on-chain message construction - current filplus.fil.org website • client deal-making behavior - current CID checker bot • tracking overall allocator health current CID checker bot
 • dispute discussion & resolution GitHub notary governance repo • community updates & comms * Slack and GitHub

Tools and Bookkeeping

35. Will you use open-source tooling from the Fil+ team?: Github repo, Google Spreadsheet

36. Where will you keep your records for bookkeeping? How will you maintain transparency in your allocation decisions?: GitHub and our internal and external spreadsheets

Risk Mitigation, Auditing, Compliance

37. Describe your proposed compliance check mechanisms for your own clients.: We will track the datacap distribution using the CID checker bot. As long as the client has good communication of any blockers or issues, we will be understanding of their demographic choices and time metrics. Trust evaluations are done at the start and checked between datacap tranche allocations. For new clients, I will give them one chance to revert to the rules when the first allocation, but the client must give enough proof.

38. Describe your process for handling disputes. Highlight response times, transparency, and accountability mechanisms.: If there is any dispute, I will respond within 1-2 days (taking into account some uncontrollable factors). Disputes will be handled by communicating with the client and SPs. If needed, we will open an issue on the notary governance GitHub repo.

39. Detail how you will announce updates to tooling, pathway guidelines, parameters, and process alterations.: If needed we will announce updates via GitHub issues in the notary governance repo.

40. How long will you allow the community to provide feedback before implementing changes?: We are always open to community feedback and will implement changes if the feedback makes logical sense to our team. We can continue to use the notary governance GitHub repo to engage with the community and monitor the feedback there.

41. Regarding security, how will you structure and secure the on-chain notary address? If you will utilize a multisig, how will it be structured? Who will have administrative & signatory rights?: We use Ledger X to build the address. The private key of this address is owned by the legal person of the company. I'm in charge of GitHub and Slack Each use requires joint management by Charles and me.

42. Will you deploy smart contracts for program or policy procedures? If so, how will you track and fund them?: If necessary, we will develop simple smart contracts. But there are currently no plans to do so.

Monetization

43. Outline your monetization models for the services you provide as a notary allocator pathway.: There are currently no plans for monetization.

44. Describe your organization's structure, such as the legal entity and other business & market ventures.: Our team is registered in Mainland China and is a development team focusing on technology product and service. And the swan-client and swan-provider is maintained by us. They are powerful tools for the data-onboarding to the filecoin network.

45. Where will accounting for fees be maintained?: None

Past Experience, Affiliations, Reputation

46. If you've received DataCap allocation privileges before, please link to prior notary applications.: FogMeta is the V4 round notary, and has Due Diligence to so many datacap applications. https://github.com/filecoin-project/notary-governance/issues/772

47. How are you connected to the Filecoin ecosystem? Describe your (or your organization's) Filecoin relationships, investments, or ownership.: These may include Storage Providers, existing client applications, developer service providers, and more. As a member of the Filecoin community, Fogmeta has developed and maintained many tools and services for clients and storage providers. They are ""Filecoin-IPFS Data Rebuilder"", ""Swan Provider"", ""Swan Client"" and ""Chainnode"", datacap-extend-service. (1) Filecoin-IPFS Data Rebuilder is a data build-and-rebuild tool between the IPFS network and the Filecoin network. (2) Swan Client is an important Web3 toolkit. It provides different tools to help users connect to the web3 world. It includes the following features: - Filecoin Deal Sender - Blockchain RPC Service (3) Swan Provider listens to offline deals that come from Swan platform. It provides the following functions: - Download offline deals automatically using aria2 for downloading service. - Import deals using lotus once download is completed. - Synchronize deal status to Swan Platform, so that both clients and miners will know the status changes in real time. - Auto bid task from FilSwan bidding market (4) Chainnode is a project for maintaining blockchain snapshots. This project enables the storing and retrieval of snapshots for blockchain projects including Filecoin, Polygon, Ethereum, Binance Smart Chain, and Near. (5) Extend Datacap Terms Service: The Extend DataCap Terms Service is designed to support Storage Providers in extending the expiration of their sectors' terms to maintain the 10x QA(quality-adjusted) power. (6) go-mc-sdk: A Golang SDK for the MetaArk product, providing an easy interface for developers to deal with the Filecoin network. It streamlines the process of securely storing, retrieving and recovering data on the IPFS and Filecoin network. (7) fs3: High Performance, Kubernetes Native Object Storage for Filecoin network. Related Links(github and docs) - Filecoin-IPFS Data Rebuilder: https://github.com/FogMeta/Rebuilder - Filecoin-IPFS Data Rebuilder website: http://rebuilder.fogmeta.com - Swan Client: https://github.com/filswan/go-swan-client - Swan Provider: https://github.com/filswan/go-swan-provider - Chainnode: https://chainnode.io - Extend Datacap Terms Service: https://datacap.swanchain.io/extend - go-mc-sdk: https://github.com/FogMeta/go-mc-sdk - fs3: https://github.com/FogMeta/fs3 ; https://fs3.fogmetalabs.com/minio/

48. How are you estimating your client demand and pathway usage? Do you have existing clients and an onboarding funnel?: We connect with many storage providers and developers. We encourage them to store valuable public data sets on Filecoin. We were proficient in this process, but we hadn’t established an onboarding pipeline yet