NateWebb03 / FilTestRepo

A test repository for allocator application automation
1 stars 0 forks source link

Test app 1038 #1038

Open NateWebb03 opened 8 months ago

NateWebb03 commented 8 months ago

Notary Allocator Pathway Name:

STCloud

Organization:

Shenzhen SpaceTime Cloud Technology Co., Ltd.

Allocator's On-chain addresss:

f1n5jufq5n5gdm3tiycpi3ujqgyh3yyj52rd72vdq

Country of Operation:

China

Region(s) of operation:

Asia minus GCR,Greater China,Europe,North America,Japan

Type of allocator: What is your overall diligence process? Automated (programmatic), Market-based, or Manual (human-in-the-loop at some phase). Initial allocations to these pathways will be capped.

Manual

Amount of DataCap Requested for allocator for 12 months:

100PiB

Is your allocator providing a unique, new, or diverse pathway to DataCap? How does this allocator differentiate itself from other applicants, new or existing?

As a major service provider that joined the decentralized storage industry on early stage, as a notary for V3 and V4, we have always strived to help Filecoin store precious data for all of humanity, aiming to make Filecoin a storage giant like AWS. From 2018 to 2020, our focus was on building a solid foundation for Filecoin, and as a result, we brought in 1.2EiB of committed capacity (CC). From 2021 to 2023, with the launch of FIL+, we will store valuable data in the Filecoin network using DataCap and accurately track the usage of DataCap through algorithms and tools.

On the data front, we actively communicate with enterprises and persuade them to store their data in the Filecoin network. For example, with our assistance, "gouhong" participated in e-fil+ and "China Auto Parts Network" joined FIL+. We also help our partners download and store public datasets. Therefore, in the past two years, our DataCap has been diverse, abundant, and unique.

At the algorithm level, we have developed our own data processing system that automatically deduplicates all data. We do not want to see any sharing of Content Identifiers (CIDs) as it would waste DataCap and harm the reputation of FIL+. Additionally, we have developed platforms for handling DataCap and submitting orders, enabling better end-to-end tracking of DataCap usage.

At the implementation level, we have established the ST-DC project, which conducts auditing, tracking, and management for both new and existing customers. This is the application link : https://github.com/stcloudlisa/STDC/issues/new/choose.

As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack. Additionally, I will adhere to all local & regional laws & regulations that may relate to my role as a business partner, organization, notary, or other operating entity. * You can read the Filecoin Code of Conduct here: https://github.com/filecoin-project/community/blob/master/CODE_OF_CONDUCT.md

Acknowledgment: Acknowledge

Cient Diligence Section:

This section pertains to client diligence processes.

Who are your target clients?

Enterprise Data Clients,Small-scale developers or data owners

Describe in as much detail as possible how you will perform due diligence on clients.

ST-DC aims to conduct due diligence and DC allocation to customers in a "public and transparent" manner. We propose a combination of automation and manual processes. Customers are required to register on GitHub and submit a DC application form: https://github.com/stcloudlisa/STDC/issues/new/choose. We will review the applications on GitHub using both manual and automated approaches, which include reviewing the application form, signing data storage contracts, and conducting third-party KYB verification of customer identities.

In our due diligence process, we primarily focus on the following aspects regarding customers:

  1. Data ownership: We are concerned about the ownership of customer data and how customers authenticate their identity to establish ownership.
  2. Data authenticity: Customers need to prove ownership of the data they claim to possess. For example, if a customer applies for 10P of DC, which requires 1P of original data to create 10 backups, they must demonstrate that they have 1P of data.
  3. Technical capabilities: We evaluate whether customers have a technical team, their familiarity with Filecoin, their experience in handling data, their data processing plans, and whether they require our assistance.
  4. Compliance of customer data: We assess where the data is distributed and whether it complies with local and international regulations.

We also gather and analyze publicly available information, such as corporate news reports, social media platforms like Twitter and LinkedIn, and analysis and ratings from professional organizations, to understand and assess the true situation of the enterprise.

Furthermore, we employ various methods, including KYB (Know Your Business), to verify the authenticity of the information submitted by customers. This may involve using publicly accessible corporate information search systems, securities regulatory authorities, third-party business databases such as Twitteraudit, Dun & Bradstreet, Bloomberg, Hoover's, and more.

Please specify how many questions you'll ask, and provide a brief overview of the questions.

https://github.com/stcloudlisa/STDC/issues/new/choose

Company name

Social media

Company introduction

Requested total amount of DataCap

Expected size of individual datasets (per copy)

Number of copies to be stored

DataCap allocated per week

On-chain address of initial allocation

Data type of your company

Amount of data your company has

How do you prove ownership of the data

Please provide data use cases

Are you applying for FIL+ or E-FIL+

Is the data being stored on Filecoin for the first time

Why do you want to store data on the Filecoin network

Which regions do you want your data to be stored in

How many backups

How did you learn about us

Do you have experience with data handling

How will you handle your data

Please introduce your technical team

Do you need assistance from us

If you already have a list of storage providers to work with, fill out their names and provider IDs below

How will you negotiate the deals with the SPs

When do you expect to start

How long do you want to store the data

What retrieval frequency do you desire

Can your data be publicly retrieval

Can you confirm that you will follow the Fil+ guideline

Will you use a 3rd-party "Know your client" (KYC) service?

Yes, we will use KYC to verify and collect information about our customers. Our goal is to gather detailed information about the customers, which may include personal identification details, address proof, business registration documents, financial statements, and other relevant information.

Integrating KYC services into our workflow involves securely transmitting the necessary customer information to the service provider for verification. This may involve sharing customer files and data through secure channels or APIs provided by the KYC service. The service provider then performs the necessary checks and verification processes to validate the provided information and provides us with a comprehensive report or confirmation of the customer's identity. The main KYC tools used include LinkedIn, Twitter, Facebook, Namescan, Toggle, Uqudo, Qichacha (企查查), national information query systems, and more.

The specific KYC service providers may vary depending on the regulatory requirements of the customer's jurisdiction. We prioritize partnering with reputable and reliable KYC service providers to ensure the security and privacy of customer data during the verification process. When conducting KYC, we primarily focus on the following aspects:

  1. Identity verification: At the individual level, customers provide identification documents such as passports, ID cards, or driver's licenses. At the company level, customers provide the company's business license, data ownership declaration (stamped with an official seal), office lease agreements, utility bills, bank transfer statements, and more. These documents are used to verify the customer's and company's identity information and authenticity.

  2. Data verification and background checks: We confirm the credibility and creditworthiness of customers by verifying the provided enterprise data cases. We primarily focus on two aspects: a. Data ownership: We assess whether the data cases provided by the customer are genuine and the extent of overlap with data found on the internet. We estimate the amount of data that the customer's company's main business can generate and compare it to the declared data volume. b. Genuine business operations: We examine the customer's actual operational details, such as the establishment date, company size, and business activities.

  3. Anti-Money Laundering (AML) and Counter Financing of Terrorism (CFT) scrutiny: For certain customers, we prioritize data security and conduct AML and CFT checks to ensure that customers are not involved in illegal activities.

Can any client apply to your pathway, or will you be closed to only your own internal clients? (eg: bizdev or self-referral)

Any customer can apply to us. Customers can contact us through email, slack, whatsapp, tg, phone, or business department. They can also come to our company offline to inspect, or participate in offline meetings held by us. All customers are equal, whether he is a large company or an individual who loves Filecoin and wants to store data in Filecoin, as long as he can prove the ownership of the data, the amount of data and can provide a reasonable data distribution plan, we will all They are welcome to apply to us. During the application process, if customers need help, they can submit it directly on GitHub, and we will respond in detail within 24 hours.

How do you plan to track the rate at which DataCap is being distributed to your clients?

When customers submit application form, they need to submit an allocation plan and the desired speed. We will communicate with the customer based on their actual situation to determine the final allocation plan. We plan to deploy an automated system, such as a program that can efficiently inspect DataCap usage through the Lotus API, and then display this data on a dedicated dashboard for easy tracking and analysis. Of course, we will also connect to the bot, which is so good to use. Every 500T as a check cycle. When the DC distributed to customers exceeds 500T and the next 500T needs to be started, we will manually check the following data: 1. Is the number of SP greater than 4? 2. Are the SP addresses distributed in three different regions? 3. Searchability. Whether search is supported. We will download the complete data to check whether the data is authentic. 4. Whether the SP’s distribution plan is consistent with the commitment If there is a gap between the customer's rate and the initial allocation plan, and the difference is greater than 20%, we will immediately stop the allocation to the customer until the customer provides an explanation and an accurate data allocation plan.

Data Diligence

This section will cover the types of data that you expect to notarize.

As a reminder: The Filecoin Plus program defines quality data is all content that meets local regulatory requirements AND • the data owner wants to see on the network, including private/encrypted data • or is open and retrievable • or demonstrates proof of concept or utility of the network, such as efforts to improve onboarding

As an operating entity in the Filecoin Community, you are required to follow all local & regional regulations relating to any data, digital and otherwise. This may include PII and data deletion requirements, as well as the storing, transmitting, or accessing of data.

Acknowledgement: Acknowledge

What type(s) of data would be applicable for your pathway?

Public Open Dataset (Research/Non-Profit),Public Open Commercial/Enterprise

How will you verify a client's data ownership? Will you use 3rd-party KYB (know your business) service to verify enterprise clients?

WikiGlobal:www.wikiglobal.com x315:https://www.x315.cn/searchworld linkdin:https://www.linkedin.com/ Twitter:https://twitter.com/ facebook:www.facebook.com medium:https://medium.com

We will use KYB services to verify our corporate customers.

The key to verifying data ownership is as follows: a) Verify the identity of the applicant to confirm that they are representing the company in the application. b) Verify how much data the company possesses to determine the maximum amount they can request based on the formula: Data Owned by the Company (X) multiplied by Backup Copies (number of backups) equals the total requested capacity. For example, if a company has 512TB of data and requires 10 backups, the maximum they can apply for is 5PiB of data capacity.

Therefore, we emphasize verifying the identity of the applicant and the company's operational situation. In the following verification process, we will utilize KYB services:

1.Verify that the applicant is representing the company in the application: Confirm the legitimacy of the applicant representing the company by verifying their identity documents and authorization documents, such as admission notices, employment contracts, payslips, etc. This may require the use of third-party KYB services to verify the applicant's identity and employment status.

2.Verify that the applicant is representing the company in the application: To ensure that the applicant is indeed representing the company, you can request the applicant to send an email using the company's domain email address and attach the company's data license certificate stamped with the company's seal. By matching the email address with the one reserved on the company's official website, the identity and authorization of the applicant can be verified.

3.Verify the amount of data owned by the company: Validate the amount of data owned by the company. This step requires the company to provide relevant data storage records, backup strategies, data management documents, and screenshots of the total data storage on platforms such as AWS. By comparing the provided data volume with the requested capacity, it can be confirmed whether the company's data ownership aligns with the application.

4.Third-party KYB investigation: Conduct a KYB investigation through a third-party service provider to examine the company's operations, data management strategies, backup plans, and compliance with data ownership regulations. This can provide additional verification and background information to ensure the legality and credibility of the customer.

5.Contract signing: Signing a contract with the customer is a crucial step in ensuring data ownership and legality. The contract should clearly outline the customer's data volume, provisions regarding data ownership, and requirements for the public accessibility of data stored on Filecoin. Customers must adhere to the contractual provisions to ensure the integrity and compliance of the data.

We will conduct ongoing audits and reviews to ensure their continuous compliance with data ownership and legality standards. This includes regular checks and verification of the data provided by the customer and monitoring the implementation of their data management and backup strategies.

How will you ensure the data meets local & regional legal requirements?

First, in the application form, we will ask the customer a question "Are you sure the data complies with local legal requirements?" Only when the customer answers "yes" will we proceed to the next step. When the customer selects "yes" , which means that the customer has made a commitment to data compliance. If the data is questioned due to compliance, the customer needs to bear certain responsibilities. Second, we will hire a professional legal team, implement comprehensive data management and compliance strategies, formulate compliance strategies for specific regions, and customize our data storage services to meet different legal requirements. Third, our legal team will save and manage the GitHub application form, KYC/KYB form, signed data contract, etc. filled out by the customer. Only when the legal team confirms that the customer data is compliant will we proceed with the next step of cooperation with the customer. Fourth, we will ask third-party KYB service providers to download the source data provided by customers and analyze the compliance of the data. During the cooperation process, we will randomly download sectors of the data, decompress the data, and view the data. authenticity and compliance.

What types of data preparation will you support or require?

We have a mature technical team and rich experience in DC data processing. We can help customers process data. We are also willing to provide our data solutions to customers to help them download and process data. Therefore, when customers choose to cooperate with us, we can provide customers with data processing tool kits, data processing guidance, etc. Customers do not need to worry about data processing. We have a team of 20 technical engineers who can quickly complete the work within one hour. Respond to and answer customer questions.

What tools or methodology will you use to sample and verify the data aligns with your pathway?

To verify the authenticity of the data, we can approach it from the following aspects: 1. We can use various methods to determine the customer's data volume and ensure that we do not allocate more DC than necessary. 2. We recommend our team to provide data processing to our clients, which allows us to ensure that customers are not inflating or misusing their data space. 3. If customers use our data processing system, we can monitor the download quantity and processing speed of their data in the system backend. By comparing the amount of data downloaded by the customer, the number of backups they have, and the amount of storage capacity they have consumed, we can assess whether the customer is downloading data honestly. For example, if a customer downloads 100TB of data, makes 8 backups, resulting in a total of 800TB of data, but they have only consumed 1PB of storage capacity, it suggests that the customer may have inflated their data by 25%. 4. We will use tools such as Lassie Fetch, Boost Retrieve, Lotus Client, and HTTP retrieval to obtain the data and compare it with its metadata, including size. We have sufficient experience in identifying sector padding and misuse. If we detect any misuse, we will immediately stop providing additional storage capacity until the customer provides a detailed resolution plan.

Data Distribution

This section covers deal-making and data distribution.

As a reminder, the Filecoin Plus program currently defines distributed onboarding as multiple physical locations AND multiple storage provider entities to serve client requirements.

Recommended Minimum: 3 locations, 4 to 5 storage providers, 5 copies

How many replicas will you require to meet programmatic requirements for distribution?

5+

What geographic or regional distribution will you require?

The number of SPs needs to be greater than 5 and distributed across 3 different regions. There are no specific regions that need to be excluded.

How many Storage Provider owner/operators will you require to meet programmatic requirements for distribution?

5+

Do you require equal percentage distribution for your clients to their chosen SPs? Will you require preliminary SP distribution plans from the client before allocating any DataCap?

We adhere to a market-based adjustment principle, there are no specific regulations regarding equal percentages. Specifically, when customers submit their initial applications, they are required to provide an SP allocation plan. The allocation plan will outline the collaborating SPs, their storage capacity, and SP data backups in detail. We expect customers to adhere to the initial allocation plan. While we do not require each SP to have an equal proportion of storage capacity, we do require SPs to ensure data backups. This means that the storage capacity and speed of at least four SPs should be relatively similar. If there is a significant discrepancy in data backups, we will request that the customer revise the allocation plan or actively seek additional SPs to participate in the storage process.

What tooling will you use to verify client deal-making distribution?

As far as we know, the bot is open source, and our team is discussing whether to integrate the bot. If we access the bot, we will follow the rules of the bot. In our opinion, the bot already has powerful functions, such as CID checking, SP distribution, retrieval query, data backup query, etc. Of course, there are currently many tools in the community that support verifying customers’ transaction distribution: 1.datacapstats.io 2. filecoin.tools 3. filfox.io 4. CID inspection robot 5.1、datacapstats.io 2、filecoin.tools 3、filfox.io 4、CID检查机器人 5、https://filplus.info

How will clients meet SP distribution requirements?

We will assist customers in data processing and work together with them to serve SPs . Before implementing automated transaction allocation, we will review customers and assess their reliability and compliance with community standards. If customers require technical assistance, we will provide 100% support. We believe that data preparation can enhance the efficiency, accuracy, and transparency of the allocation process, providing a positive experience for SPs. Additionally, we offer insights into the allocation progress through real-time monitoring and reporting tools (such as datacapstats.io). If any issues arise during the process, our technical team will promptly intervene and address them. The data processing approach will be determined based on the specific circumstances of customers and SPs. If the customer and SP are in the same region, we recommend using disk transfer. If they are not in the same region, we suggest online downloading. We will also consider using dedicated software (such as a data-clearinghouse) to programmatically choose and distribute data from clients to SPs.

As an allocator, do you support clients that engage in deal-making with SPs utilizing a VPN?

We do not support customers using VPN for transactions with SPs. We will utilize tools such as https://seon.io/, Filecoin Green, and Filstats.io to check whether nodes have manipulated their actual location through VPN. We will not collaborate with SPs who use VPN to modify their real address location.

DataCap Allocation Strategy

In this section, you will explain your client DataCap allocation strategy.

Keep in mind the program principle over Limited Trust Over Time. Parties, such as clients, start with a limited amount of trust and power. Additional trust and power need to be earned over time through good-faith execution of their responsibilities and transparency of their actions.

Will you use standardized DataCap allocations to clients?

Yes, standardized

Allocation Tranche Schedule to clients:

We will refer to the current allocation rules of FIL+:

For the first time, 50% of the weekly quota is allocated; The second time, 100% of the weekly allocation; For the third time, 200% of the weekly allocation amount; The fourth time, 400% of the weekly allocation; ..... The maximum quota in each round will not exceed 2P, and the next round of allocation will be triggered only if the remaining DC is less than 20%. Before the next round of allocation, we will check the distribution of SP through datacapstats.io, check the retrieval of SP through filecoin.tools, check whether SP uses VPN through https://seon.io/, etc.

After checking and everything is reasonable and in compliance with the allocation rules, we will allocate the next round of quotas to the customer.

Will you use programmatic or software based allocations?

Yes, standardized and software based

What tooling will you use to construct messages and send allocations to clients?

Use https://filplus.fil.org/#/ or https://filplus-registry.netlify.app/

Describe the process for granting additional DataCap to previously verified clients.

We will refer to the community rules and integrate the community's bot (https://github.com/filecoin-project/filplus-ssa-bot). Regarding the allocation process of DataCap (DC) and the triggering rules for each round, we will largely follow the rules set by the bot: 1. The next round of allocation will only be triggered when the remaining DC is less than 25%. 2. Each round of allocation cannot exceed 2P. We will check whether the initially committed allocation plan by the customer matches the actual allocation plan. If they are the same and there is no misuse of data, the next round of allocation will be granted. if there is any misuse, it will be immediately halted until the customer provides a reasonable explanation.

Tooling & Bookkeeping

This program relies on many software tools in order to function. The Filecoin Foundation and PL have invested in many different elements of this end-to-end process, and will continue to make those tools open-sourced. Our goal is to increase adoption, and we will balance customization with efficiency.

This section will cover the various UX/UI tools for your pathway. You should think high-level (GitHub repo architecture) as well as tactical (specific bots and API endoints).

Describe in as much detail as possible the tools used for: • client discoverability & applications • due diligence & investigation • bookkeeping • on-chain message construction • client deal-making behavior • tracking overall allocator health • dispute discussion & resolution • community updates & comms

Customer Discovery and Application: Customers can contact us through Slack (Lisa), Github (Stcloudlisa), official email, or our company's Twitter. They can also reach out to our business development department. We have branch offices in Shenzhen, Hong Kong, and Singapore.   Due Diligence: Customers are required to fill out the "ST-DC" application form. We conduct a review of the customer based on the information provided, such as their official website, establishment date, and social media presence. We also perform KYB (Know Your Business) checks using third-party platforms like LexisNexis and Qichacha. Throughout the collaboration process, we investigate the customer's allocation plans through tools like ACbot and retrieval systems.   Bookkeeping: All applications and communication records are documented on GitHub, ensuring transparency as anyone can access them. If video conferences are conducted, they are archived on Zoom. Signed contracts are stored in our email, which has cloud storage enabled for easy access at any time.   On-chain Message Construction: On-chain messages are constructed using Lotus or official tools available at https://filplus.fil.org/#/ or https://filplus-registry.netlify.app/.   Customer Transactions and tracking overall allocator health: Customer transactions can be conducted through methods such as disk transfer, online data transfer, and downloading. The allocation status of customers can be viewed on GitHub through the bot. Tools like datacapstats.io and https://dashboard.starboard.ventures/dashboard can be used for querying.   Dispute Resolution: Slack and GitHub serve as the primary platforms for community discussions. We recommend resolving disputes by creating proposals on GitHub. If the dispute is relatively simple, our PR (Public Relations) team can handle it. For more significant disputes, other notaries and the official governance team will be invited as committee members to resolve the dispute.   Community Updates: Our team stays updated on FIL+ policies and tool updates. We actively participate in discussions on Slack and GitHub and engage in notary meetings. We believe that GitHub is the best tool for publishing specific update content, while Slack and Zoom serve as ideal platforms for discussing updates.

Will you use open-source tooling from the Fil+ team?

Thanks to the FIL+ team's efforts, the community has many useful tools. We will use the FIL+ team's open source tools to manage our own LDN. Lassie: used to retrieve data https://filplus.fil.org/#: Signature system, decentralized DC filfox.io: Query SP computing power and luck value fillus-ssa-bot: Query LDN balance datacapstats.io: Query SP distribution and DC proportion used by each SP fiplus-backend: used for bookkeeping management and LDN management

Where will you keep your records for bookkeeping? How will you maintain transparency in your allocation decisions?

GitHub is an open platform that keeps allocation decisions transparent and thoroughly documented. The GitHub platform will serve as a central repository for recording our allocator operations, such as:

The ST-DC application form will be stored on Github. ST-DC’s due diligence process will be displayed on Gtihub. The time and amount allocated will be displayed on GitHub. GitHub will also serve as our platform for handling and resolving disputes.

Github's open and transparent approach not only keeps our allocator open, it also makes our activities and decisions easily accessible to community members and the Fil+ governance team.

Some private information will be reflected in Slack communications, and important contract documents (ID cards, legal person information, contracts, data authorization statements) may also be retained via email. At the same time, based on any conclusions or major conclusions generated in private conversations, Decisions will be summarized and published on GitHub. If there is a dispute, we are willing to share documents from the private conversation with the governance team.

Risk Mitigation, Auditing, Compliance

This framework ensures the responsible allocation of DataCap by conducting regular audits, enforcing strict compliance checks, and requiring allocators to maintain transparency and engage with the community. This approach safeguards the ecosystem, deters misuse, and upholds the commitment to a fair and accountable storage marketplace.

In addition to setting their own rules, each notary allocator will be responsible for managing compliance within their own pathway. You will need to audit your own clients, manage interventions (such as removing DataCap from clients and keeping records), and respond to disputes.

Describe your proposed compliance check mechanisms for your own clients.

Compliance inspection mechanism: 1. Combined automatic and manual inspection: We will automatically check the distribution and compliance of data through Checker bot, and determine whether the data has been amplified by retrieving data, downloading data, and comparing metadata. 2. Regular inspections and audits: We will conduct regular inspections with customers to review their DataCap usage and storage transaction performance to ensure compliance with assigned DataCaps and their intended use.

Tracking DataCap distribution metrics: We will closely monitor metrics related to DataCap distributions, such as the number of distributions, retrieval rate, and address location distribution.

Understand customer demographics and timing metrics: We will query specific SP numbers and distribution through datacapstats.io. We will also track timing metrics related to DataCap requests, allocations and utilization to evaluate the efficiency and responsiveness of our processes. .

Trust Rating: Customers must fill out an onboarding application and undergo an onboarding check. To receive a larger DataCap allocation, customers are required to provide data processing experience.

Use of automated tools: Tools such as CID Checker and Retrievability Bot will be used to automatically monitor and verify the integrity and retrievability of stored data.

New Customer Policy and Tolerance: New customers will be subject to stricter scrutiny than existing customers. We need to clearly know who they are, their purpose of obtaining DC, and the interest groups behind them. They will need to provide evidence of their ability to responsibly load data onto the Filecoin network. This may include proof of previous small-scale data coming online.

In ST-DC allocator, the DC received by new customers in the first round should not exceed 250TiB. When allocating them to the second round, we need to comprehensively and detailedly check their first round allocation plan.

We can accept that customers do not understand technology, and we are willing to provide them with technical assistance, but we cannot accept that they abuse DC and use lies to obtain DC. Therefore, if a dispute arises, we will immediately stop the issuance of DC.

Describe your process for handling disputes. Highlight response times, transparency, and accountability mechanisms.

At ST-DC Allocator, we will establish a GitHub repository specifically for submitting dispute issues. In GitHub, we will publish a complete audit trail of dispute decisions and accept suggestions from the governance team and the community.

Submit and track disputes on GitHub: All disputes are systematically tracked and logged on GitHub. This approach ensures transparency as the wider community can understand the details of the dispute and our response.

Discuss the dispute on Slack or Zoom: If the dispute expands and the community begins to discuss the dispute on Slack and gets into a fierce fight, we will invite the governance team to join in, conduct a zoom meeting, and handle the dispute as soon as possible.

Response time: We will respond to questions and handle disputes within 24 hours.

Detail how you will announce updates to tooling, pathway guidelines, parameters, and process alterations.

All updates to our tools, guides, and processes will be discussed in our GitHub issues and public Slack. We adhere to the principles of decentralization and community self-governance. We do not make significant changes unilaterally. Instead, we initiate proposals to allow community members to comment and express their opinions, particularly valuing input from the governance team. Once the issue is approved by the community, we proceed with the updates.

How long will you allow the community to provide feedback before implementing changes?

Community Interaction Channels: GitHub issue Slack: We will make announcements in the fil-plus and fil-fips channels. Slack (DM): For any private communication, you can contact LISA.

We will adhere to the principle of community self-governance and respect the opinions of the community. We will publish change notifications on GitHub, Slack, and Twitter, allowing a minimum of 15 days for community feedback.

For significant changes, we will provide more time for discussion, with a minimum of 60 days.

Not all changes will be implemented. the larger the change, the more importance we place on the opinions of the community and the governance team. If there is significant opposition, we will not proceed with the change.

Regarding security, how will you structure and secure the on-chain notary address? If you will utilize a multisig, how will it be structured? Who will have administrative & signatory rights?

We use Ledger to create and protect addresses. Ownership: LISA, Xianjun Lai (Legal Representative) Signing Authority: Xianjun Lai (Legal Representative), Yanjie Wang (Technical Lead), LISA The Ledger mnemonic phrase is kept by Xianjun Lai (Legal Representative). When we sign, it requires the participation of three parties. LISA assesses the customer's background, Yanjie Wang evaluates the customer's data (allocation plan, CID, retrieval), and after both LISA and Yanjie Wang agree to sign, the signing request is submitted to Xianjun Lai (Legal Representative) for the actual signing process.

Will you deploy smart contracts for program or policy procedures? If so, how will you track and fund them?

We tend to provide the systems developed free of charge to customers. Whether to open-source them to the community requires discussion among team members. If we utilize open-source tools developed by the FIL+ team, we will open-source our developments and updates based on those FIL+ tools to the community.

Monetization

While the Filecoin Foundation and PL will continue to make investments into developing the program and open-sourcing tools, we are also striving to expand and encourage high levels of service and professionalism through these new Notary Allocator pathways. These pathways require increasingly complex tooling and auditing platforms, and we understand that Notaries (and the teams and organizations responsible) are making investments into building effective systems.

It is reasonable for teams building services in this marketplace to include monetization structures. Our primary guiding principles in this regard are transparency and equity. We require these monetization pathways to be clear, consistent, and auditable.

Outline your monetization models for the services you provide as a notary allocator pathway.

ST-DC Allocator is dedicated to assisting in the storage of more valuable data on Filecoin, with the ultimate goal of establishing Filecoin as a storage cornerstone in the metaverse. As a result, we are inclined to provide DC services to high-quality customers.

To achieve this, we may implement relatively low fees that cover the operational costs of the ST-DC Allocator team. Based on our considerations, we may set the pricing at 0.8 FIL/T, which includes:

In addition, we have developed discounts specifically for premium customers, with discounts stacking up to a maximum of 60%:

  1. Extremely high-quality enterprise data (leading enterprises with a history of more than 10 years and a real data volume exceeding 2 PiB): 30% discount
  2. Comparatively high-quality enterprise data (established for more than 3 years, with over 50 employees and a real data volume exceeding 500T): 20% discount
  3. Public datasets: 10% discount
  4. Collaboration with storage providers (SPs) distributed across 3 continents: 10% discount
  5. Customers finding high-quality SPs in specific regions where SPs are less active: 10% discount
  6. Collaboration with SPs that enable fast and long-term retrieval: 10% discount

Consequently, the highest discount can reach up to 60%, with most discounts ranging from 30% to 50%. On average, the fee pricing could be approximately 0.45 FIL/T.

Describe your organization's structure, such as the legal entity and other business & market ventures.

Website:https://yunos.io/ Promotional activities: https://v.qq.com/x/page/s3219a9ooi1.html https://drive.google.com/file/d/1xQDkNcckyinYfkLuOCWF2med4dTfAHzR/view?usp=sharing Interview: https://v.qq.com/x/page/j3017ziwzri.html https://v.qq.com/x/page/t3219mnjjvt.html It has been 7 years since we have been deeply involved with IPFS and Filecoin. Since 2015, our team has been following and propagating IPFS. In October 2017, in order to better promote IPFS and Filecoin in China, we established the IPFS China Community (ipfs.cn). So far, it has more than 100,000 fans. From 2017 to 2021 based on the original intention of "Propagating for IPFS and Voice for Filecoin", our team successively conducted IPFS and Filecoin themed promoted sermons in Shenzhen, Guangzhou, Beijing, Hangzhou, Hefei, Changsha, Wuhan and so on. From 2022 to 2023, we became a Filecoin notary. We actively participated in the governance of the FIL+ community, participated in notary meetings, and expressed our opinions on slack and github proposals. We also participated in E-FIL and recommended a "state-owned enterprise" to join the Filecoin network. At the same time, we also actively preached. We have offices in China and Singapore. In the past two years, we have actively organized conferences and FIL+ brings enterprise customers. In 2020, our team actively participated in Filecoin testnet and achieved superior results in Space Race 1 and Space Race 2. On October 15, 2020, Filecoin mainnet was officially launched. we provide mining technical support for the mining pools FILPool and BPool, which is serving Filecoin investors. Our team has been deeply involved in the storage industry for more than 12 years. We have rich experience in data storage, cluster architecture construction and emergency handling. The storage products we have developed have now cooperated with the multinational giant-China Unicom and served nearly one million people. We not only have advantages in storage technology, but also are committed to developing the ecological construction of IPFS and Filecoin. To help fans understand IPFS & Filecoin faster, we have created the IPFS China Community (ipfs.cn). In order to help teams with ideas to develop the ecology, we have established the IPFS Ecological Fund for investing in applications that are beneficial to the ecology. According to the economic model of filecoin, in order to increase the liquidity of filecoin, we participated in the investment and incubation of the Defil platform.

Where will accounting for fees be maintained?

We have a mature financial team, and cost accounting will be carried out in a dedicated and transparent financial management system to ensure that all transactions are recorded and traceable. We welcome the governance team and the community to review and supervise these records.

If you've received DataCap allocation privileges before, please link to prior notary applications.

https://github.com/filecoin-project/notary-governance/issues/764 https://github.com/filecoin-project/notary-governance/issues/429

How are you connected to the Filecoin ecosystem? Describe your (or your organization's) Filecoin relationships, investments, or ownership.

As one of the largest Storage Providers (SPs) in the Filecoin network, we have made investments in IPFSCN, DEFIL, Lingdong, Wan'anxin, Distributed Capital, and Luckyhash. We have provided detailed evidence of our investments in our Notary Node application #764.

How are you estimating your client demand and pathway usage? Do you have existing clients and an onboarding funnel?

We have provided our contact email on our official website, customers can reach us through email (stcloud@yunos.io). We have offices in both China and Singapore, customers can contact us through offline channels. Additionally, we promote our information on social media, partner networks, and online advertisements. For transitioning to the ST-DC distributor, our primary onboarding channel is: https://github.com/stcouldlisa/IPFSSTDC/issues/new/choose Regarding the signing of contracts, personal identification documents, utility bills, bank statements, and other relevant documents, we will discuss and communicate through Slack and email. However, significant decisions will be synchronized on GitHub.