NateWebb03 / FilTestRepo

A test repository for allocator application automation
1 stars 0 forks source link

Test app 1059 #1061

Open NateWebb03 opened 5 months ago

NateWebb03 commented 5 months ago

Notary Allocator Pathway Name:

Protocan

Organization:

Protocan Labs

Allocator's On-chain addresss:

f172ej7cryhfuymdypv3euuwkeh5rxkhbqdyv7d7a

Country of Operation:

Singapore

Region(s) of operation:

Africa ,Asia minus GCR,Greater China,Europe,Oceania,Japan,North America,South America,Other

Type of allocator: What is your overall diligence process? Automated (programmatic), Market-based, or Manual (human-in-the-loop at some phase). Initial allocations to these pathways will be capped.

Manual

Amount of DataCap Requested for allocator for 12 months:

100PiB

Is your allocator providing a unique, new, or diverse pathway to DataCap? How does this allocator differentiate itself from other applicants, new or existing?

The development of the Filecoin network requires storing more useful and valuable data, necessitating a transition from storing cold data to hot data. With the continuous development of the retrieval network, Filecoin has acquired the capability to store precious, useful, and highly retrievable data. To achieve this goal, we have adopted the following strategies and measures: • Uniqueness and Novelty: We store data from public datasets on the Filecoin network, such as scientific research data and astronomical data. We seek datasets that are unique and novel to avoid duplicating storage of the same data, which would result in resource wastage. We explore entirely new public dataset websites and store their data on the Filecoin network.

• Diversity: We actively seek enterprise data and store it on the Filecoin network. We are committed to storing enterprise data from various industries on the Filecoin network, including architecture design, media advertising, schools, AI intelligence, healthcare, online education, and more.

• High Inclusivity:We are actively developing storage systems for individuals to help more people join the Filecoin network.Our LDNs are well-documented and managed by colleagues who can quickly identify whether a customer is new or existing and assess their social reputation. We customize solutions for customers based on their specific circumstances.

As a member in the Filecoin Community, I acknowledge that I must adhere to the Community Code of Conduct, as well other End User License Agreements for accessing various tools and services, such as GitHub and Slack. Additionally, I will adhere to all local & regional laws & regulations that may relate to my role as a business partner, organization, notary, or other operating entity. * You can read the Filecoin Code of Conduct here: https://github.com/filecoin-project/community/blob/master/CODE_OF_CONDUCT.md

Acknowledgment: Acknowledge

Cient Diligence Section:

This section pertains to client diligence processes.

Who are your target clients?

Individuals learning about Filecoin,Small-scale developers or data owners,Enterprise Data Clients

Describe in as much detail as possible how you will perform due diligence on clients.

Any customer who wishes to obtain DC must fill out an application form and undergo a review process. We firmly believe that all customers must go through a rigorous process to qualify. Here are the three principles we follow in our due diligence investigations: • Information retention and traceability: We store all information on GitHub, Slack, and email. Publicly available information such as addresses, companies, industries, and social media will be kept on GitHub for the governance team to access at any time. Special information such as identification cards and contact numbers will be retained via email for immediate contact in case of disputes. • For existing customers, we prioritize their community reputation, while new customers are subject to detailed KYB authentication. For existing customers, we focus on their historical reputation, retrievability, luck value, and computational maintenance (i.e., the technical capabilities of their team). If a customer has a good community reputation, we will continue to cooperate with them. However, customers who demonstrate dishonesty will be blacklisted. New customers are required to answer a series of questions, and we thoroughly understand them before establishing trust. • In case of disputes, whether with new or existing customers, we immediately halt the distribution of DC and form a five-member investigation team to conduct an inquiry. If necessary, we may involve the governance team in the investigation or discuss it during the notary council meeting. We believe that all customers must go through a rigorous process to qualify. Below, I will outline the information we need to collect when conducting due diligence investigations on customers. For individual customers, KYC will be verified through the following means: • Name • Nationality • School • Occupation and work address • Identification card • Social media accounts (Facebook/LinkedIn/Twitter) • Contact information for local offices • Whether the types of data they store align with their personal and professional activities For corporate customers, their KYB will be verified through the following means: • Business license, including establishment date, address, industry, and taxpayer identification number. • Website • Social media accounts • Corporate legal representative information: ID number • Authorized signatory information: ID card and photo identification • Evidence that the authorized signatory has the authority to act on behalf of the applying entity • Emails sent from the company's domain name to confirm the application is from the company • Conduct checks on the company's operational status and any litigation records through channels such as corporate searches to determine its industry reputation • Company's business direction and revenue • Company's goals and main business activities Our team will have dedicated personnel to verify and assess the above information.

Please specify how many questions you'll ask, and provide a brief overview of the questions.

Company: Introduce your company Website How many employees does your company have? On-chain address for first allocation What is your industry? What type of data do you want to store? Why does your company store data on the Filecoin network? How do you verify your identity? Can you send us emails using your business domain email? Data: Total amount of DataCap being requested Expected size of single dataset (one copy) Weekly allocation of DataCap requested How much data does your company have? Can you provide data examples? Has this dataset been stored on Filecoin before? How long do you plan to store it ? What is the expected retrieval frequency for this data? Where is your data currently stored? Where do you want your data to be stored? How much backup do you want of your data? Confirm that this is a public dataset that can be retrieved by anyone on the Network Technology: Please list the SPs you cooperate with. How many members about your technical team? Please list the storage providers you cooperate with How will the data be prepared? Please include tooling used and technical details How will you be distributing your data to storage providers Do you need our help?Please list it in detail Can you confirm that you will follow the Fil+ guideline

Will you use a 3rd-party "Know your client" (KYC) service?

KYC services are a common due diligence mechanism that involves partnering with reliable third parties to provide comprehensive and reliable customer background checks and verification. We will make full use of third-party services to verify the identity and compliance of our customers. We will collaborate with reputable KYC service providers to ensure the accuracy and credibility of customer identity verification and compliance assessments. For customers in China, we will primarily rely on the following three channels for inquiries: • National Enterprise Credit Information Publicity System (National Enterprise Credit Information Publicity System): This is an official website operated by the Chinese government, providing public information such as company registration details, credit ratings, and administrative penalties. • Administration for Industry and Commerce: Local Administration for Industry and Commerce departments are also important sources for obtaining KYC information about companies. Through their websites or offline service windows, we can access details such as company registration information, business scope, and legal representatives. • Professional business search websites such as Qichacha (https://www.qcc.com/), Tianyancha (https://www.tianyancha.com/), and Qixinbao (https://www.qixin.com/). For customers in other countries such as the United States and Singapore, we will primarily rely on the following three avenues to verify their KYC information: • Business registries such as the U.S. Business Registry and the Accounting and Corporate Regulatory Authority (ACRA) in Singapore. These are government-operated official websites that provide information about company registration, business licenses, tax registrations, and relevant permits. • Regulatory bodies such as the U.S. Securities and Exchange Commission (SEC) and the Singapore Exchange (SGX): For companies listed or engaged in securities business, the SEC provides publicly accessible databases containing registration documents, annual reports, insider trading records, and more. • Business credit reporting agencies: Virtually every country has business credit reporting agencies (such as Experian and DP Information Group) that provide business credit reports and KYC services. These reports include information such as the company's credit rating, financial status, and operational risks. Additionally, we will utilize the following three avenues to gather information about our customers: Firstly, we will use professional address verification tools such as https://seon.io/ to detect whether the customer is using a VPN. Secondly, for customers applying for e-filing, we will communicate with Kevin-z and utilize tools like Synaps (https://efilplus.synaps.me/signup) and Diro (https://diro.io/). Thirdly, we are actively researching bot systems and AC bots. These systems should be open-source, and we plan to integrate and further enhance the bot to make it more intelligent.

Can any client apply to your pathway, or will you be closed to only your own internal clients? (eg: bizdev or self-referral)

We are open to all clients, whether they are internal or external, and they can apply for our services through various channels. We welcome individuals and organizations who are interested, and they can apply by contacting our Business Development department (bizdev) or through self-referral. Our team will have dedicated personnel to handle the application process, and we will respond to these applications within 24 hours. However, whether they are internal or external clients, everyone is treated equally. All applicants, regardless of their status, need to submit their applications on GitHub's issues platform. This platform is publicly accessible, allowing all clients to use it. In addition, the FIL+ Governance team and the Filecoin Foundation have the ability to review the issues at any time.

How do you plan to track the rate at which DataCap is being distributed to your clients?

We will establish our own GitHub-Filecoin project group, clients will fill out the application form there . We will closely monitor the clients' progress on GitHub. The bot is open-source, and we will integrate it into our GitHub project. We will respect the existing bot strategy to monitor the remaining available quota of clients. Additionally, the distribution of clients can be viewed at https://datacapstats.io/. Only when the remaining available quota falls below 25% will it trigger a request for the next round of quota allocation. We will only consider granting the next batch of data cap quotas to clients when the bot triggers the request.

Data Diligence

This section will cover the types of data that you expect to notarize.

As a reminder: The Filecoin Plus program defines quality data is all content that meets local regulatory requirements AND • the data owner wants to see on the network, including private/encrypted data • or is open and retrievable • or demonstrates proof of concept or utility of the network, such as efforts to improve onboarding

As an operating entity in the Filecoin Community, you are required to follow all local & regional regulations relating to any data, digital and otherwise. This may include PII and data deletion requirements, as well as the storing, transmitting, or accessing of data.

Acknowledgement: Acknowledge

What type(s) of data would be applicable for your pathway?

Public Open Dataset (Research/Non-Profit),Public Open Commercial/Enterprise,Private Commercial/Enterprise,Private Non-Profit/Social Impact

How will you verify a client's data ownership? Will you use 3rd-party KYB (know your business) service to verify enterprise clients?

To verify the ownership of the client's data, we will take a series of measures to confirm the client's claims regarding the data source. KYB verification, such as Diro (https://diro.io/), is just one of the methods we will use. Here are the specific measures: • GitHub Application Form: Clients will need to complete the GitHub application form, and we will review the client's information based on the application form. • Data Examples and Capacity Proof: To verify the ownership of the client's data, they will need to provide an adequate number of data examples as well as proof of the data capacity. A minimum of 1 TiB of data examples is required. • Signing of a Cooperation Agreement: We require clients to explicitly declare their ownership of the data in the contract or other legal documents. • Data Authorization and Licensing: Before establishing a cooperative relationship with the client, we will request the client to provide relevant data authorization and licensing documents. These documents may include business licenses, contracts with official seals, recorded videos, etc. They should clearly specify the scope, purpose, and time limits of the client's authorization for us to use their data, with a particular emphasis on data sovereignty. • Use of "Enterprise Email" for Document Submission: The signing of the cooperation agreement and submission of data authorization documents must be done through the client's enterprise email to ensure the authenticity and legality of the documents. • Business Information Verification: We will use professional business information credit systems, commercial registration bureaus, and other enterprise inquiry websites to verify the client's information. • KYB System Authentication: We will utilize KYB systems like Diro (https://diro.io/) for client authentication and verification. Through these measures, we will ensure the validation of the client's data ownership and protect their legitimate rights and data security.

How will you ensure the data meets local & regional legal requirements?

To ensure compliance with local and regional legal requirements, we take the following measures: 1. Legal Compliance Team: We have a dedicated legal compliance team responsible for monitoring and ensuring that our data processing activities comply with local and regional legal requirements. This team stays updated on relevant regulations and legal changes and develops corresponding policies and procedures to ensure the compliance of our data processing activities. 2. Data Classification and Labeling: We classify and label data to differentiate different types of data and apply appropriate protection measures based on their sensitivity. This includes distinguishing personal identifiable information, sensitive business data, and other legally protected data from general business data, and setting specific data processing and storage requirements accordingly. 3. Data Protection Policies and Control Measures: We have established clear data protection policies and implemented appropriate technical and organizational measures to safeguard the confidentiality, integrity, and availability of data. This includes access controls, encryption, data backups, and disaster recovery measures to ensure data security and comply with local and regional legal requirements. 4. Risk Assessment and Compliance Review: We conduct regular risk assessments and compliance reviews to ensure that our data processing activities comply with local and regional legal requirements. This includes assessing the risks associated with data collection, storage, transmission, and processing, and taking necessary corrective measures to maintain compliance. 5. Compliance Training and Awareness: We provide compliance training and awareness activities to our employees to ensure they understand and comply with local and regional legal requirements. This includes training on data protection principles, privacy regulations, and security best practices to enhance employee awareness and understanding of data compliance. In summary, we ensure compliance with local and regional legal requirements through the establishment of a legal compliance team, data classification and labeling, data protection policies and control measures, risk assessment and compliance reviews, as well as compliance training and awareness activities. We are committed to safeguarding the security and compliance of client data and adhering to applicable laws and regulations.

What types of data preparation will you support or require?

• If the customer has his or her own technical team, we will ask the customer to use car files as the final data type that needs to be encapsulated, but we do not limit the size of the car files. A single file only needs to be less than the maximum sector capacity of 32/64G. 2. If the customer does not have a technical team, or the technical solution is not mature enough, then we will provide the customer with a tool for packaging car files. We now have a mature car file packaging and conversion tool that can easily package source files into car files and Generate corresponding metadata files for order issuing and later data query. 3. If it is small data for application types, such as NFT image data, we will provide a platform service for automated process processing. Customers upload source data through the service interface, and our service will automatically package and distribute the data to SP also provides an asynchronous interface for customers to obtain real-time task progress and specific information about the encapsulated and saved data. 4. We have offices in mainland China, Hong Kong, and Singapore. If customers need it, we can send technical personnel to travel to the customer's office to assist customers.

What tools or methodology will you use to sample and verify the data aligns with your pathway?

We will use fillus-backend to manage LDN. Use https://filecoin.tools/ to query the data. Use tools such as lassie fetch/boost retrieve/lotus client/http retrieval to obtain data. Manual inspection of data is essential. Based on the detailed information provided by the customer when applying for DC, we will conduct a certain proportion of spot checks at each quota issuance stage after the initial stage to compare and confirm the actual data with the data type declared by the customer. Whether it is consistent or not, we will retain each sampling data until the end of LDN. We believe that it is also necessary to compare the sampling data at each stage to identify whether the same data has been modified to generate different piececids after making some pattern changes. Files are reused. At the same time, we have enough experience in identifying existing sector filling and abuse. We have also seen different abuse methods, such as junk data filling pictures, inflated MP4s, etc. If abuse is found, we will stop the next batch of quotas. The unused DC amount will be reduced at the same time.

Data Distribution

This section covers deal-making and data distribution.

As a reminder, the Filecoin Plus program currently defines distributed onboarding as multiple physical locations AND multiple storage provider entities to serve client requirements.

Recommended Minimum: 3 locations, 4 to 5 storage providers, 5 copies

How many replicas will you require to meet programmatic requirements for distribution?

5+

What geographic or regional distribution will you require?

We have served 80+SPs, who are distributed in mainland China, HK, Singapore, Japan, the United States and other regions. Decentralized storage and backup contribute to the security of data storage, We require SPs to be located in three different continents and from five different countries, and the proportion of each SP cannot exceed 25%. We have no areas to exclude, all areas are equal.

How many Storage Provider owner/operators will you require to meet programmatic requirements for distribution?

5+

Do you require equal percentage distribution for your clients to their chosen SPs? Will you require preliminary SP distribution plans from the client before allocating any DataCap?

Customers must provide a distribution plan, and the following four distribution bottom lines must be adhered to during the packaging process: a. The maximum amount of data stored in a single SP cannot exceed 25% b. Data duplication rate cannot exceed 20% c. The retrieval rate is greater than 60% d. At least 4 copies of data backup are required We will not force each SP to allocate the same percentage, because each SP can provide different sizes of space, some can provide 10P+ space for packaging, and some may only have less than 1P, and in this Some SPs may also stop encapsulation during the process. So it is very difficult to require the same percentage for each SP. Before allocating any DataCap, we require customers to provide preliminary SP allocation plans, and we will investigate whether these SPs meet the requirements on a case-by-case basis, including geographic location, whether they belong to different institutional entities, etc. We require customers to strictly abide by the distribution plan they provide, and we will also check it carefully. If the customer's commitment is different from what the customer actually cooperates with, we will stop the distribution before the customer gives an explanation. Finally, if the packaging space gap between each SP is relatively large, we will classify the SPs according to their capabilities to ensure that the customer's data can be allocated with basically the same percentage.

What tooling will you use to verify client deal-making distribution?

https://datacapstats.io/, https://filecoin.tools/ and the CID checking bot can be used to verify customer transaction distribution. The basis is that transaction distribution complies with community norms. On top of this, we pay more attention to whether there is any abuse of customers' actual transaction data.

How will clients meet SP distribution requirements?

This our solutions : 1. Mail the hard disk. Store the processed car files in the hard disk and mail them offline to the SP. This method requires negotiation with the SP in advance. Currently, most SPs in China support this method. If it needs to be sent to SPs in other countries, it can be distributed by sending the hard disk to the dedicated network room in the second option below. 2. Dedicated network download. We host a batch of storage servers in the Hong Kong computer room, and have 1000m uplink and downlink bandwidth. We provide file services through the NGINX file server. We can provide users with file transfer services. Of course, we will charge a part of the service fee as operating costs. 3. For some public big data sets, we conduct secondary development based on https://github.com/karust/gogetcrawl. After downloading a batch of data, it will automatically process the packaging and generate car files. As long as the same download parameters are set, You can download and generate car files with the same piececid, which ensures that different SPs can download and encapsulate the same file. This method only requires the SP to have good download network bandwidth. We recommend at least 500m download bandwidth.

As an allocator, do you support clients that engage in deal-making with SPs utilizing a VPN?

We will strictly require customers to truthfully report the SP's address location and interest relationships, we require SPs to be distributed on three continents to meet the requirements of decentralization and diversity, but this is not necessary. Preparatory conditions, if the SP the customer is looking for is only distributed in two continents, I hope he will report to me truthfully and not fake it through VPN.

DataCap Allocation Strategy

In this section, you will explain your client DataCap allocation strategy.

Keep in mind the program principle over Limited Trust Over Time. Parties, such as clients, start with a limited amount of trust and power. Additional trust and power need to be earned over time through good-faith execution of their responsibilities and transparency of their actions.

Will you use standardized DataCap allocations to clients?

No, client specific

Allocation Tranche Schedule to clients:

Will you use programmatic or software based allocations?

Yes, standardized and software based

What tooling will you use to construct messages and send allocations to clients?

We prefer to use https://filplus.fil.org/#/ to send quota signature messages, provided that it can specify a special Notary ID. https://filplus-registry.netlify.app/ also looks nice. We can also perform the issuance of quota by adding a special signature method on Lotus.

Describe the process for granting additional DataCap to previously verified clients.

We will initiate subsequent allocation requests based on community standards, monitoring the remaining DataCap through the https://github.com/filecoin-project/filplus-ssa-bot bot. When it falls below 25%, the next round of allocation requests will be triggered

Tooling & Bookkeeping

This program relies on many software tools in order to function. The Filecoin Foundation and PL have invested in many different elements of this end-to-end process, and will continue to make those tools open-sourced. Our goal is to increase adoption, and we will balance customization with efficiency.

This section will cover the various UX/UI tools for your pathway. You should think high-level (GitHub repo architecture) as well as tactical (specific bots and API endoints).

Describe in as much detail as possible the tools used for: • client discoverability & applications • due diligence & investigation • bookkeeping • on-chain message construction • client deal-making behavior • tracking overall allocator health • dispute discussion & resolution • community updates & comms

Will you use open-source tooling from the Fil+ team?

Yes. We'll use open-source tools, such as AC Bot, CID Checker, https://datacapstats.io/ These open source tools can help follow customers' DataCap usage in real time. Also, GitHub will be used for booking. Tencent sheet will be used for collaboration in our group.

Where will you keep your records for bookkeeping? How will you maintain transparency in your allocation decisions?

Public Platforms (GitHub): We may use public platforms to record and display public information to achieve community transparency and ongoing auditing. This may include summaries of allocation decisions, overall allocation plans, allocation timelines, and more. We will strive to ensure that this information is updated and accessible in a timely manner, providing it to community members for review and supervision. Private Communication Channels (emails, Slack, Telegram): For sensitive or restricted-access information, we may use private communication channels to communicate and share with relevant parties. These communication channels can provide higher levels of privacy and security, ensuring that sensitive information is not accessed by unauthorized individuals. For example, we may interact with the Fil+ governance team via email and provide necessary information as requested to address disputes or conduct audits. Information Content: We will determine which information should be made public based on the requirements and guidance from the Fil+ governance team, providing partial information as needed. Not all due diligence and customer information needs to be made public, so we will selectively disclose and disclose information as necessary. This helps balance the need for transparency and protecting customer privacy. Requesting More Information: If needed, relevant parties can request more information from us through appropriate channels. This can be done by providing email contact information, submitting request forms, or directly communicating with our team. We will make efforts to provide the necessary information and meet the requirements for auditing and dispute resolution within reasonable bounds. In summary, we will record public information on public platforms and communicate and share sensitive information with relevant parties through private communication channels. We will determine the scope and manner of information disclosure based on the requirements and guidance from the Fil+ governance team, and accept requests and provide more information through appropriate channels. This is done to maintain transparency in allocation decisions and balance the needs for transparency and privacy protection.

Risk Mitigation, Auditing, Compliance

This framework ensures the responsible allocation of DataCap by conducting regular audits, enforcing strict compliance checks, and requiring allocators to maintain transparency and engage with the community. This approach safeguards the ecosystem, deters misuse, and upholds the commitment to a fair and accountable storage marketplace.

In addition to setting their own rules, each notary allocator will be responsible for managing compliance within their own pathway. You will need to audit your own clients, manage interventions (such as removing DataCap from clients and keeping records), and respond to disputes.

Describe your proposed compliance check mechanisms for your own clients.

For LDN management, we will continue to provide an application portal on GitHub and utilize filplus-backend for management.
 We plan to integrate a bot, which being open source, will allow us to have a clear view of CID sharing, retrieval status, data backups, and duplicate data.
 We will use https://datacapstats.io/ and filplus-backend to assess the usage, distribution, and proportion of Service Providers (SPs) to evaluate their trustworthiness.
 For both new and existing customers, we believe they should adhere to certain rules. We will provide comprehensive training, technical support, and code services to new customers. Additionally, we will clearly communicate our principles to customers, which include:
a. Each SP's storage capacity should not exceed 25% of the total data.
b. Data duplication rate should not exceed 20%.
c. The retrieval rate should be greater than 60%.
d. Data backups should be maintained in at least four copies.
e. CID sharing is not allowed.
 If customers can adhere to these rules, we will guide and coordinate to address any other issues that may arise. However, if customers violate fundamental rules such as CID sharing, we may consider terminating the cooperation.
Honesty is of utmost importance for both new and existing customers. We hope customers can communicate sincerely with us so that we can collaboratively resolve any technical issues that arise.

Describe your process for handling disputes. Highlight response times, transparency, and accountability mechanisms.

We aim to deal with disputes in a positive and speedy way.
When a dispute occurs, we will respond within 72 hours. To resolve a dispute, we will first find out the details of the dispute and identify the related parties, nature and conditions of the dispute. As we keep detailed records of every deal, we have a full view of our capability to handle disputes. The data we record can support our explanation of any dispute.
For disputes involving the client themselves, we will respond within 72 hours and a reasonable explanation will be provided by GitHub/slack to the challenger.
We will respond to disputes involving our allocator process within 48 hours and will provide a reasonable explanation to the challenger via Email: protocanlabs@outlook.com .

Detail how you will announce updates to tooling, pathway guidelines, parameters, and process alterations.

When we have an update to to tooling, pathway guidelines, parameters, and process alterations, we will make it public via GitHub, website, slack, which will include the specific part of the update, and the function that will be implemented after the update. Before a specific update is put in place, we will also contact the Filplus-govteam via email to confirm that the update can be passed. After the update is done, we will also inform the clients who have applied to us via slack or email in a timely way.

How long will you allow the community to provide feedback before implementing changes?

We allow one month for the community to provide feedback before changes are implemented. It will give community enough time to voice their opinions. We will also post pending updates on our website, GitHub, and slack, and engage in interactive discussions with the community to ensure changes are possible. Community can also communicate with us via email (pangodgroup@outlook.com) for feedback.
We will balance the opinions from community members with our goal of participating in filecoin, and aim to make improvements that are more beneficial to the filecoin ecosystem.

Regarding security, how will you structure and secure the on-chain notary address? If you will utilize a multisig, how will it be structured? Who will have administrative & signatory rights?

We will protect the on-chain notary address by multisig to ensure the security and standardization of signing. Our process of multisig takes place within our company's senior management team, via internal OA approval --- The operations team submits the allocator request and the management team approves the request, copying it to the company's audit team. After internal approval, the management team selects two members for two multisig to grant the DataCap.

Will you deploy smart contracts for program or policy procedures? If so, how will you track and fund them?

We have plans to deploy smart contracts to track DataCap allocation, and ensure compliance and fairness of distribution. The relevant code for smart contracts will be stored and made available on our GitHub repository page. We aim for transparent community participation. We will keep a set amount of money for contract development costs and have a dedicated finance team to manage the accounts for this.

Monetization

While the Filecoin Foundation and PL will continue to make investments into developing the program and open-sourcing tools, we are also striving to expand and encourage high levels of service and professionalism through these new Notary Allocator pathways. These pathways require increasingly complex tooling and auditing platforms, and we understand that Notaries (and the teams and organizations responsible) are making investments into building effective systems.

It is reasonable for teams building services in this marketplace to include monetization structures. Our primary guiding principles in this regard are transparency and equity. We require these monetization pathways to be clear, consistent, and auditable.

Outline your monetization models for the services you provide as a notary allocator pathway.

If the clients do not have the DP's ability to prepare the data themselves. If they need us to help them with data preparation services, they need to pay us a data fee of 10USDT/TiB. If clients need us to help send the hard drive to SPs, we will charge them for the courier fee. Also, we'll ask for a labour fee of 100USDT/day. Our fees are set to cover the cost of the services we provide, the cost of labour and to ensure that we can maintain a high quality and sustainable service.

Describe your organization's structure, such as the legal entity and other business & market ventures.

I'm Tony, Co-founder and VP of Protocan Labs. We are committed to providing professional Internet advertising services for enterprises and various institutions. At present, it has about 35 employees. Since its establishment in August 2003, the company has been following the business philosophy of "honesty based", aiming at "serving the traditional industry, promoting the innovative development of the industry, integrating network marketing, and providing professional services", adhering to the service concept of "customer first, serving with heart, and winning customers by relying on service", constantly improving and improving professional level and service quality, embracing the Internet and innovative technology, and shaping a good service brand and corporate image, It has established a good reputation and popularity in the industry and customers.

Where will accounting for fees be maintained?

We will invite a professional audit team in our company to account for the costs. We will put all records on our website and regularly publish the audits. The process is: Our professional audit team will review the transaction details, flow of funds on a quarterly basis and form an audit opinion, generate a report and finally publish it on our website. Provide a clear audit path for Fil+ governance and stakeholders, and give transparency and ease of audit for all monetisation processes.

If you've received DataCap allocation privileges before, please link to prior notary applications.

https://github.com/filecoin-project/notary-governance/issues/682

How are you connected to the Filecoin ecosystem? Describe your (or your organization's) Filecoin relationships, investments, or ownership.

Used to be a notary and on boarding data successfully.

How are you estimating your client demand and pathway usage? Do you have existing clients and an onboarding funnel?

As a V4 notary, we have signed many DataCaps and well aware of the current demand for DataCaps from clients. Some large organizations need to store corporate cold data, so Filecoin is a good choice for them. At the same time, if you use DataCap, you will save a lot of storage machines. Currently AIGC is very hot, AIGC training needs a lot of data, I believe this is also a good direction. Most of the customers applying for DataCap are looking for cold storage to help them save money on traditional storage providers.