keyko-io / filecoin-large-clients-onboarding

0 stars 0 forks source link

[DataCap Application] - test reconnecting issues #298

Closed fabriziogianni7 closed 2 years ago

fabriziogianni7 commented 2 years ago

Large Dataset Notary Application

To apply for DataCap to onboard your dataset to Filecoin, please fill out the following.

Core Information

Please respond to the questions below by replacing the text saying "Please answer here". Include as much detail as you can in your answer.

Project details

Share a brief history of your project and organization.

IPFS Korea is founded in 2018, and we have participated in first level and second level test nets too. We built a Main net and Currently owned 27 nodes. Our company is growing into a storage company based on IPFS. We build a large scale of servers and operate storage centers, and provide professional management services. We will stand at the forefront of the fourth industry and try to fulfill our responsibilities and roles to secure Korea's data sovereignty. But we are focusing on high speed sealing. so we are not planning to make storage provider for datacap. 
Moreover, We are expanding our business to NFT, so we will create NFTs and store NFTs (related to K pop, Art, Music, etc.) and we want to store those data in to File Coin Storage. 

What is the primary source of funding for this project?

Parula(NFT Market), Kpop industry, and NFT making company.

What other projects/ecosystem stakeholders is this project associated with?

Kpop, NFT, Galleries, IPFS KOREA

Use-case details

Describe the data being stored onto Filecoin

For NFT.Storage
We are offering storage of all NFT artwork of Parula(NFT Market), which should be preserved long term on Filecoin and IPFS to ensure owners of NFTs can continually access the media they own. In the future, we might expand this to serve use cases broader than NFTs (Related to IP, K-pop, etc)

Where was the data in this dataset sourced from?

South Korea

Can you share a sample of the data? A link to a file, an image, a table, etc., are good ways to do this.

ok! 
![image](https://user-images.githubusercontent.com/96154919/146582061-616b8eb4-6787-4d5a-b453-07050c6469d1.png)

Confirm that this is a public dataset that can be retrieved by anyone on the Network (i.e., no specific permissions or access rights are required to view the data).

yes, it is a public data

What is the expected retrieval frequency for this data?

This type of data is attractive to the general public as aesthetic and entertaining content. So the search frequency is expected to be as high as that of gallery and museum sites. This is the reason why we chose the Filecoin network for storage. We want to retain the data over a longer period of time and provide it to those who need it when they need it.

For how long do you plan to keep this dataset stored on Filecoin?

I think it's a permanent archive and we'll constantly update our data stored on Filecoin.

DataCap allocation plan

In which geographies (countries, regions) do you plan on making storage deals?

All around Asia, but mostly in South Korea. 

How will you be distributing your data to storage providers? Is there an offline data transfer process?

We will use online transfer to distribute our data at first. If there is any abnormality, we probably resort to offline transfer. We do have an offline data transfer system.

How do you plan on choosing the storage providers with whom you will be making deals? This should include a plan to ensure the data is retrievable in the future both by you and others.

We will select miners through miners reputation systems like filrep.io, or Filecoin blockchain explorers. 

How will you be distributing deals across storage providers?

As we have a large amount of data, I think we better use offline transfer. but we do know both way so we can use both on-line and off-line transfer. 

Do you have the resources/funding to start making deals as soon as you receive DataCap? What support from the community would help you onboard onto Filecoin?

Currently, One NFT market company is waiting for this deals. They have more than 250 artists and NFT artworks. Also, We have connection with NFT, Metaverse flatform companies (Related to K pop entertainment, Games). Also they want to start making deals as soon as possible. So, we can make a deal with those companies too.  
large-request[bot] commented 2 years ago

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

fabriziogianni7 commented 2 years ago

Multisig Notary Reconnection Request

Multisig Notary Address

f01403239

Client Address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

Notary Governance Issue

https://github.com/keyko-io/filecoin-notaries-onboarding/issues/370

large-request[bot] commented 2 years ago

fabriziogianni7 Has not the permission to post this comment

fabriziogianni7 commented 2 years ago

fabriziogianni7 Has not the permission to post this comment

Only gov team members or assignees can trigger this flow

fabriziogianni7 commented 2 years ago

Multisig Notary Reconnection Request

Multisig Notary Address

f01403239

Notary Governance Issue

https://github.com/keyko-io/filecoin-notaries-onboarding/issues/370

large-request[bot] commented 2 years ago

DataCap Allocation requested

Multisig Notary address

f01403239

Client address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

DataCap allocation requested

50TiB

large-request[bot] commented 2 years ago

Stats for DataCap Allocation

Multisig Notary address

f01403239

Client address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

DataCap allocation requested

50TiB

Stats

Number of deals Number of storage providers Previous DC Allocated Top provider Remaining DC
2157 9 25TiB 32.72 64GiB
large-request[bot] commented 2 years ago

Datacap allocation triggered after issue reconnection request.

fabriziogianni7 commented 2 years ago

in this case the client need a datacap allocation --> the bot triggered the request and the stat comment

large-request[bot] commented 2 years ago

This is a Issue reconnection request. the client has enough datacap. the bot will post the request when needed. Updating labels.

fabriziogianni7 commented 2 years ago

This is a Issue reconnection request. the client has enough datacap. the bot will post the request when needed. Updating labels.

in this case the client don't need a datacap allocation --> the bot updated the label for the ssa bot to check this issue when running. when the client need dc, ssa bot will trigger the allocation request

fabriziogianni7 commented 2 years ago

@galen-mcandrew @dkkapur here is a test issue explaining how the flow would work. please, have a look to the comments and feel free to give feedback

fabriziogianni7 commented 2 years ago

the dc requested is calculated taking in consideration the number of allowances and the 1st allowance, in this case the 1st allowance was 25 TiB so the request is 50 TiB. The next will be 100 TiB

large-request[bot] commented 2 years ago

DataCap Allocation requested

Multisig Notary address

f01403239

Client address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

DataCap allocation requested

50TiB

large-request[bot] commented 2 years ago

Stats for DataCap Allocation

Multisig Notary address

f01403239

Client address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

Last two approvers

not found & not found

DataCap allocation requested

50TiB

Stats

Number of deals Number of storage providers Previous DC Allocated Top provider Remaining DC
2157 9 25TiB 32.72 68719476736
fabriziogianni7 commented 2 years ago

DataCap Allocation requested

Multisig Notary address

f01403239

Client address

f3r5wyvf4nhgbuk72wmvheju5uhur4ifatnorqpkzans2gwup6ehxvzmbie2mtmshdggukaxyekzmgttcqsraa

DataCap allocation requested

50TiB

this is the ssa bot running, now it is requesting 50 TiB because is looking at the n of allowances (just one for now) but in a real case it would be 100 TiB

github-actions[bot] commented 2 years ago

This application has not seen any responses in the last 20 days, so for now it is being closed. Please feel free to re-open if this is relevant, or start a new application for DataCap anytime. Thank you!