fidlabs / Enterprise-Data-Pathway

1 stars 1 forks source link

[DataCap Application] <Akave, INC > <2024-09-23T17:03:47.891Z> #71

Open martapiekarska opened 1 month ago

martapiekarska commented 1 month ago

Version

2024-09-23T17:03:47.891Z

DataCap Applicant

@stef-sf

Data Owner Name

Akave, INC

Data Owner Country/Region

IT & Technology Services

Website

akave.ai

Social Media Handle

@akavenetwork

Social Media Type

Twitter

What is your role related to the dataset

Data onramp entity that provides data onboarding services to multiple clients

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

50TiB

Number of replicas to store

2

Weekly allocation of DataCap requested:

50TiB

On-chain address for first allocation

f1j44hd5rxindqftcnp2einqvei4wjrlk5mcgutey

Data Type of Application

Private Commercial/Enterprise

Identifier

Share a brief history of your project and organization

Akave emerged from the need to solve data management problems that users struggle with when taking control back of their data assets. Akave is a protocol that enables businesses and users with a decentralized storage layer and data management tools to efficiently manage on-chain data lakes.

Is this project associated with other projects/ecosystem stakeholders?

Yes

If answered yes, what are the other projects/ecosystem stakeholders

Private datasets

Where was the data currently stored in this dataset sourced from

AWS Cloud, Google Cloud, Azure Cloud, My Own Storage Infra, Other

If you answered "Other" in the previous question, enter the details here

Customer on-premises or colo infrastructure

If you are a data preparer. What is your location (Country/Region)

United States

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

Data will be ingested over S3 and converted into CAR files by the Akave stack.

If you are not preparing the data, who will prepare the data? (Provide name and business)

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

no, these should be unique private customer datasets

Please share a sample of the data

they are private

Confirm that this is a public dataset that can be retrieved by anyone on the Network

No

If you chose not to confirm, what was the reason

it is not a public datasets

What is the expected retrieval frequency for this data

Monthly

For how long do you plan to keep this dataset stored on Filecoin

2 to 3 years

In which geographies do you plan on making storage deals

North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Lotus built-in data transfer

How did you find your storage providers

Slack

If you answered "Others" in the previous question, what is the tool or platform you used

Please list the provider IDs and location of the storage providers you will be working with.

f010446 Texas, Dallas

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

Can you confirm that you will follow the Fil+ guideline

Yes

datacap-bot[bot] commented 1 month ago

Application is waiting for allocator review

stef-sf commented 1 month ago

acknowledged

datacap-bot[bot] commented 1 month ago

KYC has been requested. Please complete KYC at https://kyc.allocator.tech/?owner=fidlabs&repo=Enterprise-Data-Pathway&client=f1j44hd5rxindqftcnp2einqvei4wjrlk5mcgutey&issue=71

martplo commented 1 month ago

@stef-sf Thank you for applying.

The following steps required are: -Github User KYC Check via Gitcoin passport -KYB Form (LINK)

Once complete, you will be eligible for 50TiBs of DataCap.

datacap-bot[bot] commented 1 month ago

KYC completed for client address f1j44hd5rxindqftcnp2einqvei4wjrlk5mcgutey with Optimism address 0xd47c3B723633F0c811C4054b1Bfd980ca44E0b89 and passport score 31.