Open gimims opened 1 month ago
Application is waiting for allocator review
Hope to get your help and support
Hi, @gimims.
Thank you for applying.
I would like to ask you some questions for DC allocation.
In the last LDN, several people applied for DC using EMPIAR, and it was also stored in the filecoin network. You also requested DC allocation, is there a specific reason?
Have you or your organization ever applied for DC as a material for EMPIAR to the previous LDN?
Do you have any experience applying for DC with other data in your previous LDN? If so, please let me know the application link.
Of the SP list you say to save, f03053817 is just the FIL address, all the remaining 11 lists are located in Hong Kong. This doesn't fit the FIL+ distribution rule. DC will not be assigned unless there is SP in other regions, do you understand?
Thank you for your reply.
The data on usgs is continuously growing, with a total of 15 P of data. If create 10 backups, can apply for a 150P DCr. Currently, no one has fully stored the data on the Filecoin network.
We haven't applied for a data center from anyone . We can assure you that we are only applying to you.
Although we don't have experience with DC, we have been working with CC for three years. We hope to have DC to reduce the number of machines. We have borrowed FIL from a lending platform, and the data is ready in advance. We hope to receive your assistance.
We will add f03064819, located in South Korea.
Sincerely, we hope to receive your help.
Total DataCap requested
3PiB
Expected weekly DataCap usage rate
300TiB
DataCap Amount - First Tranche
100TiB
Client address
f1q4ywjzcyw7osmkcwqmr5asaf5sl2bh6zvn7umci
f1q4ywjzcyw7osmkcwqmr5asaf5sl2bh6zvn7umci
100TiB
8c6b7784-e046-4e64-82bf-f6cb59ded5cc
Application is ready to sign
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacea4exbotcrdux3x52i5e7prgu3obqv4nqdpl76lotkw45bb7rypm2
Address
f1q4ywjzcyw7osmkcwqmr5asaf5sl2bh6zvn7umci
Datacap Allocated
100TiB
Signer Address
f1grjkkw3p5hw3vx5gonvppkkzpcgmu4xnwfm7sli
Id
8c6b7784-e046-4e64-82bf-f6cb59ded5cc
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacea4exbotcrdux3x52i5e7prgu3obqv4nqdpl76lotkw45bb7rypm2
Application is Granted
Application is in Refill
checker:manualTrigger
No active deals found for this client.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
Looks like it's been used up, waiting for data update
checker:manualTrigger
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report.
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report.
Version
1
DataCap Applicant
DataVault Solutions
Project ID
DataVault Solutions-004
Data Owner Name
USGS
Data Owner Country/Region
United States
Data Owner Industry
Life Science / Healthcare
Website
https://www.usgs.gov/
Social Media Handle
https://www.usgs.gov/
Social Media Type
Slack
What is your role related to the dataset
Data Preparer
Total amount of DataCap being requested
3PiB
Expected size of single dataset (one copy)
500TiB
Number of replicas to store
6
Weekly allocation of DataCap requested
300TiB
On-chain address for first allocation
f1q4ywjzcyw7osmkcwqmr5asaf5sl2bh6zvn7umci
Data Type of Application
Public, Open Dataset (Research/Non-Profit)
Custom multisig
Identifier
No response
Share a brief history of your project and organization
Is this project associated with other projects/ecosystem stakeholders?
No
If answered yes, what are the other projects/ecosystem stakeholders
No response
Describe the data being stored onto Filecoin
Where was the data currently stored in this dataset sourced from
AWS Cloud
If you answered "Other" in the previous question, enter the details here
No response
If you are a data preparer. What is your location (Country/Region)
None
If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
No response
If you are not preparing the data, who will prepare the data? (Provide name and business)
No response
Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
No response
Please share a sample of the data
Confirm that this is a public dataset that can be retrieved by anyone on the Network
If you chose not to confirm, what was the reason
No response
What is the expected retrieval frequency for this data
Yearly
For how long do you plan to keep this dataset stored on Filecoin
2 to 3 years
In which geographies do you plan on making storage deals
Greater China, North America, Europe
How will you be distributing your data to storage providers
Cloud storage (i.e. S3), HTTP or FTP server, Shipping hard drives
How did you find your storage providers
Slack, Filmine
If you answered "Others" in the previous question, what is the tool or platform you used
No response
Please list the provider IDs and location of the storage providers you will be working with.
How do you plan to make deals to your storage providers
Boost client, Singularity
If you answered "Others/custom tool" in the previous question, enter the details here
No response
Can you confirm that you will follow the Fil+ guideline
Yes