Closed nicelove666 closed 8 months ago
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
The bot of 2204 seems to be bug, the bot did not trigger the next round of signature requests. We submitted the cooperative SP in detail on 2204. The following are the SPs that this LDN will cooperate with. We have listed the SPs, location, and entities in detail. We look forward to your early approval us and review of online transactions. Thank you for your trust and hard work. , we believe together that Filecoin will get better and better.
Provider | Location | SP Entity or Personal |
---|---|---|
f02841613 | HK | Coffeecould |
f02831201 | GuangDong | Juwu Mine |
f02831202 | GuangDong | Juwu Mine |
f02816081 | Singapore | KRAL |
f02816095 | Singapore | KRAL |
f02824157 | BeiJing | zhongchuangyun |
f02824140 | BeiJing | zhongchuangyun |
f02223170 | tianyou | avn |
f02199203 | Inner Mongolia | Richard |
f02760664 | Inner Mongolia | Richard |
Give me a proper index files of your deals, give me two EU and USA miners with no VPN. Show me retrieval on those unsealed copies. Say your data is correct, love to support.
But NOT like this, the current form.
First, we met in a video conference. Maybe you forgot that I am an American. I have been in China recently, the SPs I cooperate with are mainly in Asia, but this does not mean that we do not have foreign SPs.We also have a team in Singapore, and I am contacting FF to meet.During labweek, I also went to Türkiye.
Second, If you care about the sp we work with, I'd be happy to tell you,but they won't start now: f01422327(Japan) f02229545 (Los Angeles) f02252024(United States)
Third, next week or around December 15th, we will have a European SP to start.it is a new SP, I don not know the SP until it is started. When the SP is established, we will disclose it in advance.
Fourth, now that the AC robot is online, leave everything to the intelligent AC robot, and we will meet the requirements of the AC robot.
I hope you can push it forward @kevzak @Filplus-govteam @Sunnyiscoming @Kevin-FF-USA @galen-mcandrew
@nicelove666 Business names and contact information please. I will check for VPN use.
All information is public and we have submitted it. In addition, please use a professional and recognized website to check, such as this website https://seon.io/. There are many similar testing websites. I hope you can make your testing website public so that you can have credibility. I also hope that your testing tool will produce the same results as this type of recognized testing website.
Hello, per the https://github.com/filecoin-project/notary-governance/issues/922 for Open, Public Dataset applicants, please complete the following Fil+ registration form to identify yourself as the applicant and also please add the contact information of the SP entities you are working with to store copies of the data.
This information will be reviewed by Fil+ Governance team to confirm validity and then the application will be allowed to move forward for additional notary review.
All information is public and we have submitted it.
Can you show me in here with contact information? Love to perform due diligence.
We submitted it, hope to see your progress @Sunnyiscoming
Is there any update here? @Sunnyiscoming
@nicelove666 - where are the 10 SPs onboarding these 10 copies? We see 3 SPs listed on your registration form.
f02841613 | coffeecloud | HK | no | ted f02831201 | Juwu Mine | GuangDong | no | Jon f02831202 | Juwu Mine | GuangDong | no | Jon f02824157 | zhongchuangyun | BeiJing | no | lisa f02824140 | zhongchuangyun | BeiJing | no | lisa
Hello, @Filplus-govteam, we fill in the SP according to the requirements of the registration form. However, only 5 SPs can be filled in the registration form.
In order to show our cooperative SPs , we have listed the cooperative SPs in detail in the application form,Hope you can push us forward.
The bot of 2204 seems to be bug, the bot did not trigger the next round of signature requests. We submitted the cooperative SP in detail on 2204. The following are the SPs that this LDN will cooperate with. We have listed the SPs, location, and entities in detail. We look forward to your early approval us and review of online transactions. Thank you for your trust and hard work. , we believe together that Filecoin will get better and better.
Provider Location SP Entity or Personal f02841613 HK Coffeecould f02831201 GuangDong Juwu Mine f02831202 GuangDong Juwu Mine f02816081 Singapore KRAL f02816095 Singapore KRAL f02824157 BeiJing zhongchuangyun f02824140 BeiJing zhongchuangyun f02223170 tianyou avn f02199203 Inner Mongolia Richard f02760664 Inner Mongolia Richard
This shows 6 SPs. You said 10 copies, who is storing all the copies?
Also can you show proof of 1.5PiB dataset from commoncrawl? Which dataset?
@Filplus-govteam Why are these 6 SPs, not 10 SPs? Is the counting unit of SPs "company" or "node"? There are 10 nodes here from 6 companies.
I hope we can set a clear rule about the number of sps, whether they are companies or nodes. then we can launch a issues, everyone will abide by this rule.
https://commoncrawl.org/ This data set has at least 4P of data. With 10 backups, we can apply for a total of 40P. If I calculated it wrong, please tell me. Thank you.
Got it, so you are storing 10 copies across 6 companies.
Per guidelines, no more than one copy per miner ID and no more than 30% per company. Thanks
https://commoncrawl.org/ This data set has at least 4P of data. With 10 backups, we can apply for a total of 40P. If I calculated it wrong, please tell me. Thank you.
Yes, exactly, its very big datasets. So which portion of the 4PiB are you storing? For your data sample you posted their website.
What specific dataset are you storing that is 1.5PiB?
Thank you for taking the time to communicate with me. I appreciate the opportunity to have further discussions with you, such as notary meetings or offline conferences. We have brought in over 100P of DC for Filecoin, and I will provide detailed information in the V5 application.
Now, if possible, I kindly request your assistance in advancing this LDN. Thank you for your hard work, and I wish you a wonderful weekend in advance.
We have stored the downloaded data in 2204, you can view it at any time
In order to save you time, we can upload the data here at http://send.datasetcreators.com at any time. In fact, we have uploaded the data here multiple times for everyone to see, but this website is only valid for 7 days.
But it doesn't matter, I can still upload it for you if you need it.
You have stored what in 2204? There are no details there, there are no dataset details here.
What did you store 14PiB in 2204? What is being stored that is different in 2287? How do we know? How can we see?
Just asking you to add more detail as to which applications include which portions of CommonCrawl. Otherwise there is no record of anything you have stored/will store to look back on
Well, I understand. we will reply to your question in detail. I hope this is a pleasant communication.
We did secondary development based on https://github.com/karust/gogetcrawl. After downloading a batch of data, it automatically splits and packages it into tar and converts it into a car file, so as long as the same download parameters are set, it can be downloaded and Generate car files with the same piececid, which can ensure multi-node backup of the files. Set parameters pointing to different parts of the data set when starting the program to download and generate car files. Currently, all commoncrawl data sets are as follows:
CC-MAIN-2023-MAR-MAY-OCT CC-MAIN-2022-23-SEP-NOV-JAN (2204) CC-MAIN-2022-MAY-JUN-AUG (2204) CC-MAIN-2021-22-OCT-NOV-JAN (2204) CC-MAIN-2021-JUN-JUL-SEP (2204) CC-MAIN-2021-FEB-APR-MAY (2204) CC-MAIN-2020-21-OCT-NOV-JAN CC-MAIN-2020-JUL-AUG-SEP CC-MAIN-2020-FEB-MAR-MAY CC-MAIN-2019-20-NOV-DEC-JAN CC-MAIN-2019-AUG-SEP-OCT CC-MAIN-2019-MAY-JUN-JUL CC-MAIN-2019-FEB-MAR-APR CC-MAIN-2018-19-NOV-DEC-JAN CC-MAIN-2018-AUG-SEP-OCT CC-MAIN-2018-MAY-JUN-JUL CC-MAIN-2018-FEB-MAR-APR CC-MAIN-2018-JAN CC-MAIN-2017-18-NOV-DEC-JAN CC-MAIN-2017-AUG-SEP-OCT CC-MAIN-2017-MAY-JUN-JUL CC-MAIN-2017-FEB-MAR-APR-HOSTGRAPH
Taking LDN2204 as an example, the downloaded and packaged data sets include CC-MAIN-2022-23-SEP-NOV-JAN, CC-MAIN-2022-MAY-JUN-AUG, CC-MAIN-2021-22-OCT-NOV -JAN, CC-MAIN-2021-JUN-JUL-SEP, CC-MAIN-2021-FEB-APR-MAY total about 1.6P. After the car file is generated, the corresponding metadata file will be generated. The file name is the name and starting position of the data set. Based on the metadata file, you can know which part of the total data set is saved by each LDN. For example 1-2204@CC-MAIN-2023-06.csv
This is the kind of information that is valuable to see about a dataset, thank you for sharing
Thanks for your approbate, hope to see updates.
@Sunnyiscoming
Total DataCap requested
15PiB
Expected weekly DataCap usage rate
1PiB
Client address
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
f02049625
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
512TiB
2ed85c7e-7373-4149-bfc2-8a302c37215b
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzacedwz56khl7b4zohrsed6jib3cvuea32gkzjp6oyevus42273rfc4i
Address
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
Datacap Allocated
512.00TiB
Signer Address
f1jvvltduw35u6inn5tr4nfualyd42bh3vjtylgci
Id
2ed85c7e-7373-4149-bfc2-8a302c37215b
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedwz56khl7b4zohrsed6jib3cvuea32gkzjp6oyevus42273rfc4i
Support for the first round.
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacedbjha6esfcwt5wv2yihvyu7gk5fajvq4xmyg5ysgu67ffnuzzsw4
Address
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
Datacap Allocated
512.00TiB
Signer Address
f1mdk7s2vntzm6hu35yuo6vjubtrpfnb2awhgvrri
Id
2ed85c7e-7373-4149-bfc2-8a302c37215b
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedbjha6esfcwt5wv2yihvyu7gk5fajvq4xmyg5ysgu67ffnuzzsw4
The quota has been used up, but the robot did not trigger the next round of signatures. Please help us. @clriesco
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard.
@Filplus-govteam As we can clearly see by the evidence provided above:
STOP allowing abuse once and for all. Close this LDN, note the abusive notaries (again) and listen to me once I say that @nicelove666 is not adhering to the rules like all of their previous LDNs.
@Kevin-FF-USA @galen-mcandrew @simonkim0515
Please publish your website
Randomly insulting the person who brought 100P DC to Filecoin is not a wise choice.
f02049625
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
512TiB
b1b5b89c-5c70-41d5-8ce2-cebf4d6f5017
checker:manualTrigger
✔️ Storage provider distribution looks healthy.
✔️ Data replication looks healthy.
✔️ No CID sharing has been observed.
[^1]: To manually trigger this report, add a comment with text checker:manualTrigger
[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger
[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...
Click here to view the CID Checker report. Click here to view the Retrieval Dashboard.
LGTM, will support this round.
Your Datacap Allocation Request has been proposed by the Notary
bafy2bzaceds2wqnavozrypzsfods5cdvovd5wzw3havw7xoe22l2qmkbunzje
Address
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
Datacap Allocated
512.00TiB
Signer Address
f1foiomqlmoshpuxm6aie4xysffqezkjnokgwcecq
Id
b1b5b89c-5c70-41d5-8ce2-cebf4d6f5017
You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceds2wqnavozrypzsfods5cdvovd5wzw3havw7xoe22l2qmkbunzje
Report shows the case is healthy, willing to support this round
Your Datacap Allocation Request has been approved by the Notary
bafy2bzacea3zmtgqglw5l5r7qieej2pi642se7g2cqx2cy4hgf4sknwyxibh6
Address
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
Datacap Allocated
512.00TiB
Signer Address
f1c5non5yf35avgcpsqvxu4yj54yyvxorwyjochqq
Id
b1b5b89c-5c70-41d5-8ce2-cebf4d6f5017
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacea3zmtgqglw5l5r7qieej2pi642se7g2cqx2cy4hgf4sknwyxibh6
f02049625
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
512TiB
0cb05f47-d20e-4251-8eba-a5b4c162d1e2
Data Owner Name
Commoncrawl
What is your role related to the dataset
Data Preparer
Data Owner Country/Region
United States
Data Owner Industry
Life Science / Healthcare
Website
https://commoncrawl.org/
Social Media
Total amount of DataCap being requested
15PiB
Expected size of single dataset (one copy)
1.5P
Number of replicas to store
10
Weekly allocation of DataCap requested
1PiB
On-chain address for first allocation
f1ht5xh5qtccibzvozb5li43cdhaivheuhy2fje3i
Data Type of Application
Public, Open Dataset (Research/Non-Profit)
Custom multisig
Identifier
No response
Share a brief history of your project and organization
Is this project associated with other projects/ecosystem stakeholders?
Yes
If answered yes, what are the other projects/ecosystem stakeholders
Describe the data being stored onto Filecoin
Where was the data currently stored in this dataset sourced from
AWS Cloud
If you answered "Other" in the previous question, enter the details here
No response
If you are a data preparer. What is your location (Country/Region)
United States
If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
If you are not preparing the data, who will prepare the data? (Provide name and business)
No response
Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
Please share a sample of the data
Confirm that this is a public dataset that can be retrieved by anyone on the Network
If you chose not to confirm, what was the reason
No response
What is the expected retrieval frequency for this data
Yearly
For how long do you plan to keep this dataset stored on Filecoin
2 to 3 years
In which geographies do you plan on making storage deals
Greater China, Asia other than Greater China, Africa, North America, Europe
How will you be distributing your data to storage providers
Cloud storage (i.e. S3), Shipping hard drives, Lotus built-in data transfer
How do you plan to choose storage providers
Slack, Filmine, Partners
If you answered "Others" in the previous question, what is the tool or platform you plan to use
No response
If you already have a list of storage providers to work with, fill out their names and provider IDs below
How do you plan to make deals to your storage providers
Boost client, Lotus client
If you answered "Others/custom tool" in the previous question, enter the details here
No response
Can you confirm that you will follow the Fil+ guideline
Yes