filecoin-project / filecoin-plus-large-datasets

Hub for client applications for DataCap at a large scale
110 stars 62 forks source link

[DataCap Application]Radio telescope——National Radio Astronomy Observatory #2045

Closed nicelove666 closed 11 months ago

nicelove666 commented 1 year ago

Data Owner Name

National Radio Astronomy Observatory

What is your role related to the dataset

Data Preparer

Data Owner Country/Region

United States

Data Owner Industry

Life Science / Healthcare

Website

https://data.nrao.edu/portal/#/

Social Media

https://data.nrao.edu/portal/#/

Total amount of DataCap being requested

27PiB

Expected size of single dataset (one copy)

1P

Number of replicas to store

10

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1y5mkyvzsfxsapuecbbs4hrrmio2te6ajdqpgedq

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

Identifier

No response

Share a brief history of your project and organization

Founded in 1956, NRAO provides the most advanced radio telescope facilities and information to the international scientific community. Currently, https://data.nrao.edu/portal/#/ has stored 4.3PB of data.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Founded in 1956, NRAO provides the most advanced radio telescope facilities and information to the international scientific community. Currently, https://data.nrao.edu/portal/#/ has stored 4.3PB of data.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

How do you plan to prepare the dataset

IPFS, lotus, singularity

If you answered "other/custom tool" in the previous question, enter the details here

No response

Please share a sample of the data

We counted the data on this website, a total of 500,800 pieces of information, with a total capacity of 4.2P。

https://docs.google.com/spreadsheets/d/1F26TunJBid_6SqMYOQscSpm3793y4xYu/edit?usp=share_link&ouid=109823390606932719085&rtpof=true&sd=true

Confirm that this is a public dataset that can be retrieved by anyone on the Network

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, IPFS, Shipping hard drives, Lotus built-in data transfer

How do you plan to choose storage providers

Slack, Filmine, Big Data Exchange

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

f02182867、
f0427989、
f02182798、
f02204960、
f02182743、
f02182802、
f02182902、
f02105219、
f0427989、
f02145020、
f021255、
f02125861、
f02181415、
f02145020、
f021255

How do you plan to make deals to your storage providers

Boost client, Lotus client, Droplet client, Singularity

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

zcfil commented 1 year ago

checker:manualTrigger

filplus-checker-app[bot] commented 1 year ago

DataCap and CID Checker Report Summary[^1]

Retrieval Statistics

Storage Provider Distribution

⚠️ 2 storage providers sealed too much duplicate data - f02363308: 46.02%, f02217602: 22.82%

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients[^3]

⚠️ CID sharing has been observed. (Top 3)

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger

[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Full report

Click here to view the CID Checker report. Click here to view the Retrieval Dashboard. Click here to view the Retrieval report.

zcfil commented 1 year ago

Storage Provider Distribution ⚠️ 2 storage providers sealed too much duplicate data - f02363308: 46.02%, f02217602: 22.82%

Can you explain?Are plans in the works for a restoration?

nicelove666 commented 1 year ago

Hi, dear notary, thank you for your question. We have explained this problem many times. Because the technical guy mailed the hard drive repeatedly, there was a problem of duplicate data. Since we discovered it, we immediately informed the SP to stop using it. Since then, we have not continued to cooperate with SP, so the backup data will not increase. Thank you.

zcfil commented 1 year ago

Hi, dear notary, thank you for your question. We have explained this problem many times. Because the technical guy mailed the hard drive repeatedly, there was a problem of duplicate data. Since we discovered it, we immediately informed the SP to stop using it. Since then, we have not continued to cooperate with SP, so the backup data will not increase. Thank you.

That's not the problem.it's storing too much duplicate data, it's over 20%, is there a plan to fix this at a later stage?

nicelove666 commented 1 year ago

According to communication, the two SP are not encapsulated for the time being. If they need to continue to encapsulate, we hope to continue to work with them and send different data, so we can solve the problem of duplicate data.

zcfil commented 1 year ago

Okay, looking forward to getting to health

Browsing past comments will support this round of

zcfil commented 1 year ago

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedshz3iaj6frduvujmb3a6i4b5q4vgtotb6cr46o4rxhnvkuvnwum

Address

f1y5mkyvzsfxsapuecbbs4hrrmio2te6ajdqpgedq

Datacap Allocated

2.00PiB

Signer Address

f1cjzbiy5xd4ehera4wmbz63pd5ku4oo7g52cldga

Id

443282f1-5a84-4010-b624-387267c960c5

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedshz3iaj6frduvujmb3a6i4b5q4vgtotb6cr46o4rxhnvkuvnwum

github-actions[bot] commented 1 year ago

This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.

-- Commented by Stale Bot.

nicelove666 commented 1 year ago

please keep it open, thanks

large-datacap-requests[bot] commented 12 months ago

DataCap Allocation requested

Request number 14

Multisig Notary address

f02049625

Client address

f1y5mkyvzsfxsapuecbbs4hrrmio2te6ajdqpgedq

DataCap allocation requested

2PiB

Id

23542cfc-5d9c-48fd-9138-4e22abd4bf76

cryptoAmandaL commented 12 months ago

Hi @kevzak I have found that this application has been involved in several irregular activities in several aspects.

  1. The applied quantity of DC and the actual distribution quantity do not match. In the chat records of this post, it can be seen that the applicant initially applied for 15P, but after communication with Sunnyiscoming, for reasons unknown, the DC allocation was increased to 27P. According to the existing LDN application rules, such a large-scale DC distribution should not have been granted here.

  2. The applicant engaged in repetitive signing by the same two notaries. In the below picture I provided, two notaries, Darleen and Peng Kai, simultaneously signed two applications. This action clearly violates LDN's notary signing regulations.

Screenshot 2023-10-30 at 14 53 52
  1. The SP IDs listed in the application form do not match the actual SP IDs involved in sealing. Here I've listed the top 10 SP IDs by actual storage volumes, only one of them are present in the application form. Additionally, out of the 15 IDs submitted by the applicant, only 4 have actual storage. In my view, this falls under deceptive behavior.
WeChatWorkScreenshot_b719c1bf-bfda-448b-864c-a1705909b7db
  1. Given the above violations, I have doubts about the actual sealing content of this applicant. If the applicant wishes to clear their name, please upload the document with the CID code to http://send.datasetcreators.com.

Once again, I believe this post involves fraudulent activities and should be closed.

kevzak commented 11 months ago

checker:manualTrigger

kevzak commented 11 months ago

This application was overlooked as a very large Dataset. Anything above 15PiB needs to follow E-Fil+ pathway. Closing until client completes KYC and provides clear SP list

filplus-checker-app[bot] commented 11 months ago

DataCap and CID Checker Report Summary[^1]

Storage Provider Distribution

⚠️ 1 storage providers sealed too much duplicate data - f02217602: 22.82%

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients[^3]

⚠️ CID sharing has been observed. (Top 3)

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

[^2]: Deals from those addresses are combined into this report as they are specified with checker:manualTrigger

[^3]: To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Full report

Click here to view the CID Checker report. Click here to view the Retrieval Dashboard.

herrehesse commented 11 months ago

Was 12PiB of DC Stolen here by @nicelove666 again? @kevzak can you investigate?

nicelove666 commented 11 months ago

Now, the maxmium of datacap requested is not 15 PB. If you want to apply for this dataset later, why did you apply for 15 PB this time, not total amount of all remaining data?

We originally applied for 6 copies of 5Pib (1947-1952), but they were closed because they were not used. Later we decided to reapply. We initially applied for 15Pib, and then we expressed that we would apply for another LDN after using the 15Pib. Later, sunyiscoming expressed that the LDN application quota can exceed 15PiB, so we applied for 27PiB.

Why it is 27Pib, we have a detailed explanation in the 1947 application: We counted the data on this website, a total of 500,800 pieces of information, with a total capacity of 4.2P.

Since 1947+1948+1949 used 15Pib, 42-15=27Pib.

The entire series holds 42pib of data, and it is very reasonable for a single sp to store 2Pib.

At the same time, according to my observation, sunyiscoming multiple applications for a single LDN can exceed 15Pib.