maidsafe / safe_network

Autonomi combines the spare capacity of everyday devices to form a new, autonomous, data and communications layer of the Internet
http://autonomi.com
80 stars 49 forks source link

bug: uploading to the newly reset network many chunks have been carried over from the previous network #2111

Open happybeing opened 1 month ago

happybeing commented 1 month ago

I'm uploading lots of files that I previously uploaded, and after the recent reset most/all chunks are still there from the previous network.

maqi commented 1 month ago

chunks are still there from the previous network.

the nodes of previous network that owned by us will continue run for couple of days more to allow some statistic data to be collected. or some other node owner's node forgot to reset and still stay with previous network. In that case, if your client try to connect to previous network, then there is chance the connection will be established, and fetch some chunks back. This itself shall not be considered as an issue.

However, if you mean your client is connected to new network, but can fetch some content that blongs to previous network, then that could be an issue.

Could you clarify the detail of this reported issue? thx

happybeing commented 1 month ago

However, if you mean your client is connected to new network, but can fetch some content that blongs to previous network, then that could be an issue.

Yes, this is what I mean. I've seen this effect both with my client built for the new network, and with safe files upload using safe -V -> 0.95.0. @aatonnomicc also reported this when he was uploading BegBlag to the new network.

maqi commented 1 month ago

looks like there are some node owner did upgrade instead of reset for this time. this will upgrade the node and allows it connected to new network meanwhile bring old records into new network

rid-dim commented 1 month ago
Logging to directory: "/home/riddim/.local/share/safe/client/logs/log_2024-09-16_16-01-48"
safe client built with git version: 08b0a49 / stable / 08b0a49 / 2024-09-09
Instantiating a SAFE client...
Connecting to the network with 25 peers
đź”— Connected to the Network                                                                                                                   Chunking 1 files...
"BigBuckBunny_320x180.mp4" will be made public and linkable
Splitting and uploading "BigBuckBunny_320x180.mp4" into 125 chunks
**************************************
*          Uploaded Files            *
**************************************
Uploaded "BigBuckBunny_320x180.mp4" to address b0c551356ff21b023ddb01add3fa74a8eb7db3a6a940722989e98c611507ae4c
Among 125 chunks, found 61 already existed in network, uploaded the leftover 64 chunks in 8 minutes 2 seconds
**************************************
*          Payment Details           *
**************************************
Made payment of NanoTokens(74) for 64 chunks
Made payment of NanoTokens(74) for royalties fees
New wallet balance: 0.000001060
Completed with Ok(()) of execute "Files(Upload { file_path: \"BigBuckBunny_320x180.mp4\", batch_size: 16, make_data_public: true, retry_strategy: Quick })"

half the file already existed on the current network - hard to believe someone uploaded half of the chunks for me in advance

oh - stupid me ... @maqi already did diagnose it ... but somehow this still feels strange doesn't it? should this really be possible? ... that data was not paid for on this network ...

rid-dim commented 1 month ago

but @maqi you say this like it would be expected behavior ... isn't this highly problematic ...?

I could self encrypt my data and just start a specialized uploader node to upload it instead of paying for uploads… (or a uploader process that mimics a node for just enough seconds to do the uploads) ...or I could generate data, upload it to the network, shut my node down again - and repeat ...

Bringing data into the network for free opens the gates to gaming the system and for attacks ....?

maqi commented 1 month ago

it would be expected behavior

I don't mean it's an expected behaviour, just try to explain what happened :)

just start a specialized uploader node to upload it instead of paying for uploads

That node must to be close to the data generated.

Meanwhile, I have raised this as a concern and we are discussing the options to prevent that.