fcorbelli / zpaqfranz

Deduplicating archiver with encryption and paranoid-level tests. Swiss army knife for the serious backup and disaster recovery manager. Ransomware neutralizer. Win/Linux/Unix
MIT License
260 stars 22 forks source link

9802: ERR kind 123 + Some questions #14

Closed aybli closed 2 years ago

aybli commented 2 years ago

Hello :)

When I enter...

zpaqfranz.exe a test_????.zpaq *.jpg -index test_0000.zpaq

...and then add another version, with zpaqfranz I get this error:

zpaqfranz.exe a test_????.zpaq *.jpg -index test_0000.zpaq
zpaqfranz v54.11-experimental (HW BLAKE3), SFX64 v52.15, compiled Dec 28 2021
9802: ERR <test_????.zpaq> kind 123

When using zpaq 7.15, it works fine.

I also have some questions... I'm currently testing if zpaq works for my scenario.

zpaq and zpaqfranz restore to the correct time modified but using -list they report time that seems to be UTC?

In my test I have an index file locally and multi-part versions stored remotely. The first version is 18 MiB. Is it possible to access individual files inside the archive without downloading the full archive? I have it mounted via rclone vfs.

When I list "test_0001.zpaq" which is 18 MiB from the remote it downloads the whole file before showing the list. When I list "test_0002.zpaq" which is just 9 MiB, it downloaded some 65 MiB even though all versions combined are just 29 MiB. Is this a flaw in rclone/remote or is this a limitation of zpaq?

fcorbelli commented 2 years ago

1) On index files: I'll check, thanks for the report 2) On list: it is a feature :) In fact standard OS conversion is used. I'll check if it is possible to change this behavior 3) On rclone. I do not know. I do use, for remote backup, straight rsync with --append, and a secondary check (for file integrity). This is the overall faster mode to send backups "somewhere". In my use case a local backup is updated (on another media), then sended remotely via rsync. Or, either, a dropbox share (!) for automatic upload. If I understand right you do not store locally the backup, by "directy" via rclone I have to install this software, simply I do not know how it work

Short version: I'll check and work on this issues as soon as possible

fcorbelli commented 2 years ago

Update on 1) (-index) The problem is in a new feature of zpaqfranz, the "getpasswordifany"
When you open an encrypted file, WITHOUT a -key, zpaqfranz will detect and ask from keyboard

=> some fixe needed

aybli commented 2 years ago
  1. Great, I hope you find the fix :)
  2. I see, not a big deal but would be nice.
  3. I agree, a local backup that syncs to a remote would be better.

Thanks for the fast response!

fcorbelli commented 2 years ago

On Windows 64 bit please check the attached pre-release

1) Should be fix the "do-not-check-passwords-on-multipart-archive"
2) Should show in local time just about everything. With the new -utc switch goes back to default behaviour 3) You can find some example into the wiki
Here just an example (on Windows) for rsync to a remote server via RSA key

rsync -e "c:\cloud\ssh -i /cygdrive/c/cloud/backup_franco"      -I  --append --partial -r --progress -vv --chmod=a=rwx,Da+x --delete /cygdrive/k/test/cloud franco@something.francocorbelli.com:/home/franco/writehere

With

Please let me know if this fix the issues.
Thank you

12f.zip

fcorbelli commented 2 years ago

If you have ssh access to the remote server, you can run script like this on Windows (from the c:\cloud folder) to launch remote script (/root/script/checkfranco.sh)

SETLOCAL
SET CWRSYNCHOME=c:\cloud

REM the key is here (arrrgghhhhh!!!!)
SET PATH=%CWRSYNCHOME%\BIN;%PATH%
c:\cloud\ssh -i c:\cloud\francosshkey franco@theserver.francocorbelli.com /root/script/checkfranco.sh

Or a more "direct" approach to check local "k:\test\cloud\provona.zpaq" vs the remote one /home/franco/writehere/provona.zpaq

c:\cloud\zpaqfranz sum k:\test\cloud\provona.zpaq -xxh3-noeta -pakka 
c:\cloud\ssh -i c:\cloud\francosshkey franco@theserver.francocorbelli.com /usr/local/bin/zpaqfranz sum /home/franco/writehere/provona.zpaq -xxh3 -noeta -pakka 

Hope this can help (example script, please check yourself as needed)

fcorbelli commented 2 years ago

If you have SSD/NVMe, and more than 1 file (like a bunch _????) you can run the sum() command with -all, to run a multithreaded hash
Typical speed ~500MB/s with SSD, over 2GB/s (with NVMe)
Personally I do NOT use multipart volumes, if not really, really needed (first huge-local backup, only incremental one sended to a remote site)

aybli commented 2 years ago
  1. Can confirm, it now works as expected.
  2. Thank you!
  3. Thanks for the tips, I will have to see if I can get this to work but it seems like the better solution.

Thanks for the super-fast update!