Closed tharrisone closed 5 months ago
No matter how many times I repair or just delete and recreate the database this error pops up.
what version are you using?
duplicati-latest(experimental) 2.0.1.7-1
On 04/21/2016 11:39 AM, agrajaghh wrote:
what version are you using?
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/duplicati/duplicati/issues/1706#issuecomment-213005143Web Bug from https://github.com/notifications/beacon/AFuHg7YPHT3pFi3wN1Vzg7vrBr3Ldm-4ks5p56gygaJpZM4IJVh9.gif
you could try the newest version 2.0.1.14 at https://github.com/duplicati/duplicati/releases
But I don't know if this will fix your issue...
Probably same problem as #1678 and #1695
"Detected file entries with not associated blocks" appears in 2.0.1.14 consistantly for restores on Windows 7
Steps :
Replicating again:
I got this error message now with 2.0.1.20 onm Win7 64bit, too. Remote storage is a SMB-share. There's one file that duplicati always complains abou "Found 1 files that are missing from the remote storage, please run repair"
IS there anything I can do apart from deleting the whole remote storage and recreate the backup again?
@huste repair isnt working?
@agrajaghh no. Sorry, should have written that. I can do that endlessly: run, delete, repair, run -> same error run, recreate, run -> same error and it's always a file that Duplicati complains about which is not in the remote store. I wonder why Duplicati thinks it should be there. I have a profiling log file but that's 1,7GB in size.
Using commandline with affected parameter indentify which backup set this file affects. Then delete backup set and repair database in GUI.
@mnaiman Sorry, I don't get it. I know which backup job is affected. I know which file Duplicati thinks should be there in the remote location. When you say "delete backup set and repair in GUI", what do you mean? When I delete the backup job, I can't repair, as the job isn't there any more. Or do you mean something different by saying "backup set"?
@huste have a look at this comment: https://github.com/duplicati/duplicati/issues/1122#issuecomment-53686217
@agrajaghh Thanks for the pointer. I'll try when I am at work again.
I think I am not doing it right. Here's the excerpt from the log file:
2016-09-12 09:27:22Z - Error: Remote file referenced as duplicati-b5a2c9f7bb91d479188169436482bfe14.dblock.zip, but not found in list, registering a missing remote file
2016-09-12 09:34:13Z - Warning: Missing file: duplicati-b5a2c9f7bb91d479188169436482bfe14.dblock.zip
2016-09-12 09:34:13Z - Error: Found 1 files that are missing from the remote storage, please run repair
Here's what I run afterwards:
C:\Progs\duplicati>Duplicati.CommandLine.exe affected z:\stefan\operaMail duplicati-b5a2c9f7bb91d479188169436482bfe14.dblock.zip
A total of 0 file(s) are affected (use --verbose to see filenames)
Found 0 related log messages (use --verbose to see the data)
Z:\stefan\operaMail is the target directory of the failing backup job.
Still there in 2.0.1.24
I have same error on 2.0.1.31. I have uploaded files to B2. Removed Source folder from backup and added new source path. Removed files from B2. Press run backup and get 3 files missing. Tried to reinstall Duplicati same thing. Database repair do not help.
@bithost can you please describe your steps more exactly? I couldnt reproduce this yet
@agrajaghh, sure. PC: virtual windows 7 x64 PRO N, running on ESXi host. Duplicati backup details.
Duplicati was started with administrator permissions. Run as Administrator. If i ran it normally it wasn't able to backup files from samba share.
General:
Destination:
Source Data:
Schedule:
Options:
So what i did.
Error message: Fatal error System.Exception: Found 10 files that are missing from the remote storage, please run repair at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log) at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend) at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
What i tried to do.
If you need more info i could give you team viewer access to that affected machine.
Thanks, Tadas
@bithost Not sure I understand the problem correctly.
When you remove the files in step (3) you really should get the error in step (5), as the files really are missing.
Whenever things on the remote side looks different than what Duplicati expects, you get an error saying "please run repair". It does not run repair automatically, because if something changes files on the remote side, you really need to figure out what it is, and stop it.
Since you are getting a different number of "files missing" later, I suspect you have more than one backup, and they are somehow connected to the same local database. Is this possible? Maybe another machine with the same user syncs the database to the roaming profile?
For your "uninstall" step (3), I assume you mean "ran steps 1-5 and got same error"?
When you remove the files in step (3) you really should get the error in step (5), as the files really are missing. @kenkendk ,Nope. No error was given in Duplicati, when i removed files on B2 storage.
Since you are getting a different number of "files missing" later, I suspect you have more than one backup, and they are somehow connected to the same local database. Is this possible? Maybe another machine with the same user syncs the database to the roaming profile?
I checked process explorer. Only one Duplicati is running. Tried to reboot pc. Same error. Now constantly getting missing 10 files error. No. There is not other pc configured to do backups to that B2 storage account.
For your "uninstall" step (3), I assume you mean "ran steps 1-5 and got same error"? Yes. I even tried to create backup set with different name in Duplicati. Tried to removed FullSQL folder on B2 and create with different name, same thing.
In logs i see "Please run repair". Log messages when repair is completed:
`Nov 10, 2016 11:02 AM: Failed while executing "Backup" with id: 1
System.Exception: Found 10 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.<Backup>c__AnonStorey0.<>m__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
Nov 10, 2016 11:06 AM: Reporting error gave error
System.ObjectDisposedException: Cannot write to a closed TextWriter.
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at Duplicati.Server.WebServer.RESTHandler.DoProcess(RequestInfo info, String method, String module, String key)
Nov 10, 2016 11:06 AM: Request for http://127.0.0.1:8200/api/v1/serverstate/?lasteventid=9&longpoll=true&duration=299s gave error
System.Threading.ThreadAbortException: Thread was being aborted.
at System.Threading.WaitHandle.WaitOneNative(SafeHandle waitableSafeHandle, UInt32 millisecondsTimeout, Boolean hasThreadAffinity, Boolean exitContext)
at System.Threading.WaitHandle.InternalWaitOne(SafeHandle waitableSafeHandle, Int64 millisecondsTimeout, Boolean hasThreadAffinity, Boolean exitContext)
at Duplicati.Server.EventPollNotify.Wait(Int64 eventId, Int32 milliseconds)
at Duplicati.Server.WebServer.RESTMethods.RequestInfo.LongPollCheck(EventPollNotify poller, Int64& id, Boolean& isError)
at Duplicati.Server.WebServer.RESTMethods.ServerState.GET(String key, RequestInfo info)
at Duplicati.Server.WebServer.RESTHandler.DoProcess(RequestInfo info, String method, String module, String key)
Nov 10, 2016 11:12 AM: Failed while executing "Backup" with id: 1
System.Exception: Found 10 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.<Backup>c__AnonStorey0.<>m__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)`
How can i complete removed Duplicati and all it's databases? I thought removing folder from ProgramData and Roaming folders will make it clean and forget about this problem of missing files.
I ran my first backup (using web interface on headless Centos system) and got the 2 files missing on remote please repair message but I see no way in the web interface to run the repair on the backup job? And without docs on the correct command syntax, I'm dead in the water just groping in the dark.
Here's my log.
Fatal error System.Exception: Found 2 files that are missing from the remote storage, please run repair at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Options options, Duplicati.Library.Main.Database.LocalDatabase database, IBackendWriter log) <0x41dd4b70 + 0x0067c> in <filename unknown>:0 at Duplicati.Library.Main.Operation.BackupHandler.PostBackupVerification () <0x41ec1f90 + 0x001a2> in <filename unknown>:0 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, IFilter filter) <0x41db4080 + 0x016f0> in <filename unknown>:0
After having faced this issue, I did a bit of digging. Here's what I think is happening.
Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList
is run before backing up. As the local database contains entries to non-existent remote dblock files, it notifies the user of the missing dblocks and abortsTo fix the issue, remove the remote dindex files with missing remote dblock files, then repair the database (Make sure to backup the dindex files you delete, just in case). The problem with this approach is that the dindex filename does not match the dblock filename it contains. Thus, what you can do is run the following request on your broken local database:
SELECT R1.Name as IndexFileName, R2.Name as BlockFileName
FROM Remotevolume R1, Remotevolume R2, IndexBlockLink as L
WHERE R1.ID = L.IndexVolumeID AND L.BlockVolumeID = R2.ID
This will return the dindex <-> dblock filename association, such that you can search for the blocks that are listed as missing (from Duplicati logs) and delete their associated remote dindex file.
July 2017, Duplicati will not backup anything using the latest version. This error persists and is the only result I ever get using Windows.
In my experience it looks like Duplicati deletes index files, but does not for some reason remember that it has done so. I have been having issues with getting the Found 1 files that are missing from the remote storage, please run repair
error every few weeks or so in a daily sftp backup job.
I even tried to see what is going on on the remote server using inotify to monitor how the files come and go. Here is one example of a sequence (the times in UTC):
[2017-06-19 05:01:06] CREATE duplicati-b06772e4cf293424e99a385a6ac4e4b47.dblock.zip.aes
[2017-06-19 05:01:06] CREATE duplicati-i81d708633edd4fce872a384c0aea4853.dindex.zip.aes
[2017-06-19 05:01:29] CREATE duplicati-20170619T045810Z.dlist.zip.aes
[2017-06-19 05:14:37] CREATE duplicati-b604a9e921fb34d30bcbbf9e864caa0e9.dblock.zip.aes
[2017-06-19 05:14:37] CREATE duplicati-i60e87f49bd764ba4958ecf0eeea6af9a.dindex.zip.aes
[2017-06-19 05:15:12] DELETE duplicati-b604a9e921fb34d30bcbbf9e864caa0e9.dblock.zip.aes
[2017-06-19 05:15:12] DELETE duplicati-i60e87f49bd764ba4958ecf0eeea6af9a.dindex.zip.aes
At the Duplicati side, I got these messages (the times now in CEST, which is 2 hours + UTC):
19. jun. 2017 10:00: Message
Fatal error
Duplicati.Library.Interface.UserInformationException: Found 1 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
19. jun. 2017 10:00: Message
Found 1 files that are missing from the remote storage, please run repair
19. jun. 2017 10:00: Message
Missing file: duplicati-i60e87f49bd764ba4958ecf0eeea6af9a.dindex.zip.aes
19. jun. 2017 10:00: Message
removing file listed as Temporary: duplicati-i1ffc66dda9614592b98f150098cbef50.dindex.zip.aes
19. jun. 2017 10:00: Message
removing file listed as Temporary: duplicati-b308a93d25ea94132b3696f30038712a8.dblock.zip.aes
19. jun. 2017 10:00: Message
removing file listed as Deleting: duplicati-b604a9e921fb34d30bcbbf9e864caa0e9.dblock.zip.aes
19. jun. 2017 7:15: Message
Compacting because there are 1 fully deletable volume(s)
19. jun. 2017 7:14: Message
No remote filesets were deleted
So duplicati-i60e87f49bd764ba4958ecf0eeea6af9a.dindex.zip.aes
was created, then deleted several seconds later, but the engine still expected it to be there?
In my case, repairing a database works, reinstating the file.
@matejart Great detective work! Does this happen only when the "compact" has run?
It seems very strange that it would upload a file, and then delete the same file afterwards...
It appears to delete both the dblock
and the dindex
file it just created.
It is supposed to delete the older ones.
@kenkendk so far it appears that this happens during the compacting. Here is another example from two days ago (from oldest to newest):
31. jul. 2017 10:00: Message
removing file listed as Temporary: duplicati-bdd5fab61af5a46e58eb5230977a54cf4.dblock.zip.aes
31. jul. 2017 10:00: Message
removing file listed as Temporary: duplicati-i9143305d4e29498cb16c48e3d106c4ca.dindex.zip.aes
31. jul. 2017 10:01: Message
No remote filesets were deleted
31. jul. 2017 10:01: Message
Compacting because there are 1 fully deletable volume(s)
[...]
1. avg. 2017 10:00: Message
Missing file: duplicati-i2064be9259b9440494316e4a1e8a4a56.dindex.zip.aes
And this is what happened on the backup server (timestamps shifted by -2 hours):
[2017-07-31 08:00:57] CREATE duplicati-bd16ef8e2ed434835bf9c5a7a39d0edb1.dblock.zip.aes
[2017-07-31 08:00:57] CREATE duplicati-i2064be9259b9440494316e4a1e8a4a56.dindex.zip.aes
[2017-07-31 08:01:27] DELETE duplicati-bd16ef8e2ed434835bf9c5a7a39d0edb1.dblock.zip.aes
[2017-07-31 08:01:27] DELETE duplicati-i2064be9259b9440494316e4a1e8a4a56.dindex.zip.aes
The funny thing is, the "remote" log lists only the creations, but not the deletions:
31. jul. 2017 10:00: put duplicati-bd16ef8e2ed434835bf9c5a7a39d0edb1.dblock.zip.aes
{"Size":1085,"Hash":"oq4ciLiq3oIkiNuoBG58LUDXtlCrwWNn0EerpWUG03g="}
31. jul. 2017 10:00: put duplicati-i2064be9259b9440494316e4a1e8a4a56.dindex.zip.aes
{"Size":925,"Hash":"1+pbavTUjYUr6V2RE/XDcteYge/fip3JPw5dycDHXx4="}
1. avg. 2017 10:00: list
At some point, this issue caused a file go missing that was somehow essential for any operation, so any backup or restore operation failed. This forced me to start a new backup with the same settings as before. So far I am more than 20 incremental backup runs later, but the issue has not occurred.
So it must have been something weird in the files that were stored on the backup.
I had same issue. I had wrong bucket name and key. Delete entire backup and start over and double check bucket name copy new B2 Application Key! Hope it helps someone. Here's the screenshot https://prnt.sc/h0rh4s1
Same problem here with Duplicati 2.0.1.73 and Backup on Amazon Cloud Drive.
Couple days ago I reinstalled my PC (w7 x64), installed duplicati and imported my backup profiles. However, the same problem as described here in the thread appears for all my backup profiles. I use HubiC cloud storage. Is there anything I can do to help developers debugging this issue?
I'm seeing this as well. I've been trying to figure out my problem by posting to the forums at https://forum.duplicati.com/t/continually-needing-repair/6400/17
Tonight I finally got enough logging to show the file being deleted and why. It's the compacting that is deleting the file and the file is being looked for later. My backup store is nextcloud, so deleted files only go into the recycle bin and I can restore them and the backups complete just fine after that.
I'm using Duplicati - 2.0.4.5_beta_2018-11-28 on Linux and seeing the same behavior on Windows.
This seems like a pretty critical bug to have in a backup system.
I'm seeing the exact same thing as @jpschewe . I'm running a few Duplicati instances on family computers. Two Windows 10 laptops, my own Linux server and an iMac. All backup to the same webdav enabled Owncloud/Nextcloud account at Stack (TransIP). On the iMac I regularly have error messages saying files are missing from the remote storage. The funny thing is, only on the iMac this seems to happen a lot.
I checked if I could find the missing files on the remote location but they were gone. Until I read @jpschewe's post I hadn't thought about the recycle bin. So I searched for the missing files in the recycle bin on the remote storage and there they are. I can't remember the exact Duplicati version on that iMac right now, since it's my uncles, but I do remember updating it to the latest beta version in the beginning of this year. So it's either v2.0.5.1-2.0.5.1_beta_2020-01-18 or it could still be v2.0.4.23-2.0.4.23_beta_2019-07-14. I'd have to check that and get back to this.
This issue has been mentioned on Duplicati. There might be relevant details there:
https://forum.duplicati.com/t/found-4-files-that-are-missing-from-the-remote-storage/3643/7
This old multi-party issue is hard to pull apart, but some recent posters might be seeing this issue:
403 error during compact forgot a dindex file deletion, getting Missing file error next run. #4129
I put quite a bit of information in forum post Continually needing repair, where @jpschewe got far enough to ring bells, and @D43m0n helped a bit more. The original post here is really old, and the problem is kind of generic, but here are some other comments that remind me of the issue I cited:
After having faced this issue, I did a bit of digging. Here's what I think is happening.
- A dindex file is created remotely
- Its associated dblock file is not created remotely (interrupted?)
In my experience it looks like Duplicati deletes index files, but does not for some reason remember that it has done so.
so far it appears that this happens during the compacting.
Tonight I finally got enough logging to show the file being deleted and why. It's the compacting that is deleting the file and the file is being looked for later.
So I searched for the missing files in the recycle bin on the remote storage and there they are.
I won't repeat all the linked forum post, but there are ways there to see if you hit linked issue which looks like error handling during a compact caused Duplicati to roll back its DB transaction, thereby forgetting it deleted some dindex files (delete verifiable through logs and maybe recyle/trash bins).
This issue has been mentioned on Duplicati. There might be relevant details there:
https://forum.duplicati.com/t/found-4-files-that-are-missing-from-the-remote-storage/3643/8
I'm at the family member's iMac at the moment; the Duplicati version is 2.0.5.1_beta_2020-01-18.
I restored the missing files from the recycle bin at Stack and let Duplicati run and presto! No more complaints about missing files and a successful backup report was created. Backup retention is set to a number of 100 backups to keep. I haven't set any other specific or advanced options like "compact". So if a maximum number of backups implies compacting is applied, then to me, this confirms there is an issue with actually deleting "redundant" or "unused" files and Duplicati forgetting during that cleanup that it doesn't need those old files anymore.
What would be a workaround? Setting a different retention scheme? Removing a limit on storage all together? I can try off course but I'm not always in the position to visit the family member to go ahead and spend a day trying and discovering what works or not, since in that day the backup doesn't see enough changes to start a cleanup of a that max 100 backups. If someone with knowledge of the code or experience can get me a heads up? Much appreciated 👍
What would be a workaround?
Please look at the links given earlier for background. It's not 100% clear without logs or DB that your issue is the known (but not yet fixed) bug involving a suspected transaction rollback due to get error during compact. I linked from other forum reports where a different error may have been the trigger.
If it's that bug, avoiding compact or avoiding errors might help. Option --no-auto-compact=true can prevent the usual (potential) compact, and keeping all versions means no need to ever compact, BUT eventually a lot of space will be used, and things will slow. You could delete versions, then do manual Compact now sometime as a compromise. If you have network or destination that occasionally show RetryAttempts
in the job Complete log then you could try to fix that or increase --number-of-retries.
Thanks @ts678 I've decided to run Duplicati on my own Macbook the coming time with a maximum of 5 backups to keep, in my theory a cleanup/compact should happen often and I work mostly daily on this MBP so the source changes a lot. Doing this on my own machine should induce the same behavior, if it can't be reproduced, it's hard to debug. It's easier on my own MBP to gather logs and search in them 😄
I'm running Duplicati on my own MacBook for a short month now with only 5 backups to keep, but guess what... Now, on my own device I don't see the errors again... Go figure. Like making an appointment at the dentist for your aching tooth and while traveling to that appointment, your pain goes away...
Now, on my own device I don't see the errors again
If you want to encourage errors, reduce --number-of-retries. While running that way, I found that a file delete error can cause the same problem. I wrote up a load of other cases, including some from others.
403 error during compact forgot a dindex file deletion, getting Missing file error next run. #4129 should probably be renamed to delete 403 mention. Even though my new one was 403, some others were not.
I have this issue with duplicati-bin-2.0.5.111 and Tardigrade backend. I have no logs because trying to view the logs gives an SQLite error.
Fix missing file error caused by interrupted compact. Before the dindex files are deleted on the remote, update the state in the database to Deleting. Previously, only dblock files were handled this way. fixes this in 2.0.7.100_canary_2023-12-27
Fix missing file error caused by interrupted compact, thanks @Jojo-1000 and @warwickmm
Some users likely wish to wait for the next Beta, but there's probably no point posting more reports on releases without this fix. Let me put out a last call for any comments (especially if you've chosen to test out the Canary). If nothing more, issue will close.
This issue is stale because it has been open for 15 days with no activity.
This issue was closed because it has been inactive for 15 days since being marked as stale.
I'm on Arch Linux, here is the error I saw in the logs: