arsenetar / dupeguru

Find duplicate files
https://dupeguru.voltaicideas.net
GNU General Public License v3.0
5.48k stars 418 forks source link

sqlite3.OperationalError: database is locked #1172

Open bmartinmd opened 1 year ago

bmartinmd commented 1 year ago

Error message Application Name: dupeGuru Version: 4.3.1 Python: 3.10.12 Operating System: Linux-6.2.0-35-generic-x86_64-with-glibc2.35

Traceback (most recent call last): File "/usr/share/dupeguru/hscommon/gui/progress_window.py", line 111, in pulse self._finish_func(self.jobid) File "/usr/share/dupeguru/core/app.py", line 300, in _job_completed fs.filesdb.commit() File "/usr/share/dupeguru/core/fs.py", line 176, in commit self.conn.commit() sqlite3.OperationalError: database is locked

This error occurs after the contents scan is complete and I am preparing to mark then delete duplicate files.

Steps to reproduce the behavior: Open 10 separate instances of dupeGuru with the shell script sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru & sudo dupeguru &

For each instance set dupGuru to scan folders containing specific files types located on a zfs array configured as 24x16TB disk drives in a draid2 array with one spare. The array contains over 200,000,000 files. Use a contents scan.

Expected behavior No error messages.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

Additional context I run multiple instances because it seems that dupeGuru does not take advantage of multiple cores for parallel operations.

glubsy commented 1 year ago

From what I recall, dupeguru is not designed to be instantiated more than once. You could try using shelve instead of sqlite (check settings) but I doubt this will help much.

FredWahl commented 10 months ago

Previous versions of dupeguru could be instantiated more then once. I think a solution would be to adD ProcessID as a column in the database.