Closed MatthK closed 3 years ago
Currently editing or deleting a channel as an operation has no choice but to recursively check the media associated with a source one at a time which is why it's slow (because, for example, if you changed the date range or quality requirements your media needs to be individually checked to make sure it's still valid - or for deleting it has to check if the thumbnail exists and delete it and notify the media server the item is gone). There is some work ongoing to make this sensible with #94 but this is now pending replacing the worker library entirely now.
You can manually delete the source by UUID in the database which would do what you want, then delete any media items in the database which is linked to the source UUID. This would leave any downloaded thumbnails or media kicking about on disk though to manually sort out.
If you want to add large channels I would strongly advise you use PostgreSQL as a back-end as well if you aren't already.
Pending the major refactoring work, I've just pushed a couple of quick helper commands to delete massive channels on the command line. Wait for the image to build, pull :latest
and then you should be able to use something like:
$ docker exec -ti tubesync python3 /app/manage.py list-sources
Note the source UUID and then use:
$ docker exec -ti tubesync python3 /app/manage.py delete-source --source source-uuid-here
It has no confirmation so make sure you get the UUID right. It'll take some time to delete extremely large sources.
Wow, thank you for the very quick response. That is incredible.
And it worked. Pulled the latest image, rebuild the container and then could delete the tasks. It ran for almost 10 minutes. I assume that did not delete the thumbnails, no?
Is there an easy way to migrate from the standard DB to PostgreSQL?
That should have deleted thumbnails, media and cleaned everything up properly.
As for migrating, there's a guide here:
https://github.com/meeb/tubesync/blob/main/docs/other-database-backends.md
Basically you're just using Django's built in "dumpdata" to dump from SQLite into JSON, swapping over the backend database connection and then using "loaddata" to import the JSON dump into Postgres. Keep the SQLite database backed up though as you might well encounter issues.
Yes, I just saw it at the end of the messages that it did delete the thumbnails. I missed that in the beginning. I managed to migrate the DB to MySql, which I had another docker container already running. The SQLite DB is 1.2GB already, so no wonder it was struggling. The migration worked quite well, although the user needs Superuser permission on the server to successfully load the data. That's quite strange. And I am amazed. The performance of the page is now like day and night. Previously the system was quite slow, it took a few seconds to load any page, while now it is showing pages almost immediately. Thanks again for the very quick and perfect support.
No problem. SQLite works really well... until the database gets quite busy then you get write locks and it starts to trudge a bit. MySQL can have issues due to MySQL not handling very long running database connections very well, or at all, so if after a day you get "MySQL has gone away" errors please open an issue with the details.
As the channel I added is way too big, I wanted to delete it.
Unfortunately, TubeSync seems to constantly time out. How can I increase the timeout? Or is there a manual way to delete a source?