Closed noisydeadlines closed 1 year ago
I suspect this is happening because of a timeout -- it was reported earlier and self resolved without me doing anything.
Hi! I tried exporting it again twice, I still get the same empty file every time.
I can confirm this issue was happening with v0.4.3 and is still not working with v0.4.4. I was trying to export on the BookWyrm Social instance.
I can confirm I'm getting the exact same thing too. Tired to export twice and spoke with @mouse-reeve who pointed to this issue
Desktop
same here, is annoying only reason I wanted this is to migrate to another server, the one I am at is not been taken care off is more time down than up for many people, https://bookwyrm.social/ I keep getting issues for more than 9 months, I cant edit a group list some of us made, works when not logged in, but as soon I log in it breaks, because a nginx bad gateway that usually happens when the service is unresponsive or the container is low on memory unresponsive etc, so got tired of waiting for the admin to reply to us, I even offer to let me ssh so I can fix it, (systems engineer for 24 years) and to try to troubleshoot this even do I sent 2-3 times screenshots, I waited 9 months because I know this is not a job and voluntier but now and here comes to the point, I try to get the CSV to migrate to another instance and is empty :( I am on Arch GNU/Linux I tried with qutebrowser, firefox and chromium.
@r3k2 -- I just wanted to acknowledge how frustrated and unheard it sounds like you're feeling with this. It sounds like you've been impacted by bugs that interfere with basic usability, and you aren't getting a response that makes you feel like the problems you're encountering are being acknowledged and taken seriously. I imagine it's just as frustrating for everyone who's been reporting and +1'ing these as well!
When I get bug reports that I have trouble replicating and am not sure how to fix, I often find it overwhelming and I'm not always sure how to get outside help in fixing them. As a result, there are some bugs, like the ones you've encountered, that I've been aware of but pretty stuck about how to address, and as a result they languish in the issue tracker. The scale and demands of the project have gone way up and my capacity to fix bugs and bring in support from others hasn't kept pace. I find that really daunting and a little scary. So given that, it's super generous of you to have offered your expertise to help fix it, and I apologize for missing that you made that offer; although I try my best, I still fail to notice or forget things.
I believe that the source of this specific bug is that the server is overloaded and the query is timing out and returning an empty csv. I have been focusing (with some greatly appreciated help) for the last couple weeks to try to improve the performance situation overall, which is related but doesn't directly address this specific thing not working. I'd be really open to suggestions on how a data export can be compiled in a way that is robust to server load and timeouts.
It will also need some expanding to be an effective migration utility, but that's very doable.
I'm also frustrated by this bug, and I'm wondering if the solution is to make the export asynchronous. I have two examples in mind which might be useful as patterns:
LibraryThing: when I exported from there to import to Bookwyrm, clicking the "Export all books" button didn't immediately give me a link, instead it showed some kind of "in progress" feedback. Trying it now, I got a blue bar that says "nnn books processed", updating every 100 books. It took a few minutes to do my whole library, and then was replaced with a link in the format https://www.librarything.com/download_export_file.php?uniqueId=HexadecimalIDHere . Presumably that hex ID is a lookup in a table that connects to an actual file; clicking the link gets me a file with my username in the file name.
Allen Coral Atlas: I actually maintain the downloads system for this one. Downloads can take a couple of hours to assemble in the worst case, so we pre-package common requests and then have a fully asynchronous system for everything else. It goes like this:
The ACA version is probably overkill here! But something like what LibraryThing does, perhaps also generating a notification that contains the link, seems like it could work.
Hi, I'm @eldang 's spouse and I told him that I would take a crack at fixing this bug as a Christmas gift. Luckily, I do a lot of work on Django sites, so it's like... actually realistic of me to offer this.
I haven't done a lot of digging here yet, so I might be way off-base, but something like https://pypi.org/project/django-import-export-celery/ might go a long way toward helping make this happen.
I think what I'm likely to do is start with something tagged as "good first bug" to get my feet wet in this repo, too.
So... hi! Hope I can help!
Thank you! Help on this ticket would be super appreciated, and I agree that using celery seems like the right approach. I'd be happy to work with you and provide whatever help/explanations of weird codebase things I can do
@nein09, any luck with django-import-export-celery
? I'd love to learn about how you're approaching this challenge.
@todrobbins I decided to start with what looks like a smaller piece of work first (https://github.com/bookwyrm-social/bookwyrm/issues/1678) (and life has been getting in the way of even that), so I'm not quite here yet. So sadly, there isn't much to report yet though. But I haven't forgotten about it.
Hi! As a note on possible causes here, I was able to get an export from my own bookwyrm instance just fine, but the export I got from bookwyrm.social was blank
I made some interesting progress on this today. I've been playing with our gunicorn setup (since I'm pretty sure it's responsible for many of the loading delays on bookwyrm.social recently), and I bumped the timeout up to 600 seconds to test this (also increasing the timeout on the nginx side).
This prevents the timeout, but there's still a problem — psycopg2 reports that the disk is full:
And indeed, watch -n1 df -h
while a CSV export is running shows that it quickly eats up all the remaining space on the server (which is promptly returned when the query fails).
Using the log_temp_files
postgres setting, it seems that this is from creating many temporary files of around 200MB each, rather than a single enormous temporary file.
It looks to me like what's happening is that one of the queries in the for book in books:
loop in the csv_row_generator
is making a 200MB temp file, and the results of the query are being kept around until the generator is fully read (maybe? I don't have a strong picture of how generators work, nor the Django ORM).
While building this export in Celery is clearly the best long-term solution, I do think that there's potential to rewrite the export code to fix this in the short term. I'm not sure why a generator was initially used (probably it saves on memory by avoiding storing the entire CSV in memory?), but unrolling it into a loop that generates the CSV in-memory, rather than keeping the Query objects around and converting them to CSV one line at a time would probably get exports working on bookwyrm.social for the moment.
Looks like #2713 didn't have exactly the effect I was hoping for — somehow the same disk space problem is triggered. Not sure if it's that I don't understand when Django QuerySet
s are cleaned up, when Postgres temp files are cleaned up, or something else. Will continue investigating.
The fact that we're seeing a 500 error rather than an empty file is an improvement, I think!
Yeah, that's true :)
My guess is that what's happening is the books
query is trying to materialize tables that are larger than work_mem
(there is a Materialize
in the EXPLAIN
output), and those thus get written to disk. Looking at the EXPLAIN
output for the books query:
books = (
models.Edition.viewer_aware_objects(request.user)
.filter(
Q(shelves__user=request.user)
| Q(readthrough__user=request.user)
| Q(review__user=request.user)
| Q(comment__user=request.user)
| Q(quotation__user=request.user)
)
.distinct()
)
It seems like the culprit and solution will likely be similar to #2725 / #2726, although presumably more complicated since we'll need to figure out how to handle duplicates. If we were writing raw SQL it would be pretty easy to use CTEs to select all five of those as individual queries then DISTINCT
them at the end, but it looks like Django can't generate CTEs without using libraries like django-cte, which I'm not totally sure is worth the effort, versus making five queries then sticking them in a set
or something.
Just FYI, #2741 hasn't been deployed to bookwyrm.social yet, so CSV exports will still fail there until that's deployed — I'll comment in this issue when it is, so that anyone following this will get a notification.
This should now be fixed on bookwyrm.social! I was able to export my own CSV history, albeit a small one. Please open a new issue if you have any problems with the CSV export.
@Strubbl @todrobbins FYI
Thank you for your work on this, @WesleyAC! I'll test a bookwyrm.social export right now.
@WesleyAC Hooray! Thank you for fixing this. Export from bookwyrm.social just worked smoothly (and fairly quickly) for me, and it looks like importing that to books.theunseen.city (which is on v0.6.0) is also working.
Export worked perfectly and pretty quick (~3s/43.1 KB/363 rows):
I can confirm it worked for me too, thank you so much!
On 04/04/2023 18:56, Tod Robbins wrote:
Export worked perfectly and pretty quick (~3s/43KB/363 rows):
image https://user-images.githubusercontent.com/158590/229877801-491d0175-0c53-44fe-85db-28aa2c96fa0d.png
— Reply to this email directly, view it on GitHub https://github.com/bookwyrm-social/bookwyrm/issues/2157#issuecomment-1496375505, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAMS2ASPCIXET6M3IXBZ2BTW7ROFZANCNFSM52DOWXMQ. You are receiving this because you commented.Message ID: @.***>
Describe the bug When using the CSV data export the resulting file is empty
To Reproduce Steps to reproduce the behavior:
Expected behavior CSV with data from the books
Screenshots
Instance BookWyrm Social
Desktop (please complete the following information):