Closed theoparis closed 3 years ago
Seems like the solution is to use
let data = ""
const stream = {
write(chunk) {
data += chunk.toString()
}
};
Instead of using res.write();
and then in .export().then simply use res.send(data)
I'm making a fork with updated dependencies and latest nodejs in the dockerfile. It also fixes this issue. I can make a PR as well
Thanks, I'll take a look at it. Your proposed fix does not stream the output though, which is problematic when exporting large nodes!
I've found what might be causing this. If there is an error while streaming the output, it does res.statusCode = 500
in ref.export
's catch handler - which is obviously not allowed because it has been streaming data already. The error triggering this issue is probably a timeout for getting all data of a large node (or entire db?). How large is the data you are trying to export?
I've fixed it by changing the stream to:
const stream = {
write(chunk) {
res.write(chunk);
}
};
Commit will follow later today
Committed and published in acebase-server
v1.2.3
Hi there. Thanks for making acebase, its amazing. I just ran into an issue when I click on the
You can export the value of this node to json
button in the admin ui.This occurs in the acebase server file on line 2162. Everything else such as logging in and navigating seems to work fine.
Also, not sure if this is a super useful idea, but it would be nice to see acebase server be able to be used as a middleware, so you can use it in an existing express app instead of creating a new one and listening on a different port.