Closed stoiven closed 2 years ago
Hi! Are you sure this works as expected when you restore the dump with mysql manually? pynonymizer really isn't doing much other than piping the dumpfile into the mysql binary.
Normally when you get a broken pipe on the mysql cli it relates to the server's ability to handle large statements/a lot of data at once, e.g. max_allowed_packet or similar settings.
Ah you're right! It was definitely the input! The dump was actually PostgreSQL custom database dump
in binary, so it couldn't read properly. Sorry about that!
Additionally, I don't see an option to read from a custom dump? I've tried converting over to a .gz
extension, but it still outputs the same error. (as above). The file itself is a custom file, and I'm not sure if it's a pynonymizer thing or some special flags you need to pass through?
pynonymizer is written for logical database dumps running over to the CLI sql runner (for postgres, psql). If this is something we could be supporting, I'd be interested to hear a feature request!
Describe the bug A clear and concise description of what the bug is. Running the pynonymizer app, it automatically crashes if we use a big file, e.g. >15GB
To Reproduce Steps to reproduce the behavior: Running the program with a large file will output the below:
Expected behavior A clear and concise description of what you expected to happen. Should act as normal. This works fine with a file that's <5GB