Closed jlevy44 closed 5 years ago
Did you check out: https://stackoverflow.com/questions/51562221/python-multiprocessing-overflowerrorcannot-serialize-a-bytes-object-larger-t
Consider modifying your code so that each process shares less data. For example if you have a static data/model, you can make this public to all threads and only share the pertinent thread information.
Closing this for now, but feel free to re-open if needed.
Seems like I'm having problems pickling these large files for running multiprocessing jobs:
OverflowError: cannot serialize a bytes object larger than 4 GiB
Any idea what I should do here, or should I just run in series? In that case, would MP be deactivated? I'll check the code.