Closed shon-otmazgin closed 4 years ago
I hadn't thought of that. I'll work on adding dumps
and loads
when I have some time available next week.
GREAT !! here is workaround:
import io
with io.BytesIO() as f:
compress_pickle.dump(obj, f, compression=compression, set_default_extension=False)
f.seek(0)
## now f can be writing to remote FS
but it works only for lzma
, gzip
, bz2
compression
and not supporting None
or pickle
@shon-otmazgin, I've released v1.1 which implements dumps and loads, but also adds support for any file-like object (for example a io.BytesIO
or a communication pipe) You can upgrade and try it out.
@lucianopaz THANKS !! i refereed the docs:
from compress_pickle import dumps, loads obj = " ".join(["content"] * 100) b = dumps(obj, compression="gzip") b b'\x1f\x8b\x08\x00H;\xc9]\x02\xffj`\x99\x1a\xcc\x00\x01=\xfe\xc9\xf9y%\xa9y%\nT\xa2\xa7\xe8\x01\x00\x00\x00\xff\xff\x03\x00\x9e\x98\xd6$^\x00\x00\x00' loads(b) 'content content content content content content content content content content'
it is optional to send to loads function compression parm ? i can dumps with any type of package compression and loads it without specify the compression package ?
No. It's a typo in the docs. I'll fix it now.The compression is only inferred when you dump
to a path-like object, and it is done by trusting the path's extension.
On the other hand, you can dumps
and loads
using any compression protocol, as long as you explicitly set the used compression
@lucianopaz all compress packages work as expected this is great enchantment !! i didn't tested the file-like object with the dump and load. because, dumps and loads does it flawless.
Hello, The package is very useful, however if you want to use other file storage than local FS like gridFS or any remote FS you need to use dumps and loads functionality for in memory compression.