Closed Fangyh09 closed 4 years ago
Sorry, can you explain more detailly about what happened?
Please refer this https://github.com/Fangyh09/Image2LMDB. I solved other small problems.
--- old reply --- I fixed it by using
def loads_pyarrow(buf):
"""
Args:
buf: the output of `dumps`.
"""
return pa.deserialize(buf)
Thanks for the code! It's awesome.
Glad to see you resolve the issue yourself : )
Packing separate images into single LMDB helps when disk I/O is the bottleneck. If one day you find CPU utilization becomes bottleneck, then you should take a look of https://github.com/NVIDIA/DALI
Thanks a lot 👍
I fixed it by using
def loads_pyarrow(buf): """ Args: buf: the output of `dumps`. """ return pa.deserialize(buf)
Thanks for the code! It's awesome.
Hi, it seems i got the same problem too. could u tell me how to slove this question i dont know where to use function loads_pyarrow() thinks
@dreamcontinue Hi, you can try this https://github.com/Fangyh09/Image2LMDB.
@dreamcontinue Hi, you can try this https://github.com/Fangyh09/Image2LMDB.
think you for your reply I sloved it and found that IO speed is still quite slow TAT
In the ideal case, LMDB should provide much faster I/O compared with original JPEG files. Can you show more detailed information?
@dreamcontinue Yes, I also find that.
I have solved other problems, please refer to this https://github.com/Fangyh09/Image2LMDB instead of
def loads_pyarrow(buf):
"""
Args:
buf: the output of `dumps`.
"""
return pa.deserialize(buf)
msgpack==0.5.6