Closed yukaribbba closed 1 week ago
Thanks for reporting this issue. Indeed, the sentinel1 data is very heavy, and I have to take small chunk sizes when working with that data on my laptop, for example:
DASK_ARRAY__CHUNK_SIZE=32MB my_s1_script.py
See more on how to tweak performance here: https://satpy.readthedocs.io/en/stable/faq.html#why-is-satpy-slow-on-my-powerful-machine
Thanks for the hint and it does work for me. I usually set DASK_ARRAY__CHUNK_SIZE
to 2200*2200 (about 68MB.), which gives me satisfying performance on most other readers except this one. I wish we had more specific tips for adjust the value depending on readers. That'd be better...
I agree, we should give more information when we can. Feel free to create a PR about it if you want.
Ok that's on my todo list. For now I'll just close this.
Describe the bug Nearly all my 64GB memory and it still asks for more.
To Reproduce
Actual results
Environment Info: