Open rdcarr2 opened 1 month ago
Hey there!
I haven't seen the issue before. I think the underlying rechunk
would need to happen on the SARAH data, not on the cutout. A wild guess would be that you could you try to pass the chunk
argument to the prepare(...)
function similarly as shown in the example here:
https://atlite.readthedocs.io/en/latest/examples/create_cutout_SARAH.html#Specifying-the-cutout
Version Checks (indicate both or one)
[X] I have confirmed this bug exists on the lastest release of Atlite.
[ ] I have confirmed this bug exists on the current
master
branch of Atlite.Issue Description
Hey everyone, I'm trying to create SARAH cutouts for later use in producing PyPSA-Eur networks for different climate years, but keep getting the same error message after running cutout.prepare():
"ValueError: dimension lat on 0th function argument to apply_ufunc with dask='parallelized' consists of multiple chunks, but is also a core dimension. To fix, either rechunk into a single array chunk along this dimension, i.e.,
.chunk(dict(lat=-1))
, or passallow_rechunk=True
indask_gufunc_kwargs
but beware that this may significantly increase memory usage."I've tried the rechunking method suggested, but keep getting the same error regardless. I'm stuck now and not sure what to do, and ChatGPT is sending me in circles. Anyone experienced this issue and managed to find a solution?
Reproducible Example
Expected Behavior
Installed Versions