Closed GoogleCodeExporter closed 9 years ago
When you use the netcdf variable object like a numpy array, python will perform
the slicing operation on it each time you perform an operation inside the loop.
This creates a copy of the data each time, hence the slowness and the memory
usage.
I don't see this as bug, but I could be wrong. I also don't see that I can do
much about it. Reading the data out of the variable before the loop seems
like the right solution to this problem.
Original comment by whitaker.jeffrey@gmail.com
on 5 Jun 2013 at 11:38
I see, thank you for the info. In either case, it seems to me that the memory
usage has be more or less constant. But what I see is that the memory usage
increases linearly with the number of iterations. See the attached pdf, based
on the results of running repro.py.
The red dots represent the memory usage during iterations when slicing the
Variable instance before the iteration. The blue-ish dots represent the memory
usage during iterations when using the Variable instance directly in the
expression. That suggests that some memory isn't released, don't you think?
Thanks,
Kor
Original comment by tjalli...@gmail.com
on 5 Jun 2013 at 3:48
Attachments:
I guess the numpy arrays created when you slice the variable within the loop
are not cleaned up by the python garbage collector (I guess they do not go out
of scope?). That memory is not under the control of the netcdf module, so
there's nothing I can do about it.
Original comment by whitaker.jeffrey@gmail.com
on 5 Jun 2013 at 4:51
Makes sense, thanks.
Kor
Original comment by tjalli...@gmail.com
on 5 Jun 2013 at 8:56
Original comment by whitaker.jeffrey@gmail.com
on 26 Feb 2014 at 2:04
Original issue reported on code.google.com by
tjalli...@gmail.com
on 5 Jun 2013 at 9:44Attachments: