Closed rhysgt closed 8 months ago
I worry that losing displacements will break other analysis later on, and so you would need to make sure you catch this on all other functions that might use them. Is it worth it or should we just buy more RAM for doing big maps?
We could make precision parameter so you can pick what datatype to use for the data arrays, it uses numpy's default atm which is 64-bit floats and 32-bit ints I think. This can be linked in with #36
The xc
, yc
, xd
, yd
lists are extraneous since x_map
and y_map
hold this data in a structured form anyway. Simply removing these arrays helps a lot. The RDR calculations use x_map
and y_map
.
Memory usage of DefDAP can be quite high. For two EBSD maps (admittedly quite large ones) - it occupies ~1.4 GB of memory. For two large HRDIC maps, it's using ~1.8 GB. This makes it tricky to compare lots of strain steps on a machine which doesn't have much RAM for example.
Might be worth changing the
dtype
for some of the data - for example, x coordinates read in from a DaVis DIC map are stored asfloat64
even though they are just unsigned integers. These occupy 0.7 GB of memory at the moment, for example. As do the displacements. At the moment, I'm deleting these since I don't actually need them. Perhaps there can be an argument likekeepDisplacements
, which removes the displacements if they are not needed?Additionally, linking the DIC and EBSD maps increases usage by another 2 GB. When RAM is exhausted, calculations become very slow of course.