Open MicAtWork opened 8 months ago
Those certainly are large datasets.
Not a solution to the actual problem... but wondering if you might entertain using reduce() as a workaround. If you felt like a spatially downsampled version of your map may yield a satisfactory representative stiffness tensor, then you could load your data and subset every other point (or some other factor) to start from a map with a lower point-density and larger effective step-size than your original.
reduce() will output an EBSD dataset with a factor reduction in the number of points. So... reduce(ebsd, 2) yields a map with every other point from the original. Use a greater number factor for greater reduction.
Thanks! that work for now, Another solution was to use a random subset of orientations:
odf_subset = calcOrientations(o, 1000); % 1000 random points, change to any desired number
Im trying to calculate the stiffnes tensor for some large EBSD files I have. For the files larger than ~690 MB I get "out of memory" in Matlab. Ive tried increasing the memory by setMTEXpref('memory',32000*1024); but it doesnt help.
This is the code I use
The Error Message I get:
The other files are less than 470 MB and work fine for the calculation of the stiffnessmatrix
Using MTEX version 5.10.2 with Matlab 2023b update 4, on Win 11, on a computer with 13th intel i5 and 64 GB RAM, with java heap memory set to maximum.
would it be possible to implement some sort of tall-array calculation for heavy files? or split the calculation in subsets of the orientations and then average the results?