QCoDeS / Qcodes_loop

Loop and matching dataset that used to be part of qcodes
Other
1 stars 1 forks source link

memory consumption when loading data #18

Open MerlinSmiles opened 8 years ago

MerlinSmiles commented 8 years ago

I have a dataset on disk it consumes 250mb, when I load this after the measurement has finished, the python process consumes ~1GB of memory, does that somehow make sense? Shoudnt it be much smaller in memory? How can I profile this?

cc: @alexcjohnson @giulioungaretti

alexcjohnson commented 8 years ago

not sure if I tried to optimize memory while reading the file... will have to look at it. You can check the size of all the DataArrays in a DataSet with:

sum(a.nbytes for a in data_set.arrays.values())

This might be a nice helper to map in as a @property of the DataSet. Presumably that's the main memory user but we'll have to dig in more if that doesn't explain what you see. I've never done this myself but it looks like there are a number of tools available to try...

MerlinSmiles commented 8 years ago

Yah that gives me 916Mb, but I guess it might be due to all those nan values QCoDeS/Qcodes_loop#19 in that measurement, as another file has 30mB on disk and 916Mb in memory. Both taken with 10000 steps with a step parameter, and finished at some condition.

alexcjohnson commented 8 years ago

OK good, so no big memory leak (not here anyway!) but we can leave this open for someone to add a DataSet.nbytes property.

MerlinSmiles commented 8 years ago

Just to add the slack discussion here too: Calling this: data = qc.load_data(location) several times, adds the data to the memory, without clearing it.

giulioungaretti commented 8 years ago

Isn't the last comment from @MerlinSmiles actually a bug ?

MerlinSmiles commented 8 years ago

@giulioungaretti Yes, and it awards me with pizza :) P1?

giulioungaretti commented 8 years ago

@MerlinSmiles hard to reproduce the behaviour you see on osx/linux. Weird. Will try on windows, as soon as I am done with documentation boiler plate, will remove bug from this issue and create new issue!