Closed yvesauad closed 4 years ago
Difficult to know how you are putting the data in ram etc... At some point we may need help form @marceltence on such memory-related issues, but first please let me know how you make the acquisition, ie, only numpy array
or if you go through swift dataitems
by now i am simply appending my camera acquistions and at the end i create various data items in which i perform calibrations for my several plots. RAM usage is simply my biglist getting apendded over and over.
i explicitly kill my array after plotting my data using del biglist
and at least i reclaim all memory after plotting. The problem is when i run out of memory during acquisition.
Also please not this is my computer issue. i have only 4gb in my notebook and i have in fact 2gb usage after a super fine acquisition.
i am playing around with some async funcs. the idea is to create a dataItem at the beginning and keep appending spectra by spectra (but not storing all of them)
This is nice now. Live display and i am not storing insanes amounts of data
What is the solution? not urgent answer needed..., just curious!
pas de solution en fait hahaha je suis allé a un ordi avec plus de memoire 🥇
c'est une bonne solution ;)
ram is increasing way too much acceptable. In a SCAN of 0.1nm from 575 to 590 and 20 avg we accumulate at least 3000 1024x256 points. At the end we have 2-3 GB OF ram consumed and generate my plot or abort acquistion doesnt seem to reduce it. of course this is probably linked to the way i am storing data. I have an array of data which appends points by points. Another motivation to do an async grab and display data live. I wouldn't need to store all of them