Open MaximilianHoffmann opened 4 months ago
Actually...I think it's alright. It seems like there was again an issue with file transfer locking up some files
%%timeit
p='/mnt/fast/Data_Imaging/img/20240626_fly2_1.siff'
sr=SiffReader(p)
sr.time_lims=[datetime.fromtimestamp(float(x)/1e9) for x in list(sr.get_time([sr.all_frames[0],sr.all_frames[-1]],'epoch'))]
760 ms ± 9.16 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
It's not super-fast, though, this is a 50 GB file on a local SSD drive and if you'd have 100 files like this it would take a substantial time, right?
Hmmm I think that's the fastest way of what's currently implemented with SiffPy
. The slowest part is the file opening, so maybe what we can consider is a quick scan that doesn't do as much importing to find the first and last timestamp? The problem is you have to scan each IFD sequentially to find out where the end is... Here are some lazy benchmarks with a file I happened to have open to demonstrate where the latencies are coming from. Maybe I can memoize the all_frames
property to speed things up too.
Right now I am getting the recording start and end like:
but it's kind of slow. Is there a faster way?