Closed GolgiWhillikers closed 1 year ago
Hi @GolgiWhillikers , Thanks for reaching out! a few of questions for you:
Hi, thanks for the quick reply.
1) Here's a subset of the dataset I've been working with for testing - this is the last 100 frames of the pre acquisition and the first 100 frames of the post acquisition.
Hopefully you can access this: https://drive.google.com/file/d/1Up8VOfJm4Zq4nH07db2OnabKRLCKm4Sp/view?usp=sharing
2) I have tried first frame, last frame, and both projection styles with no success.
3) Adding a filter did more or less solve the problem, which would be consistent with a SNR issue. I tried both 5 pixel median or mean filters and in this case the drift correction performed much better. It's not perfect but it's rather close and probably good enough. I'm open to other filter suggestions if there is something better to try. I actually ended up just using the NanoJ-Core plugin here (since I could just use it on an open image for troubleshooting) but did confirm that I got the same results with Fast4DReg. For this dataset, where there is not much drift during the actual acquisition and the majority is between acquisitions, it may be simpler to calculate the drift with NanoJ on 2 frames - projections of the pre/post - and then apply the shift to the entire post dataset.
Hi, Thanks for sharing the files. I got a pretty good registration using a 2D gaussian blur filter (4px). By eye, it's nearly perfect. Seems better than with the median or mean filters. So perhaps worth trying on your end! I hope this helps! Cheers Guillaume
Hello-
I am trying to use Fast4DReg to correct drift in 2D GCaMP imaging experiment, where I have several thousand frames imaged both before and after a manipulation. Drift within the pre and post acquisitions is pretty minimal, but there is a visually apparent several pixel shift between the pre and post acquisitions. Trying to correct across this shift with Fast4DReg doesn't seem to work - even after correction, the shift is still there, and is estimated at ~0.03 px in x and y, when measuring by hand it's more like 5+ pixels.
I have similar results when trying to just use the NanoJ-Core plugin to drift correct the same data, which is why I suspect perhaps this is a limitation of the cross-correlation algorithm on this dataset. The images are definitely low signal/noise (maximum intensities ~120-130 on a minimum of ~100), which I see in the paper can cause the algorithm to perform poorly. However even if I collapse the data to 2 frames with a max intensity projection of the pre and post and try to correct across it with Fast4DReg, the drift is not corrected; similarly if I turn on time averaging to say 100 frames and correct the time series, the jump persists (and should be between averaged frames, so not just averaged out). StackReg does fine with this dataset with either rigid body or translation, but is quite slow, whereas Fast4DReg is comparatively fast (8 minutes instead of a few hours for 6K frames) so I'd like to use it, but am struggling to understand if my data are just limited for this particular algorithm, or if there is a piece of data preparation I am missing or can change.
Thanks
Image showing s/n ratio is low
Shift between pre and post is ~5-8 pixels (mostly in Y)
Calculated shift is only ~0.03 pixels