NeuroDataDesign / fngs-archive

Apache License 2.0
2 stars 1 forks source link

Due 2/20 #121

Closed ebridge2 closed 7 years ago

ebridge2 commented 7 years ago

@ebridge2 Commitment:

Stretch:

@ewalke31

http://neurodatadesign.github.io/fngs/ewok_weekly/stc/Slice_Timing_Correction.html

@02agarwalt

gkiar commented 7 years ago

@ebridge2 I suggest making the following your goals for M and R:

ebridge2 commented 7 years ago

sounds good!

ebridge2 commented 7 years ago

how's that look? yah I think this issue is definitely better for this week than the JE stuff; there were definitely a lot of moving parts with actually getting that done by next Monday. Good call :)

gkiar commented 7 years ago

Much better! One minor improvement: before all of that text for each task state in 5 words or less the deliverable. Thanks!!

ebridge2 commented 7 years ago

Note: I apply frequency filter BEFORE doing remainder of nuisance correction so as to not have segmentation and nuisance correction related estimation errors impact the power of our frequency filtering. Least squares correction imposes a small error so if we were to apply that before fft there could be propogated error impacting the quality of our fft, or at least that's the logic I am going with.

ebridge2 commented 7 years ago

Things to note: highpass with nuisance does a LOT better. however, it seems as though I cvould call it better (see real data top 2 plots; there is something weird going on in the high pass filtering where it is leaving the first timepoint untouched) so I am obviously doing something slightly wrong in that regard. overall seems promising though.

gkiar commented 7 years ago
ebridge2 commented 7 years ago

the issue was the original thursday deliverable, which i believe we agreed would be moved to this week instead of last (originally, it was something about talking with leo, and then reading a paper and deciding a strategy w him for interpretting task data) so i put it here as the second in my todo list. should i just leave it when i pivot next time?

02agarwalt commented 7 years ago

Summary:

My goal this week was to evaluate the performance of 3 registration combos:
1) FLIRT + LDDMM
2) NDRegAffine + LDDMM
3) NDRegAffine + FNIRT

The first 2 combos were dreadful for discriminability because the only way to run them realistically was to heavily downsample the data. The 3rd combo was a failure entirely because the transforms produced by NDReg are not compatible with standard fMRI registration techniques because NDReg doesn't register the anatomical image to the reference brain space. Kwame hasn't been able to provide a solution.

NDReg is clearly not suitable for fMRI processing. The best registration combo is FLIRT + FNIRT, so I think we should keep them. I'm not sure what to pursue next since this "chapter" of registration seems to be complete. Perhaps something involving epi_reg?

ebridge2 commented 7 years ago

Summary:

My goal for this week was to evaluate the viability of a combined approach of Quadratic regression and frequency filtering to remove low frequency, slow drifts from our timeseries. This approach theoretically will be more favorable than compcor (our current strategy) particularly for task fMRI, as it is hypothesized that correcting for best-fit components from the white matter in gray matter can remove signal related to the hemodynamic response. Our simulations show that our algorithms generally work in the cases we expect (low frequency drift added, quadratic function added, particularly when the stimulus-related sinusoid has a frequency much greater than our cutoff and the noise is far below our cutoff). We can also see that when the noise is close to our cutoff, our model will perform relatively poorly, which shouldn't be much of a problem however as generally defining a cutoff around 0.01 Hz is relatively generous as low-frequency drift is often in the 0.001 - 0.005 Hz range. In our real data, we can see first and foremost that our first time point is relatively messed up, which is the result of our naive highpass filter not being able to compare the first time point to anything. As we can see, our new nuisance correction strategy unanimously improves our discriminability; particularly, it significantly improves our spectral discriminability (although our ranked correlational discriminability improves as well quite significantly).

Future investigations will hinge on looking into better highpass techniques (the current was a very naive algorithm consisting of fft -> cutoff -> ifft) such as applying specific box filters or butterworth filters, and seeing if we can improve the drift particularly in the early timesteps (note that aside from the first timestep being completely bogus it looks like a low-frequency drift remains in the first 50 timesteps of relatively low amplitude, but it is still present regardless). For this week, I have interviews Tuesday - Friday, but I will collect the existing discriminability code for spectral and correlational computation into a R package, and include a tutorial similar to the markdowns we have been working on for the last few weeks (a methods_markdown that basically is the same as the spectral_vs_corr_discr.html page, but shows in the context of this package itself). Also I am meeting with Leo in the AM and will have notes from our meeting that could be a Thursday deliverable.

gkiar commented 7 years ago

@ebridge2 yes leave it next time, please. As we discussed offline :)

@02agarwalt cool, so now let's look at epi_reg for fmri to t1/mprage registration. This will replace the aforementioned flirt only, not the flirt prior to running fnirt for t1 to mni registration.

@ebridge2 cool. reversing the time series will resolve first-timepoint issues, also applying a proper filter via convolution and some premade package may have considered t=0 filtering already and done some clever fix. GL on interviews!