Closed tsalo closed 4 years ago
I'm not entirely sure how this is possible, but this fork still passes the integration test!
What do you mean as drop the squared motion parameters? From the original ICA-AROMA algorithm: "We defined a 72RP model, including the standard 24RP model (6 standard RPs, their derivatives, and the quadratic terms of these 12 RPs), as well as a single time-point backward and a forward shifted version of the 24RP model to reflect possible non-linear effects. Next, we calculated the maximum absolute correlation of each IC's time-course with each of the 72 RPs (squared RPs were correlated with squared IC time-series). We used a robust correlation, by calculating the mean correlation over 1000 random selections of 90% of the points in a time-series."
In https://github.com/maartenmennes/ICA-AROMA/commit/7b000665b45ea49b749bfe89be065f50f4921523#diff-5f3554c3cc435b1caa80a92f299d6728e62963ec1b378dd4030b39da254300aaL235, ICA-AROMA did some refactoring that included removing the squared motion parameters and the squared derivatives from the model, but the same refactor also added a step later on that squared the model. The issue for me was that I saw a comment that they failed to remove in that commit that mentioned the squared motion parameters but didn't have any code, so I added what I thought made sense (i.e., squaring the parameters). However, I think this applies the squaring twice, so I think I introduced a bug.
What's really weird to me is that my bug doesn't seem to affect the results of our tests.
I think the key question is, does this
accomplish what I added this
to do?
If so, then I introduced a bug by adding the extra squaring, in which case we should remove it.
I did not know that the IC time series (mix) is also squared and compute cross-correlation? In addition, it must also compute the squares of rp12_1fw and rp12_1bw and include them in rp_model https://github.com/Brainhack-Donostia/ica-aroma-org/blob/4d41f891fea87142185d03e358b7418dde85aad9/aroma/features.py#L51
But, essentially, yes, you are right, the current code computes squared twice in the instantaneous regressors. So, I would define rp_model = np.hstack((rp12, rp12_1fw, rp12_1bw))
in https://github.com/Brainhack-Donostia/ica-aroma-org/blob/4d41f891fea87142185d03e358b7418dde85aad9/aroma/features.py#L64
That sounds right. I'll move forward with this fix then. Thank you!
Closes #28.
Changes proposed in this pull request: