Closed Boltzmachine closed 2 months ago
Hi just a follow-up on this issue. Really appreciate it if you can help with that!
Hi, this code wasn't actually the code we used to get the betas files in the huggingface repo -- actually the betas we obtained were created by coauthor Reese Kneeland as described in the MindEye2 paper -- this script was what we were initially using before Reese provided updated betas (which explained the commented out part of the code saying "YOU SHOULD EXCLUDE SHARED1000 FROM THIS", this is the reason why we switched to Reese's betas). Note that both the betas obtained from this script and the ones provided by Reese led to nearly the same results though. I'd recommend you just take the data directly from the Natural Scenes Dataset: https://natural-scenes-dataset.s3.amazonaws.com/index.html
@PaulScotti Actually, I have the same issue. Could you more elaborate the data processing procedure?
I used betas_fithrf_GLMdenoise_RR
with func1pt8mm
and ensured that the Shared1000 data is not included in the calculation of mean and std of each session. But, I failed to reproduce your provided betas.
I have succeeded to reproduce the Mindeye1's data but the same approach does not seem to work on Mindeye2.
Update: I resolved the problem. In the mindeye2, it seems that all the mean and std value are calculated with entire training dataset (not session-wise).
Ah, yes that would explain it. Sorry I forgot that there was this difference!
I tried to modify the
dataset_creation.ipynb
to reproduce the data preprocessing. I found all of the rest (changemind
,isold
,iscorrect
, etc. can match the dataset you provided). However, thebetas_all
I obtained is different from the file you provided, i.e.betas_all_subj01_fp32_renorm.hdf5
.Below is the procedure to get
betas_all
in my script