if we use 'write_to_afni' to export the patterns with multiple subbricks, like below data:
subj = init_subj(exp_name,subjid);
subj = load_afni_mask(subj,'Wholebrain',mask);
subj = load_afni_pattern(subj,'pattern_name','Wholebrain',{'data_file_name+tlrc'});
vox = get_mat(subj,'pattern','pattern_name');
disp(size(vox))
> 69391 3
kk = [vox,vox,vox,vox] % whose size will be 69391*12
subj = initset_object(subj,'pattern','repeated_patterns',kk,'masked_by','Wholebrain');
write_to_afni(subj,'pattern','repeated_patterns','atlas/MNI_Glasser_HCP_2019_v1.0/MNI152_mask+tlrc',...
'overwrite_if_exist',true,'output_filename','repeated_patterns_data','view','+tlrc');'
errors will be reported as below:
++ 3dcalc: AFNI version=AFNI_22.3.03 (Oct 13 2022) [64-bit]
++ Authored by: A cast of thousands
ERROR: selector index 99 is out of range 0..0
[7m ERROR:[0m bad sub-brick selector [0..99]
[7m FATAL ERROR:[0m can't open dataset EPI_T1_mask+tlrc[0..99]
Program compile date = Oct 13 2022
Error in BrikLoad : HEAD file allvolszeros_EPI_T1_mask+tlrc.HEAD not found
Error using zeroify_write_afni>load_sample_brik (line 186)
error in BrikLoad -BrikLoad: allvolszeros_EPI_T1_mask+tlrc.HEAD not found
Error in zeroify_write_afni (line 63)
[V head] = load_sample_brik(allvolszeros_sample_brik_name,allvols);
Error in write_to_afni>try_zeroify (line 412)
zeroify_write_afni(allvols,sample_filename,cur_filename,zeroify_args);
Error in write_to_afni (line 151)
try_zeroify(subj,objtype,objin,sample_filename,args,runTRs);
Error in permutation_searchlight>permutation_sl (line 45)
write_toafni(subj,'pattern',['perm' cond_names{icond}],'EPI_T1_mask+tlrc',...
Error in permutation_searchlight (line 18)
permutation_sl(exp,subjects{i},period,analysis,condition,perm_times,filenames)
I'm confused. Why did I receive an email about this issue being closed - or any communication at all about this repo? How am I connected to this repo at all?
if we use 'write_to_afni' to export the patterns with multiple subbricks, like below data:
errors will be reported as below:
it could be fixed the problem as below changings: