Closed brissend closed 2 years ago
Thanks for the report, it seems to be a follow-up of #8. Could you share a small example of a file.func.gii
that causes problem? Is the time dimension meant to be the second one?
Yes the issue is very similar to #8. An example file can be accessed here https://www.dropbox.com/s/erkrcxtwhs0cru6/file.func.gii?dl=0. The time dimension is the 2nd one.
Hello, sorry, just looking at this now. Am a bit confused by your data: doesn't it contain 461 times series of 10242 time points each?
>> f
f =
struct with fields:
cdata: [10242x461 single]
>> f.private.data{1}.attributes
ans =
struct with fields:
ArrayIndexingOrder: 'RowMajorOrder'
DataType: 'NIFTI_TYPE_FLOAT32'
Encoding: 'GZipBase64Binary'
Endian: 'LittleEndian'
ExternalFileName: ''
ExternalFileOffset: '0'
Intent: 'NIFTI_INTENT_TIME_SERIES'
Dim: 10242
If so, shouldn't you use ts = ts((ndiscard+1):end,:);
instead?
The file is definitely supposed to be 10242 vertices x 461 TRs. I noticed that as well but assumed that it was correct as that file is simply renamed output from fmriprep. The output space is fsaverage5 which has 10,242 vertices per hemisphere. Is this potentially an issue on their end? Regardless, your code appears to correctly identify the time dimension as n
in subsasgn is a vector of length 461.
Thanks, I think you are right. Rereading the specifications for time-series file, it says:
The Time-Series file contains one or more DataArrays with Intent set to NIFTI_INTENT_TIME_SERIES and DataType set to NIFTI_TYPE_FLOAT32. Dimensionality is one with the first dimension set to the number of nodes.
What sent me in the wrong direction is that the TR is a metadata of each DataArray:
>> f.private.data{1}.metadata
ans =
struct with fields:
name: 'TimeStep'
value: '650.000000'
According to the specs, it should be in the file metadata and not in each DataArray metadata:
TimeStep – Included in the file metadata of a data file that contains NIFTI_INTENT_TIME_SERIES DataArrays. TimeStep provides TR (repetition time). In a NIFTI volume file, this value is the “slice_duration” parameter.
In your case, it means that you want to remove the first 8 DataArrays (and not shorten them as I initially thought). The difficulty with your code is that at the level of subsasgn
there is no easy way to know which ones of the DataArrays are being removed so that the attributes, metadata and space properties can be reassigned appropriately. In your case, the metadata are the same for all time points so it's less of an issue but it is still difficult to have a generic implementation.
The following is not recommended because it uses the private accessor but it should do what you want:
ndiscard = 8;
f = gifti('file.func.gii');
f.private.data(1:ndiscard) = [];
save(f,'newfile.func.gii');
Sorry for the delayed response. I tested out the provided code and it appears to do the trick. Thank you for providing a solution!
I am attempting to load a gifti timeseries, remove non-steady-state volumes, and then save the resulting data array. I'm getting the following error:
This error appears to stem from line 130 of subsasgn (
if numel(n) == size(A,2)
) which ensures that any processing that changes the timeseries length will lead to an error. I realize I can create a new gifti (e.g.fout = gifti(ts)
). However, I then lose all metadata from the original file which creates issues at later steps of my analysis pipeline.