cassianobecker / tgcn2

1 stars 2 forks source link

Infer mat_size from DataLoader #17

Closed semihcanturk closed 5 years ago

semihcanturk commented 5 years ago

mat_size = 59412 or 149 should be retrieved from dataloader, before calling the constructor of the NetGCN

cassianobecker commented 5 years ago

This is still MAGIC-NUMBERING (inside loaders()) ..... size_dict = { 'aparc': 148, 'dense': 59412 } plus, it requires the caller to specify which parcelllation is its in dictionary... Also, for example, this will not work when there is pooling/graph coarsening.

A good implementation would be to create member function of HcpDataset in which the user would just call get_dimensions() and get a list of applicable dimensions (right now, for no coarsening, it is a list of one element, mat_size). The values for this first value must retrieved from the data (i.e., ts inside HcpDataset) .... and not MAGICIANIZED.

cassianobecker commented 5 years ago

Also, about putting imageglobals.logger = set_logger('Nibabel', settings['LOGGING']['nibabel_logging_level']) at data_torch.py.... this belong more closely to data.py, which works at the nibabel territory.... I will update it when I address issue #11....

semihcanturk commented 5 years ago

Not magicianizing requires calling get_index on the dataloader, because at the time we create the model there is no data accessible to the dataloader, so we'd need the first batch to determine information re: data - but we wouldn't want to call the first batch before we call train().

cassianobecker commented 5 years ago

can't we get by asking the HcpDataset - before batching (for which Dataloader might have a reference)?

semihcanturk commented 5 years ago

should be fixed now, using the first subject's data as a reference point to infer size

cassianobecker commented 5 years ago

implemented in as function data_shape in HcpDataset