This is a pretty big change in how the variables are being reduced in the sense that we are no longer using the pretrain_vars and finetune_vars arguments. I am now using a more flexible approach. If we wanted to reproduce what we had before than:
old
new
pretrain_vars = 'both'
reduce_temp_pretrn = 0, reduce_flow_pretrn = 0
pretrain_vars = 'flow'
reduce_temp_pretrn = 1, reduce_flow_pretrn = 0
finetune_vars= 'both'
reduce_temp_trn = 0, reduce_flow_trn = 0
finetune_vars= 'flow'
reduce_temp_trn = 1, reduce_flow_trn = 0
So this update will make the code non-backwards-compatible. I could leave those variables in there, but I just don't want to keep dragging them along.
pretrain_vars
andfinetune_vars
arguments. I am now using a more flexible approach. If we wanted to reproduce what we had before than:pretrain_vars = 'both'
reduce_temp_pretrn = 0
,reduce_flow_pretrn = 0
pretrain_vars = 'flow'
reduce_temp_pretrn = 1
,reduce_flow_pretrn = 0
finetune_vars= 'both'
reduce_temp_trn = 0
,reduce_flow_trn = 0
finetune_vars= 'flow'
reduce_temp_trn = 1
,reduce_flow_trn = 0
So this update will make the code non-backwards-compatible. I could leave those variables in there, but I just don't want to keep dragging them along.