Closed jsadler2 closed 4 years ago
I think it makes more sense (and it is also easier/quicker for training) to have just one set of output weights:
In the pre-training, nothing would change. In the fine-tuning, we'd be masking out the variables for which we weren't making predictions. In this case, we'd mask out all of the flow predictions so it wouldn't be adjusting weights based on those predictions.
I like the idea of simplifying and suspect it would work better. But I'd also be interested to learn Xiaowei's take on this.
Yeah. It would be good to talk to Xiaowei. To start with, ideally I could just do it as he did, but that is seeming pretty complicated. As far as I know, there is no easy way to do the pre-training and fine-tuning the way I have it set up. To switch between pretrain=True
and pretrain=False
I make a whole new rgcn
object which randomly instantiates the weights. So I would have to do something like
# pretrain
model_pre = rgcn(..., pretrain=True)
model_pre.compile(...)
model_pre.fit(...)
# somehow save the model weights (this is made up)
model_pre.save_weights(...)
# finetune
model_finetune = rgcn(..., pretrain=False)
# somehow read the weights (this is made up)
model_finetune.read_weights(weight_file)
model_finetune.compile(...)
model_finetune.fit(...)
If I we cut out that extra set of weights, it would be more like
# pretrain
model = rgcn(...)
model.compile(...)
model.fit(x, y_pre)
# finetune
model.fit(x, y_obs, mask)
I like the idea of simplifying and suspect it would work better.
So yeah. much simpler. and I too think it would work better, so maybe I'll just try that.
I'd also be interested to learn Xiaowei's take on this.
I too would be interested to run this by him and see what his original thought process was
As I'm working on the training with the observation data (#11), it is making more and more sense to me to only have one set of output weights. In Xiaowei's original code, he has a set of weights for the output of the pre-training and a set of weights for the output of the fine tuning.