lululxvi / deepxde

A library for scientific machine learning and physics-informed learning
https://deepxde.readthedocs.io
GNU Lesser General Public License v2.1
2.79k stars 757 forks source link

MFNN training #94

Closed jethromoses closed 4 years ago

jethromoses commented 4 years ago

Do you train the low fidelity neutral networks first, then the high fidelity neutral network in mfnn? Or are they trained simultaneously?

lululxvi commented 4 years ago

I train them simultaneously. But I believe they can also be trained one by one.

jethromoses commented 4 years ago

I'm a little confused on how you train them simultaneously.

For example, In cokriging, I first construct the low fidelity model, then to construct the difference model (basically correction between low and high fidelity predictions), I first make a low fidelity prediction at the high fidelity sample points (if i don't have any sampled low fidelity prediction at these points), then I can construct the difference model. This gives me the high fidelity prediction values from cokriging.

What is the equivalent of this when you train the mfnn simultaneously? Do you only train the high fidelity part of the mfnn when you have the output values of low and high fidelity at the sample input values?

lululxvi commented 4 years ago

The total loss is a summation of two losses: the loss of low-fidelity data (only depends on the low-fidelity network), and the loss of high-fidelity data (depends on both the low and high fidelity networks). We just train the whole network using the total loss.

For multi-fidelity GP, I don't do low-fidelity first and then the high-fidelity. I also construct the covariance matrix and then optimize them together.

jethromoses commented 4 years ago

Thanks for your input.