modern-fortran / neural-fortran

A parallel framework for deep learning
MIT License
395 stars 82 forks source link

Introduce a separate `network % compile()` step #176

Open milancurcic opened 4 months ago

milancurcic commented 4 months ago

...to add the optimizer, loss, and eventually, metrics, to the network.

These are only needed in training. For that reason, we don't require them at model creation (good!), but that also means that because we don't have a separate network % compile() step, we need to set them either in network % train() or network % update() (in case of optimizer) or network % backward() (in case of loss) methods. It's complicating the code and making it more difficult to understand and maintain.

subroutine compile(self, optimizer, loss)
  class(network), intent(inout) :: self
  class(optimizer_base_type), intent(in), optional :: optimizer
  class(loss_type), intent(in), optional :: loss
  ...
end subroutine compile

Then, we wouldn't need to do any special bookkeeping regarding whether loss or optimizer are set in train(), update(), or backward() methods.