Closed jvdp1 closed 3 months ago
I added example use of this to examples/dense_mnist.f90.
One challenge I found is that if we use the multiple metrics variant, we can't mix and match functions defined in nf_metrics
and those in nf_loss
(e.g. metrics=[mse(), corr()]
is not allowed because an array constructor must have all types the same).
I think there may be a workaround by first staging metrics as class(metric_type), allocatable :: metrics(:)
but I couldn't quite figured it out.
Even if it can't be done, it's a minor limitation IMO.
I'll add some tests too.
@jvdp1 actually, my challenge I mentioned above extends to even passing multiple metrics of the same parent type. Can you show me one example of invoking net % evaluate_batch_1d_metrics()
?
I think once we add one example of passing multiple metrics (say, in a test program), we can merge this PR.
One challenge I found is that if we use the multiple metrics variant, we can't mix and match functions defined in
nf_metrics
and those innf_loss
(e.g.metrics=[mse(), corr()]
is not allowed because an array constructor must have all types the same).
Indeed, you are right. The same problem happens also with multiple metrics of the same parent type. So, it seems that evaluate_batch_1d_metrics
is not possible and should be removed.
A possible solution could be this one proposed by Brad. But it sounds to me a bit too complex for what we want to achieve.
Yes, I think this is the same approach that we use to pass an array of heterogeneous layers to a network. It's also overkill for the user to do this. Let's remove the multi-metrics variant then, at least until we hear anyone really needing this.
Thank you!
Related to #179