Closed confusedmatrix closed 7 months ago
Attention: 1 lines
in your changes are missing coverage. Please review.
Comparison is base (
d0a80ee
) 50.85% compared to head (b341182
) 50.85%. Report is 8 commits behind head on main.
Files | Patch % | Lines |
---|---|---|
pvnet/models/base_model.py | 0.00% | 1 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@dfulu this looks good to go?
If I'm correct I think that common_metrics returns scalar values for the rmse and mae.
mse_each_step
is expected to be an array of mse values for each forecast step. Same withmae_each_step
. So I think there is still some work to do to fix this
but this is about the inputs? Also perhaps a test for this would be a good way to check this?
If I'm correct I think that common_metrics returns scalar values for the rmse and mae.
mse_each_step
is expected to be an array of mse values for each forecast step. Same withmae_each_step
. So I think there is still some work to do to fix thisbut this is about the inputs? Also perhaps a test for this would be a good way to check this?
Yep - the PR in ocf-ml-metrics (here: https://github.com/openclimatefix/ocf-ml-metrics/pull/13) updates the common_metrics function to handle scalar and array return types + includes tests for both cases
Thanks @confusedmatrix, I merged your changes to ocf-ml-metrics, so once the build and pip release is done we should update the requirements here and that should be it.
Also @peterdudfield we probably should have a full end2end test for the PVNet training, loop, but I'd say lets do that in a separate issue
Thanks @confusedmatrix, I merged your changes to ocf-ml-metrics, so once the build and pip release is done we should update the requirements here and that should be it.
Also @peterdudfield we probably should have a full end2end test for the PVNet training, loop, but I'd say lets do that in a separate issue
I agree, year put that in a different issue
Pull Request
Description
This PR fixes a typo in a parameter name when calling
common_metrics
in the validation step during training of PVNet. Additionally, ascommon_metrics
expects np.ndarrays as input, the input predictions and target values are cast from torch Tensors to np.ndarrays.This PR works in conjunction with https://github.com/openclimatefix/ocf-ml-metrics/pull/13 to provide correct logging of validation metrics during training using multiple forecast horizons.
How Has This Been Tested?
Tested by training locally over pre-made data batches
If your changes affect data processing, have you plotted any changes? i.e. have you done a quick sanity check?
Checklist: