Closed sadamov closed 4 weeks ago
While you are doing this I thought of something else with the config: Shouldn't num_forcing_features
be derived as the length of the dataset.var_names
list in the config https://github.com/mllam/neural-lam/blob/main/neural_lam/data_config.yaml#L57? I mean rather than being defined separately.
Hmm, the dataset.var_names are of the state not of the forcings though. In the future when our yaml file covers state,forcings and boundaries I suggest something along these lines https://github.com/mllam/neural-lam/blob/feature_dataset_yaml/neural_lam/data_config.yaml
self.grid_dim = (
2 * self.config_loader.num_data_vars("state")
+ grid_static_dim
+ self.config_loader.num_data_vars("forcing")
* self.config_loader.forcing.window
)
Note, that currently the forcings are calculate in parts in the dataset and in otherparts loaded from .npy files.
Hmm, the dataset.var_names are of the state not of the forcings though.
Ah yes, of course you're right! But but isn't num_forcing_features == 16
too large then? I thought the forcing features were only radiation etc.
Also, I really like your suggestion about introducing a num_data_vars
representation somewhere. Should I start a separate issue for this so we can discuss it in more detail?
the forcings should be GRID_FORCING_DIM = 5 * 3 + 1 # 5 feat. for 3 time-step window + 1 batch-static
(from original constants.py
)
=16
num_data_vars
is already implemented in the Config class: https://github.com/mllam/neural-lam/blob/5b71be3c68d815e0e376ee651c14f09d801f86de/neural_lam/config.py#L51
The coords_projection problem was fixed by merging #17, so merged main in here so we don't see that change.
Did not add explicitly to changelog as these are only fixes to #23
Summary
https://github.com/mllam/neural-lam/pull/31 introduced three minor bugs that are fixed with this PR:
()
)data_config.yaml
var_leads_metrics_watch