[x] Instead of using a one-hot encoding of NWP step, it might be better to use nwp_init_time_utc_fourier where the other queries have time_fourier_t0.
[x] #171
[x] #147
[x] All NWP variables
[x] #172
[ ] #173
[x] #176
[ ] Give the model the absolute PV capacity
[ ] Weight the GSP loss by the PV capacity of the GSP?
[ ] GSP ID of pixels & PV systems
[ ] #175
[ ] #170
[ ] #168
[ ] Reduce weight of PV loss
[ ] Increase learning rate?
[ ] Compute nanmean and nanstd for all NWP variables.
[ ] Learnable ID for Satellite, PV and GSP queries.
[ ] Pad with learnt vectors, not zeros.
[ ] Separate NWP query elements for different locations
[ ] Downsample NWP using little CNN
[ ] Try removing the RNN again.
[ ] #162
[ ] Don't include GSP history in the query for time_transformer. We're already giving the GSP history into the SatelliteTransformer. And it may be confusing for the model to have the GSP history go to zero for the forecast horizons.
[ ] Don't give the GSP history in any of the main queries. Instead give a separate input to the time_transformer which includes the GSP history, and a marker to say "this is history".
[ ] Plot attention matrix
[ ] Don't give PV history in any of the queries. It's already in the RNN.
[ ] #167
[ ] #102
[ ] #144
[ ] #120
[ ] #156
[ ] #118 (although first wait to see if Peter gets Passiv data into production)
[ ] If the model is too big to run in production, then convert to Torch Script
nwp_init_time_utc_fourier
where the other queries havetime_fourier_t0
.time_transformer
. We're already giving the GSP history into theSatelliteTransformer
. And it may be confusing for the model to have the GSP history go to zero for the forecast horizons.time_transformer
which includes the GSP history, and a marker to say "this is history".