The current default data-generation distribution has unbounded support for individual loads.
This makes it challenging to run verification tasks on the same distribution (or support thereof) as was used for training.
Alternatives could be:
use uniform white noise instead of log-normal --> simple
use truncated log-normal noise -> stays somewhat lognormal, but we would potentially need to introduce another parameter for truncation
The current default data-generation distribution has unbounded support for individual loads. This makes it challenging to run verification tasks on the same distribution (or support thereof) as was used for training.
Alternatives could be: