This pull request includes new components for amortized point estimation, namely
quantile_loss() to simultaneously train multiple quantile levels
QuantileActivation, a tf.keras Layer to add at the end of a neural architecture for quantile estimation to exploit that quantiles are ordered according to their levels (inductive bias)
It also includes a related API addition, namely
select the quantile_loss() conveniently by specifying quantile_levels, much like norm_ord does for the norm_diff() loss, e.g. quantile_levels=[0.1, 0.5, 0.9].
Finally, there is an updated tutorial notebook including some theory and showcasing the above + parametric bootstrapping and a simulation-based calibration test.
This pull request includes new components for amortized point estimation, namely
quantile_loss()
to simultaneously train multiple quantile levelsQuantileActivation
, a tf.keras Layer to add at the end of a neural architecture for quantile estimation to exploit that quantiles are ordered according to their levels (inductive bias)It also includes a related API addition, namely
quantile_loss()
conveniently by specifyingquantile_levels
, much likenorm_ord
does for thenorm_diff()
loss, e.g.quantile_levels=[0.1, 0.5, 0.9]
.Finally, there is an updated tutorial notebook including some theory and showcasing the above + parametric bootstrapping and a simulation-based calibration test.