For students it would nice to have some kind of measure of model complexity when we discuss overfitting.
However, simple parameter counts do not work as often (e.g. random forest) it is not clear what to count as parameter.
There has been proposed a generalized degree of freedom that basically checks variations of the outcome for slightly different inputs.
For a few models there are closed form solutions, but for many there are not. Gao and Joijc used a Monte-Carlo estimate (with epislon=1e-5 and with Monte Carlo draws) for neural networks.
For students it would nice to have some kind of measure of model complexity when we discuss overfitting.
However, simple parameter counts do not work as often (e.g. random forest) it is not clear what to count as parameter. There has been proposed a generalized degree of freedom that basically checks variations of the outcome for slightly different inputs.
For a few models there are closed form solutions, but for many there are not. Gao and Joijc used a Monte-Carlo estimate (with epislon=1e-5 and with Monte Carlo draws) for neural networks.