Closed johnnybonney closed 2 years ago
The criteria are different and not comparable.
With the moment-based approach, you are trying to match the moments themselves. So you get 0 if and only if the moments are matched perfectly.
With the regression approach you are trying to minimize the sum of squared residuals. Even if you have specified the conditional mean perfectly, you still won't get 0 unless the variance of the residuals is 0.
As for increasing with N, my guess is that @jkcshea coded it in such a way that he dropped the 1/N factor in the criterion function. So it's not converging anywhere with N. That's why you're seeing something that's about 5x as large when you make N 5x as large. @jkcshea it probably makes more sense to use the normalized criterion. Potentially this could contribute to computational stability (although seems unlikely). Do you have time to change it? Probably it would be quick I think, but I know you are busy, so feel free to defer.
Sorry for the confusion, we had agreed several times in #198 to normalize the criterion. While the function is indeed doing this, I was still reporting the SSR as the criterion. This is now corrected, and the reported criterion should no longer vary with sample size now!
Easy fix, thanks!
I have been getting very high values of the minimum criterion when using the direct MTR approach.
I recall that if I were to specify the IV-like moments myself, the minimum criterion represents an L1 measure of how close we can get to matching the moments (given the specified constraints on the MTRs). However, when I use the direct MTR approach, I've been getting minimum criteria in the tens of thousands for a binary outcome and no covariates.
It must be either that (i) a different definition of minimum criterion is used for the direct MTR method that I'm just not familiar with, (ii) I am goofing up somewhere, or (iii)
ivmte
is doing something weird.Here is an example:
This yields a minimum criterion of
2367.948
and bounds of[-0.09251298, 0.2420311]
.In addition, if I increase
N = 50000
and rerun the code, I get a minimum criterion of11887.48
and bounds of[-0.2021548, 0.2500426]
. This pattern holds in general---the minimum criterion scales linearly with N, and it seems to be changing the bounds non-trivially. (While I would not expect increasing the size of the data set to improve the fit of the moments per se, it seems odd that the measure of fit is worse for higher N.)