Closed roberto-amaralsantos closed 1 year ago
Hi,
Thanks for your note. If you can provide a minimal working example that reproduces the error, that would make it much easier for us to debug. Thanks!
J
On Thu, Oct 26, 2023 at 3:06 AM Roberto Amaral-Santos < @.***> wrote:
Hi,
thanks a lot for your package! I am having some trouble running HonestDiD::createSensitivityResults_relativeMagnitudes() in my data. When I try running it I get the following error:
Error in { : task 1 failed - "ℹ In index: 599. Caused by error in svd(): ! a dimension is zero"
I am having a hard time understanding what this error means and how to inspect for issues in my code/regression. If you could help me with some guidance on what to do that would be great! Below is an example code of what I am running (let me know if you need a working example to dig into this issue):
parallel_trends_levels_class_year <- fixest::feols(deforestation ~ i(year, class_dummy, ref = 2018)| class + year, cluster = "city_id", reg_data %>% filter(year>2007))
betahat <- summary(parallel_trends_levels_class_year)$coefficients
sigma <- summary(parallel_trends_levels_class_year)$cov.scaled
parallel_trends_levels_class_year_sensitivity <- HonestDiD::createSensitivityResults_relativeMagnitudes( betahat = betahat, sigma = sigma, numPrePeriods = 10, numPostPeriods = 4, Mbarvec = 0.5 )
— Reply to this email directly, view it on GitHub https://github.com/asheshrambachan/HonestDiD/issues/47, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE6EXFBPJIEKAONBDXEYWOLYBIDWXAVCNFSM6AAAAAA6QTD246VHI2DSMVQWIX3LMV43ASLTON2WKOZRHE3DEOBVGE2DQNY . You are receiving this because you are subscribed to this thread.Message ID: @.***>
@roberto-amaralsantos Just scale the outcome. it works if you e.g. put it in millions.
@mcaceresb , thanks! It works when I do so indeed. Do you know why this is the case? Keeping the outcome in the current measurement unit is giving the command a hard time inverting a matrix or something? Thanks!
@roberto-amaralsantos Yes, bc I don't think it's overflowing. I suspect the same as you and reckon it's running into some numerical precision issues because the inverse has all very small values. It's hard to tell exactly where the issue is popping up without debugging.
@jonathandroth LMK if you're happy to call it a day or if I should investigate where the fault was exactly.
@mcaceresb , thanks a lot! Just to let you know how I came across that (in case it is useful somehow): I am working with county level deforestation data. It has lots of 0s, so one of the things I am doing is following Chen & Roth and Normalizing the ATE in levels.
Roberto -- thanks for letting us know about this issue, and for using my paper with Kevin :)
@mcaceresb should we maybe add some logic around where we take the inverse of the covariance, like:
LMK what you think.
@jonathandroth I actually don't know where the error was exactly. It doesn't seem to be any direct code written here (i.e. it's internal in another function). There actually is no error if svd or inversion is done directly.
It sounds like I should find out shere exactly it's failing, otherwise I couldn't implement any rescaling checks.
Got it, thanks! I'd say if you can find it quickly, it's worth adding the check, but if it's going to be costly to do, then I would skip it, since we have not seen this error come up frequently. (BTW, my guess would be it's coming from the calculation of truncated normal critical values, i.e. the call to mvtnorm)
J
On Mon, Oct 30, 2023 at 10:08 AM Mauricio Caceres Bravo < @.***> wrote:
@jonathandroth https://github.com/jonathandroth I actually don't know where the error was exactly. It doesn't seem to be any direct code written here (i.e. it's internal in another function). There actually is no error if svd or inversion is done directly.
It sounds like I should find out shere exactly it's failing, otherwise I couldn't implement any rescaling checks.
— Reply to this email directly, view it on GitHub https://github.com/asheshrambachan/HonestDiD/issues/47#issuecomment-1785293827, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE6EXFFET7TBRXXCTN53K7TYB6YFXAVCNFSM6AAAAAA6QTD246VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTOOBVGI4TGOBSG4 . You are receiving this because you were mentioned.Message ID: @.***>
Continued in #48
@roberto-amaralsantos FYI this should be fixed now and ought to work without scaling.
@mcaceresb Thanks! I will update the version of the package I am running
Hi,
thanks a lot for your package! I am having some trouble running
HonestDiD::createSensitivityResults_relativeMagnitudes()
in my data. When I try running it I get the following error:Error in { : task 1 failed - "ℹ In index: 599. Caused by error in
svd(): ! a dimension is zero"
I am having a hard time understanding what this error means and how to inspect for issues in my code/regression. If you could help me with some guidance on what to do that would be great! Below is an example code of what I am running (let me know if you need a working example to dig into this issue):
parallel_trends_levels_class_year <- fixest::feols(deforestation ~ i(year, class_dummy, ref = 2018)| class + year, cluster = "city_id", reg_data %>% filter(year>2007))
betahat <- summary(parallel_trends_levels_class_year)$coefficients
sigma <- summary(parallel_trends_levels_class_year)$cov.scaled
parallel_trends_levels_class_year_sensitivity <- HonestDiD::createSensitivityResults_relativeMagnitudes( betahat = betahat, sigma = sigma, numPrePeriods = 10, numPostPeriods = 4, Mbarvec = 0.5 )