Closed adebiasi21 closed 2 years ago
Yeah, it's almost surely that the dataset is too large to compute analytic standard errors. I would try things in this order:
?did2s
for details. This might not work though as the code could admittedly be more efficient.Do it as a for loop and be careful not use too much memory and it will work
Thanks, Kyle. I'll try these workarounds. Would running this analysis "as is" on a cluster be another option? Under those conditions, memory wouldn't be an issue. Or, am I not understanding the crux of this issue?
That is certainly an option too! Yeah, the problem is that make_V can't store a matrix big enough for the fixed effects even in sparse form. So more memory can fix that problem
I do think it's worth considering if you want to include parcel fixed effects. A slightly higher level of aggregation (e.g. street fixed effects) are theoretically more appealing anyways as a parcel fixed effect is not consistent with fixed T
Awesome! I'll try my hand at running the analysis on a cluster before trying the other workarounds. And, thanks for the advice on the parcel fixed effects. The variable name is a bit deceiving, as they are actually street networks associated with each parcel in my dataset. "networkid" would likely be a better name.
To update: I tried running the analysis "as is" on a cluster. However, I get the following error: address (nil), cause 'memory not mapped'. I am wondering whether there is some built-in stopping rule within the code that prevents the analysis from proceeding past a certain matrix size. Alternatively, this could be a cluster-specific issue. That said, do you have any insights here?
I am trying to run a standard static model:
However, I get the following error:
On its face, it seems like my dataset is too large. Any advice on how to address this error would be appreciated.