Lukas and I have a question concerning use of the regression discontinuity design methodology ('rdd' package) that we are attempting to use in our thesis.
We came across the idea seeking a methodology to help understand an interruption/treat/event in the regression line for the systemic risk performance of our units (banks) over time (over the eurocrisis). Optimally, we would like to have an output such as this:
Source/more info: personal.lse.ac.uk/iyengarr/MEI_Event_studies.ppt
We have downloaded the 'rdd' package, and experimented with the 'Imbens-Kalyanaraman Optimal Bandwidth Calculation' with the following line of code:
If anyone knows how to help us, because this did not yield any of the hoped-for results, we would be very grateful with assistance in the application of this method.
Hi everyone, hi Chris,
Lukas and I have a question concerning use of the regression discontinuity design methodology ('rdd' package) that we are attempting to use in our thesis.
We came across the idea seeking a methodology to help understand an interruption/treat/event in the regression line for the systemic risk performance of our units (banks) over time (over the eurocrisis). Optimally, we would like to have an output such as this: Source/more info: personal.lse.ac.uk/iyengarr/MEI_Event_studies.ppt
We have downloaded the 'rdd' package, and experimented with the 'Imbens-Kalyanaraman Optimal Bandwidth Calculation' with the following line of code:
IKbandwith(allvars2$Days, allvars2$SRISK, cutpoint = NULL, verbose = FALSE, kernel = "triangular")
If anyone knows how to help us, because this did not yield any of the hoped-for results, we would be very grateful with assistance in the application of this method.
Github repo for project: https://github.com/laurencehendry/SRISK_Thesis/blob/master/Gather