dstanley4 / apaTables

Development version of apaTables R package. Current stable version is on the CRAN.
http://dstanley4.github.io/apaTables/
Other
55 stars 13 forks source link

beta on apa.reg.table #16

Closed eric-kruger closed 5 years ago

eric-kruger commented 5 years ago

Hi,

In the examples for the apa.reg.table function there are columns for beta. However, when I run the function I produce no such beta (see example code below). Did you remove the beta column in the latest version of apaTables? The function produces no errors. Thanks

> apaTables::apa.reg.table(models[[1]],models[[2]],models[[3]],filename = file.name)

Regression results using TUG as the criterion

   Predictor       b      b_95%_CI sr2  sr2_95%_CI             Fit        Difference
 (Intercept) 10.57** [7.74, 13.39]                                                  
         Age    0.02 [-0.03, 0.08] .02 [-.06, .09]                                  
  GenderMale    0.30 [-1.13, 1.73] .00 [-.03, .04]                                  
   Education   -0.27 [-0.77, 0.23] .03 [-.07, .13]                                  
                                                         R2 = .043                  
                                                   95% CI[.00,.15]                  

 (Intercept)   4.78*  [0.39, 9.16]                                                  
         Age    0.02 [-0.03, 0.07] .02 [-.05, .08]                                  
  GenderMale    0.12 [-1.17, 1.41] .00 [-.01, .01]                                  
   Education   -0.36 [-0.81, 0.09] .05 [-.06, .16]                                  
         TSK  0.16**  [0.06, 0.26] .21 [-.00, .41]                                  
                                                        R2 = .249* Delta R2 = .206**
                                                   95% CI[.00,.40] 95% CI[-.00, .41]

 (Intercept)   4.98*  [0.43, 9.52]                                                  
         Age    0.02 [-0.03, 0.07] .02 [-.05, .08]                                  
  GenderMale    0.09 [-1.22, 1.40] .00 [-.01, .01]                                  
   Education   -0.37 [-0.83, 0.09] .05 [-.06, .17]                                  
         TSK  0.16**  [0.06, 0.26] .20 [-.00, .41]                                  
   `I Total`   -0.72 [-4.33, 2.88] .00 [-.03, .03]                                  
                                                        R2 = .252*   Delta R2 = .003
                                                   95% CI[.00,.38] 95% CI[-.03, .03]

Note. A significant b-weight indicates the semi-partial correlation is also significant.
b represents unstandardized regression weights. 
sr2 represents the semi-partial correlation squared.
Square brackets are used to enclose the lower and upper limits of a confidence interval.
* indicates p < .05. ** indicates p < .01.
arhot commented 5 years ago

I ran into the same issue: the beta is dropped for all variables if any of the preditors are factors (like gender in your model). I don't know if that's by design or if it's a new change.

dstanley4 commented 5 years ago

This issue occurs by design currently. When any predictor is a factor then beta-weights are not displayed. This is due to the fact that, to me, it's not clear how to best calculate beta-weights in these circumstances.

eric-kruger commented 5 years ago

I am not sure I understand. Correct me if I am wrong, but is not the beta weight just the standardize regression coefficient (where the DV and IV are both divided by their respective SDs). So given a factor which is converted into a binary variable then the interpretation is the increase on the scale of standard deviations of the DV as you move from zero to 1 on the IV. In the example code I provided above this would be going from Male (0) to Female (1).

dstanley4 commented 5 years ago

My understanding is that it's a bit more complicated. The short cut you outline is merely a quick way to model the scenario where all predictors and the criterion and converted to z-scores. Then the regression (without a constant) is performed on those scores. The result is the beta-weight. When all predictors are continuous this relation and the shortcut holds. However, it's not clear to me how the situation should be treated when factors are involved - particularly given the way R handles factors within the lm command.

eric-kruger commented 5 years ago

Thanks for your explanation, I trust that you have looked into this with considerably more detail than me. Thanks for the awesome package btw

eric-kruger commented 5 years ago

I wanted to follow up my previous comment with a quick test to recreate SPSS's beta coefficients--if you regard SPSS as some sort of standard. These can be recreated by using the scale() on all variables in the linear model (including dummy coded variables). The constant (intercept) goes to zero in R and SPSS does not report an intercept for beta coefficients. With factors that are more than 2 levels in R. I would just make sure that you dummy code the variables manually if you want to recreate the standardized beta coefficients.