easystats / parameters

:bar_chart: Computation and processing of models' parameters
https://easystats.github.io/parameters/
GNU General Public License v3.0
428 stars 36 forks source link

`effects="all"` drops columns #975

Closed vincentarelbundock closed 3 months ago

vincentarelbundock commented 4 months ago

In some cases, setting effects="all" has the unintended consequences of dropping some statistics. This is one example I was able to construct this morning, but I believe there are other cases.

Notice that z disappears in the second call:

library(stats)
library(parameters)
ts1 <- rnorm(200, mean = 10, sd = 3) |> ts()
test <- Box.test(ts1,lag = 5, type = "Box-Pierce", fitdf = 2)

parameters::model_parameters(test)

    Box-Pierce test

    Parameter1 |    z | df |     p
    ------------------------------
    ts1        | 3.73 |  3 | 0.292

parameters::model_parameters(test, effects = "all")

    Warning: This 'htest' method is not (yet?) supported.
      Returning 'parameters::model_parameters(model)'.

    Box-Pierce test

    Parameter1 | df |     p
    -----------------------
    ts1        |  3 | 0.292
strengejacke commented 4 months ago

Thanks! This is most likely due to partial matching:

options(
  warnPartialMatchArgs = TRUE,
  warnPartialMatchAttr = TRUE,
  warnPartialMatchDollar = TRUE
)

library(stats)
library(parameters)
ts1 <- rnorm(200, mean = 10, sd = 3) |> ts()
test <- Box.test(ts1, lag = 5, type = "Box-Pierce", fitdf = 2)

parameters::model_parameters(test, effects = "all")
#> Warning in model_parameters.htest(test, effects = "all"): partial argument
#> match of 'effects' to 'effectsize_type'
#> Warning: This 'htest' method is not (yet?) supported.
#>   Returning 'parameters::model_parameters(model)'.
#> Box-Pierce test
#> 
#> Parameter1 | df |     p
#> -----------------------
#> ts1        |  3 | 0.066

Created on 2024-06-03 with reprex v2.1.0

The htest method of model_parameters() has an effectsize_type argument - using effects partially matches that argument with an invalid option and/or not supported by that specific htest.

Maybe we should check the dots for effects, and than drop that argument?

strengejacke commented 4 months ago

Ok, that won't work, because effects is already recognized as effectsize_type upon calling. I just check if effectsize_type has any valid effects option, and if so, silently ignore it (i.e. treat as NULL).

vincentarelbundock commented 4 months ago

Thanks for looking into this.

Hmmm, honestly, i don't have a great sense of the interactions between different code elements here. I don't feel competent to propose a solution right now...

mattansb commented 3 months ago

@strengejacke maybe we do need to rename the effectsize_type argument to avoid partial matching after all (https://github.com/easystats/effectsize/issues/639)...

What would be a good name for this argument? (reminder, it is passed to effectsize(..., type)) (which is only a valid argument for anova objects, htest objects and BFBayesFactor objects - ignored for all other...).

strengejacke commented 3 months ago

The good thing is, nothing unexpected happens. Because if you provide effects = "all", effectsize_type will get the value "all", but no code is run because we check for valid options before:

  # is valid effect size?
  if (!all(effectsize_type %in% c("eta", "omega", "epsilon", "f", "f2"))) {
    return(params)
  }

It's really only the warning about partial matching. For htests, we didn't have that check for valid options, which I now added in the PR, so this issue is resolved.

Still, I think only renaming could resolve the partial matching warning.

vincentarelbundock commented 3 months ago

Thanks for looking into this, I appreciate it!