Closed marcio-resende closed 2 years ago
It looks like the strange magnitude of the GxE effects is due to environmental variance being set to a very small number by default (1e-06) instead of it being set to zero as intended. I missed changing the defaults for two of the trait types. This shouldn't have any real effect on the simulation, but it does look strange.
The underlying model is described in the traits vignette. Below is some code following on from your above example to show what's happening behind the scenes and how the GxE effects get used.
# Create phenotypes with no residual error for two locations
# Default functionality is equivalent p=runif(1)
Loc1 = setPheno(F1, varE=0, p=0.25, onlyPheno=TRUE)
Loc2 = setPheno(F1, varE=0, p=0.75, onlyPheno=TRUE)
cor(Loc1, Loc2)
# Reconstruct Loc1 yield manually
ManLoc1 = F1@gv + F1@gxe[[1]]*qnorm(0.25, sd=sqrt(SP$traits[[1]]@envVar))
cor(Loc1, ManLoc1)
# Examine underlying function calls
# The above calculation takes place within calcPheno, which is nested in setPheno
setPheno
AlphaSimR:::calcPheno
I don't think this is necessarily and issue, but I am trying to understand the gxe values saved within pop@gxe when a trait is simulated with addTraitADG. See an example below. The values stored in F1@gxe are in a different scale compared to the values of F1@pheno and F1@gv. I tried different values of the environmental p-value (p) when setting the phenotype, but that leads to small changes.
Any thoughts?