inlabru-org / inlabru

inlabru
https://inlabru-org.github.io/inlabru/
78 stars 21 forks source link

Unexpected scaling estimates from joint model #126

Closed ghost closed 2 years ago

ghost commented 2 years ago

Hi, im trying to better understand how estimating a Beta scaling constant in the context of a joint likelihood model works in INLA/inlabru when one likelihood is a cox process and the other is e.g. lognormal. E.g. in the context of a marked point process model.

Where: Y ij1 = β01 + fsk(sij)
and
Y ij2 = β02 + βs fsk(sij)

Y1 is the latent field for a point process. Y2 is the latent field another 'mark' variable recorded at each point. β01 and β02 are separate intercepts. fsk(sij) is an GMRF spatial effect, and βs is a scaling constant. I’ve simulated two sets of data, to which I’ve fit an inlabru model which I think is structured as above.

In the first dataset the marks are in an unknown proportion with the latent field of the cox process. In the second dataset the marks are in an unknown proportion / 2 with the latent field of the cox process.

I was expecting to retrieve different estimates for βs from each model. However, the estimates for βs in either model are similar, as are every other component of the model.

What im confused about is that both models make broadly reasonable predictions about their own datasets, even though the estimate for their parameters are similar? Is there some parameter estimated within inla which I cant find?

Many thanks

Im using inlabru_2.3.1.9000 and INLA INLA_21.11.01.

Below is the code Im running:

library(ggplot2) library(sp) library(raster) library(rgeos) library(rgdal) library(INLA) library(inlabru) library(cowplot)

min.x <- -50 max.x <- 50 min.y <- -50 max.y <- 50 n.samples <- 250

some projection

my.proj <- "+proj=laea +lon_0=50.625 +lat_0=42.1915846 +datum=WGS84 +units=km +no_defs" ################################################################################################################################

simulate some points

my.points <- data.frame(x = c(rnorm(n.samples, mean = 0, sd = 10)), y = c(rnorm(n.samples, mean = 0, sd = 10))) my.points <- SpatialPoints(coords = cbind(my.points$x, my.points$y), proj4string = CRS(my.proj)) ################################################################################################################################################

create empty raster

my.range <- seq(from = min.x, to = max.x, by = 10) my.grid <- raster(nrow = length(seq(from = min.x, to = max.x, by = 10)), ncol = length(seq(from = min.y, to = max.y, by = 10)), ext = extent(min.x, max.x, min.y, max.y), crs = my.proj)

assign values of raster based on number of points within each raster cell

my.grid[my.grid] <- 0 tab <- table(cellFromXY(my.grid, my.points))

my.grid[as.numeric(names(tab))]<- tab rm(tab) my.grid <- as(my.grid, "SpatialPixelsDataFrame") names(my.grid) <- "marks" ################################################################################################################################################

assign values of 'marks' based on number of points within each raster cell

my.points$marks <- sp::over(my.points, my.grid["marks"])[,1]

as above, but multiply marks by some scaling constant

marks.scale <- 0.5 my.points$marks.scale <- sp::over(my.points, my.grid["marks"])[,1] * marks.scale

hist(my.points$marks) hist(my.points$marks.scale)

cowplot::plot_grid(ggplot() + gg(my.grid, aes(fill = marks)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)), ggplot() + gg(my.points, aes(colour = marks)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)), ggplot() + gg(my.points, aes(colour = marks.scale)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)), ncol = 1)

########################################################################

define mesh and ipoints

my.bound <- as(extent(my.grid), "SpatialPolygons") proj4string(my.bound) <- my.proj

mesh <- INLA::inla.mesh.2d( # SB loc = my.points, offset = c(50, 25), max.edge = c(10, 20), cutoff = 5, crs = CRS(my.proj) )

matern <- INLA::inla.spde2.pcmatern(mesh, prior.range = c(100, 0.01), prior.sigma = c(1, 0.01), constr = FALSE)

ips <- ipoints(samplers = my.bound, domain = mesh)

my.points$id <- 1:nrow(my.points)

cmp <- ~ -1 + Inter.point(1) + Inter.mark(1) + spat(main = coordinates, model = matern) + spat.copy(main = coordinates, copy = "spat", fixed = FALSE)

#######

lik.cp <- like("cp", formula = coordinates ~ Inter.point + spat,
data = my.points, ips = ips, include = c("Inter.point", "spat"))

#######

lik.marks <- like("lognormal", formula = marks ~ Inter.mark + spat.copy,
data = my.points, include = c("Inter.mark", "spat.copy"))

lik.marks.scale <- like("lognormal", formula = marks.scale ~ Inter.mark + spat.copy,
data = my.points, include = c("Inter.mark", "spat.copy") )

#####################

fit models

fit.joint <- bru(components = cmp, lik.cp, lik.marks, options = list(bru_verbose = 3, bru_max_iter = 10))

fit.joint.scale <- bru(components = cmp, lik.cp, lik.marks.scale, options = list(bru_verbose = 3, bru_max_iter = 10)) ########################################################################

paramaters look similair?

fit.joint$summary.fixed fit.joint.scale$summary.fixed

fit.joint$summary.hyperpar fit.joint.scale$summary.hyperpar

###############################################################################################################

self predict total number of points

pred.sum <- predict(fit.joint, ips, ~ sum(weight * exp(Inter.point + spat)), include = c("Inter.point", "spat"))

pred.sum.scale <- predict(fit.joint.scale, ips, ~ sum(weight * exp(Inter.point + spat)), include = c("Inter.point", "spat"))

result

rbind(cbind(data.frame(model = "standard", observed = nrow(my.points)), pred.sum), cbind(data.frame(model = "scale", observed = nrow(my.points)), pred.sum.scale))

predictions look reasonable

####################################################

self predict marks

pred.mark <- predict(fit.joint, my.points, ~ exp(Inter.mark + spat.copy), include = c("Inter.mark", "spat.copy"))

pred.mark.scale <- predict(fit.joint.scale, my.points, ~ exp(Inter.mark + spat.copy), include = c("Inter.mark", "spat.copy"))

plot(pred.mark$marks, pred.mark$mean) plot(pred.mark.scale$marks.scale, pred.mark.scale$mean) ####################################################

self predict total marks

pred.sum.mark <- predict(fit.joint, ips, ~ sum((weight exp(Inter.point + spat)) exp(Inter.mark + spat.copy)), include = c("Inter.point", "spat", "Inter.mark", "spat.copy"))

pred.sum.mark.scale <- predict(fit.joint.scale, ips, ~ sum((weight exp(Inter.point + spat)) exp(Inter.mark + spat.copy)), include = c("Inter.point", "spat", "Inter.mark", "spat.copy"))

result

rbind(cbind(data.frame(model = "standard", observed = sum(my.points$marks)), pred.sum.mark), cbind(data.frame(model = "scale", observed = sum(my.points$marks.scale)), pred.sum.mark.scale))

predictions look reasonable

################################################################################################################################################################

beta is similair?

fit.joint$summary.hyperpar["Beta for spat.copy",] fit.joint.scale$summary.hyperpar["Beta for spat.copy",]

beta.df <- rbind(cbind(data.frame( model = "standard", data.frame(fit.joint$marginals.hyperpar$Beta for spat.copy))), cbind(data.frame( model = "scale", data.frame(fit.joint.scale$marginals.hyperpar$Beta for spat.copy))))

cowplot::plot_grid(ggplot() + geom_point(data = subset(beta.df, model == "standard"), aes(x = x, y = y)) + labs(title = "Beta standard") + scale_x_continuous(limits = range(beta.df$x)), ggplot() + geom_point(data = subset(beta.df, model == "scale"), aes(x = x, y = y)) + labs(title = "Beta scale model") + scale_x_continuous(limits = range(beta.df$x)), ncol = 1)

################################################################################################################################################################

range is similair?

fit.joint$summary.hyperpar["Range for spat",] fit.joint.scale$summary.hyperpar["Range for spat",]

range.df <- rbind(cbind(data.frame( model = "standard", data.frame(fit.joint$marginals.hyperpar$Range for spat))), cbind(data.frame( model = "scale", data.frame(fit.joint.scale$marginals.hyperpar$Range for spat))))

cowplot::plot_grid(ggplot() + geom_point(data = subset(range.df, model == "standard"), aes(x = x, y = y)) + labs(title = "Range spat standard") + scale_x_continuous(limits = range(range.df$x)), ggplot() + geom_point(data = subset(range.df, model == "scale"), aes(x = x, y = y)) + labs(title = "Range spat scale model") + scale_x_continuous(limits = range(range.df$x)), ncol = 1)

range.sd.df <- rbind(cbind(data.frame( model = "standard", data.frame(fit.joint$marginals.hyperpar$Stdev for spat))), cbind(data.frame( model = "scale", data.frame(fit.joint.scale$marginals.hyperpar$Stdev for spat))))

cowplot::plot_grid(ggplot() + geom_point(data = subset(range.sd.df, model == "standard"), aes(x = x, y = y)) + labs(title = "Stdev spat standard") + scale_x_continuous(limits = range(range.sd.df$x)), ggplot() + geom_point(data = subset(range.sd.df, model == "scale"), aes(x = x, y = y)) + labs(title = "Stdev spat scale model") + scale_x_continuous(limits = range(range.sd.df$x)), ncol = 1)

################################################################################################################################################################

precision lognormal

prec.ln.df <- rbind(cbind(data.frame( model = "standard", data.frame(fit.joint$marginals.hyperpar$Precision for the lognormal observations[2]))), cbind(data.frame( model = "scale", data.frame(fit.joint.scale$marginals.hyperpar$Precision for the lognormal observations[2]))))

cowplot::plot_grid(ggplot() + geom_point(data = subset(prec.ln.df, model == "standard"), aes(x = x, y = y)) + labs(title = "prec ln standard") + scale_x_continuous(limits = range(prec.ln.df$x)), ggplot() + geom_point(data = subset(prec.ln.df, model == "scale"), aes(x = x, y = y)) + labs(title = "prec ln scale model") + scale_x_continuous(limits = range(prec.ln.df$x)), ncol = 1)

finnlindgren commented 2 years ago

Can you rerun this with reprex::reprex() so the results are included?

ghost commented 2 years ago

``` r

.libPaths("F:/R.Library/library")

library(ggplot2) library(sp) library(raster) library(rgeos)

> rgeos version: 0.5-8, (SVN revision 679)

> GEOS runtime version: 3.9.1-CAPI-1.14.2

> Please note that rgeos will be retired by the end of 2023,

> plan transition to sf functions using GEOS at your earliest convenience.

> GEOS using OverlayNG

> Linking to sp version: 1.4-5

> Polygon checking: TRUE

library(rgdal)

> Please note that rgdal will be retired by the end of 2023,

> plan transition to sf/stars/terra functions using GDAL and PROJ

> at your earliest convenience.

>

> rgdal: version: 1.5-27, (SVN revision 1148)

> Geospatial Data Abstraction Library extensions to R successfully loaded

> Loaded GDAL runtime: GDAL 3.2.1, released 2020/12/29

> Path to GDAL shared files: F:/R.Library/library/rgdal/gdal

> GDAL binary built with GEOS: TRUE

> Loaded PROJ runtime: Rel. 7.2.1, January 1st, 2021, [PJ_VERSION: 721]

> Path to PROJ shared files: F:/R.Library/library/rgdal/proj

> PROJ CDN enabled: FALSE

> Linking to sp version:1.4-5

> To mute warnings of possible GDAL/OSR exportToProj4() degradation,

> use options("rgdal_show_exportToProj4_warnings"="none") before loading sp or rgdal.

> Overwritten PROJ_LIB was F:/R.Library/library/rgdal/proj

library(INLA)

> Loading required package: Matrix

> Loading required package: foreach

> Loading required package: parallel

> This is INLA_21.11.01 built 2021-10-31 20:28:40 UTC.

> - See www.r-inla.org/contact-us for how to get help.

> - Save 68.6Mb of storage running 'inla.prune()'

library(inlabru)

min.x <- -50 max.x <- 50 min.y <- -50 max.y <- 50 n.samples <- 250

sd.clus <- 10

my.proj <- "+proj=laea +lon_0=50.625 +lat_0=42.1915846 +datum=WGS84 +units=km +no_defs"

################################################################################################################################

simulate some clustered points

my.points <- data.frame(x = c(rnorm(n.samples, mean = 0, sd = sd.clus)), y = c(rnorm(n.samples, mean = 0, sd = sd.clus))) my.points <- SpatialPoints(coords = cbind(my.points$x, my.points$y), proj4string = CRS(my.proj))

################################################################################################################################################

create empty raster

my.range <- seq(from = min.x, to = max.x, by = 10) my.grid <- raster(nrow = length(seq(from = min.x, to = max.x, by = 10)), ncol = length(seq(from = min.y, to = max.y, by = 10)), ext = extent(min.x, max.x, min.y, max.y), crs = my.proj)

assign values of raster based on number of points within each raster cell

my.grid[my.grid] <- 0 tab <- table(cellFromXY(my.grid, my.points))

my.grid[as.numeric(names(tab))]<- tab rm(tab) my.grid <- as(my.grid, "SpatialPixelsDataFrame") names(my.grid) <- "counts" ################################################################################################################################################

assign values of 'marks' based on number of points within each raster cell

my.points$counts <- sp::over(my.points, my.grid["counts"])[,1]

as above, but multiply counts by some scaling constant

counts.scale <- 0.5 my.points$counts.scale <- sp::over(my.points, my.grid["counts"])[,1] * counts.scale

hist(my.points$counts)


![](https://i.imgur.com/dwFFq5b.png)

``` r
hist(my.points$counts.scale)

cowplot::plot_grid(ggplot() + gg(my.grid, aes(fill = counts)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)),
                   ggplot() + gg(my.points, aes(colour = counts)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)),
                   ggplot() + gg(my.points, aes(colour = counts.scale)) + coord_cartesian(xlim = c(min.x, max.x), ylim = c(min.y, max.y)),
                   ncol = 1)

########################################################################
#define mesh and ipoints

my.bound <- as(extent(my.grid), "SpatialPolygons")
proj4string(my.bound) <- my.proj

mesh <- INLA::inla.mesh.2d( # SB
  loc = my.points,
  offset = c(50, 25),
  max.edge = c(10, 20),
  cutoff = 5,
  crs = CRS(my.proj)
)

matern <- INLA::inla.spde2.pcmatern(mesh, 
                                    prior.range = c(100, 0.01), 
                                    prior.sigma = c(1, 0.01), 
                                    constr = FALSE)

ips <- ipoints(samplers = my.bound, domain = mesh)

my.points$id <- 1:nrow(my.points)

cmp <- ~
  -1 +
  Inter.point(1) +
  Inter.mark(1) +
  spat(main = coordinates, model = matern) +
  spat.copy(main = coordinates, copy = "spat", fixed = FALSE)

#######

lik.cp <- like("cp",
               formula = coordinates ~ Inter.point + spat,  
               data = my.points,
               ips = ips,
               include = c("Inter.point", "spat")
)

#######

lik.counts <- like("lognormal", 
                   formula = counts ~ Inter.mark + spat.copy,   
                   data = my.points,
                   include = c("Inter.mark", "spat.copy")
)

lik.counts.scale <- like("lognormal", 
                         formula = counts.scale ~ Inter.mark + spat.copy,   
                         data = my.points,
                         include = c("Inter.mark", "spat.copy")
)

#####################
#fit model 

fit.joint <- bru(components = cmp,
                 lik.cp, lik.counts,
                 options = list(bru_verbose = 3,
                                bru_max_iter = 10,
                                control.inla = list(int.strategy = "eb",
                                                    h = 0.005),
                                control.compute = list(waic = FALSE),
                                verbose = FALSE))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

#> iinla: Iteration 1 [max:10]
#> iinla: Iteration 2 [max:10]
#> iinla: Iteration 3 [max:10]
#> iinla: Max deviation from previous: 4.32% of SD [stop if: <1%]
#> iinla: Iteration 4 [max:10]
#> iinla: Max deviation from previous: 4.26% of SD [stop if: <1%]
#> iinla: Iteration 5 [max:10]
#> iinla: Max deviation from previous: 2.82% of SD [stop if: <1%]
#> iinla: Iteration 7 [max:10]
#> iinla: Max deviation from previous: 3.44% of SD [stop if: <1%]
#> iinla: Iteration 8 [max:10]
#> iinla: Max deviation from previous: 3.07% of SD [stop if: <1%]
#> iinla: Iteration 9 [max:10]
#> iinla: Max deviation from previous: 0.722% of SD [stop if: <1%]
#> iinla: Convergence criterion met, running final INLA integration with known theta mode.
#> iinla: Iteration 10 [max:10]

fit.joint.scale <- bru(components = cmp,
                      lik.cp, lik.counts.scale,
                      options = list(bru_verbose = 3,
                                     bru_max_iter = 10,
                                     control.inla = list(int.strategy = "eb",
                                                         h = 0.005),
                                     control.compute = list(waic = FALSE),
                                     verbose = FALSE))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

#> iinla: Iteration 1 [max:10]
#> iinla: Iteration 2 [max:10]
#> iinla: Max deviation from previous: 4.09% of SD [stop if: <1%]
#> iinla: Iteration 3 [max:10]
#> iinla: Max deviation from previous: 3.09% of SD [stop if: <1%]
#> iinla: Iteration 4 [max:10]
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

#> iinla: Max deviation from previous: 5.01% of SD [stop if: <1%]
#> iinla: Iteration 6 [max:10]
#> iinla: Max deviation from previous: 1.79% of SD [stop if: <1%]
#> iinla: Iteration 7 [max:10]
#> iinla: Max deviation from previous: 1.67% of SD [stop if: <1%]
#> iinla: Iteration 8 [max:10]
#> iinla: Max deviation from previous: 1.75% of SD [stop if: <1%]
#> iinla: Iteration 9 [max:10]
#> iinla: Max deviation from previous: 3.52% of SD [stop if: <1%]
#> iinla: Maximum iterations reached, running final INLA integration.
#> iinla: Iteration 10 [max:10]
########################################################################
#looks similair?

fit.joint$summary.fixed
#>                  mean       sd 0.025quant  0.5quant 0.975quant      mode
#> Inter.point -8.755504 1.607418 -11.929826 -8.749244 -5.6190108 -8.736650
#> Inter.mark  -2.960691 1.310038  -5.547659 -2.955621 -0.4043825 -2.945417
#>                      kld
#> Inter.point 8.176091e-07
#> Inter.mark  8.205493e-07
fit.joint.scale$summary.fixed
#>                  mean       sd 0.025quant  0.5quant 0.975quant      mode
#> Inter.point -8.769045 1.616937 -11.962131 -8.762760  -5.613946 -8.750113
#> Inter.mark  -3.666356 1.318121  -6.269258 -3.661263  -1.094249 -3.651015
#>                      kld
#> Inter.point 8.181054e-07
#> Inter.mark  8.217254e-07

fit.joint$summary.hyperpar
#>                                                   mean         sd 0.025quant
#> Precision for the lognormal observations[2]  7.6951011  0.8473413  6.1467853
#> Range for spat                              86.0043984 18.2147854 56.8107428
#> Stdev for spat                               2.5064667  0.4392233  1.7763599
#> Beta for spat.copy                           0.8209215  0.0566044  0.7128266
#>                                               0.5quant  0.975quant      mode
#> Precision for the lognormal observations[2]  7.6554454   9.4792043  7.584279
#> Range for spat                              83.7269068 127.9965514 79.160094
#> Stdev for spat                               2.4603258   3.4933016  2.365016
#> Beta for spat.copy                           0.8195293   0.9355425  0.814536
fit.joint.scale$summary.hyperpar
#>                                                   mean          sd 0.025quant
#> Precision for the lognormal observations[2]  7.6845078  0.82162673  6.1672887
#> Range for spat                              85.9634101 17.99010215 56.9916544
#> Stdev for spat                               2.5153911  0.45890838  1.7524231
#> Beta for spat.copy                           0.8202373  0.05667892  0.7114848
#>                                               0.5quant  0.975quant       mode
#> Precision for the lognormal observations[2]  7.6516811   9.3916085  7.5986784
#> Range for spat                              83.7563689 127.2928799 79.3293554
#> Stdev for spat                               2.4673113   3.5489582  2.3697052
#> Beta for spat.copy                           0.8190292   0.9346134  0.8147097

###############################################################################################################

#self predict total number of points
pred.sum <- predict(fit.joint, ips, ~ sum(weight * exp(Inter.point + spat)), 
                        include = c("Inter.point", "spat"))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

pred.sum.scale <- predict(fit.joint.scale, ips, ~ sum(weight * exp(Inter.point + spat)), 
                         include = c("Inter.point", "spat"))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

##result
rbind(cbind(data.frame(model = "standard", observed = nrow(my.points)), pred.sum),
      cbind(data.frame(model = "scale", observed = nrow(my.points)), pred.sum.scale))
#>      model observed     mean       sd   q0.025   median   q0.975     smin
#> 1 standard      250 248.3314 14.57674 222.3089 246.2250 278.0098 212.1707
#> 2    scale      250 251.3163 16.31212 221.5744 250.7591 289.2283 220.2790
#>       smax         cv      var
#> 1 286.0551 0.05869875 212.4814
#> 2 293.1421 0.06490671 266.0852
#predictions look reasonable

####################################################

#self predict total number of marks
pred.sum.mark <- predict(fit.joint, ips, ~ sum((weight * exp(Inter.point + spat)) * exp(Inter.mark + spat.copy)), 
                         include = c("Inter.point", "spat", "Inter.mark", "spat.copy"))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

pred.sum.mark.scale <- predict(fit.joint.scale, ips, ~ sum((weight * exp(Inter.point + spat)) * exp(Inter.mark + spat.copy)), 
                              include = c("Inter.point", "spat", "Inter.mark", "spat.copy"))
#> Warning in showSRID(SRS_string, format = "PROJ", multiline = "NO", prefer_proj
#> = prefer_proj): Discarded ellps unknown in Proj4 definition: +proj=longlat +R=1
#> +no_defs +type=crs

##result
rbind(cbind(data.frame(model = "standard", observed = sum(my.points$counts)), pred.sum.mark),
      cbind(data.frame(model = "scale", observed = sum(my.points$counts.scale)), pred.sum.mark.scale))
#>      model observed     mean       sd   q0.025   median   q0.975     smin
#> 1 standard     4176 4198.251 319.2275 3541.995 4210.596 4838.853 3468.006
#> 2    scale     2088 2125.641 146.8880 1855.563 2126.544 2408.459 1808.780
#>       smax         cv       var
#> 1 5101.183 0.07603820 101906.19
#> 2 2513.096 0.06910289  21576.08
#predictions look reasonable

Created on 2021-11-26 by the reprex package (v2.0.1)

ASeatonSpatial commented 2 years ago

Based on the maths at the top shouldn't the model be something like this?

cmp <- ~
-1 +
Inter.point(1) +
Inter.mark(1) +
beta_spat(1) +
spat(main = coordinates, model = matern) 

lik.cp <- like("cp",
formula = coordinates ~ Inter.point + spat,
data = my.points,
ips = ips,
include = c("Inter.point", "spat"))

lik.marks <- like("lognormal",
formula = marks ~ Inter.mark + beta_spat * spat,
data = my.points,
include = c("Inter.mark", "spat", "beta_spat"))

lik.marks.scale <- like("lognormal",
formula = marks.scale ~ Inter.mark + beta_spat * spat,
data = my.points,
include = c("Inter.mark", "spat", "beta_spat")
)

Maybe I'm just misunderstanding what the copy is doing.

I think this should give beta_spat approximately 1 and Inter.mark should be different which is what you would expect since the field has been multiplied by 1/2.

finnlindgren commented 2 years ago

The version from @ASeatonSpatial is an alternative to using "copy" (copy has the scaling parameter "beta_spat" built-in), and the total effect on the predictor on the two approaches should be similar. In both cases, the difference in results between the "marks" and "marks.scale" models should be a change in the estimate of Inter.mark, since the predictor is on the log-scale, and hence the spatial field scaling is not affected by a global scaling of the observation values. This is also what the results show for Inter.mark; -2.96 - log(2) is -3.65, which matches the estimated -3.67 value for the scaled model.

finnlindgren commented 2 years ago

The precision parameter that is estimated for the lognormal model is the precision on the log-scale, so that's also unaffected by the scaling of the observations.

ASeatonSpatial commented 2 years ago

Yes it looks like the intercept is different in the two models. I ran the above model I wrote and got a different intercept. This seems to scale the predictions to sensible values:

pred = predict(fit.joint, ips, ~ exp(Inter.mark + beta_spat * spat))
pred.scale = predict(fit.joint.scale, ips, ~ exp(Inter.mark + beta_spat * spat))

g1 = ggplot() +
  gg(pred, aes(colour = mean)) +
  coord_equal()

g2 = ggplot() +
  gg(pred.scale, aes(colour = mean)) +
  coord_equal()

plot_grid(g1, g2)

image

finnlindgren commented 2 years ago

Yes, my point was that the model results look exactly as should be expected, so I don't think there's a problem with them.

The non-copy version has a non-linear predictor so will not be as accurate in the posterior, which is also slightly different, as the prior distribution for beta_spat is different than for the hyperparameter in the copy-version.

ghost commented 2 years ago

Ah, i see. Cheers for that, bit of a lightbulb moment.

Much appreciated. H