BennMacdonald / AGM_RPackage

1 stars 0 forks source link

Sampling Lambda #18

Closed FrankD closed 6 years ago

FrankD commented 6 years ago

@BennMacdonald Sorry, I found another potential issue...

The code block in population_mcmc.R where we sample lambda (when not using tempering) is outside the loop over chains. However, the sampling step is chain-specific:

lambda.sampling = sampleLambda(lambda[chain,], gpFit[[chain]], x[[chain]],
                                   parameters[chain,], timePoints, auxVars, pick,
                                   chain, chainTemp)

I checked my old code from way back when, and it's correctly inside the loop over chains, so I'm wondering if you moved it, and if so, if you remember why? The way it's implemented currently, it would only sample new lambda values for the last chain...

BennMacdonald commented 6 years ago

Hi Frank,

I can't remember what population_mcmc.R script we edited to begin getting the code ready for CRAN - whether we used my file or yours and if we used my file, whether we used the one for tempering or the one for sampling lambda (I always kept 2 files). I have checked my scripts and I have the following code in my tempering lambda population_mcmc.R file

    # Chain-specific Inference
    for(j in 1:chainNum) {

      chain = resample(1:chainNum, 1)
      chainParams = parameters[chain,]
      chainTemp = temperatures[chain]
      u = runif(1,0,1)
##########################################
      # Sample Lambda
      #if(!options$explicit && runif(1) > 0.99) {
        #pick = resample(auxVars$speciesList, 1)
        #lambda.sampling = sampleLambda(lambda[chain,], gpFit[[chain]], x[[chain]],
        #                               parameters[chain,], timePoints, auxVars, pick,
        #                               chain, chainTemp)
        #lambda[chain,pick] = lambda.sampling$lambda
        #lLchains[chain] = lambda.sampling$lL.new
        #
        #cat(lambda.sampling$lL.old, lambda.sampling$accept, lambda.sampling$lL.new, '\n')
        #
        #if(lambda.sampling$accept) lastMove = 'LambdaAccept'
        #else lastMove = 'LambdaReject'
        #
        #tuning$proposeLambdaTemp[chain,pick] = tuning$proposeLambdaTemp[chain,pick] + 1
        #tuning$acceptLambdaTemp[chain,pick] = tuning$acceptLambdaTemp[chain,pick] + lambda.sampling$accept
        #
        #if(lLchains[chain] > 1e4) browser()
      #}
      ##########################################
    ### The above samples lambda (gamma), the mismatch between the gradient
        ### from the ODEs and the interpolant. Commenting out this section
        ### along with sampleLambda (function) and proposeLambda(function)
        ### will allow for a smoother annealing scheme to be implemented

      # GP Parameter Inference, coupled with latent variable inference

As you can see, although the sampling is commented out, it does remain in the loop over chains. I checked my sampling lambda population_mcmc.R script and I have the following code

    # Chain-specific Inference
    for(j in 1:chainNum) {

      chain = resample(1:chainNum, 1)
      chainParams = parameters[chain,]
      chainTemp = temperatures[chain]
      u = runif(1,0,1)

      # Sample Lambda
      if(!options$explicit && runif(1) > 0.99) {
        pick = resample(auxVars$speciesList, 1)
        lambda.sampling = sampleLambda(lambda[chain,], gpFit[[chain]], x[[chain]],
                                       parameters[chain,], timePoints, auxVars, pick,
                                       chain, chainTemp)
        lambda[chain,pick] = lambda.sampling$lambda
        lLchains[chain] = lambda.sampling$lL.new

        #cat(lambda.sampling$lL.old, lambda.sampling$accept, lambda.sampling$lL.new, '\n')

        if(lambda.sampling$accept) lastMove = 'LambdaAccept'
        else lastMove = 'LambdaReject'

        tuning$proposeLambdaTemp[chain,pick] = tuning$proposeLambdaTemp[chain,pick] + 1
        tuning$acceptLambdaTemp[chain,pick] = tuning$acceptLambdaTemp[chain,pick] + lambda.sampling$accept

        if(lLchains[chain] > 1e4) browser()
      }

      # GP Parameter Inference, coupled with latent variable inference

Again, it looks like lambda is sampled chain specifically. I am therefore not sure how it managed to get out of the loop for our GitHub version. Are you alright putting it back in the loop or would you like me to update? As with the temperature chain changes, I don't mind doing it, but I need confirmation that your latest update is on GitHub before I download and change the files.

FrankD commented 6 years ago

Strange, but at least that confirms that we both think it should be inside the loop. I'll make the change, since I'll be committing a new version tomorrow anyway.

FrankD commented 6 years ago

Just pushed the fix, should be fine now.