cmlegault / IBMWG

Index Based Methods Working Group
0 stars 1 forks source link

finalize names for IBM functions #19

Closed cmlegault closed 4 years ago

cmlegault commented 4 years ago

Currently design_matrix.R uses the following names for IBMs: ibms <- c("Islope", "Itarget", "true_Skate_CR", "M_CC", "PlanBsmooth", "ExpandSurvey_recentavgrelF", "ExpandSurvey_F%SPR", "ExpandSurvey_stableperiodF", "AIM", "CatchCurve_Fspr", "CatchCurve_stableperiod", "JoeDLM", "Ensemble")

Please note any changes needed to this list so that design_matrix.R and design_matrix.csv can be updated. Thanks!

rich-bell commented 4 years ago

Thanks Chris,

This may have been settled while I was away last week. Just wanted to check on two aspects:

  1. Just want to confirm that using the recent average F or catch was an option the group wanted to use for expanded biomass? I thought I recall that Jon D used that method simply to write the function, but was not intending for it to be part of the factorial. Does not matter to me, just want to confirm that it should be included.

  2. John W. - I recall a conversation about using only natural mortality and SPR as proxy Fmsy values for the catch curve method and not using the stable historical period. It looks like stable period function is now outputting a relative F that is a harvest rate (C/B) that is in the units of total catch over total biomass from the expanded biomass method. Would that F be on the same relative scale as the biomass estimate from the catch curve method?

I believe the original reason for not using the historically stable period method for the catch curve was that we were simply identifying the years of the stable period, but the catch curve analysis estimated biomass in the terminal year to produce a single value of biomass called 'Ac' in the function. The function would need a vector of biomass to use the years when biomass was stable to actually work, but if we use a relative F that is no longer a problem.

Additional, and this blends with issue 16

Currently (at least as well as I could tell looking at IBMWG/demonstrations/gavin/wham_mse_functions.R, I might be looking in the wrong place ), we have one catch curve function. It is called M_CC (natural mortality catch curve). M_CC is coded to use either natural mortality or SPR as a proxy Fmsy value. the stable historical period is currently not included in the function, though it could be depending on the answer to number two above. To run 'CatchCurve_Fspr' from Chris's list above, one would actually run the function M_CC with method=1, and to run 'M_CC' from Chris's list above, one would run function M_CC with method=3.

The question arises whether we should rename the function M_CC, to be 'CatchCurve' as Chris has done in the list above and have the natural mortality and SPR flavors just to be consistent?

If we would like to use the stable historical period in the catch curve we need to confirm that the scale or units of the relative F coming out of the stable.period function (modified AIM function) are applicable to the estimated biomass in the catch curve. Since there is only a single estimate of biomass in the catch curve analysis, not a vector, we can not input the biomass estimate from the catch curve method into the stable.period function as Jon D. did for the expand biomass method to ensure the relative F is at the correct scale/units.

Sorry for being so long winded. Just getting caught up with how the code is modified.

Also, the default value for y$AIM_npts should be one and not ten as it is currently coded. the input values for the index based method functions are in multiple locations in Gavin's folder so I was not sure where to correct it.

Thanks,

Rich

JDeroba commented 4 years ago

My update: ibms <- c("Islope", "Itarget", "true_Skate_CR", "PlanBsmooth", "ExpandSurvey_recentavgrelF", "ExpandSurvey_F%SPR", "ExpandSurvey_stableperiodF", "ExpandSurvey_FequalM", "AIM", "CatchCurve_F%SPR", "CatchCurve_FequalM", "JoeDLM", "Ensemble")

Responding to multiple replies from Rich...I think we agree to include "ExpandSurvey_recentavgrelF" in the design matrix and test it because according to JW, "people actually do that", and so it would be valuable to have. I don't think it makes sense to evaluate the catch curve approach with the stable period exploitation rate. The stable period approach uses the expanded survey biomass time series to ID a target exploitation rate that would then be applied to a biomass value derived in a completely different manner when using the M_CC function. Thus, we could be using one value of biomass derived from surveys that is entirely inconsistent with the value derived in the M_CC function, and that doesn't make sense to me and I doubt anybody would actually do that. Even if we didn't expand the surveys, the logic that I consider problematic still holds. For example, would you actually derive an F from a survey that was declining in recent years, and apply that F to a biomass value derived in a different way and possibly trending in a different direction? We can discuss further though.

No need to rename the functions to be consistent with Chris' design matrix. I think the design matrix names will link to a "tibble" that will contain the correct function names and input values. I'll change the default y$AIM_npts to 1 instead of 10 in the IBM_options script. If we're not going to change this value as part of the study design, then an alternative would be to hard code it somewhere to avoid an accidental user change, but not a big deal.

rich-bell commented 4 years ago

Hello,

That sounds fine to me. Using the relative F developed based on the biomass estimate from the expand biomass function is what threw up a red flag for me. That said, we could modify the catch curve function to produce a vector of total mortality (Z) based on all the cohorts, then use the estimate of natural mortality to get a vector of F, then back calculate the biomass from the Z, F and catch in the Baranov catch eq. Once we had that biomass estimate produced with the catch curve method, we could input that into the stable period method to produce a proxy Fmsy for the catch curve. I am not saying we need to do it and considering the time, we likely should just move forward, but it could be done.

Rich

thefaylab commented 4 years ago

All, I'm trying to sort out the scenario specification list using the design matrix and some lookup tibbles that convert specifications to the relevant input object names.

Realized I needed to change the PlanBsmooth function to the internal one, but I can't seem to find the ExpandSurvey_XXXX functions anywhere - the only function I can locate is ExpandSurvey_modified().

It's possible that (per @jonathanderoba comment above) these are different options for the ExpandSurvey_modified() function that are specified in IBM.options (thus adding to the charge for changing where these are entered). I think these different methods are chosen via y$expand_method, with "ExpandSurvey_recentavgrelF" is y$expand_method=4 "ExpandSurvey_F%SPR" is y$expand_method=1 "ExpandSurvey_stableperiodF" is y$expand_method=2

Can someone confirm? If this is right I will ensure the info gets passed through OK. Thanks!

gavinfay commented 4 years ago

Oops, realized I was logged in to my lab's account instead of me.

JDeroba commented 4 years ago

Gavin, what you surmised is correct. The four ExpandedSurvey_xxxx options in the design matrix are all variants of the ExpandedSurvey_Modified function controlled via y$expand_method. You missed y$expand_method=3 uses F=M. I didn't know how else to type those into the design matrix. I hope it's not problematic. If it's easier we can make each of the four alternatives a separate function.

JJD

gavinfay commented 4 years ago

Got it. Thanks Jon. No, I don't think there's any need to make separate functions - I've adjusted the setup_scenarios.R` to be adaptive. Hoping to commit this change soon....