Closed javdrher closed 7 years ago
Merging #31 into master will increase coverage by
0.01%
. The diff coverage is100%
.
@@ Coverage Diff @@
## master #31 +/- ##
==========================================
+ Coverage 99.69% 99.71% +0.01%
==========================================
Files 8 8
Lines 660 690 +30
==========================================
+ Hits 658 688 +30
Misses 2 2
Impacted Files | Coverage Δ | |
---|---|---|
GPflowOpt/acquisition.py | 100% <100%> (ø) |
:arrow_up: |
GPflowOpt/bo.py | 98.24% <100%> (+0.09%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 7bba551...cf158c3. Read the comment docs.
I thought about this, and the rebuilding on a free state change is a broader issue, and is also tightly coupled with the callback for setting model hyperparameters (#7), so I'll build a new PR which builds on this.
Adding support for performing Monte-Carlo marginalization of the hyperparameters. This integrates quite nicely into the framework. I haven't tested yet, but it seems that you would be able to construct acquisition hierarchies, mixing point estimates with MCMC.
There is one minor twist which has to addressed: the following code would currently result in errors:
The fix changes the size of the free state, but the copies still maintain the old one. Somehow, I should detect this change, and probably take new copies and replace the operands ParamList. I could verify the length of
get_free_state
but this is not a very robust fix: plenty of situations where no change would be detected. I could verify the position of every Param using theget_param_index
, but I noticed this method does not take into account the fixed/unfixed state either (is this actually intended GPflow behaviour?)Perhaps the easiest way out would be to detect the _needs_recompile flag in _optimize_models and maintain a flag indicating a change in the likelihood graph of the model. This would also catch changes in transforms which would also affect the semantics of the free state. Only changes of prior might introduce some overhead. Any thoughts on this?