Closed matthewfeickert closed 4 years ago
Sorry for the late reply. You can find the multiprocessing part of code here. Thanks for the help.
Hi all,
I have two new questions :
As a workaround for our memory leak in SModelS
which could come from the line model = workspace.model(modifier_settings=msettings)
in
llhdSpec = jsonpatch.apply_patch(bkg, patch)
msettings = {'normsys': {'interpcode': 'code4'}, 'histosys': {'interpcode': 'code4p'}}
workspace = pyhf.Workspace(llhdSpec)
for _ in range(10000):
model = workspace.model(modifier_settings=msettings)
result = pyhf.infer.hypotest( 1., workspace.data(model), model, qtilde=True, return_expected=False)
I've tried to instantiate this model
variable only once and then modify it in-place instead of re-instantiating it each time I rescale the signals. So I tried to change the event yields both in model.spec
and in workspace
. However, the output of hypotest
doesn't change after these modifications. I guess some other attributes in these objects need modification. Is it even possible to modify the signal yields in-place?
My second question is about the new ATLAS-SUSY-2018-06 analysis. I've tried to use it in our SModelS/pyhf interface but hypotest
seems to always return the same expected and observed CLs. Is it related to the fact that there are several fixed parameters?
Description
@lukasheinrich has already had some discussions with the SModelS developers (I think mainly @WolfgangWaltenberger) about SModelS using pyhf. We should further pursue these discussions.