When running jobs using real hardware, they may queue for a long time.
During this time, I may purposefully (or accidentally) close my Python session/kernel.
When this happens, I can obtain my job's results using the QiskitRuntimeService. However, this is only the job that gets submitted internally within the POVMSampler and the actual job object which I need for the POVMPostProcessor is a POVMSamplerJob which also contains the appropriate POVM metadata.
We should provide some simple interface for:
storing the relevant metadata
recovering the proper job object using the internal job's ID and stored metadata
This may touch upon some intricate details w.r.t. serialization of the metadata (which I don't think is possible right now because of the POVMImplementation stored inside of it). Maybe we can rethink the contents and instead store some data from which the appropriate POVMImplementation can be recovered (e.g. the angles and bias).
When running jobs using real hardware, they may queue for a long time. During this time, I may purposefully (or accidentally) close my Python session/kernel.
When this happens, I can obtain my job's results using the
QiskitRuntimeService
. However, this is only the job that gets submitted internally within thePOVMSampler
and the actual job object which I need for thePOVMPostProcessor
is aPOVMSamplerJob
which also contains the appropriate POVM metadata.We should provide some simple interface for:
This may touch upon some intricate details w.r.t. serialization of the metadata (which I don't think is possible right now because of the
POVMImplementation
stored inside of it). Maybe we can rethink the contents and instead store some data from which the appropriatePOVMImplementation
can be recovered (e.g. theangles
andbias
).