tempoCollaboration / OQuPy

A Python package to efficiently simulate non-Markovian open quantum systems with process tensors.
https://oqupy.readthedocs.io
Apache License 2.0
78 stars 26 forks source link

Improve guess_tempo_parameters depending on system parameters #96

Closed piperfw closed 3 months ago

piperfw commented 1 year ago

Currently oqupy.guess_tempo_parameters takes an optional BaseSystem object but does nothing with it.

@gefux Was the idea with this parameter to compute the system-environment commutator and so infer a suitable timestep to avoid Trotter error? A simple thing we can do is look at the eigenvalues and rates of the system Hamiltonian and Lindblad terms, although I understood the Trotter error was normally more important to characterise.

In the meantime we should probably remove the optional argument or add a NotImplemented note or similar. Overall I think guess_tempo_parameters is a potentially a very useful function for people starting out with OQuPy.

gefux commented 1 year ago

@piperfw Yes, that was the idea. I wrote this function as a zeroth approximation to a good parameter estimation and thought that future better approximations might want to make use of information in the system (such as looking at the eigenvalues as you suggest). I guess that having this optional argument makes sense for future-compatability, i.e. one can setup the code with passing the system and a future version of OQuPy might actually use that information. But to avoid confusion I agree that making a warning that it currently doesn't make use of the system information could be good.

piperfw commented 1 year ago

Thanks @gefux, would something along the following lines be appropriate?

# sample points (trivial for t-independent system)
sample_times = [0, end_time/2, end_time]
# sample system Hamiltonian, dissipation rates and commutator with bath coupling
HS_samples = [sys.Hamiltonian(t) for t in sample_times]
gamma_samples = [np.max(sys.gammas(t)) for t in sample_times]
HE_com_samples = [comm(HS, bath.coupling_operator) for HS in HS_samples]
# get maximum eigenvalues/rates
HS_max = np.max([np.eigh(HS)[0] for HS in HS_samples])
gamma_max = np.max(gamma_samples)
HE_comm_max = np.max([np.eigh(HE_comm)[0] for HE_comm in HE_com_samples]) 

I would guess the subtle part is knowing factors to extract suitable dt from these maximum scales e.g. dt~ HS_max

piperfw commented 1 year ago

If HS commutes with HE, then we don't need to worry about Trotter error so maybe assume dt sufficient to resolve system dynamics (in a plot) is suitable.

gefux commented 1 year ago

As you write, I think the question is what to do with that information. Although for the case of [HE, HS] = 0 the timestep dt can be arbitrarily large one might still want to resolve the system dynamics (where the eigenvalues of HS would come in). I have been working on the basis that one would use TEMPO if the dynamics is not analytically solvable (as the commuting case would be) and that one is interested in dynamics on a time scale of the environment correlation function. Hence only the correlation function determines the timestep (under those assumptions).

I appreciate that the function is a bit cryptic (that's mostly because it avoids computing correlations again that have been computed already), so I'll roughly explain what it does:

  1. It starts with 11 equally spread points in time between 0 and the maximum possible correlation time (which is the overall time evolution).
  2. _analyse_correlation() calculates the integral of the correlation function for each interval using the trapezoidal rule. It also computes the integral for each interval divided into two smaller intervals by adding a timestep in between and using the trapezoidal rule twice. Then, for each interval the error is the difference between the two values (normalized by the full integral). The function _analyse_correlation() then returns that list of errors and the cummulative integral (also normalized by the full integral).
  3. Then (back in the main function) the times for which the cummulative integral has reached it's final value (up to a tollerance) are discarded. This will determine dkmax.
  4. Then 2 and 3 are repeated until for every interval the error is below the tollerance.
  5. Finally the smallest needed timeinterval determines dt
piperfw commented 1 year ago

Thanks! That is helpful (I had not reviewed _analyse_correlation()).

I have been working on the basis that one would use TEMPO if the dynamics is not analytically solvable (as the commuting case would be) and that one is interested in dynamics on a time scale of the environment correlation function.

I understand these assumptions but also that the possibility for approximations involving the system were left open via the optional BaseSystem argument. Here's an idea that is loosely analogous to steps you list above; perhaps let me know what you think:

  1. Start with 11 equally spaced points between 0 and end_time
  2. At each point _analyse_evolution() (or _analyse_derivative() etc.) calculates the maximum local error from e.g. using Euler's method vs one with half the time-step
  3. Repeat until every interval has error below the tolerance
  4. The smallest time-interval needed determines dt_sys.

It may also be possible to compare the error [H_S,H_E] dt**2 at each step.

gefux commented 1 year ago

This seems reasonable to me. If one was to redesign / extend the guess_tempo_parameters() function, I suggest to organize it in smaller pieces which each return upper or lower bonds on the parameters which are then intersected in the main function. For example: currently we calculate the upper bound on dt based on the structure of the correlation function. The change of a Hamiltonian over time of a time dependent system as you suggest would lead to some different upper bound, etc. which are then intersected (i.e. one would need to take the minimum of the two upper bounds). The same with and 3rd or 4th reason we would come up to restrict dt in any way.

piperfw commented 1 year ago

Thanks for the input, agree intersection is the logical way to handle multiple parts. I would like to work on this if I find the time.

piperfw commented 6 months ago

I have a branch pr/parameters which calculates a maximum frequency of a system object and sets dt to the Nyquist rate if that is smaller than the dt calculated from the bath correlations.

In examples/guessing_parameters.py there are three examples I was using to test the utility of the system frequency calculation. I'll summarise the results here. In the following 'sys' refers to the parameters having included the system in the guess_tempo_parameters call, and 'nosys' for the parameters without.

A - spin-boson model based on that in that Quickstart tutorial

exampleA

B - Pulse Hamiltonian adapted from PT-TEMPO tutorial

exampleB

*C - Spin boson with a time-dependent Hamiltonian `Delta(t) sigma_x`**

exampleC

Overall, I think example C alone is probably fair reason to make this a useful feature to have. If the user doesn't pass a system object to guess_tempo_parameters, a calculation is just done using the bath correlation functions. If the user passes a system object, then the smaller dt gets chosen. A message should inform the user whether the bath or system was the 'limiting' aspect (i.e. required the smaller dt), plus the usual warning about no guarantee of convergence, please check results manually.

@gefux if you get a chance to look at these results, you may let me know what you think? The code I've added will probably need some tweaking, but the physical idea is there.

Additional information:

gefux commented 3 months ago

Solved by #124