Closed sujanbharadwaj closed 3 years ago
@sujanbharadwaj
The calculation of the G_l components is achieved through integration of the imaginary-time Green functions according to Eq 2 in the Paper you mention.
I.e., if you want to calculate your G(tau) multiple values of n_l you can simply take the following steps
edit: sry was a few minutes too slow. What @Wentzell writes is of course also a practical solution for exactly what you asked for. The following could be a still a bit easier for calculations in practice.
Hi @sujanbharadwaj, indeed the automatic fitting and choice of number of legendre polynomials is not yet implemented, and would be nice to have! After the PRB84 there has been much more progress in this direction. Please take also a look at: https://doi.org/10.1103/PhysRevB.98.035104 and especially at the practical implementation https://doi.org/10.1016/j.cpc.2019.02.006 . This is a freely available python package.
I think you can skip all these steps with calculating various n_l at every G(tau) and find the optimal number of n_l for your current numerical precision much easier: in the PRB above it is shown that G_l decays exponentially. You can use that property to determine n_l, by plotting G_l as function of n_l as a log plot:
As you can see the even / odd G_l coefficients decay very fast until a certain point. This point where the exp. decay stops gives you n_l. Another way to find n_l from noisy data might be found in the python package mention above in the jupyter example: https://github.com/SpM-lab/irbasis/blob/master/sample/step_by_step_examples.ipynb . Maybe @shinaoka has a pracital idea? So far I usually did this graphically by fitting G(tau) with various n_l and determining the point where the noise on G_l sets in. You can use for that the triqs function https://triqs.github.io/triqs/unstable/documentation/manual/triqs/gfs/py/tools.html?highlight=fit%20legendre#triqs.gf.tools.fit_legendre . I hope that helps.
code for plotting:
Gl = h5['Gimp_l']
fig, (ax1) = plt.subplots(1,1,figsize=(8,8))
block = 'up_0'
orb = 0
# even nl
nl = range(0,len(Gl[block][orb,orb].data[:].real),1)[0::2]
ax1.semilogy(nl,(np.abs(Gl[block][orb,orb].data[0::2])),"o-", color='C0', label = "$G_l$ even", linewidth = 1.5)
# odd n_l
nl_odd = range(0,len(Gl[block][orb,orb].data[:].real),1)[1::2]
ax1.semilogy(nl_odd,(np.abs(Gl[block][orb,orb].data[1::2])),"x-", color='C1' ,label = "$G_l$ odd", linewidth = 1.5)
ax1.set_xlabel(r"$l$")
ax1.set_ylabel(r"$|$G$_{l}|$")
When I use the Legendre basis, I simply use as many as basis functions, say 200 basis functions and adopt all the coefficients. I am always afraid that using two few coefficient causes under-fitting/systematic errors.
This conversation was very helpful and constructive. Thank you all.
Hi, I am trying to determine the number of Legendre coefficient(n_l) systematically. I am trying to reproduce Figure 2 in [http://dx.doi.org/10.1103/PhysRevB.84.075145], where they have plotted Imaginary time greens function G(tau) at four different values of tau as a function of Legendre coefficients. Thereby they are suggesting us to determine n_l systematically. I believe that for each beta value we need to determine n_l in this way. One (hard) way of reproducing that graph is to do the calculation for different n_l's separately, which takes a lot of time(for higher beta values this seems impossible). *Is there any way that we can calculate G(tau) values for different n_l's in a single calculation in TRIQS-cthyb?.