Closed PaulChern closed 3 years ago
Hi, Thanks for proofreading of the code!
Indeed np.einsum('ij, ji-> ij') is equivalent to A* B.T On my computer the former is a litter faster.
np.einsum('ij, jk', A, B) is A@B.
t = self.get_Delta(iatom) @ Gij_up @ self.get_Delta(jatom) @ Gji_dn is correct. But note that :np.trace(A@B) = np.sum (AB.T). So using AB.T with np.sum should be equivalent.this is slower, but we could use the intermediate result to do the orbital-by-orbital decomposition.
Best regards, HeXu
Dear He,
Thanks a lot for the explanation! That's brilliant.
Best, Peng
Dear Developers, There are places in the code trying to do matrix multiplication with
np.einsum('ij, ji-> ij')
However, it seems that('ij,ji->ij', A, B)
is actually element-wise multiplication of A and B.T the correct way should benp.einsum('ij,jk', A, B)
which is matrix multiplication of A and B or simply to uset = self.get_Delta(iatom) @ Gij_up @ self.get_Delta(jatom) @ Gji_dn
is fine as commented out in the codeIn my opinion, matrix multiplication instead of element-wise multiplication is what you want to do. The results can be essentially different.