Closed PetMetz closed 1 year ago
Reshaping vectorized computation was mixing up indices... would need to think about why.
def Gijmn(self, s) -> np.ndarray:
r"""
.. math::
G_{ijmn} = \tau_i \tau_j \tau_m \tau_n,
(i,m) = 1,2,3,
(j,n) = 1,2.
"""
a = np.zeros((3,2,3,2)) # .reshape((-1,4))
I = np.indices((3,2,3,2)).T.reshape((-1,4))
tau = self.tau(s) # (-1,)
for index in I:
a[tuple(index)] = np.product([tau[i] for i in index])
return a
# rv = np.product(tau[I], axis=1) # NB this works because tau is 1D, hence I is treated as an integer mask
# return rv.reshape((3,2,3,2)) # .round(tbx._PREC)
I think this is resolved
MLS gives the geometrical portion of the contrast factor as $G_{ijkl}$ ( $i,k = 1,2,3, j,l = 1,2$ ) as
$$ G_{ijkl} = (\vec{d}^ \cdot e_i )(\vec{d}^ \cdot e_j )(\vec{d}^ \cdot e_k )(\vec{d}^ \cdot e_l )\ \ \ \ \ [1] $$
but the tensor formed by the Dislocation interface via lazy computed properties gives different values than direct evaluation of the loop in equation 1.