Open ambarb opened 4 years ago
@afluerasu will check to see if we can just delete this or if anything is of use for tests or better code.
for now, let's ignore p2 and plan to delete.
Looks like the normalization is different between the two. (normalizing or not by the total intensity in each frame - imgsum). Both schemes are valid and useful sometimes and we should definitely think about keeping a flexible way of doing different normalizations of the correlation functions is needed. Keeping, however, correlation and correlation2 as two almost identical copies of the code is certainly not the best implementation ... :) I'll look at exactly what we are doing and try to propose a flexible solutions..
else: fra_pix[ pxlist] = v[w]/ norm[pxlist] #-1.0
170,174c168,169 < S = norm.shape < if len(S)>1:
< fra_pix[ pxlist] = v[w]/ imgsum[i]/ norm[i,pxlist] < else:
< fra_pix[ pxlist] = v[w]/ imgsum[i]/ norm[pxlist]fra_pix[ pxlist] = v[w]/ imgsum[i]/ norm[pxlist]
From @yugangzhang 's google doc on pyCHX,
chx_correlationp
is the parallel computation of g2 using compressed data format.Looks like
chx_correlationp2
was added to debug g2.Can we delete this module from
.v2._commonspeckle
OR
Do we need to integrate parts of this module into the
chx_correlationp
or as tests forchx_correlationp
@yugangzhang or @afluerasu if this is quick for you to answer, this will save tons of time.
bash shell diff on two files