gschramm / parallelproj

code for parallel TOF and NONTOF projections
MIT License
27 stars 8 forks source link

Issue regarding the list-mode negative Poisson log-likelihood function #80

Closed zjutk closed 1 month ago

zjutk commented 1 month ago

Dear Professor Georg Schramm,

I recently reviewed the various algorithm examples in the Iterative Listmode Algorithm Examples and noticed your expression for the list-mode negative Poisson log-likelihood function:

image

I would like to confirm with you whether this expression aligns with the Listmode re-formulation of the sinogram-based minimization problem in your paper "Fast and memory-efficient reconstruction of sparse Poisson data in listmode with non-smooth priors with application to time-of-flight PET."

Additionally, I observed that your expression for the list-mode negative Poisson log-likelihood function appears to differ from those mentioned in other literature, such as:

1721359550424

K. Ote, F. Hashimoto, Y. Onishi, T. Isobe and Y. Ouchi, "List-Mode PET Image Reconstruction Using Deep Image Prior," in IEEE Transactions on Medical Imaging, vol. 42, no. 6, pp. 1822-1834, June 2023, doi: 10.1109/TMI.2023.3239596.

Can these two expressions be considered fundamentally equivalent?

I look forward to your response. Thank you!

Sincerely, Kun Tian

gschramm commented 1 month ago

Hi Kun,

  1. The equation from our paper is the negative Poisson logL using sinogram data (y and \bar(y)) are sinograms and the sum runs over all sinogram bins "i".
  2. The definition of Ote is equivalent except for:
    • a global minus sign (they define the logL, we the negative logL)
    • they ignore additive contaminations such as scatter and randoms

To see that they are equivalent, consider this:

$$logL = \sum_i y_i \log \bar{y_i} - \sum_i \bar{y_i} $$

now rewrite $y_i$ as

$$y_i = \sum_1^{y_i} 1$$

(the counts in sinogram bin "i" are a sum of ones $y_i$ times)

$$logL = \sum_i \sum_1^{y_i} \log \bar{y_i} - \sum_i \bar{y_i} $$

the first two sums can be rewritten as the sum over all events (with event index $t$ and corresponding sinogram bin $i_t$)

$$logL = \sum{events \ t} \log \bar{y}{i(t)} - \sum_i \bar{y_i} $$

If we now insert the forward model (ignoring scatter and randoms)

$$\bar{y_i} = \sumj a{ij} x_j$$

we get the expression of Ote et al.