SimonBiggs / npgamma

Using numpy broadcasting to find the gamma index
GNU Affero General Public License v3.0
19 stars 7 forks source link

Our recent experience with npgamma #9

Closed AlainSottiaux closed 7 years ago

AlainSottiaux commented 7 years ago

Dear Simon,

We try to compare dose distribution from TPS against dose from Monte Carlo simulation, in CT. So we are looking for 3D gamma comparison tools, in order to compare dose in a quantitative way. A few days ago, we found npgamma. There are not so many 3D gamma tools available.

We started from your example, just replacing the example DICOM files by our's. It worked immediately! Our reference and evaluation gamma don't have the same size and resolution (pixel size). Thank you for your great work.

We did some adjustments. The first is simply to use subplot for doses and gamma viewing (all plots visible in the same window). We noticed that the y axis was reversed on slices. We didn't check yet if it specific to our TPS or if it's general to DICOM. The MC software we use computes dose on all CT image, and not only inside patient (we can do the same in TPS, but we don't care about dose outside patient, and it may change a lot passing rate of gamma). The gamma ignore voxels with dose lower than cutoff in evaluation. outside patient, we have 0 dose in reference (TPS), but dose in evaluation (MC), and sometimes higher than cutoff, so the gamma fails there. Before calculating gamma, we clean evaluation dose. For every voxel in evaluation set, we find the closest one in evaluation (resolution is different). If reference dose is lower than cutoff, we put 0 (or a dose lower that cutoff) in evaluation. So gamma is not calculated if lower than cutoff in evaluation or reference dose. This gives the expected result. One other way could be to limit the gamma calculation only inside a DICOM structure (for sure pydicom and numpy provides all needed tools for that). We didn't tested yet if the dose resolution (around 2x2mm for evaluation,2.5x2.5mm for reference, slice thickness of 2mm for both) don't give false negative for gamma. AFAIK, it's recommended to have a resolution 3 times smaller than gamma distance criteria. Interpolation would solve that possible issue. Is your gamma calculation sensitive to poor resolution (similar or worst that distance criteria)? Did you consider to use multithreading to use mutliple cores for the gamma calculation? It would improve calculation speed. In your example, you could add the calculation of gamma passing rate. Thank again for your work.

Alain Sottiaux

SimonBiggs commented 7 years ago

Hi Alain,

Thank you very much for your feedback. It is greatly appreciated. I tend to justify which projects I spend time on based upon user interest, so thank you.

With regards to masking dose based on dicom structures you might find some code I wrote for determining DVHs useful (it hasn't undergone sufficient testing, be wary): https://github.com/SimonBiggs/dicom-utilities/blob/master/DVH%20from%20dicom.ipynb

You also can also feel free to adjust the following code to increase the lower dose cutoff:

lower_dose_cutoff = np.max(dose_reference) * 0.2

With regards to interpolation, the code actually by default interpolates down to a tenth of the distance threshold. The following parameter details how finely the grid is interpolated down to:

distance_step_size = distance_threshold / 10

However the interpolation doesn't solve all of the problem. Having a relatively large spacing between input grid points may still produce false negatives due to the fact that interpolation is linear and the true dose may have a higher gradient closer to the point in question.

With regards to speed, if you have sufficient RAM on your computer you may wish to set the following parameter as so:

max_concurrent_calc_points = np.inf

That will make all points for a given distance be calculated simultaneously. That may provide quite the speed boost (or a memory error if your computer is not up to it).

With regards to multithreading, I had actually considered one step further. I thought mutlithreading would help, but not as much as pycuda would (https://mathema.tician.de/software/pycuda/). However there hasn't before you been a user who was interested in speeding the calculation up further. Multithreading won't actually be too difficult as I have already laid the foundation for splitting up the calculation in order to be compatible with low RAM systems.

With regards to the gamma pass rate in my example I have the following line:

np.sum(valid_gamma <= 1) / len(valid_gamma)

That should determine the proportion of gamma which passes.

Thanks for the feedback. Cheers, Simon

AlainSottiaux commented 7 years ago

Hi Simon,

Thank you fro your answer and suggestions. I didn't notice the passing rate calculation in your example (so I recalculated my self) I tested to change the max_concurrent_calc_points: max_concurrent_calc_points = np.inf Gamma calculation time goes from approximately 515s to 435s (~15% improvement). Memory consumption is increased (from ~1.2Gb to ~3.5Gb) We already tried to incrase the cutoff dose (up to 30 or 35%), because we have such doses outside patient in our evaluation dose. It's difficult to evaluate the appropriate cutoff level. It's the reason why we tried to "clean" the evaluation dose. Your example using contours looks clear. We will try to use it to filter dose comparison inside a given volume. We may try to add the contours and/or the CT on plot with dose/gamma values

Best regards,

Alain

SimonBiggs commented 7 years ago

15% improvement in time isn't much. I'll see if I can get multithreading up and running in the near future. I'll keep you up to date.

On Fri, 20 Jan 2017, 10:12 PM AlainSottiaux notifications@github.com wrote:

Hi Simon,

Thank you fro your answer and suggestions. I didn't notice the passing rate calculation in your example (so I recalculated my self) I tested to change the max_concurrent_calc_points: max_concurrent_calc_points = np.inf Gamma calculation time goes from approximately 515s to 435s (~15% improvement). Memory consumption is increased (from ~1.2Gb to ~3.5Gb) We already tried to incrase the cutoff dose (up to 30 or 35%), because we have such doses outside patient in our evaluation dose. It's difficult to evaluate the appropriate cutoff level. It's the reason why we tried to "clean" the evaluation dose. Your example using contours looks clear. We will try to use it to filter dose comparison inside a given volume. We may try to add the contours and/or the CT on plot with dose/gamma values

Best regards,

Alain

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/SimonBiggs/npgamma/issues/9#issuecomment-274046383, or mute the thread https://github.com/notifications/unsubscribe-auth/AGQVeyo-lOYauW7SxtFQ2C0fQOVudWqXks5rUJSYgaJpZM4Lonzp .

robmarkcole commented 7 years ago

Hi guys, a piece of code I found useful for DVH analysis is dicompyler-core https://github.com/dicompyler/dicompyler-core

My analysis is here https://github.com/robmarkcole/Useful-python-for-medical-physics/blob/master/Experiments%20in%20ipython%20notebooks/dicompyler-core%20experiments/AXB%20project/Lung%20plan%2030%20exported%2011-10-2016/Case_30%20Lung%20PTV%2031-10-2016.ipynb

Cheers Robin

SimonBiggs commented 7 years ago

That is nice. Thanks Robin.

SimonBiggs commented 7 years ago

Hi @AlainSottiaux,

I've just implemented multithreading. Going from 1 core up to 4 cores acheived an approximate 1 third reduction in calculation time on my machine. The new parameter is called "num_threads", to use it you will need to upgrade to numpy-0.6.0. See the updated example for how to use it:

http://nbviewer.jupyter.org/github/SimonBiggs/npgamma/blob/master/Module%20usage%203D.ipynb

Cheers, Simon