faustomilletari / VNet

GNU General Public License v3.0
287 stars 122 forks source link

how to train from end to end? #4

Closed yuyinzhou closed 8 years ago

yuyinzhou commented 8 years ago

Hello,

How big is your volume data? Are all your data size the same as 128_128_64? How many volumes of 128_128_64 are you dealing with? I am also doing binary segmentation on medical CT scan dataset, but my dataset size is 512* 512* (200-1000), i,.e. the third dimension differs from case to case. So I am wondering if I feed the whole volume to the network, the memory will just blow up.

faustomilletari commented 8 years ago

Hello, we could at most process 128x128x128 volumes. One at a time. The code contains the necessary functions to resample volumetric data to 1) a common resolution (for example 1mm voxel) 2) a common size (for example 128x128x128)

Fausto Milletarì Sent from my iPhone

On 13.09.2016, at 11:01, shirley0106 notifications@github.com wrote:

Hello,

How big is your volume data? Are all your data size the same as 12812864? How many volumes of 12812864 are you dealing with? I am also doing binary segmentation on medical CT scan dataset, but my dataset size is 512* 512* (200-1000), i,.e. the third dimension differs from case to case. So I am wondering if I feed the whole volume to the network, the memory will just blow up.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

yuyinzhou commented 8 years ago

thanks ?

Sent from my iPhone

On Sep 13, 2016, at 11:06 AM, Fausto Milletari notifications@github.com<mailto:notifications@github.com> wrote:

Hello, we could at most process 128x128x128 volumes. One at a time. The code contains the necessary functions to resample volumetric data to 1) a common resolution (for example 1mm voxel) 2) a common size (for example 128x128x128)

Fausto Milletar? Sent from my iPhone

On 13.09.2016, at 11:01, shirley0106 notifications@github.com<mailto:notifications@github.com> wrote:

Hello,

How big is your volume data? Are all your data size the same as 12812864? How many volumes of 12812864 are you dealing with? I am also doing binary segmentation on medical CT scan dataset, but my dataset size is 512* 512* (200-1000), i,.e. the third dimension differs from case to case. So I am wondering if I feed the whole volume to the network, the memory will just blow up.

You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/faustomilletari/VNet/issues/4#issuecomment-246712679, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AM4GQyNeIQoPs_uufCvhpb69effcXksCks5qprvDgaJpZM4J7xxN.