NifTK / NiftyNet

[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
http://niftynet.io
Apache License 2.0
1.36k stars 403 forks source link

Why are the preprocessing layers deepcopied? #428

Closed fepegar closed 5 years ago

fepegar commented 5 years ago

I'd like to keep track of the random augmentation parameters applied to each single sample. Every time the reader is called, the preprocessors are deepcopied, therefore the information I want is not stored.

Why are these layers deepcopied before being randomised?

fepegar commented 5 years ago

It seems to work fine without the deepcopy operation.

wyli commented 5 years ago

Hi @fepegar because that function could be called from multiple threads simultaneously, deepcopy ensures the 'randomness'.

fepegar commented 5 years ago

Thanks @wyli