NifTK / NiftyNet

[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
http://niftynet.io
Apache License 2.0
1.36k stars 404 forks source link

Modify docs about smaller_final_batch_mode #377

Open fepegar opened 5 years ago

fepegar commented 5 years ago

The smaller_final_batch_mode parameter is ignored during training as the dataset is "infinite" (samples yielded by a generator). Dropping a smaller batch won't happen. During inference, users will rarely want to specify if they want to specify whether they want to use pad or dynamic.

Should this parameter be completely removed from training config and hardcoded to pad for inference?

@wyli