When we solve the Errors like:
"EOFError: Ran out of input"
"RuntimeError: unexpected EOF, expected 31322553 more bytes. The file might be corrupted."
"_requests.exceptions.SSLError: HTTPSConnectionPool(host='zenodo.org', port=443): Max retries exceeded with url: /record/3518331/files/best_weights_ep143.npz?download=1 (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (ssl.c:1129)')))"
We often use----“rm -r ~/.tractseg”This does solve the problem of reporting errors,but we have to download same pretrainedweights(~140MB)_data for different subjects frequently. When we treat one subject, we have to download different_pretrainedweights(~140MB)_data seven times (aboat 980M):
__Loading weights from: /home/w/.tractseg/pretrained_weights_tract_segmentation_v3.npz
Downloading pretrained weights (~140MB) ...
In batch_TractSeg_data processing, this will consume significant network traffic, time and energy.
Can you optimize the program for optimal time and network traffic consumption? or can we have a better other solution for the problem.
Thank you.
When we solve the Errors like: "EOFError: Ran out of input" "RuntimeError: unexpected EOF, expected 31322553 more bytes. The file might be corrupted." "_requests.exceptions.SSLError: HTTPSConnectionPool(host='zenodo.org', port=443): Max retries exceeded with url: /record/3518331/files/best_weights_ep143.npz?download=1 (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (ssl.c:1129)')))" We often use----“rm -r ~/.tractseg” This does solve the problem of reporting errors,but we have to download same pretrainedweights(~140MB)_data for different subjects frequently. When we treat one subject, we have to download different_pretrainedweights(~140MB)_data seven times (aboat 980M): __Loading weights from: /home/w/.tractseg/pretrained_weights_tract_segmentation_v3.npz Downloading pretrained weights (~140MB) ...
Loading weights from: /home/w/.tractseg/pretrained_weights_endings_segmentation_v4.npz Downloading pretrained weights (~140MB) ...
Loading weights from: /home/w/.tractseg/pretrained_weights_peak_regression_part1_v2.npz Downloading pretrained weights (~140MB) ..
Loading weights from: /home/w/.tractseg/pretrained_weights_peak_regression_part1_v2.npz Downloading pretrained weights (~140MB) ...
Loading weights from: /home/w/.tractseg/pretrained_weights_peak_regression_part2_v2.npz Downloading pretrained weights (~140MB) ...
Loading weights from: /home/w/.tractseg/pretrained_weights_peak_regression_part3_v2.npz Downloading pretrained weights (~140MB) ...
Loading weights from: /home/w/.tractseg/pretrained_weights_peak_regression_part4_v2.npz Downloading pretrained weights (~140MB) ...__
In batch_TractSeg_data processing, this will consume significant network traffic, time and energy. Can you optimize the program for optimal time and network traffic consumption? or can we have a better other solution for the problem. Thank you.