Zyun-Y / DconnNet

Codes for CVPR2023 paper "Directional Connectivity-based Segmentation of Medical Images"
138 stars 7 forks source link

Error #26

Closed OopsXie closed 9 months ago

OopsXie commented 10 months ago

image Hello,sir.I can't deal with this error,please help me.

jzyxn commented 10 months ago

It looks like you don't have the dataset ISIC2018_npy_all_224_320.

OopsXie commented 10 months ago

It looks like you don't have the dataset ISIC2018_npy_all_224_320.

I downloaded the dataset from the link given on the author's homepage, and the names of the data in it are not consecutive. The resized data we used and the implementation of training in our paper follows this site

jzyxn commented 10 months ago

"Data preparation We cropped the ISIC 2018 dataset to 224*320 and saved it in npy format, which can be downloaded from Baidu web disk.

link: https://pan.baidu.com/s/1bIVUdzYG_7tuwalbI4Y8Ww

password: c36c" The author put the complete data here, the folder "data/ISIC2018_npy_all_224_320" is not complete

OopsXie commented 10 months ago

"Data preparation We cropped the ISIC 2018 dataset to 224*320 and saved it in npy format, which can be downloaded from Baidu web disk.

link: https://pan.baidu.com/s/1bIVUdzYG_7tuwalbI4Y8Ww

password: c36c" The author put the complete data here, the folder "data/ISIC2018_npy_all_224_320" is not complete

Thank you for your answer. I downloaded my dataset from this Baidu web disk link and I don't know why it shows missing files after running it.

Can you run this dataset and can you show me the final training results.

jzyxn commented 10 months ago

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch.

If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

OopsXie commented 10 months ago

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch.

If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

jzyxn commented 10 months ago

There is no additional operations if you run it in Colab. And my downloaded dataset indeed contains ISIC_0010344.np. I think you should check the dataset and then configure the environment according to the errors reported.

jzyxn commented 10 months ago

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch. If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

The dataset has 2594 items in all.

jzyxn commented 10 months ago

I downloaded again, it has no missing items.

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch. If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

The dataset has 2594 items in all.

OopsXie commented 10 months ago

think you very much,I will try it again in the colab

OopsXie commented 10 months ago

I downloaded again, it has no missing items.

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch. If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

The dataset has 2594 items in all.

I finished it. Thank U. I can run it successfully on colab,but can not on my server. It is so strange. colab: image

my server:Still showing missing files image

jzyxn commented 10 months ago

I downloaded again, it has no missing items.

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch. If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

The dataset has 2594 items in all.

I finished it. Thank U. I can run it successfully on colab,but can not on my server. It is so strange. colab: image

my server:Still showing missing files image

Sorry, it's really hard for me to find out the reason. I found error when I run it in Kaggle, with the problem of package apex, but the problem seems to be different from yours.

OopsXie commented 10 months ago

I downloaded again, it has no missing items.

image The training time is rather time consuming (my GPU resources are poor), so I'll just show you the results of one epoch. If you downloaded the dataset from this Baidu web disk link,you can try to configure the environment as follows image image image

Thank you very much.Here's my debugging process " I first downloaded the code for this github project and imported it to the server Then I downloaded the ISIC2018_npy_all_224_320 dataset and put it in the data directory Then I modified the ISIC script file in the scrip folder Finally ran isic.sh via bash isic.sh (which of course downloaded some missing dependencies) " How did you do this, did you modify any other files?

The dataset has 2594 items in all.

I finished it. Thank U. I can run it successfully on colab,but can not on my server. It is so strange. colab: image my server:Still showing missing files image

Sorry, it's really hard for me to find out the reason. I found error when I run it in Kaggle, with the problem of package apex, but the problem seems to be different from yours.

Yes, the apex version issue happened to me as well. Anyway, I really appreciate you helping me and giving me very useful instructions. Thanks again.