gevero / enet_tensorflow

Enet implementation in tensorflow
14 stars 3 forks source link

process_label function not found #2

Closed khg2478 closed 3 years ago

khg2478 commented 3 years ago

Hello.

Thank you for the tf2 version of enet implementation.

I want to test the train code but I figured out that there is no "process_label" function, which is written in 78th line in train.py file.

I thought that would be "map_label" function in utils.py file. Am I right??

As of restriction in my company, I tried to train the code with cityscapes_train.sh but unfortunately, due to this function issue or something else, I wasn't able to run the train code.

I will try with camvid dataset in a different environment but wish to run the code with cityscapes too.

How can I fix this?

Thanks.

gevero commented 3 years ago

Hi.

I will try to look into this issue today and get back to you.

Best

Giovanni

khg2478 commented 3 years ago

Hi.

I will try to look into this issue today and get back to you.

Best

Giovanni

Thank you so much. Let me know please. :)

gevero commented 3 years ago

Hi

So:

Best

Giovanni

khg2478 commented 3 years ago

Hi

So:

  • the call to process_label is wrong, and it needs to be substituted with map_label in train.py
  • it is also necessary to remove the class_weight=class_weights, line from the fit commands, since class weights for 3d tensors are not supported anymore inf tf.keras
  • I actually never tried the code on the cityscapes dataset (i did not have the resources), and only put in place a tentative shell script. Once you downloaded and setup the dataset, you can try the shell script and see where it gets you after the modifications I suggested.
  • You can try the model on colab with camvid dataset, the notebook should work out of the box.

Best

Giovanni

Thank you so much for your prompt trial and suggestion. I tested your code with camvid dataset in jupyter-notebook environment and found out it works fine.

I really appreciate your help and the neat code. It's been very useful for me.

Have a great day.

Best,

Hangil