Closed mohammadali-67 closed 2 years ago
Hello,
As you will notice, there is an environment.yml file on the initial Github folder. Then you have to be in the same directory and run "conda env create -f environment.yml" in order to automatically get installed the correct versions of Python libraries. After the installation of the necessary libraries, you can start exploring the code. To do so, you have to place the downloaded data from Zenodo inside the "data" folder in a structure as shown here (https://github.com/marine-debris/marine-debris.github.io#dataset-structure). If you need further help, please let me know. Could you please share the received error?
Thanks for your reply... I followed your instruction and I received the following error
Solving environment: failed
ResolvePackageNotFound:
I've tried to install libkml package but it did not fix the error
As i can see from https://anaconda.org/conda-forge/libkml/files , the package "libkml==1.3.0=h9859afa1013" is broken. However, when i try a clean marida libraries installation using "conda env create -f environment.yml" the package "libkml-1.3.0-h9859afa1014" is actually used. Did you try "conda update conda" before install the environment? Let me know.
I updated the conda as you suggested (and doublechecked) but unfortunately received the same error again. Is there a way that I can manually install this package?
I suggest installing manually those libraries https://github.com/marine-debris/marine-debris.github.io#installation-requirements using "conda install 'package name'" and try to run the script that you are interested in. If a script does not run due to a missing package you can install the missing package then. Regarding your previous question, you can use this "conda install -c conda-forge libkml" to install manually this libkml library. However, I think that the libkml will be automatically installed when you try to install the gdal library.
Hi,
Usually, when I test environments between different OSs I get the PackageNotFound error. Can you provide the environment using the export command: conda env export --no-builds > environment.yml
This unties the environment from the Python version and OS. Maybe this can be a solution.
Hi @EmanuelCastanho ,
I attached the new environement.yml file based on your suggestions. @mohammadali-67 , let me know if this one solves your issue.
Thank you
So, both environments didn't work for me. I am installing the packages manually, but I am stuck on the Unet because of this problem: https://stackoverflow.com/q/64772335/9136912
This ordered list of package installation works until Unet python train.py:
conda install -c conda-forge gdal
pip install pandas
pip install tqdm
pip install tables
pip install torch
pip install torchvision
pip install tensorboard
pip install setuptools==59.5.0
pip install -U scikit-learn
->I will try to change the num_workers to 0.
Update:
Using python train.py after changing the default values of --num_workers
(to 0), --prefetch_factor
(to 2) and --persistent_workers
(to False) seems to work on MacOS. However, the following warning appears and I don't know how much it changes the final results: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use zero_division parameter to control this behavior.
Is anyone else having this problem?
Great @EmanuelCastanho! This warning appears as during the training the provided baseline U-Net model may not predict any samples for one class. For instance, as it is stated here (https://doi.org/10.1371/journal.pone.0262247.t004), the U-Net model predicts only a few annotated Natural Organic Material (NatM) pixels.
Thanks @gkakogeorgiou I manually install the required package and could run the model, thanks a lot! I have two question regarding the U-Net. 1) I saw in the Classification Mask files (files ending with '_cl.tif') there are lots of pixels with zero value which means they have no labels. How Unet could learn to classify theses pixels? I saw in the Unet you subtract the classification mask data with 1. why is that? 2) The output of U-Net is different from Classification Mask files. shouldn't the model produce similar result with ground truth data? or is it related to the so called weakly supervised semantic segmentation? Thank you so much
Nice @mohammadali-67!
We subtract 1 in order to use the standard python indexing (1st class labeled as 0). Additionally, we use this "ignore_index=-1" argument to use supervision only on the annotated pixels (https://github.com/marine-debris/marine-debris.github.io/blob/main/semantic_segmentation/unet/train.py#L157).
The U-Net should produce similar results, however, this weakly supervised semantic segmentation task on the aquatic environment is a difficult task. As stated in the paper, the Random Forest model achieves better and more stable results. This happens because the U-Net model is not the best choice for this pixel-level marine debris task. Although we provide a deep learning baseline, we believe that the research community should focus also on other deep learning architectures that are tailored for this task. There are some ideas in the "Discussion and challenges" section.
Would you please tell me the exact way of installing it? where should I move the downloaded files? I run the script (conda env create -f environment.yml) on anacondas prompt but I received an error.
Your help is appreciated