Open gomezzz opened 5 months ago
(You should not try the model again yet :) )
Please add an environment.yml with dependencies so users can set up their conda envs
I am trying to add the yaml file but the one from my env has too many packages in it :), I'm afraid I will have to manually remove many packages.
Please add instructions where to put the dataset? I can put some instructions in the readme (to download the dataset zip from hugging face), right?
_In yolo_sam_inference.ipynb you load some model but this is not provided with the repo, I think? So I don't think this code will work?_
The models are already there in the subfolders but not the checkpoints. I will add some checkpoints later.
The example is missing code that can be used to measure the performance in mAP on the dataset, could you add it? :)
I am not sure what "example" means. For the mAP performance, there is the yolov8_sam.ipynb that trains SAM with bboxes from YOLO and also gives the mAP and some visualisation later. I kept the files like .ipynb because the training there is fast (compared to SAM alone), and also it is easier to see metrics.
There is also the idea about having an entry point. But I think it depends which entry point exactly? (for training YOLO and SAM alone i have some shell jobs files that call the training scripts, for training SAM using predicted YOLO bboxes and then compute mAPs there is the yolov8_sam.ipynb. So I don't have a single entry point for all. Should I also write something in the Readme about training models separately and together? Or I can put YOLO train and SAM train in the same file and for separate training?
I have added the environment yaml and changed many paths :) . Can you please clone the model repo and see if the inference works? You can also try the training files and eventually tell me whether there are any errors.
@IuliaElisa unfortunately, we only have very limited LFS storage and bandwidth. Ideally, let's aim to have all model checkpoints and data on huggingface and only download here, if possible? :) Otherwise, the project is taking up 25% of the entire storage we have :smiling_face_with_tear: (Also in the process of reducing the other one)
Sure, I will put the weights on huggingface (I think we discussed about that but I didn't put them yet).
I added the checkpoints on huggingface, I hope the space has significantly decreased. I also added download_dataset_and_weights.ipynb to download the dataset and the weights.
Should we maybe rename this repo to XAMI-Model or something? I am worried people might find it when they look for the dataset and be confused that the data is not in this repo.
Please remove the prefix: /opt/conda/envs/xami_env
in the environment.yml
@IuliaElisa as suspected, for me, the environment.yml doesn't work at the moment because it specifies too many packages, I get
Could not solve for environment specs
The following packages are incompatible
├─ _libgcc_mutex ==0.1 conda_forge does not exist (perhaps a typo or a missing channel);
├─ _openmp_mutex ==4.5 2_gnu does not exist (perhaps a typo or a missing channel);
├─ alsa-lib ==1.2.10 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ aom ==3.7.1 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ attr ==2.5.1 h166bdaf_1 does not exist (perhaps a typo or a missing channel);
├─ attrs ==23.2.0 pyh71513ae_0 does not exist (perhaps a typo or a missing channel);
├─ brotli-bin ==1.1.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ brotli ==1.1.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ bzip2 ==1.0.8 hd590300_5 does not exist (perhaps a typo or a missing channel);
├─ c-ares ==1.25.0 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ ca-certificates ==2024.2.2 hbcca054_0 does not exist (perhaps a typo or a missing channel);
├─ cairo ==1.18.0 h3faef2a_0 does not exist (perhaps a typo or a missing channel);
├─ cattrs ==23.2.3 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ contourpy ==1.2.0 py311h9547e67_0 does not exist (perhaps a typo or a missing channel);
├─ dav1d ==1.2.1 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ dbus ==1.13.6 h5008d03_3 does not exist (perhaps a typo or a missing channel);
├─ docstring-to-markdown ==0.13 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ exceptiongroup ==1.2.0 pyhd8ed1ab_2 does not exist (perhaps a typo or a missing channel);
├─ expat ==2.5.0 hcb278e6_1 does not exist (perhaps a typo or a missing channel);
├─ ffmpeg ==6.1.1 gpl_h186bccc_100 does not exist (perhaps a typo or a missing channel);
├─ font-ttf-ubuntu ==0.83 h77eed37_1 does not exist (perhaps a typo or a missing channel);
├─ fontconfig ==2.14.2 h14ed4e7_0 does not exist (perhaps a typo or a missing channel);
├─ fonttools ==4.47.2 py311h459d7ec_0 does not exist (perhaps a typo or a missing channel);
├─ freeglut ==3.2.2 hac7e632_2 does not exist (perhaps a typo or a missing channel);
├─ freetype ==2.12.1 h267a509_2 does not exist (perhaps a typo or a missing channel);
├─ fribidi ==1.0.10 h36c2ea0_0 does not exist (perhaps a typo or a missing channel);
├─ gettext ==0.21.1 h27087fc_0 does not exist (perhaps a typo or a missing channel);
├─ glib-tools ==2.78.3 hfc55251_0 does not exist (perhaps a typo or a missing channel);
├─ glib ==2.78.3 hfc55251_0 does not exist (perhaps a typo or a missing channel);
├─ gmp ==6.3.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ gnutls ==3.7.9 hb077bed_0 does not exist (perhaps a typo or a missing channel);
├─ graphite2 ==1.3.13 h58526e2_1001 does not exist (perhaps a typo or a missing channel);
├─ gst-plugins-base ==1.22.8 h8e1006c_1 does not exist (perhaps a typo or a missing channel);
├─ gstreamer ==1.22.8 h98fc4e7_1 does not exist (perhaps a typo or a missing channel);
├─ harfbuzz ==8.3.0 h3d44ed6_0 does not exist (perhaps a typo or a missing channel);
├─ hdf5 ==1.14.3 nompi_h4f84152_100 does not exist (perhaps a typo or a missing channel);
├─ icu ==73.2 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ jasper ==4.1.2 he6dfbbe_0 does not exist (perhaps a typo or a missing channel);
├─ jedi-language-server ==0.41.2 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ jedi ==0.19.1 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ keyutils ==1.6.1 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ kiwisolver ==1.4.5 py311h9547e67_1 does not exist (perhaps a typo or a missing channel);
├─ krb5 ==1.21.2 h659d440_0 does not exist (perhaps a typo or a missing channel);
├─ lame ==3.100 h166bdaf_1003 does not exist (perhaps a typo or a missing channel);
├─ lcms2 ==2.16 hb7c19ff_0 does not exist (perhaps a typo or a missing channel);
├─ ld_impl_linux-64 ==2.40 h41732ed_0 does not exist (perhaps a typo or a missing channel);
├─ lerc ==4.0.0 h27087fc_0 does not exist (perhaps a typo or a missing channel);
├─ libabseil ==20230802.1 cxx17_h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libaec ==1.1.2 h59595ed_1 does not exist (perhaps a typo or a missing channel);
├─ libass ==0.17.1 h8fe9dca_1 does not exist (perhaps a typo or a missing channel);
├─ libblas ==3.9.0 20_linux64_openblas does not exist (perhaps a typo or a missing channel);
├─ libbrotlicommon ==1.1.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ libbrotlidec ==1.1.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ libbrotlienc ==1.1.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ libcap ==2.69 h0f662aa_0 does not exist (perhaps a typo or a missing channel);
├─ libcblas ==3.9.0 20_linux64_openblas does not exist (perhaps a typo or a missing channel);
├─ libclang13 ==15.0.7 default_ha2b6cf4_4 does not exist (perhaps a typo or a missing channel);
├─ libcups ==2.3.3 h4637d8d_4 does not exist (perhaps a typo or a missing channel);
├─ libcurl ==8.5.0 hca28451_0 does not exist (perhaps a typo or a missing channel);
├─ libdeflate ==1.19 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libdrm ==2.4.114 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ libedit ==3.1.20191231 he28a2e2_2 does not exist (perhaps a typo or a missing channel);
├─ libev ==4.33 hd590300_2 does not exist (perhaps a typo or a missing channel);
├─ libevent ==2.1.12 hf998b51_1 does not exist (perhaps a typo or a missing channel);
├─ libexpat ==2.5.0 hcb278e6_1 does not exist (perhaps a typo or a missing channel);
├─ libffi ==3.4.2 h7f98852_5 does not exist (perhaps a typo or a missing channel);
├─ libflac ==1.4.3 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libgcc-ng ==13.2.0 h807b86a_3 does not exist (perhaps a typo or a missing channel);
├─ libgcrypt ==1.10.3 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libgfortran-ng ==13.2.0 h69a702a_3 does not exist (perhaps a typo or a missing channel);
├─ libgfortran5 ==13.2.0 ha4646dd_3 does not exist (perhaps a typo or a missing channel);
├─ libglib ==2.78.3 h783c2da_0 does not exist (perhaps a typo or a missing channel);
├─ libglu ==9.0.0 hac7e632_1003 does not exist (perhaps a typo or a missing channel);
├─ libgomp ==13.2.0 h807b86a_3 does not exist (perhaps a typo or a missing channel);
├─ libgpg-error ==1.47 h71f35ed_0 does not exist (perhaps a typo or a missing channel);
├─ libiconv ==1.17 hd590300_2 does not exist (perhaps a typo or a missing channel);
├─ libidn2 ==2.3.4 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ libjpeg-turbo ==3.0.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ liblapack ==3.9.0 20_linux64_openblas does not exist (perhaps a typo or a missing channel);
├─ liblapacke ==3.9.0 20_linux64_openblas does not exist (perhaps a typo or a missing channel);
├─ libllvm15 ==15.0.7 hb3ce162_4 does not exist (perhaps a typo or a missing channel);
├─ libnghttp2 ==1.58.0 h47da74e_1 does not exist (perhaps a typo or a missing channel);
├─ libnsl ==2.0.1 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libogg ==1.3.4 h7f98852_1 does not exist (perhaps a typo or a missing channel);
├─ libopenblas ==0.3.25 pthreads_h413a1c8_0 does not exist (perhaps a typo or a missing channel);
├─ libopencv ==4.9.0 py311hbe74fbb_5 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-auto-batch-plugin ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-auto-plugin ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-hetero-plugin ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-intel-cpu-plugin ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-intel-gpu-plugin ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-ir-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-onnx-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-paddle-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-pytorch-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-tensorflow-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino-tensorflow-lite-frontend ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopenvino ==2023.2.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libopus ==1.3.1 h7f98852_1 does not exist (perhaps a typo or a missing channel);
├─ libpciaccess ==0.17 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ libpng ==1.6.39 h753d276_0 does not exist (perhaps a typo or a missing channel);
├─ libpq ==16.1 h33b98f1_7 does not exist (perhaps a typo or a missing channel);
├─ libprotobuf ==4.24.4 hf27288f_0 does not exist (perhaps a typo or a missing channel);
├─ libsndfile ==1.2.2 hc60ed4a_1 does not exist (perhaps a typo or a missing channel);
├─ libsqlite ==3.44.2 h2797004_0 does not exist (perhaps a typo or a missing channel);
├─ libssh2 ==1.11.0 h0841786_0 does not exist (perhaps a typo or a missing channel);
├─ libstdcxx-ng ==13.2.0 h7e041cc_3 does not exist (perhaps a typo or a missing channel);
├─ libsystemd0 ==255 h3516f8a_0 does not exist (perhaps a typo or a missing channel);
├─ libtasn1 ==4.19.0 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ libtiff ==4.6.0 ha9c0a0a_2 does not exist (perhaps a typo or a missing channel);
├─ libunistring ==0.9.10 h7f98852_0 does not exist (perhaps a typo or a missing channel);
├─ libuuid ==2.38.1 h0b41bf4_0 does not exist (perhaps a typo or a missing channel);
├─ libuv ==1.46.0 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libva ==2.20.0 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libvorbis ==1.3.7 h9c3ff4c_0 does not exist (perhaps a typo or a missing channel);
├─ libvpx ==1.13.1 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ libwebp-base ==1.3.2 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ libxcb ==1.15 h0b41bf4_0 does not exist (perhaps a typo or a missing channel);
├─ libxcrypt ==4.4.36 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ libxkbcommon ==1.6.0 hd429924_1 does not exist (perhaps a typo or a missing channel);
├─ libxml2 ==2.12.3 h232c23b_0 does not exist (perhaps a typo or a missing channel);
├─ libzlib ==1.2.13 hd590300_5 does not exist (perhaps a typo or a missing channel);
├─ lsprotocol ==2023.0.1 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ lz4-c ==1.9.4 hcb278e6_0 does not exist (perhaps a typo or a missing channel);
├─ matplotlib-base ==3.8.2 py311h54ef318_0 does not exist (perhaps a typo or a missing channel);
├─ matplotlib ==3.8.2 py311h38be061_0 does not exist (perhaps a typo or a missing channel);
├─ mpg123 ==1.32.4 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ mysql-common ==8.0.33 hf1915f5_6 does not exist (perhaps a typo or a missing channel);
├─ mysql-libs ==8.0.33 hca2cd23_6 does not exist (perhaps a typo or a missing channel);
├─ ncurses ==6.4 h59595ed_2 does not exist (perhaps a typo or a missing channel);
├─ nettle ==3.9.1 h7ab15ed_0 does not exist (perhaps a typo or a missing channel);
├─ nodejs ==18.19.0 hb753e55_0 does not exist (perhaps a typo or a missing channel);
├─ nspr ==4.35 h27087fc_0 does not exist (perhaps a typo or a missing channel);
├─ nss ==3.96 h1d7d5a4_0 does not exist (perhaps a typo or a missing channel);
├─ numpy ==1.26.3 py311h64a7726_0 does not exist (perhaps a typo or a missing channel);
├─ ocl-icd-system ==1.0.0 1 does not exist (perhaps a typo or a missing channel);
├─ ocl-icd ==2.3.1 h7f98852_0 does not exist (perhaps a typo or a missing channel);
├─ opencv ==4.9.0 py311hf4f9f61_5 does not exist (perhaps a typo or a missing channel);
├─ openh264 ==2.4.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ openjpeg ==2.5.0 h488ebb8_3 does not exist (perhaps a typo or a missing channel);
├─ openssl ==3.2.1 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ p11-kit ==0.24.1 hc5aa10d_0 does not exist (perhaps a typo or a missing channel);
├─ packaging ==23.2 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ pcre2 ==10.42 hcad00b1_0 does not exist (perhaps a typo or a missing channel);
├─ pillow ==10.2.0 py311ha6c5da5_0 does not exist (perhaps a typo or a missing channel);
├─ pixman ==0.43.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ pthread-stubs ==0.4 h36c2ea0_1001 does not exist (perhaps a typo or a missing channel);
├─ pugixml ==1.14 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ pulseaudio-client ==16.1 hb77b528_5 does not exist (perhaps a typo or a missing channel);
├─ py-opencv ==4.9.0 py311hbeafb31_5 does not exist (perhaps a typo or a missing channel);
├─ pygls ==1.3.0 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ pyparsing ==3.1.1 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ pyqt5-sip ==12.12.2 py311hb755f60_5 does not exist (perhaps a typo or a missing channel);
├─ pyqt ==5.15.9 py311hf0fb5b6_5 does not exist (perhaps a typo or a missing channel);
├─ python ==3.11.7 hab00c5b_1_cpython does not exist (perhaps a typo or a missing channel);
├─ python_abi ==3.11 4_cp311 does not exist (perhaps a typo or a missing channel);
├─ qt-main ==5.15.8 h450f30e_18 does not exist (perhaps a typo or a missing channel);
├─ readline ==8.2 h8228510_1 does not exist (perhaps a typo or a missing channel);
├─ sip ==6.7.12 py311hb755f60_0 does not exist (perhaps a typo or a missing channel);
├─ snappy ==1.1.10 h9fff704_0 does not exist (perhaps a typo or a missing channel);
├─ svt-av1 ==1.8.0 h59595ed_0 does not exist (perhaps a typo or a missing channel);
├─ tbb ==2021.7.0 h924138e_0 does not exist (perhaps a typo or a missing channel);
├─ tk ==8.6.13 noxft_h4845f30_101 does not exist (perhaps a typo or a missing channel);
├─ typing-extensions ==4.9.0 hd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ typing_extensions ==4.9.0 pyha770c72_0 does not exist (perhaps a typo or a missing channel);
├─ wheel ==0.42.0 pyhd8ed1ab_0 does not exist (perhaps a typo or a missing channel);
├─ x264 ==1!164.3095 h166bdaf_2 does not exist (perhaps a typo or a missing channel);
├─ x265 ==3.5 h924138e_3 does not exist (perhaps a typo or a missing channel);
├─ xcb-util-image ==0.4.0 h8ee46fc_1 does not exist (perhaps a typo or a missing channel);
├─ xcb-util-keysyms ==0.4.0 h8ee46fc_1 does not exist (perhaps a typo or a missing channel);
├─ xcb-util-renderutil ==0.3.9 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ xcb-util-wm ==0.4.1 h8ee46fc_1 does not exist (perhaps a typo or a missing channel);
├─ xcb-util ==0.4.0 hd590300_1 does not exist (perhaps a typo or a missing channel);
├─ xkeyboard-config ==2.40 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-fixesproto ==5.0 h7f98852_1002 does not exist (perhaps a typo or a missing channel);
├─ xorg-inputproto ==2.3.2 h7f98852_1002 does not exist (perhaps a typo or a missing channel);
├─ xorg-kbproto ==1.0.7 h7f98852_1002 does not exist (perhaps a typo or a missing channel);
├─ xorg-libice ==1.1.1 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libsm ==1.2.4 h7391055_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libx11 ==1.8.7 h8ee46fc_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxau ==1.0.11 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxdmcp ==1.1.3 h7f98852_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxext ==1.3.4 h0b41bf4_2 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxfixes ==5.0.3 h7f98852_1004 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxi ==1.7.10 h7f98852_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-libxrender ==0.9.11 hd590300_0 does not exist (perhaps a typo or a missing channel);
├─ xorg-renderproto ==0.11.1 h7f98852_1002 does not exist (perhaps a typo or a missing channel);
├─ xorg-xextproto ==7.3.0 h0b41bf4_1003 does not exist (perhaps a typo or a missing channel);
├─ xorg-xf86vidmodeproto ==2.3.1 h7f98852_1002 does not exist (perhaps a typo or a missing channel);
├─ xorg-xproto ==7.0.31 h7f98852_1007 does not exist (perhaps a typo or a missing channel);
├─ xz ==5.2.6 h166bdaf_0 does not exist (perhaps a typo or a missing channel);
├─ zlib ==1.2.13 hd590300_5 does not exist (perhaps a typo or a missing channel);
└─ zstd ==1.5.5 hfc55251_0 does not exist (perhaps a typo or a missing channel).
I believe if you run
conda env export --from-history
you only get manually installed packages not their deps? :)
Hi, I added now another yaml file. You can try it and run some notebooks/scripts. (it's possible that I didn't cover all deps..)
(I tried to use conda env export --from-history but this would give only few packages, so i also used the pip reqs and then put them into the conda deps when possible)
Nope, doesn't work for me with the pinned version
Could not solve for environment specs
The following packages are incompatible
├─ astropy 6.0.0** does not exist (perhaps a typo or a missing channel);
├─ jedi-language-server is installable with the potential options
│ ├─ jedi-language-server 0.21.0 would require
│ │ └─ jedi 0.17.2 with the potential options
│ │ ├─ jedi 0.17.2 would require
│ │ │ └─ python >=2.7,<2.8.0a0 but there are no viable options
│ │ │ ├─ python [2.7.12|2.7.13|2.7.14|2.7.15] would require
│ │ │ │ └─ vc [9.* |>=9,<10.0a0 ], which conflicts with any installable versions previously reported;
│ │ │ └─ python [2.7.13|2.7.14|...|2.7.18] conflicts with any installable versions previously reported;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.10.* *_cp310, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.6.* *_cp36m, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.7.* *_cp37m, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.7 *_pypy37_pp73, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.8.* *_cp38, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ └─ jedi 0.17.2 conflicts with any installable versions previously reported;
│ ├─ jedi-language-server [0.22.0|0.23.0|...|0.34.8] would require
│ │ └─ jedi 0.18.0 with the potential options
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ └─ jedi 0.18.0 conflicts with any installable versions previously reported;
│ ├─ jedi-language-server [0.34.10|0.34.11|...|0.36.0] would require
│ │ └─ importlib-metadata >=3.10.0,<4 with the potential options
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.6.* *_cp36m, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.7.* *_cp37m, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.8.* *_cp38, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ └─ importlib-metadata 3.10.0 conflicts with any installable versions previously reported;
│ └─ jedi-language-server [0.36.1|0.37.0] would require
│ └─ importlib-metadata >=3.10.1,<4 , which can be installed (as previously explained);
├─ matplotlib 3.8.2** does not exist (perhaps a typo or a missing channel);
├─ numpy 1.26.4** is uninstallable because it conflicts with any installable versions previously reported;
├─ pandas 2.2.2** does not exist (perhaps a typo or a missing channel);
├─ pillow 10.3.0** is uninstallable because it conflicts with any installable versions previously reported;
├─ python 3.11** is uninstallable because there are no viable options
│ ├─ python 3.11.0 would require
│ │ └─ python_abi 3.11.* *_cp311, which conflicts with any installable versions previously reported;
│ └─ python [3.11.0|3.11.2|...|3.11.9] conflicts with any installable versions previously reported;
├─ scipy 1.13.0** is uninstallable because it conflicts with any installable versions previously reported;
└─ tqdm 4.66.1** does not exist (perhaps a typo or a missing channel).
Did it work to create the env on your machine with mamba env create -f environment.yaml
with the provided file? (or conda, although that is slow...)
It worked to create the env using conda env create -f environment.yaml
on my machine. Are you also on Linux ?
For me, the command with mamba doesn't work.
I also don't understand why it cannot install py311.
I moved some packages to pip deps. Can you try again with that? I am not sure it helps but maybe it doesn't work to install those packages without pip in your case.
Are you also on Linux ?
Currently testing on a windows machine.
For me, the command with mamba doesn't work.
mamba is a much much faster replacement of conda ( https://github.com/conda-forge/miniforge )
I also don't understand why it cannot install py311.
You are using an outdated version of astropy and fix it on 6.0.0. This causes a cascade of dependencies where some are not compatible with py311, I think?
If I removed the pinned version, like this, I can install it
name: xami_model_env
channels:
- conda-forge
- defaults
dependencies:
- python
- numpy
- pandas
- scipy
- matplotlib
- astropy
- Pillow
- tqdm
- pip
- jedi-language-server
- nodejs=18
- pip:
- albumentations==1.3.1
- datasets==2.18.0
- huggingface_hub==0.20.3
- opencv_python_headless==4.8.0.74
- panoptes_client==1.6.1
- photutils==1.10.0
- pycocotools==2.0.7
- pyparsing==3.1.2
- PyYAML==6.0.1
- segment_anything==1.0
- scikit-image
- supervision==0.20.0
- sympy==1.12
- tabulate==0.9.0
- timm==0.9.12
- torch==2.1.2
- torchmetrics==1.3.1
- torchvision==0.16.2
- ultralytics==8.1.37
- ipywidgets==8.1.2
And does the command complete and you can activate the env?
It is interesting because for the dataset repo, I had those explicit versions:
name: xami_dataset_env
channels:
- conda-forge
- defaults
- pytorch
dependencies:
- astropy==6.0.0
- matplotlib==3.8.2
- numpy==1.26.4
- Pillow==10.3.0
- pandas==2.2.2
- datasets==2.18.0
- jupyter==1.0.0
- ipywidgets==8.1.1
- jinja2==3.1.3
- tabulate==0.9.0
- pip:
- opencv-python-headless==4.8.0.74
- pycocotools==2.0.7
- huggingface-hub==0.23.0
But maybe I didn't add the python explicit version, that's why it worked... but in my env I used py 311 for the dataset and model also.
It works for me to install the deps with your version, except that it doesn't work to install without the explicit py 311 version.
And does the command complete and you can activate the env?
Yes, will test now if I can run the code.
But maybe I didn't add the python explicit version, that's why it worked... but in my env I used py 311 for the dataset and model also.
Yes I think the py version was the dealbreaker in the end.
Had to install ipykernel
pywavelets
for notebooks.
I can run the download notebook successfully now.
In the yolo_sam one the first cell fails with
[13](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:13) import sys
[14](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:14) sys.path.append('/workspace/raid/OM_DeepLearning/XAMI/mobile_sam')
---> [15](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:15) from mobile_sam import sam_model_registry, SamPredictor#, build_efficientvit_l2_encoder
[17](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:17) class YoloSam:
[18](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:18) def __init__(self, device, yolo_checkpoint, sam_checkpoint, model_type='vit_t', use_yolo_masks=True):
ImportError: cannot import name 'sam_model_registry' from 'mobile_sam' (unknown location)
You seem to have some hardcoded paths in there?
Yes, I had to change that. Now I modified it using os.getcwd(). I also modified the yaml file. Please check if it is like yours. (I added the py version though)
Yes, I had to change that. Now I modified it using os.getcwd(). I also modified the yaml file. Please check if it is like yours. (I added the py version though)
Is there a reason you are putting all these things in pip? Mixing pip and conda can cause many problems, see
https://stackoverflow.com/a/56141684 https://www.anaconda.com/blog/using-pip-in-a-conda-environment https://discuss.python.org/t/pip-conda-compatibility/24375
TL;DR Best practice, least problems is
And I am back to
The following packages are incompatible
├─ jedi-language-server is installable with the potential options
│ ├─ jedi-language-server 0.21.0 would require
│ │ └─ jedi 0.17.2 with the potential options
│ │ ├─ jedi 0.17.2 would require
│ │ │ └─ python >=2.7,<2.8.0a0 but there are no viable options
│ │ │ ├─ python [2.7.12|2.7.13|2.7.14|2.7.15] would require
│ │ │ │ └─ vc [9.* |>=9,<10.0a0 ], which conflicts with any installable versions previously reported;
│ │ │ └─ python [2.7.13|2.7.14|...|2.7.18] conflicts with any installable versions previously reported;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.10.* *_cp310, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.6.* *_cp36m, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.7.* *_cp37m, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.7 *_pypy37_pp73, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.8.* *_cp38, which can be installed;
│ │ ├─ jedi [0.17.2|0.18.0] would require
│ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ └─ jedi 0.17.2 conflicts with any installable versions previously reported;
│ ├─ jedi-language-server [0.22.0|0.23.0|...|0.34.8] would require
│ │ └─ jedi 0.18.0 with the potential options
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ ├─ jedi [0.17.2|0.18.0], which can be installed (as previously explained);
│ │ └─ jedi 0.18.0 conflicts with any installable versions previously reported;
│ ├─ jedi-language-server [0.34.10|0.34.11|...|0.36.0] would require
│ │ └─ importlib-metadata >=3.10.0,<4 with the potential options
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.6.* *_cp36m, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.7.* *_cp37m, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.8.* *_cp38, which can be installed;
│ │ ├─ importlib-metadata [3.10.0|3.10.1] would require
│ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ └─ importlib-metadata 3.10.0 conflicts with any installable versions previously reported;
│ └─ jedi-language-server [0.36.1|0.37.0] would require
│ └─ importlib-metadata >=3.10.1,<4 , which can be installed (as previously explained);
└─ python 3.11** is uninstallable because there are no viable options
├─ python 3.11.0 would require
│ └─ python_abi 3.11.* *_cp311, which conflicts with any installable versions previously reported;
└─ python [3.11.0|3.11.2|...|3.11.9] conflicts with any installable versions previously reported.
E.g. this works
name: xami_model_env
channels:
- conda-forge
- defaults
dependencies:
- numpy
- pandas
- scipy
- matplotlib
- astropy
- Pillow
- tqdm
- pip
- jedi-language-server
- nodejs
- albumentations
- datasets
- huggingface_hub
- photutils
- pycocotools
- pyparsing
- PyYAML
- scikit-image
- sympy
- tabulate
- timm
- pytorch
- torchmetrics
- torchvision
- pywavelets
- ipywidgets
- wandb
- pip:
- ultralytics
- segment_anything
- opencv_python_headless
- panoptes_client
- supervision
I still get
[12](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:12) import os
[14](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:14) sys.path.append(os.getcwd()+'/mobile_sam')
---> [15](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:15) from mobile_sam import sam_model_registry, SamPredictor#, build_efficientvit_l2_encoder
[17](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:17) class YoloSam:
[18](file:///D:/Arbeit/Code/XAMI/inference/YoloSamPipeline.py:18) def __init__(self, device, yolo_checkpoint, sam_checkpoint, model_type='vit_t', use_yolo_masks=True):
ImportError: cannot import name 'sam_model_registry' from 'mobile_sam' (unknown location)
That is interesting :)
E.g. this works
In my case, it doesn't work due to opencv_python_headless.
14 sys.path.append(os.getcwd()+'/mobile_sam')
For me, this works. The mobile_sam is in the XAMI directory. That is why I used os.getcwd(). Maybe windows uses that differently. Is it using '\' or '/' for paths? That would make a huge difference...
I changed how to get the path insome locations. maybe that would work.
Also, I don't know if you have the datasets installed already, but you should use the download_dataset_and_weights.ipynb to install those and the weights.
In my case, it doesn't work due to opencv_python_headless.
Huh, but that is still installed via pip?
For me, this works. The mobile_sam is in the XAMI directory. That is why I used os.getcwd(). Maybe windows uses that differently. Is it using '\' or '/' for paths? That would make a huge difference...
I changed how to get the path insome locations. maybe that would work.
No I tried manually setting the path as well, didn't work.
The problem here is that modifying the user's path is more of a hack, too.
Ideally, if you have code in multiple folders you could use a module structure, see e.g. here
https://github.com/aidotse/PASEOS/tree/main/
All the code is contained in the folder paseos
which has an init.py that takes care of exposing the submodules.
And inside the module you can then have relative imports like this
Alternatively, you can make the different subfolders individual subfolders that can just be locally installed with a setup.py, see here https://github.com/aidotse/PASEOS/blob/main/setup.py , this allows local installation via pip install .
@IuliaElisa I made a small google colab so we can test in the same env.
You can see here, it currently cannot build the environment on a linux machine.
https://colab.research.google.com/drive/1SO9ptrrbgIOWDMPMHjeliHFZ4-_B4gkC?usp=sharing
(due to some dependencies requiring Py310 I think)
PS: I don't think you have to mention Python at all in the environment.yml since it is a dependency of the other packages
Huh, but that is still installed via pip?
Yes, for me the current env yaml file in the repo worked, where opencv was in pip.
The problem here is that modifying the user's path is more of a hack, too.
Ideally, if you have code in multiple folders you could use a module structure, see e.g. here
https://github.com/aidotse/PASEOS/tree/main/
All the code is contained in the folder paseos
which has an init.py that takes care of exposing the submodules.
And inside the module you can then have relative imports like this
Unfortunately the option I found for notebooks was to append the path. Or for scripts that I run individually but import some modules from the package to run the script as a module. I will try the setup.py method.
This yaml worked for me in colab:
name: xami_model_env
channels:
- conda-forge
- defaults
dependencies:
- python
- numpy
- pandas
- scipy
- matplotlib
- astropy
- Pillow
- tqdm
- pip
- jedi-language-server
- nodejs=18
- pip:
- albumentations==1.3.1
- datasets==2.18.0
- huggingface_hub==0.20.3
- opencv_python_headless==4.8.0.74
- panoptes_client==1.6.1
- photutils==1.10.0
- pycocotools==2.0.7
- pyparsing==3.1.2
- PyYAML==6.0.1
- segment_anything==1.0
- scikit-image
- supervision==0.20.0
- sympy==1.12
- tabulate==0.9.0
- timm==0.9.12
- torch==2.1.2
- torchmetrics==1.3.1
- torchvision==0.16.2
- ultralytics==8.1.37
- ipywidgets==8.1.2
- wandb==0.16.2
- pywavelets==1.6.0
but at the end i get:
#
# To activate this environment, use
#
# $ conda activate base
#
# To deactivate an active environment, use
#
# $ conda deactivate
This yaml worked for me in colab
Yup, with that yes. Could you please still update the environment.yml in the repo though? Because that one default with the py311 does not work.
Testing again locally now.
Unfortunately, locally I have now problems with pip again.... I would still really suggest to follow best practices and use pip as little as possible as you will only have other people having the same problems later on. I will try to manually install the packages...
Maybe we can use two yaml files for Linux and windows? Or even an entire version :))
I didn't consider windows when developing the project.
Yea, sorry but I am giving up on windows....
But I am also struggling in colab again, because of things being split across two notebooks. I am trying to merge it into one, but I am getting path errors again https://colab.research.google.com/drive/1SO9ptrrbgIOWDMPMHjeliHFZ4-_B4gkC#scrollTo=AGN65cHH_Am8
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
[<ipython-input-20-ccc535128a44>](https://localhost:8080/#) in <cell line: 6>()
4 # set here the absolute path to the directory
5 sys.path.append(os.getcwd()+'/XAMI-dataset')
----> 6 from xami_dataset import XAMIDataset
7
8 # Download the dataset
ModuleNotFoundError: No module named 'xami_dataset'
Hi,
It works after import xami_dataset before that line:
sys.path.append(os.getcwd()+'/XAMI-dataset') import xami_dataset from xami_dataset import XAMIDataset
I managed to run the notebook and also try some training for one epoch (from train_yolo_sam.ipynb):
(this is a copy of your notebook, I didn't have the permission to save it) https://colab.research.google.com/drive/1KoLnT586HL170i17mj1PZ2RZO_0Hj_F-?usp=sharing
One obs: In colab, if using a batch size of 8 and the focal loss => OOM errors
Either using a batch of 4 OR a batch of 8 without the focal loss (line 105 in loss_utils.py) passes.
I managed to run the notebook and also try some training for one epoch (from train_yolo_sam.ipynb):
Perfect!
One obs: In colab, if using a batch size of 8 and the focal loss => OOM errors
Yea memory is probably limited there.
Nice work!
Hi,
Thanks for your help!
Just wanted to re-check some aspects quickly:
But I am also struggling in colab again, because of things being split across two notebooks. I am trying to merge it into one, but I am getting path errors again https://colab.research.google.com/drive/1SO9ptrrbgIOWDMPMHjeliHFZ4-_B4gkC#scrollTo=AGN65cHH_Am8
Should I merge the notebooks?
Also, I tried to run the env yaml on Unix and it works only with explicit python version. So currently, the yaml file in the repo is quite restricted for Linux users only... I will try to find a solution for that.
Should I merge the notebooks?
I think it would be a good idea yes. Always best to have a single starting point for users (especially since otherwise running one requires the other)
Also, I tried to run the env yaml on Unix and it works only with explicit python version. So currently, the yaml file in the repo is quite restricted for Linux users only... I will try to find a solution for that.
Typically you should not need python version in the environment.yml at all. Conda will automatically pick the fitting one for you. As mentioned the majority of the problems likely stem from the fact that you are fixing versions for things and using pip. Conda can then not pick the fitting versions making it impossible to create the environment in some configurations.
Some suggestions for usability
but most people will not have more than one GPU (if any). I would remove those line.