Closed brando90 closed 1 year ago
# cd to pytorch-meta-dataset and meta-dataset, our forks
pip3 freeze > requirements_patrick_mds.txt
@patricks-lab
# cd div
pip3 freeze > requirements_patrick_mds.txt
try this:
pip freeze --exclude-editable > requirements.txt
what we need is to be able to build a seperate conda env for anyone that wants to reproduce our stuff. So for that here is a starter script:
# - create conda script for mds
conda update -n base -c defaults conda
conda create -n mds_env_gpu python=3.9
conda activate mds_env_gpu
pip install -r $HOME/diversity-for-predictive-success-of-meta-learning/requirements_patrick_mds.txt
cd $HOME/diversity-for-predictive-success-of-meta-learning
then try to build the tfrecords for a few files and see if it works (maybe locally so u don't screw up ur vision cluster stuff)
@patricks-lab does this make sense? let me know if u have question.
So I initially tried re-installing using the requirements_patrick_mds.txt I generated today and looks like I'm getting quite a lot of errors from a fresh py3.9 conda environment. Instead I tried creating a new requirements list from the existing libraries' requirements. And installing the new requirements in a fresh py3.9 environment and verifying that at least a couple smaller datasets fully install (e.g. all tfrecords exist, dataspec file is there).
This shorter requirement list worked for me: https://github.com/brando90/diversity-for-predictive-success-of-meta-learning/blob/main/req_mds_essentials.txt
A sample revised script for installing using the new requirements and getting aircraft/cu-birds to work: https://github.com/brando90/diversity-for-predictive-success-of-meta-learning/blob/main/mds_install_v2.sh
hope that helps!
@patricks-lab instead of me asking you for every version of stuff. Once you have code that works, can you create a requirements_patrick.txt file that matches exactly what is working for you?
e.g. https://stackoverflow.com/a/33468993/1601580
then create a pull request for those 2 files please. Thanks! @patricks-lab