ZhangYuanhan-AI / NOAH

[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
MIT License
224 stars 11 forks source link

dataset #20

Open guoliangq opened 1 year ago

guoliangq commented 1 year ago

Hello, I can't download the VTAB dataset according to your configuration, can you send me a copy of the dataset, my email is 1971733261@qq.com. And I hope you can provide the versions of the following packages:tensorflow、tensorflow-addons、tensorflow-metadata、tensorflow-datasets、tfds-nightly.

Maystern commented 1 year ago

I also have this problem, the requirements listed in your requirement.txt seem not enough for me to load the dataset. I also need the versions of the packages below.

ZhangYuanhan-AI commented 1 year ago

What error raises exactly?

Maystern commented 1 year ago

There are my logs, it seems like the process is blocked. I think it's a version issue with the packages Six, TensorFlow, TensorFlow datasets, TensorFlow addons, and Pillow (as I only installed these additional packages).

python get_vtab1k.py
2023-08-09 04:09:57.816914: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-08-09 04:09:57.859487: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-08-09 04:09:58.665901: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages/tensorflow_addons/utils/tfa_eol_msg.py:23: UserWarning: 

TensorFlow Addons (TFA) has ended development and introduction of new features.
TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024.
Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). 

For more information see: https://github.com/tensorflow/addons/issues/2807 

  warnings.warn(
2023-08-09 04:09:59.725324: W tensorflow/tsl/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed, returning an empty token. Retrieving token from files failed with "NOT_FOUND: Could not locate the credentials file.". Retrieving token from GCE failed with "FAILED_PRECONDITION: Error executing an HTTP request: libcurl code 6 meaning 'Couldn't resolve host name', error details: Could not resolve host: metadata.google.internal".
2023-08-09 04:11:01.969729: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001244 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:12:05.570709: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001114 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:13:11.994750: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.00135 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
Maystern commented 1 year ago
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH$ cd data/vtab-source
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
Traceback (most recent call last):
  File "get_vtab1k.py", line 1, in <module>
    from task_adaptation.data import caltech
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/caltech.py", line 21, in <module>
    import task_adaptation.data.base as base
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/base.py", line 23, in <module>
    import six
ModuleNotFoundError: No module named 'six'
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install six
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting six
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/d9/5a/e7c31adbe875f2abbb91bd84cf2dc52d792b5a01506781dbcf25c91daf11/six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: six
Successfully installed six-1.16.0
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
Traceback (most recent call last):
  File "get_vtab1k.py", line 1, in <module>
    from task_adaptation.data import caltech
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/caltech.py", line 21, in <module>
    import task_adaptation.data.base as base
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/base.py", line 24, in <module>
    import tensorflow.compat.v1 as tf
ModuleNotFoundError: No module named 'tensorflow'
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install tensorflow
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting tensorflow
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/88/cc/14affad82a28640974ff2a37c177269a0a088266104ddf76a01eac21c9ac/tensorflow-2.13.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (524.1 MB)
Collecting absl-py>=1.0.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/dd/87/de5c32fa1b1c6c3305d576e299801d8655c175ca9557019906247b994331/absl_py-1.4.0-py3-none-any.whl (126 kB)
Collecting astunparse>=1.6.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/2b/03/13dde6512ad7b4557eb792fbcf0c653af6076b81e5941d36ec61f7ce6028/astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting flatbuffers>=23.1.21 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/6f/12/d5c79ee252793ffe845d58a913197bfa02ae9a0b5c9bc3dc4b58d477b9e7/flatbuffers-23.5.26-py2.py3-none-any.whl (26 kB)
Collecting gast<=0.4.0,>=0.2.1 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b6/48/583c032b79ae5b3daa02225a675aeb673e58d2cb698e78510feceb11958c/gast-0.4.0-py3-none-any.whl (9.8 kB)
Collecting google-pasta>=0.1.1 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a3/de/c648ef6835192e6e2cc03f40b19eeda4382c49b5bafb43d88b931c4c74ac/google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting grpcio<2.0,>=1.24.3 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/73/11/9201e5fad4db2fb8866de0cd434aa5b81870370ae4ea2c3a00b19bc6351e/grpcio-1.56.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.2 MB)
Collecting h5py>=2.9.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ae/3d/7d396c8be3ed11a0301d303af20a6dcb367d8ed78b4779de9e4962193303/h5py-3.9.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.8 MB)
Collecting keras<2.14,>=2.13.1 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/2e/f3/19da7511b45e80216cbbd9467137b2d28919c58ba1ccb971435cb631e470/keras-2.13.1-py3-none-any.whl (1.7 MB)
Collecting libclang>=13.0.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ea/df/55525e489c43f9dbb6c8ea27d8a567b3dcd18a22f3c45483055f5ca6611d/libclang-16.0.6-py2.py3-none-manylinux2010_x86_64.whl (22.9 MB)
Collecting numpy<=1.24.3,>=1.22 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/8b/d9/814a619ab84d8eb0d95e08d4c723e665f1e694b5a6068ca505a61bdc3745/numpy-1.24.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
Collecting opt-einsum>=2.3.2 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/bc/19/404708a7e54ad2798907210462fd950c3442ea51acc8790f3da48d2bee8b/opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting packaging (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ab/c3/57f0601a2d4fe15de7a553c00adbc901425661bf048f2a22dfc500caf121/packaging-23.1-py3-none-any.whl (48 kB)
Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/01/cb/445b3e465abdb8042a41957dc8f60c54620dc7540dbcf9b458a921531ca2/protobuf-4.23.4-cp37-abi3-manylinux2014_x86_64.whl (304 kB)
Requirement already satisfied: setuptools in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow) (68.0.0)
Requirement already satisfied: six>=1.12.0 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow) (1.16.0)
Collecting tensorboard<2.14,>=2.13 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/67/f2/e8be5599634ff063fa2c59b7b51636815909d5140a26df9f02ce5d99b81a/tensorboard-2.13.0-py3-none-any.whl (5.6 MB)
Collecting tensorflow-estimator<2.14,>=2.13.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/72/5c/c318268d96791c6222ad7df1651bbd1b2409139afeb6f468c0f327177016/tensorflow_estimator-2.13.0-py2.py3-none-any.whl (440 kB)
Collecting termcolor>=1.1.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/67/e1/434566ffce04448192369c1a282931cf4ae593e91907558eaecd2e9f2801/termcolor-2.3.0-py3-none-any.whl (6.9 kB)
Collecting typing-extensions<4.6.0,>=3.6.6 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/31/25/5abcd82372d3d4a3932e1fa8c3dbf9efac10cc7c0d16e78467460571b404/typing_extensions-4.5.0-py3-none-any.whl (27 kB)
Collecting wrapt>=1.11.0 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/af/7f/25913aacbe0c2c68e7354222bdefe4e840489725eb835e311c581396f91f/wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (81 kB)
Collecting tensorflow-io-gcs-filesystem>=0.23.1 (from tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/d6/e4/b8816320283b7a4774e53a51a1019a2344503e517d5b698e9db45d3b48c9/tensorflow_io_gcs_filesystem-0.33.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.4 MB)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from astunparse>=1.6.0->tensorflow) (0.38.4)
Collecting google-auth<3,>=1.6.3 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/9c/8d/bff87fc722553a5691d8514da5523c23547f3894189ba03b57592e37bdc2/google_auth-2.22.0-py2.py3-none-any.whl (181 kB)
Collecting google-auth-oauthlib<1.1,>=0.5 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/4a/07/8d9a8186e6768b55dfffeb57c719bc03770cf8a970a074616ae6f9e26a57/google_auth_oauthlib-1.0.0-py2.py3-none-any.whl (18 kB)
Collecting markdown>=2.6.8 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/1a/b5/228c1cdcfe138f1a8e01ab1b54284c8b83735476cb22b6ba251656ed13ad/Markdown-3.4.4-py3-none-any.whl (94 kB)
Collecting requests<3,>=2.21.0 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl (62 kB)
Collecting tensorboard-data-server<0.8.0,>=0.7.0 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/02/52/fb9e51fba47951aabd7a6b25e41d73eae94208ccf62d886168096941a781/tensorboard_data_server-0.7.1-py3-none-manylinux2014_x86_64.whl (6.6 MB)
Collecting werkzeug>=1.0.1 (from tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ba/d6/8040faecaba2feb84e1647af174b3243c9b90c163c7ea407820839931efe/Werkzeug-2.3.6-py3-none-any.whl (242 kB)
Collecting cachetools<6.0,>=2.0.0 (from google-auth<3,>=1.6.3->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a9/c9/c8a7710f2cedcb1db9224fdd4d8307c9e48cbddc46c18b515fefc0f1abbe/cachetools-5.3.1-py3-none-any.whl (9.3 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.6.3->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/cd/8e/bea464350e1b8c6ed0da3a312659cb648804a08af6cacc6435867f74f8bd/pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.6.3->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/49/97/fa78e3d2f65c02c8e1268b9aba606569fe97f6c8f7c2d74394553347c145/rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.6.3->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/c5/05/c214b32d21c0b465506f95c4f28ccbcba15022e000b043b72b3df7728471/urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<1.1,>=0.5->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/6f/bb/5deac77a9af870143c684ab46a7934038a53eb4aa975bc0687ed6ca2c610/requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting importlib-metadata>=4.4 (from markdown>=2.6.8->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/cc/37/db7ba97e676af155f5fcb1a35466f446eadc9104e25b83366e8088c9c926/importlib_metadata-6.8.0-py3-none-any.whl (22 kB)
Collecting charset-normalizer<4,>=2 (from requests<3,>=2.21.0->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/cb/e7/5e43745003bf1f90668c7be23fc5952b3a2b9c2558f16749411c18039b36/charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (199 kB)
Collecting idna<4,>=2.5 (from requests<3,>=2.21.0->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/fc/34/3030de6f1370931b9dbb4dad48f6ab1015ab1d32447850b9fc94e60097be/idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3,>=2.21.0->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/4c/dd/2234eab22353ffc7d94e8d13177aaa050113286e93e7b40eae01fbf7c3d9/certifi-2023.7.22-py3-none-any.whl (158 kB)
Collecting MarkupSafe>=2.1.1 (from werkzeug>=1.0.1->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/de/e2/32c14301bb023986dff527a49325b6259cab4ebb4633f69de54af312fc45/MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting zipp>=0.5 (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/8c/08/d3006317aefe25ea79d3b76c9650afabaf6d63d1c8443b236e7405447503/zipp-3.16.2-py3-none-any.whl (7.2 kB)
Collecting pyasn1<0.6.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/14/e5/b56a725cbde139aa960c26a1a3ca4d4af437282e20b5314ee6a3501e7dfc/pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Collecting oauthlib>=3.0.0 (from requests-oauthlib>=0.7.0->google-auth-oauthlib<1.1,>=0.5->tensorboard<2.14,>=2.13->tensorflow)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/7e/80/cab10959dc1faead58dc8384a781dfbf93cb4d33d50988f7a69f1b7c9bbe/oauthlib-3.2.2-py3-none-any.whl (151 kB)
Installing collected packages: libclang, flatbuffers, zipp, wrapt, urllib3, typing-extensions, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard-data-server, pyasn1, protobuf, packaging, oauthlib, numpy, MarkupSafe, keras, idna, grpcio, google-pasta, gast, charset-normalizer, certifi, cachetools, astunparse, absl-py, werkzeug, rsa, requests, pyasn1-modules, opt-einsum, importlib-metadata, h5py, requests-oauthlib, markdown, google-auth, google-auth-oauthlib, tensorboard, tensorflow
Successfully installed MarkupSafe-2.1.3 absl-py-1.4.0 astunparse-1.6.3 cachetools-5.3.1 certifi-2023.7.22 charset-normalizer-3.2.0 flatbuffers-23.5.26 gast-0.4.0 google-auth-2.22.0 google-auth-oauthlib-1.0.0 google-pasta-0.2.0 grpcio-1.56.2 h5py-3.9.0 idna-3.4 importlib-metadata-6.8.0 keras-2.13.1 libclang-16.0.6 markdown-3.4.4 numpy-1.24.3 oauthlib-3.2.2 opt-einsum-3.3.0 packaging-23.1 protobuf-4.23.4 pyasn1-0.5.0 pyasn1-modules-0.3.0 requests-2.31.0 requests-oauthlib-1.3.1 rsa-4.9 tensorboard-2.13.0 tensorboard-data-server-0.7.1 tensorflow-2.13.0 tensorflow-estimator-2.13.0 tensorflow-io-gcs-filesystem-0.33.0 termcolor-2.3.0 typing-extensions-4.5.0 urllib3-1.26.16 werkzeug-2.3.6 wrapt-1.15.0 zipp-3.16.2
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
2023-08-09 04:08:26.281169: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-08-09 04:08:26.321709: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-08-09 04:08:27.090942: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Traceback (most recent call last):
  File "get_vtab1k.py", line 1, in <module>
    from task_adaptation.data import caltech
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/caltech.py", line 21, in <module>
    import task_adaptation.data.base as base
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/base.py", line 25, in <module>
    import tensorflow_datasets as tfds
ModuleNotFoundError: No module named 'tensorflow_datasets'
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install tensorflow-datasets
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting tensorflow-datasets
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/62/82/521e36235c7890b067415fa874f5074ff21f1005a189d79fd72b33b39ca1/tensorflow_datasets-4.9.2-py3-none-any.whl (5.4 MB)
Requirement already satisfied: absl-py in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (1.4.0)
Collecting array-record (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/8f/d7/c561a3ee49d837478a78639502e8f21f52879d8d56b74fed3e30275836ea/array_record-0.4.0-py38-none-any.whl (3.0 MB)
Collecting click (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/1a/70/e63223f8116931d365993d4a6b7ef653a4d920b41d03de7c59499962821f/click-8.1.6-py3-none-any.whl (97 kB)
Collecting dm-tree (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/e3/03/cd776c4f224aabe06225c3661f79e1114dbe337506ae9039575eb06cc568/dm_tree-0.1.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (152 kB)
Collecting etils[enp,epath]>=0.9.0 (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ca/db/47ffb866d7a1aa21132a72f67e84c4f03a4cad11ae9d069dd61c52f929de/etils-1.3.0-py3-none-any.whl (126 kB)
Requirement already satisfied: numpy in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (1.24.3)
Collecting promise (from tensorflow-datasets)
  Using cached promise-2.3-py3-none-any.whl
Requirement already satisfied: protobuf>=3.20 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (4.23.4)
Collecting psutil (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/af/4d/389441079ecef400e2551a3933224885a7bde6b8a4810091d628cdd75afe/psutil-5.9.5-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (282 kB)
Requirement already satisfied: requests>=2.19.0 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (2.31.0)
Collecting tensorflow-metadata (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/6e/b7/41ed520712c659dee0653dbd1ae71ed991bc51c7622e3e4dafbbf208faaa/tensorflow_metadata-1.13.1-py3-none-any.whl (28 kB)
Requirement already satisfied: termcolor in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (2.3.0)
Collecting toml (from tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/44/6f/7120676b6d73228c96e17f1f794d8ab046fc910d781c8d151120c3f1569e/toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting tqdm (from tensorflow-datasets)
  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/40/14/63f9a5bc62e8a50585b8a7a6de1ffab8eab09aaa5321b86127919ee7de02/tqdm-4.65.1-py3-none-any.whl (93 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.2/93.2 kB 1.5 MB/s eta 0:00:00
Requirement already satisfied: wrapt in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-datasets) (1.15.0)
Collecting importlib-resources (from tensorflow-datasets)
  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/25/d4/592f53ce2f8dde8be5720851bd0ab71cc2e76c55978e4163ef1ab7e389bb/importlib_resources-6.0.1-py3-none-any.whl (34 kB)
Requirement already satisfied: typing_extensions in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from etils[enp,epath]>=0.9.0->tensorflow-datasets) (4.5.0)
Requirement already satisfied: zipp in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from etils[enp,epath]>=0.9.0->tensorflow-datasets) (3.16.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from requests>=2.19.0->tensorflow-datasets) (3.2.0)
Requirement already satisfied: idna<4,>=2.5 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from requests>=2.19.0->tensorflow-datasets) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from requests>=2.19.0->tensorflow-datasets) (1.26.16)
Requirement already satisfied: certifi>=2017.4.17 in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from requests>=2.19.0->tensorflow-datasets) (2023.7.22)
Requirement already satisfied: six in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from promise->tensorflow-datasets) (1.16.0)
Collecting googleapis-common-protos<2,>=1.52.0 (from tensorflow-metadata->tensorflow-datasets)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a7/bc/416a1ffeba4dcd072bc10523dac9ed97f2e7fc4b760580e2bdbdc1e2afdd/googleapis_common_protos-1.60.0-py2.py3-none-any.whl (227 kB)
Installing collected packages: dm-tree, tqdm, toml, psutil, promise, importlib-resources, googleapis-common-protos, etils, click, tensorflow-metadata, array-record, tensorflow-datasets
Successfully installed array-record-0.4.0 click-8.1.6 dm-tree-0.1.8 etils-1.3.0 googleapis-common-protos-1.60.0 importlib-resources-6.0.1 promise-2.3 psutil-5.9.5 tensorflow-datasets-4.9.2 tensorflow-metadata-1.13.1 toml-0.10.2 tqdm-4.65.1
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
2023-08-09 04:08:45.926715: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-08-09 04:08:45.970179: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-08-09 04:08:46.694028: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Traceback (most recent call last):
  File "get_vtab1k.py", line 4, in <module>
    from task_adaptation.data import diabetic_retinopathy
  File "/data/sdb/jiacheng/NOAH/data/vtab-source/task_adaptation/data/diabetic_retinopathy.py", line 24, in <module>
    import tensorflow_addons.image as tfa_image
ModuleNotFoundError: No module named 'tensorflow_addons'
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install tensorflow-addons
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting tensorflow-addons
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/06/bb/46bb83fee4207bf885e679b057d42c816d67192e006ba0c723ff7c861c28/tensorflow_addons-0.21.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (612 kB)
Requirement already satisfied: packaging in /home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages (from tensorflow-addons) (23.1)
Collecting typeguard<3.0.0,>=2.7 (from tensorflow-addons)
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/9a/bb/d43e5c75054e53efce310e79d63df0ac3f25e34c926be5dffb7d283fb2a8/typeguard-2.13.3-py3-none-any.whl (17 kB)
Installing collected packages: typeguard, tensorflow-addons
Successfully installed tensorflow-addons-0.21.0 typeguard-2.13.3
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
2023-08-09 04:09:03.610784: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-08-09 04:09:03.652733: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-08-09 04:09:04.385836: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages/tensorflow_addons/utils/tfa_eol_msg.py:23: UserWarning: 

TensorFlow Addons (TFA) has ended development and introduction of new features.
TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024.
Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). 

For more information see: https://github.com/tensorflow/addons/issues/2807 

  warnings.warn(
Traceback (most recent call last):
  File "get_vtab1k.py", line 21, in <module>
    from PIL import Image
ModuleNotFoundError: No module named 'PIL'
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install PIL
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
ERROR: Could not find a version that satisfies the requirement PIL (from versions: none)
ERROR: No matching distribution found for PIL
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ pip install Pillow
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting Pillow
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/ff/8c/5927a58c43ebc16e508eef325fdc6473b569e2474d3b4be49798aa371007/Pillow-10.0.0-cp38-cp38-manylinux_2_28_x86_64.whl (3.4 MB)
Installing collected packages: Pillow
Successfully installed Pillow-10.0.0
(NOAH) jiacheng@cvip-3090x8:/data/sdb/jiacheng/NOAH/data/vtab-source$ python get_vtab1k.py
2023-08-09 04:09:57.816914: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-08-09 04:09:57.859487: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-08-09 04:09:58.665901: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/home/jiacheng/anaconda3/envs/NOAH/lib/python3.8/site-packages/tensorflow_addons/utils/tfa_eol_msg.py:23: UserWarning: 

TensorFlow Addons (TFA) has ended development and introduction of new features.
TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024.
Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). 

For more information see: https://github.com/tensorflow/addons/issues/2807 

  warnings.warn(
2023-08-09 04:09:59.725324: W tensorflow/tsl/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed, returning an empty token. Retrieving token from files failed with "NOT_FOUND: Could not locate the credentials file.". Retrieving token from GCE failed with "FAILED_PRECONDITION: Error executing an HTTP request: libcurl code 6 meaning 'Couldn't resolve host name', error details: Could not resolve host: metadata.google.internal".
2023-08-09 04:11:01.969729: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001244 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:12:05.570709: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001114 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:13:11.994750: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.00135 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:14:21.092205: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001123 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
^C2023-08-09 04:15:39.612205: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.001054 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
2023-08-09 04:17:12.397243: E tensorflow/tsl/platform/cloud/curl_http_request.cc:610] The transmission  of request 0x2097ff0 (URI: https://www.googleapis.com/storage/v1/b/tfds-data/o/dataset_info%2Fcaltech101%2F3.0.1?fields=size%2Cgeneration%2Cupdated) has been stuck at 0 of 0 bytes for 61 seconds and will be aborted. CURL timing information: lookup time: 0.000824 (No error), connect time: 0 (No error), pre-transfer time: 0 (No error), start-transfer time: 0 (No error)
Maystern commented 1 year ago

There are my packages in my anaconda environment.

# packages in environment at /home/jiacheng/anaconda3/envs/NOAH:
#
# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                        main  
_openmp_mutex             5.1                       1_gnu  
absl-py                   1.4.0                    pypi_0    pypi
array-record              0.4.0                    pypi_0    pypi
astunparse                1.6.3                    pypi_0    pypi
ca-certificates           2023.05.30           h06a4308_0  
cachetools                5.3.1                    pypi_0    pypi
certifi                   2023.7.22                pypi_0    pypi
charset-normalizer        3.2.0                    pypi_0    pypi
click                     8.1.6                    pypi_0    pypi
dm-tree                   0.1.8                    pypi_0    pypi
etils                     1.3.0                    pypi_0    pypi
flatbuffers               23.5.26                  pypi_0    pypi
gast                      0.4.0                    pypi_0    pypi
google-auth               2.22.0                   pypi_0    pypi
google-auth-oauthlib      1.0.0                    pypi_0    pypi
google-pasta              0.2.0                    pypi_0    pypi
googleapis-common-protos  1.60.0                   pypi_0    pypi
grpcio                    1.56.2                   pypi_0    pypi
h5py                      3.9.0                    pypi_0    pypi
idna                      3.4                      pypi_0    pypi
importlib-metadata        6.8.0                    pypi_0    pypi
importlib-resources       6.0.1                    pypi_0    pypi
keras                     2.13.1                   pypi_0    pypi
ld_impl_linux-64          2.38                 h1181459_1  
libclang                  16.0.6                   pypi_0    pypi
libffi                    3.4.4                h6a678d5_0  
libgcc-ng                 11.2.0               h1234567_1  
libgomp                   11.2.0               h1234567_1  
libstdcxx-ng              11.2.0               h1234567_1  
markdown                  3.4.4                    pypi_0    pypi
markupsafe                2.1.3                    pypi_0    pypi
ncurses                   6.4                  h6a678d5_0  
numpy                     1.24.3                   pypi_0    pypi
oauthlib                  3.2.2                    pypi_0    pypi
openssl                   3.0.10               h7f8727e_0  
opt-einsum                3.3.0                    pypi_0    pypi
packaging                 23.1                     pypi_0    pypi
pillow                    10.0.0                   pypi_0    pypi
pip                       23.2.1           py38h06a4308_0  
promise                   2.3                      pypi_0    pypi
protobuf                  4.23.4                   pypi_0    pypi
psutil                    5.9.5                    pypi_0    pypi
pyasn1                    0.5.0                    pypi_0    pypi
pyasn1-modules            0.3.0                    pypi_0    pypi
python                    3.8.17               h955ad1f_0  
readline                  8.2                  h5eee18b_0  
requests                  2.31.0                   pypi_0    pypi
requests-oauthlib         1.3.1                    pypi_0    pypi
rsa                       4.9                      pypi_0    pypi
setuptools                68.0.0           py38h06a4308_0  
six                       1.16.0                   pypi_0    pypi
sqlite                    3.41.2               h5eee18b_0  
tensorboard               2.13.0                   pypi_0    pypi
tensorboard-data-server   0.7.1                    pypi_0    pypi
tensorflow                2.13.0                   pypi_0    pypi
tensorflow-addons         0.21.0                   pypi_0    pypi
tensorflow-datasets       4.9.2                    pypi_0    pypi
tensorflow-estimator      2.13.0                   pypi_0    pypi
tensorflow-io-gcs-filesystem 0.33.0                   pypi_0    pypi
tensorflow-metadata       1.13.1                   pypi_0    pypi
termcolor                 2.3.0                    pypi_0    pypi
tk                        8.6.12               h1ccaba5_0  
toml                      0.10.2                   pypi_0    pypi
tqdm                      4.65.1                   pypi_0    pypi
typeguard                 2.13.3                   pypi_0    pypi
typing-extensions         4.5.0                    pypi_0    pypi
urllib3                   1.26.16                  pypi_0    pypi
werkzeug                  2.3.6                    pypi_0    pypi
wheel                     0.38.4           py38h06a4308_0  
wrapt                     1.15.0                   pypi_0    pypi
xz                        5.4.2                h5eee18b_0  
zipp                      3.16.2                   pypi_0    pypi
zlib                      1.2.13               h5eee18b_0  
ZhangYuanhan-AI commented 1 year ago

My tfds: 4.4.0+nightly

Maystern commented 1 year ago

Thank you very much. I have solved this problem.

Maystern commented 1 year ago

Due to the network firewall in Chinese Mainland, Tensorflow cannot use the built-in method of automatically downloading datasets to download datasets from Google, so it cannot properly prepare the vtab-1k dataset.

My suggestion for this is to use servers outside of mainland China to download datasets, or to use pre downloaded datasets to avoid TensorFlow automatically downloading datasets from Google.

Anyway, thank you very much for the author's selfless help. @ZhangYuanhan-AI

ZhangYuanhan-AI commented 1 year ago

Thanks for this Important information, and enjoy NOAH!

ShunLu91 commented 1 year ago

Thanks for the tips from @Maystern. I provide the code for using proxy in get_vtab1k.py:

import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_ip:your_proxy_port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_ip:your_proxy_port'

And don't forget to remove the softlink of the original repo 'data/vtab-source/data/vtab' after clone the code. Finally, you will successfully download the datasets.

ZhangYuanhan-AI commented 1 year ago

Thanks for the tips from @Maystern. I provide the code for using proxy in get_vtab1k.py:

import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_ip:your_proxy_port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_ip:your_proxy_port'

And don't forget to remove the softlink of the original repo 'data/vtab-source/data/vtab' after clone the code. Finally, you will successfully download the datasets.

Thanks! Using proxy is indeed a good way to download.

ShunLu91 commented 1 year ago

I want to share a more convenient way to download the dataset. Thanks for the efforts of RepAdapter, we can use Google Drive provided by them to download the dataset from their repo.