Closed OeslleLucena closed 4 years ago
Hello @OeslleLucena thanks for the interest, we'll take a look at this, in the meantime contributions are welcome as well.
HI @ebursztein, that's good to know! Thank you!
@eisenjulian
I tried to generate the docker image but I've got errors when installing tapas similar to issue #33 Any idea about how to fix that? Please find below the logs for the errors.
Cloning into 'tapas'...
Collecting pip
Downloading https://files.pythonhosted.org/packages/43/84/23ed6a1796480a6f1a2d38f2802901d078266bda38388954d01d3f2e821d/pip-20.1.1-py2.py3-none-any.whl (1.5MB)
Installing collected packages: pip
Found existing installation: pip 9.0.1
Not uninstalling pip at /usr/lib/python3/dist-packages, outside environment /usr
Successfully installed pip-20.1.1
Obtaining file:///home/tapas
Collecting apache-beam[gcp]==2.20.0
Downloading apache_beam-2.20.0-cp36-cp36m-manylinux1_x86_64.whl (3.5 MB)
Collecting frozendict==1.2
Downloading frozendict-1.2.tar.gz (2.6 kB)
Collecting pandas~=1.0.0
Downloading pandas-1.0.5-cp36-cp36m-manylinux1_x86_64.whl (10.1 MB)
Collecting scikit-learn~=0.22.1
Downloading scikit_learn-0.22.2.post1-cp36-cp36m-manylinux1_x86_64.whl (7.1 MB)
Collecting tensorflow-probability==0.10.0
Downloading tensorflow_probability-0.10.0-py2.py3-none-any.whl (3.5 MB)
Collecting tensorflow~=2.2.0
Downloading tensorflow-2.2.0-cp36-cp36m-manylinux2010_x86_64.whl (516.2 MB)
Collecting tf-models-official~=2.2.0
Downloading tf_models_official-2.2.1-py2.py3-none-any.whl (711 kB)
Collecting tf_slim~=1.1.0
Downloading tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)
Collecting dataclasses~=0.7
Downloading dataclasses-0.7-py3-none-any.whl (18 kB)
Collecting future<1.0.0,>=0.16.0
Downloading future-0.18.2.tar.gz (829 kB)
Collecting pyarrow<0.17.0,>=0.15.1; python_version >= "3.0" or platform_system != "Windows"
Downloading pyarrow-0.16.0-cp36-cp36m-manylinux2014_x86_64.whl (63.1 MB)
Collecting mock<3.0.0,>=1.0.1
Downloading mock-2.0.0-py2.py3-none-any.whl (56 kB)
Collecting httplib2<=0.12.0,>=0.8
Downloading httplib2-0.12.0.tar.gz (218 kB)
Collecting numpy<2,>=1.14.3
Downloading numpy-1.19.0-cp36-cp36m-manylinux2010_x86_64.whl (14.6 MB)
Collecting hdfs<3.0.0,>=2.1.0
Downloading hdfs-2.5.8.tar.gz (41 kB)
Collecting pytz>=2018.3
Downloading pytz-2020.1-py2.py3-none-any.whl (510 kB)
Collecting avro-python3!=1.9.2,<1.10.0,>=1.8.1; python_version >= "3.0"
Downloading avro-python3-1.9.2.1.tar.gz (37 kB)
WARNING: Requested avro-python3!=1.9.2,<1.10.0,>=1.8.1; python_version >= "3.0" from https://files.pythonhosted.org/packages/5a/80/acd1455bea0a9fcdc60a748a97dcbb3ff624726fb90987a0fc1c19e7a5a5/avro-python3-1.9.2.1.tar.gz#sha256=ca1e77a3da5ac98e8833588f71fb2e170b38e34787ee0e04920de0e9470b7d32 (from apache-beam[gcp]==2.20.0->tapas==0.0.1.dev0), but installing version file-.avro-VERSION.txt
Collecting pymongo<4.0.0,>=3.8.0
Downloading pymongo-3.10.1-cp36-cp36m-manylinux2014_x86_64.whl (460 kB)
Collecting crcmod<2.0,>=1.7
Downloading crcmod-1.7.tar.gz (89 kB)
Collecting oauth2client<4,>=2.0.1
Downloading oauth2client-3.0.0.tar.gz (77 kB)
Collecting pydot<2,>=1.2.0
Downloading pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting protobuf<4,>=3.5.0.post1
Downloading protobuf-3.12.2-cp36-cp36m-manylinux1_x86_64.whl (1.3 MB)
Collecting python-dateutil<3,>=2.8.0
Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting grpcio<2,>=1.12.1
Downloading grpcio-1.30.0-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB)
Collecting typing-extensions<3.8.0,>=3.7.0
Downloading typing_extensions-3.7.4.2-py3-none-any.whl (22 kB)
Collecting fastavro<0.22,>=0.21.4
Downloading fastavro-0.21.24-cp36-cp36m-manylinux1_x86_64.whl (1.2 MB)
Collecting dill<0.3.2,>=0.3.1.1
Downloading dill-0.3.1.1.tar.gz (151 kB)
Collecting google-cloud-datastore<1.8.0,>=1.7.1; extra == "gcp"
Downloading google_cloud_datastore-1.7.4-py2.py3-none-any.whl (82 kB)
Collecting grpcio-gcp<1,>=0.2.2; extra == "gcp"
Downloading grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting google-cloud-core<2,>=0.28.1; extra == "gcp"
Downloading google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-videointelligence<1.14.0,>=1.8.0; extra == "gcp"
Downloading google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB)
Collecting google-cloud-dlp<=0.13.0,>=0.12.0; extra == "gcp"
Downloading google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-bigtable<1.1.0,>=0.31.1; extra == "gcp"
Downloading google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-language<2,>=1.3.0; extra == "gcp"
Downloading google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-vision<0.43.0,>=0.38.0; extra == "gcp"
Downloading google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Collecting google-cloud-bigquery<=1.24.0,>=1.6.0; extra == "gcp"
Downloading google_cloud_bigquery-1.24.0-py2.py3-none-any.whl (165 kB)
Collecting google-cloud-pubsub<1.1.0,>=0.39.0; extra == "gcp"
Downloading google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB)
Collecting google-apitools<0.5.29,>=0.5.28; extra == "gcp"
Downloading google-apitools-0.5.28.tar.gz (172 kB)
Collecting cachetools<4,>=3.1.0; extra == "gcp"
Downloading cachetools-3.1.1-py2.py3-none-any.whl (11 kB)
Collecting google-cloud-spanner<1.14.0,>=1.13.0; extra == "gcp"
Downloading google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting joblib>=0.11
Downloading joblib-0.16.0-py3-none-any.whl (300 kB)
Collecting scipy>=0.17.0
Downloading scipy-1.5.1-cp36-cp36m-manylinux1_x86_64.whl (25.9 MB)
Collecting cloudpickle>=1.2.2
Downloading cloudpickle-1.5.0-py3-none-any.whl (22 kB)
Collecting gast>=0.3.2
Downloading gast-0.3.3-py2.py3-none-any.whl (9.7 kB)
Requirement already satisfied: six>=1.10.0 in /usr/lib/python3/dist-packages (from tensorflow-probability==0.10.0->tapas==0.0.1.dev0) (1.11.0)
Collecting decorator
Downloading decorator-4.4.2-py2.py3-none-any.whl (9.2 kB)
Collecting absl-py>=0.7.0
Downloading absl-py-0.9.0.tar.gz (104 kB)
Collecting h5py<2.11.0,>=2.10.0
Downloading h5py-2.10.0-cp36-cp36m-manylinux1_x86_64.whl (2.9 MB)
Collecting termcolor>=1.1.0
Downloading termcolor-1.1.0.tar.gz (3.9 kB)
Collecting opt-einsum>=2.3.2
Downloading opt_einsum-3.2.1-py3-none-any.whl (63 kB)
Collecting google-pasta>=0.1.8
Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting wrapt>=1.11.1
Downloading wrapt-1.12.1.tar.gz (27 kB)
Requirement already satisfied: wheel>=0.26; python_version >= "3" in /usr/lib/python3/dist-packages (from tensorflow~=2.2.0->tapas==0.0.1.dev0) (0.30.0)
Collecting tensorflow-estimator<2.3.0,>=2.2.0
Downloading tensorflow_estimator-2.2.0-py2.py3-none-any.whl (454 kB)
Collecting astunparse==1.6.3
Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting tensorboard<2.3.0,>=2.2.0
Downloading tensorboard-2.2.2-py3-none-any.whl (3.0 MB)
Collecting keras-preprocessing>=1.1.0
Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB)
Collecting py-cpuinfo>=3.3.0
Downloading py-cpuinfo-7.0.0.tar.gz (95 kB)
Collecting pyyaml
Downloading PyYAML-5.3.1.tar.gz (269 kB)
Collecting tensorflow-hub>=0.6.0
Downloading tensorflow_hub-0.8.0-py2.py3-none-any.whl (101 kB)
Collecting tensorflow-datasets
Downloading tensorflow_datasets-3.2.0-py3-none-any.whl (3.4 MB)
Collecting tensorflow-model-optimization>=0.2.1
Downloading tensorflow_model_optimization-0.3.0-py2.py3-none-any.whl (165 kB)
Collecting matplotlib
Downloading matplotlib-3.2.2-cp36-cp36m-manylinux1_x86_64.whl (12.4 MB)
Collecting opencv-python-headless
Downloading opencv_python_headless-4.3.0.36-cp36-cp36m-manylinux2014_x86_64.whl (36.4 MB)
Collecting mlperf-compliance==0.0.10
Downloading mlperf_compliance-0.0.10-py3-none-any.whl (24 kB)
Collecting tensorflow-addons
Downloading tensorflow_addons-0.10.0-cp36-cp36m-manylinux2010_x86_64.whl (1.0 MB)
Collecting gin-config
Downloading gin_config-0.3.0-py3-none-any.whl (44 kB)
Collecting google-api-python-client>=1.6.7
Downloading google_api_python_client-1.9.3-py3-none-any.whl (59 kB)
Collecting Cython
Downloading Cython-0.29.21-cp36-cp36m-manylinux1_x86_64.whl (2.0 MB)
Collecting psutil>=5.4.3
Downloading psutil-5.7.0.tar.gz (449 kB)
Collecting typing==3.7.4.1
Downloading typing-3.7.4.1-py3-none-any.whl (25 kB)
Collecting sentencepiece
Downloading sentencepiece-0.1.91-cp36-cp36m-manylinux1_x86_64.whl (1.1 MB)
Collecting kaggle>=1.3.9
Downloading kaggle-1.5.6.tar.gz (58 kB)
Collecting Pillow
Downloading Pillow-7.2.0-cp36-cp36m-manylinux1_x86_64.whl (2.2 MB)
Collecting pbr>=0.11
Downloading pbr-5.4.5-py2.py3-none-any.whl (110 kB)
Collecting docopt
Downloading docopt-0.6.2.tar.gz (25 kB)
Collecting requests>=2.7.0
Downloading requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting pyasn1-modules>=0.0.5
Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting pyasn1>=0.1.7
Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting rsa>=3.1.4
Downloading rsa-4.6-py3-none-any.whl (47 kB)
Collecting pyparsing>=2.1.4
Downloading pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Requirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from protobuf<4,>=3.5.0.post1->apache-beam[gcp]==2.20.0->tapas==0.0.1.dev0) (39.0.1)
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
Downloading google_api_core-1.21.0-py2.py3-none-any.whl (90 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
Downloading grpc-google-iam-v1-0.12.3.tar.gz (13 kB)
Collecting google-resumable-media<0.6dev,>=0.5.0
Downloading google_resumable_media-0.5.1-py2.py3-none-any.whl (38 kB)
Collecting google-auth<2.0dev,>=1.9.0
Downloading google_auth-1.18.0-py2.py3-none-any.whl (90 kB)
Collecting fasteners>=0.14
Downloading fasteners-0.15-py2.py3-none-any.whl (23 kB)
Collecting werkzeug>=0.11.15
Downloading Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
Collecting google-auth-oauthlib<0.5,>=0.4.1
Downloading google_auth_oauthlib-0.4.1-py2.py3-none-any.whl (18 kB)
Collecting markdown>=2.6.8
Downloading Markdown-3.2.2-py3-none-any.whl (88 kB)
Collecting tensorboard-plugin-wit>=1.6.0
Downloading tensorboard_plugin_wit-1.7.0-py3-none-any.whl (779 kB)
Collecting attrs>=18.1.0
Downloading attrs-19.3.0-py2.py3-none-any.whl (39 kB)
Collecting promise
Downloading promise-2.3.tar.gz (19 kB)
Collecting tensorflow-metadata
Downloading tensorflow_metadata-0.22.2-py2.py3-none-any.whl (32 kB)
Collecting tqdm
Downloading tqdm-4.47.0-py2.py3-none-any.whl (66 kB)
Collecting dm-tree~=0.1.1
Downloading dm_tree-0.1.5-cp36-cp36m-manylinux1_x86_64.whl (294 kB)
Collecting kiwisolver>=1.0.1
Downloading kiwisolver-1.2.0-cp36-cp36m-manylinux1_x86_64.whl (88 kB)
Collecting cycler>=0.10
Downloading cycler-0.10.0-py2.py3-none-any.whl (6.5 kB)
Collecting typeguard>=2.7
Downloading typeguard-2.9.1-py3-none-any.whl (16 kB)
Collecting google-auth-httplib2>=0.0.3
Downloading google_auth_httplib2-0.0.4-py2.py3-none-any.whl (9.1 kB)
Collecting uritemplate<4dev,>=3.0.0
Downloading uritemplate-3.0.1-py2.py3-none-any.whl (15 kB)
Collecting certifi
Downloading certifi-2020.6.20-py2.py3-none-any.whl (156 kB)
Collecting python-slugify
Downloading python-slugify-4.0.1.tar.gz (11 kB)
Collecting urllib3<1.25,>=1.21.1
Downloading urllib3-1.24.3-py2.py3-none-any.whl (118 kB)
Collecting chardet<4,>=3.0.2
Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Requirement already satisfied: idna<3,>=2.5 in /usr/lib/python3/dist-packages (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam[gcp]==2.20.0->tapas==0.0.1.dev0) (2.6)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
Downloading googleapis_common_protos-1.52.0-py2.py3-none-any.whl (100 kB)
Collecting monotonic>=0.1
Downloading monotonic-1.5-py2.py3-none-any.whl (5.3 kB)
Collecting requests-oauthlib>=0.7.0
Downloading requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting importlib-metadata; python_version < "3.8"
Downloading importlib_metadata-1.7.0-py2.py3-none-any.whl (31 kB)
Collecting text-unidecode>=1.3
Downloading text_unidecode-1.3-py2.py3-none-any.whl (78 kB)
Collecting oauthlib>=3.0.0
Downloading oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting zipp>=0.5
Downloading zipp-3.1.0-py3-none-any.whl (4.9 kB)
Building wheels for collected packages: frozendict, future, httplib2, hdfs, avro-python3, crcmod, oauth2client, dill, google-apitools, absl-py, termcolor, wrapt, py-cpuinfo, pyyaml, psutil, kaggle, docopt, grpc-google-iam-v1, promise, python-slugify
Building wheel for frozendict (setup.py): started
Building wheel for frozendict (setup.py): finished with status 'done'
Created wheel for frozendict: filename=frozendict-1.2-py3-none-any.whl size=3580 sha256=fe0e18fbeb4738e9eeb9f46b74689e7a61fd79a3b5213aaf87f9fa2214ccb3ba
Stored in directory: /root/.cache/pip/wheels/c9/13/a1/b4f2255117a7dccdd6219408dce1d87446716b1bf77451cb97
Building wheel for future (setup.py): started
Building wheel for future (setup.py): finished with status 'done'
Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=493275 sha256=8cf6913164b6eb5ece253e498678212d93ab05f0fb8c25a5f9011e4db9fd81b7
Stored in directory: /root/.cache/pip/wheels/6e/9c/ed/4499c9865ac1002697793e0ae05ba6be33553d098f3347fb94
Building wheel for httplib2 (setup.py): started
Building wheel for httplib2 (setup.py): finished with status 'done'
Created wheel for httplib2: filename=httplib2-0.12.0-py3-none-any.whl size=95116 sha256=c9129588c1d93621d3e4472b1829055bd036daf425277168a731918f562347d4
Stored in directory: /root/.cache/pip/wheels/8f/42/db/47792ef8349ffb6fa2b96e203852f60681a351162b5515b45f
Building wheel for hdfs (setup.py): started
Building wheel for hdfs (setup.py): finished with status 'done'
Created wheel for hdfs: filename=hdfs-2.5.8-py3-none-any.whl size=34733 sha256=7cd3d63724f3395697a985ee3504b207cb1acf8db1bc389ed71000793b32b4b4
Stored in directory: /root/.cache/pip/wheels/3e/0c/c3/26ad975f80274d6bf73ed4d8facd055648f452428bc1623283
Building wheel for avro-python3 (setup.py): started
Building wheel for avro-python3 (setup.py): finished with status 'done'
Created wheel for avro-python3: filename=avro_python3-file_.avro_VERSION.txt-py3-none-any.whl size=44890 sha256=aedd5b188459df8950f540a6fe3300b2054c7e7b7b6cf36f67546e9e5cd751fe
Stored in directory: /root/.cache/pip/wheels/4e/08/0c/727bff8f20fedbdeb8a2c5214e460b214d41c10dc879cf6dac
Building wheel for crcmod (setup.py): started
Building wheel for crcmod (setup.py): finished with status 'done'
Created wheel for crcmod: filename=crcmod-1.7-cp36-cp36m-linux_x86_64.whl size=37197 sha256=e015e52be20aa34adb30926c8f37b6a9145e7f705f415ce88555e6d2ed9b8f46
Stored in directory: /root/.cache/pip/wheels/ac/bb/07/adfb4ffd0aaace2022ea25c082a7cdc688b10d30e86d6d2fde
Building wheel for oauth2client (setup.py): started
Building wheel for oauth2client (setup.py): finished with status 'done'
Created wheel for oauth2client: filename=oauth2client-3.0.0-py3-none-any.whl size=107377 sha256=25f0e99226c832e7867ea6453352087043014ab8e9ea0dd30cef1693b27172e5
Stored in directory: /root/.cache/pip/wheels/85/84/41/0db9b5f02fab88d266e64a52c5a468a3a70f6d331e75ec0e49
Building wheel for dill (setup.py): started
Building wheel for dill (setup.py): finished with status 'done'
Created wheel for dill: filename=dill-0.3.1.1-py3-none-any.whl size=80833 sha256=4ececb8a8549538e458d7434c7fa8aa6a23880f05f2c9710a2b4efd4bd9ff378
Stored in directory: /root/.cache/pip/wheels/09/84/74/d2b4feb9ac9488bc83c475cb2cbe8e8b7d9cea8320d32f3787
Building wheel for google-apitools (setup.py): started
Building wheel for google-apitools (setup.py): finished with status 'done'
Created wheel for google-apitools: filename=google_apitools-0.5.28-py3-none-any.whl size=131643 sha256=edfd8c1459d1d12f9c69633d201eb62d6db94903a342aaa6c83a61e7455ac2b3
Stored in directory: /root/.cache/pip/wheels/49/56/1c/73a513e437099b768ededdcb95106a58f5cdd048fb27ff640b
Building wheel for absl-py (setup.py): started
Building wheel for absl-py (setup.py): finished with status 'done'
Created wheel for absl-py: filename=absl_py-0.9.0-py3-none-any.whl size=119396 sha256=fed19b1ecff172b5ac447d787c5d96c23da4dd5b108ad470b74c0a38690d6ae6
Stored in directory: /root/.cache/pip/wheels/c3/af/84/3962a6af7b4ab336e951b7877dcfb758cf94548bb1771e0679
Building wheel for termcolor (setup.py): started
Building wheel for termcolor (setup.py): finished with status 'done'
Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=5679 sha256=d0317f1dc626dce91581e0ef6ed31438b60985b8f11d4acd60aef02968c16de4
Stored in directory: /root/.cache/pip/wheels/93/2a/eb/e58dbcbc963549ee4f065ff80a59f274cc7210b6eab962acdc
Building wheel for wrapt (setup.py): started
Building wheel for wrapt (setup.py): finished with status 'done'
Created wheel for wrapt: filename=wrapt-1.12.1-cp36-cp36m-linux_x86_64.whl size=69350 sha256=ae1fe13c5c4f1f871becedec3a67606d59a65ebd91c96bc46cc80286e24b784a
Stored in directory: /root/.cache/pip/wheels/32/42/7f/23cae9ff6ef66798d00dc5d659088e57dbba01566f6c60db63
Building wheel for py-cpuinfo (setup.py): started
Building wheel for py-cpuinfo (setup.py): finished with status 'done'
Created wheel for py-cpuinfo: filename=py_cpuinfo-7.0.0-py3-none-any.whl size=20299 sha256=1ab842eeb035b82b50a210f828443e9e4ea42d6374ac9a3610dcf50ca16f2c2b
Stored in directory: /root/.cache/pip/wheels/46/6d/cc/73a126dc2e09fe56fcec0a7386d255762611fbed1c86d3bbcc
Building wheel for pyyaml (setup.py): started
Building wheel for pyyaml (setup.py): finished with status 'done'
Created wheel for pyyaml: filename=PyYAML-5.3.1-cp36-cp36m-linux_x86_64.whl size=45919 sha256=c3ff6a042df4955c75d72df1bdfd47ce9c21371415e3207bb1d7870fba2c79ae
Stored in directory: /root/.cache/pip/wheels/e5/9d/ad/2ee53cf262cba1ffd8afe1487eef788ea3f260b7e6232a80fc
Building wheel for psutil (setup.py): started
Building wheel for psutil (setup.py): finished with status 'done'
Created wheel for psutil: filename=psutil-5.7.0-cp36-cp36m-linux_x86_64.whl size=279827 sha256=841838e6feb3569d06a18c48868015ae0ee27360b55f681e1aaff890a7db8446
Stored in directory: /root/.cache/pip/wheels/a1/d9/f2/b5620c01e9b3e858c6877b1045fda5b115cf7df6490f883382
Building wheel for kaggle (setup.py): started
Building wheel for kaggle (setup.py): finished with status 'done'
Created wheel for kaggle: filename=kaggle-1.5.6-py3-none-any.whl size=73813 sha256=6cd19202645dcd630f55d7ea8afb8e4dd7abf35d8d177472551fb44a84fefb8b
Stored in directory: /root/.cache/pip/wheels/01/3e/ff/77407ebac3ef71a79b9166a8382aecf88415a0bcbe3c095a01
Building wheel for docopt (setup.py): started
Building wheel for docopt (setup.py): finished with status 'done'
Created wheel for docopt: filename=docopt-0.6.2-py2.py3-none-any.whl size=19852 sha256=cc0804ade908147b9a28df2c5e97ab43acb09b3776e62ba31f0f957343cbcdd6
Stored in directory: /root/.cache/pip/wheels/3f/2a/fa/4d7a888e69774d5e6e855d190a8a51b357d77cc05eb1c097c9
Building wheel for grpc-google-iam-v1 (setup.py): started
Building wheel for grpc-google-iam-v1 (setup.py): finished with status 'done'
Created wheel for grpc-google-iam-v1: filename=grpc_google_iam_v1-0.12.3-py3-none-any.whl size=15418 sha256=b4681926cc5bc5928792b72aaf1f1796288ba1623a9af6b5478540dabad0e94c
Stored in directory: /root/.cache/pip/wheels/76/65/cd/392da05e43270f143b6c5076ba88d39144abff586792593e7c
Building wheel for promise (setup.py): started
Building wheel for promise (setup.py): finished with status 'done'
Created wheel for promise: filename=promise-2.3-py3-none-any.whl size=23948 sha256=f0a7e022866e1c079593b3a512539d53d65e643bd162c561b3e76e86d185402a
Stored in directory: /root/.cache/pip/wheels/59/9a/1d/3f1afbbb5122d0410547bf9eb50955f4a7a98e53a6d8b99bd1
Building wheel for python-slugify (setup.py): started
Building wheel for python-slugify (setup.py): finished with status 'done'
Created wheel for python-slugify: filename=python_slugify-4.0.1-py2.py3-none-any.whl size=7016 sha256=af64344f11e71991a0cc9e0cd2ffff79189af8303bc3b27da66a746ae15f07f4
Stored in directory: /root/.cache/pip/wheels/72/e6/db/122611605e60148f54ee2abaca98b2bbeafc6e22486a867bad
Successfully built frozendict future httplib2 hdfs avro-python3 crcmod oauth2client dill google-apitools absl-py termcolor wrapt py-cpuinfo pyyaml psutil kaggle docopt grpc-google-iam-v1 promise python-slugify
ERROR: google-auth 1.18.0 has requirement setuptools>=40.3.0, but you'll have setuptools 39.0.1 which is incompatible.
ERROR: google-cloud-bigquery 1.24.0 has requirement six<2.0.0dev,>=1.13.0, but you'll have six 1.11.0 which is incompatible.
ERROR: google-apitools 0.5.28 has requirement six>=1.12.0, but you'll have six 1.11.0 which is incompatible.
ERROR: apache-beam 2.20.0 has requirement avro-python3!=1.9.2,<1.10.0,>=1.8.1; python_version >= "3.0", but you'll have avro-python3 file-.avro-VERSION.txt which is incompatible.
ERROR: tensorboard 2.2.2 has requirement setuptools>=41.0.0, but you'll have setuptools 39.0.1 which is incompatible.
ERROR: tensorflow 2.2.0 has requirement scipy==1.4.1; python_version >= "3", but you'll have scipy 1.5.1 which is incompatible.
ERROR: tensorflow 2.2.0 has requirement six>=1.12.0, but you'll have six 1.11.0 which is incompatible.
ERROR: tensorflow-hub 0.8.0 has requirement six>=1.12.0, but you'll have six 1.11.0 which is incompatible.
ERROR: dm-tree 0.1.5 has requirement six>=1.12.0, but you'll have six 1.11.0 which is incompatible.
ERROR: tf-models-official 2.2.1 has requirement oauth2client>=4.1.2, but you'll have oauth2client 3.0.0 which is incompatible.
Installing collected packages: future, numpy, pyarrow, pbr, mock, httplib2, docopt, certifi, urllib3, chardet, requests, hdfs, pytz, avro-python3, pymongo, crcmod, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, protobuf, python-dateutil, grpcio, typing-extensions, fastavro, dill, cachetools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpcio-gcp, google-cloud-videointelligence, google-cloud-dlp, grpc-google-iam-v1, google-cloud-bigtable, google-cloud-language, google-cloud-vision, google-resumable-media, google-cloud-bigquery, google-cloud-pubsub, monotonic, fasteners, google-apitools, google-cloud-spanner, apache-beam, frozendict, pandas, joblib, scipy, scikit-learn, cloudpickle, gast, decorator, tensorflow-probability, absl-py, h5py, termcolor, opt-einsum, google-pasta, wrapt, tensorflow-estimator, astunparse, werkzeug, oauthlib, requests-oauthlib, google-auth-oauthlib, zipp, importlib-metadata, markdown, tensorboard-plugin-wit, tensorboard, keras-preprocessing, tensorflow, py-cpuinfo, pyyaml, tensorflow-hub, attrs, promise, tensorflow-metadata, tqdm, tensorflow-datasets, dm-tree, tensorflow-model-optimization, kiwisolver, cycler, matplotlib, opencv-python-headless, mlperf-compliance, dataclasses, typeguard, tensorflow-addons, gin-config, google-auth-httplib2, uritemplate, google-api-python-client, Cython, psutil, typing, sentencepiece, text-unidecode, python-slugify, kaggle, Pillow, tf-models-official, tf-slim, tapas
Running setup.py develop for tapas
Successfully installed Cython-0.29.21 Pillow-7.2.0 absl-py-0.9.0 apache-beam-2.20.0 astunparse-1.6.3 attrs-19.3.0 avro-python3-file-.avro-VERSION.txt cachetools-3.1.1 certifi-2020.6.20 chardet-3.0.4 cloudpickle-1.5.0 crcmod-1.7 cycler-0.10.0 dataclasses-0.7 decorator-4.4.2 dill-0.3.1.1 dm-tree-0.1.5 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 frozendict-1.2 future-0.18.2 gast-0.3.3 gin-config-0.3.0 google-api-core-1.21.0 google-api-python-client-1.9.3 google-apitools-0.5.28 google-auth-1.18.0 google-auth-httplib2-0.0.4 google-auth-oauthlib-0.4.1 google-cloud-bigquery-1.24.0 google-cloud-bigtable-1.0.0 google-cloud-core-1.3.0 google-cloud-datastore-1.7.4 google-cloud-dlp-0.13.0 google-cloud-language-1.3.0 google-cloud-pubsub-1.0.2 google-cloud-spanner-1.13.0 google-cloud-videointelligence-1.13.0 google-cloud-vision-0.42.0 google-pasta-0.2.0 google-resumable-media-0.5.1 googleapis-common-protos-1.52.0 grpc-google-iam-v1-0.12.3 grpcio-1.30.0 grpcio-gcp-0.2.2 h5py-2.10.0 hdfs-2.5.8 httplib2-0.12.0 importlib-metadata-1.7.0 joblib-0.16.0 kaggle-1.5.6 keras-preprocessing-1.1.2 kiwisolver-1.2.0 markdown-3.2.2 matplotlib-3.2.2 mlperf-compliance-0.0.10 mock-2.0.0 monotonic-1.5 numpy-1.19.0 oauth2client-3.0.0 oauthlib-3.1.0 opencv-python-headless-4.3.0.36 opt-einsum-3.2.1 pandas-1.0.5 pbr-5.4.5 promise-2.3 protobuf-3.12.2 psutil-5.7.0 py-cpuinfo-7.0.0 pyarrow-0.16.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pydot-1.4.1 pymongo-3.10.1 pyparsing-2.4.7 python-dateutil-2.8.1 python-slugify-4.0.1 pytz-2020.1 pyyaml-5.3.1 requests-2.24.0 requests-oauthlib-1.3.0 rsa-4.6 scikit-learn-0.22.2.post1 scipy-1.5.1 sentencepiece-0.1.91 tapas tensorboard-2.2.2 tensorboard-plugin-wit-1.7.0 tensorflow-2.2.0 tensorflow-addons-0.10.0 tensorflow-datasets-3.2.0 tensorflow-estimator-2.2.0 tensorflow-hub-0.8.0 tensorflow-metadata-0.22.2 tensorflow-model-optimization-0.3.0 tensorflow-probability-0.10.0 termcolor-1.1.0 text-unidecode-1.3 tf-models-official-2.2.1 tf-slim-1.1.0 tqdm-4.47.0 typeguard-2.9.1 typing-3.7.4.1 typing-extensions-3.7.4.2 uritemplate-3.0.1 urllib3-1.24.3 werkzeug-1.0.1 wrapt-1.12.1 zipp-3.1.0
Collecting tox
Downloading tox-3.16.1-py2.py3-none-any.whl (137 kB)
Collecting py>=1.4.17
Downloading py-1.9.0-py2.py3-none-any.whl (99 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
Downloading virtualenv-20.0.26-py2.py3-none-any.whl (4.9 MB)
Requirement already satisfied: importlib-metadata<2,>=0.12; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from tox) (1.7.0)
Collecting packaging>=14
Downloading packaging-20.4-py2.py3-none-any.whl (37 kB)
Collecting toml>=0.9.4
Downloading toml-0.10.1-py2.py3-none-any.whl (19 kB)
Collecting pluggy>=0.12.0
Downloading pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting six>=1.14.0
Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting filelock>=3.0.0
Downloading filelock-3.0.12-py3-none-any.whl (7.6 kB)
Collecting distlib<1,>=0.3.1
Downloading distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting appdirs<2,>=1.4.3
Downloading appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB)
Collecting importlib-resources>=1.0; python_version < "3.7"
Downloading importlib_resources-3.0.0-py2.py3-none-any.whl (23 kB)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata<2,>=0.12; python_version < "3.8"->tox) (3.1.0)
Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging>=14->tox) (2.4.7)
ERROR: tf-models-official 2.2.1 has requirement oauth2client>=4.1.2, but you'll have oauth2client 3.0.0 which is incompatible.
ERROR: tensorflow 2.2.0 has requirement scipy==1.4.1; python_version >= "3", but you'll have scipy 1.5.1 which is incompatible.
ERROR: tensorboard 2.2.2 has requirement setuptools>=41.0.0, but you'll have setuptools 39.0.1 which is incompatible.
ERROR: google-auth 1.18.0 has requirement setuptools>=40.3.0, but you'll have setuptools 39.0.1 which is incompatible.
ERROR: apache-beam 2.20.0 has requirement avro-python3!=1.9.2,<1.10.0,>=1.8.1; python_version >= "3.0", but you'll have avro-python3 file-.avro-VERSION.txt which is incompatible.
Installing collected packages: py, distlib, appdirs, filelock, importlib-resources, six, virtualenv, packaging, toml, pluggy, tox
Attempting uninstall: six
Found existing installation: six 1.11.0
Uninstalling six-1.11.0:
Successfully uninstalled six-1.11.0
Successfully installed appdirs-1.4.4 distlib-0.3.1 filelock-3.0.12 importlib-resources-3.0.0 packaging-20.4 pluggy-0.13.1 py-1.9.0 six-1.15.0 toml-0.10.1 tox-3.16.1 virtualenv-20.0.26
GLOB sdist-make: /home/tapas/setup.py
py36 create: /home/tapas/.tox/py36
py36 installdeps: -r/home/tapas/requirements.txt
py36 inst: /home/tapas/.tox/.tmp/package/1/tapas-0.0.1.dev0.zip
py36 installed: absl-py==0.9.0,apache-beam==2.20.0,astunparse==1.6.3,attrs==19.3.0,avro-python3===file-.avro-VERSION.txt,cachetools==3.1.1,certifi==2020.6.20,chardet==3.0.4,cloudpickle==1.5.0,crcmod==1.7,cycler==0.10.0,Cython==0.29.21,dataclasses==0.7,decorator==4.4.2,dill==0.3.1.1,dm-tree==0.1.5,docopt==0.6.2,fastavro==0.21.24,fasteners==0.15,frozendict==1.2,future==0.18.2,gast==0.3.3,gin-config==0.3.0,google-api-core==1.21.0,google-api-python-client==1.9.3,google-apitools==0.5.28,google-auth==1.18.0,google-auth-httplib2==0.0.4,google-auth-oauthlib==0.4.1,google-cloud-bigquery==1.24.0,google-cloud-bigtable==1.0.0,google-cloud-core==1.3.0,google-cloud-datastore==1.7.4,google-cloud-dlp==0.13.0,google-cloud-language==1.3.0,google-cloud-pubsub==1.0.2,google-cloud-spanner==1.13.0,google-cloud-videointelligence==1.13.0,google-cloud-vision==0.42.0,google-pasta==0.2.0,google-resumable-media==0.5.1,googleapis-common-protos==1.52.0,grpc-google-iam-v1==0.12.3,grpcio==1.30.0,grpcio-gcp==0.2.2,h5py==2.10.0,hdfs==2.5.8,httplib2==0.12.0,idna==2.10,importlib-metadata==1.7.0,joblib==0.16.0,kaggle==1.5.6,Keras-Preprocessing==1.1.2,kiwisolver==1.2.0,Markdown==3.2.2,matplotlib==3.2.2,mlperf-compliance==0.0.10,mock==2.0.0,monotonic==1.5,numpy==1.19.0,oauth2client==3.0.0,oauthlib==3.1.0,opencv-python-headless==4.3.0.36,opt-einsum==3.2.1,pandas==1.0.5,pbr==5.4.5,Pillow==7.2.0,promise==2.3,protobuf==3.12.2,psutil==5.7.0,py-cpuinfo==7.0.0,pyarrow==0.16.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.1,pymongo==3.10.1,pyparsing==2.4.7,python-dateutil==2.8.1,python-slugify==4.0.1,pytz==2020.1,PyYAML==5.3.1,requests==2.24.0,requests-oauthlib==1.3.0,rsa==4.6,scikit-learn==0.22.2.post1,scipy==1.5.1,sentencepiece==0.1.91,six==1.15.0,tapas @ file:///home/tapas/.tox/.tmp/package/1/tapas-0.0.1.dev0.zip,tensorboard==2.2.2,tensorboard-plugin-wit==1.7.0,tensorflow==2.2.0,tensorflow-addons==0.10.0,tensorflow-datasets==3.2.0,tensorflow-estimator==2.2.0,tensorflow-hub==0.8.0,tensorflow-metadata==0.22.2,tensorflow-model-optimization==0.3.0,tensorflow-probability==0.10.0,termcolor==1.1.0,text-unidecode==1.3,tf-models-official==2.2.1,tf-slim==1.1.0,tqdm==4.47.0,typeguard==2.9.1,typing==3.7.4.1,typing-extensions==3.7.4.2,uritemplate==3.0.1,urllib3==1.24.3,Werkzeug==1.0.1,wrapt==1.12.1,zipp==3.1.0
py36 run-test-pre: PYTHONHASHSEED='2962058561'
py36 run-test: commands[0] | python -m unittest discover -p '*_test.py'
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/compat/v2_compat.py:96: disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.
Instructions for updating:
non-resource variables are not supported in the long term
WARNING:tensorflow:From /home/tapas/tapas/datasets/dataset.py:58: parallel_interleave (from tensorflow.python.data.experimental.ops.interleave_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use `tf.data.Dataset.interleave(map_func, cycle_length, block_length, num_parallel_calls=tf.data.experimental.AUTOTUNE)` instead. If sloppy execution is desired, use `tf.data.Options.experimental_deterministic`.
WARNING:tensorflow:From /home/tapas/tapas/datasets/dataset.py:73: map_and_batch (from tensorflow.python.data.experimental.ops.batching) is deprecated and will be removed in a future version.
Instructions for updating:
Use `tf.data.Dataset.map(map_func, num_parallel_calls)` followed by `tf.data.Dataset.batch(batch_size, drop_remainder)`. Static tf.data optimizations will take care of using the fused implementation.
2020-07-13 11:58:09.847162: W tensorflow/stream_executor/platform/default/dso_loader.cc:55] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/nvidia/lib:/usr/local/nvidia/lib64
2020-07-13 11:58:09.847223: E tensorflow/stream_executor/cuda/cuda_driver.cc:313] failed call to cuInit: UNKNOWN ERROR (303)
2020-07-13 11:58:09.847270: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:163] no NVIDIA GPU device is present: /dev/nvidia0 does not exist
2020-07-13 11:58:09.847543: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-07-13 11:58:09.852386: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 2799925000 Hz
2020-07-13 11:58:09.852779: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f48e4000b20 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-07-13 11:58:09.852795: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
.......s/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/framework/tensor_util.py:523: DeprecationWarning: tostring() is deprecated. Use tobytes() instead.
tensor_proto.tensor_content = nparray.tostring()
..........sE.WARNING:tensorflow:From /home/tapas/tapas/models/bert/modeling.py:252: dense (from tensorflow.python.layers.core) is deprecated and will be removed in a future version.
Instructions for updating:
Use keras.layers.Dense instead.
WARNING:tensorflow:From /home/tapas/tapas/models/bert/modeling.py:252: dense (from tensorflow.python.layers.core) is deprecated and will be removed in a future version.
Instructions for updating:
Use keras.layers.Dense instead.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/layers/core.py:187: Layer.apply (from tensorflow.python.keras.engine.base_layer_v1) is deprecated and will be removed in a future version.
Instructions for updating:
Please use `layer.__call__` method instead.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/layers/core.py:187: Layer.apply (from tensorflow.python.keras.engine.base_layer_v1) is deprecated and will be removed in a future version.
Instructions for updating:
Please use `layer.__call__` method instead.
.sWARNING:tensorflow:From /usr/lib/python3.6/contextlib.py:60: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version.
Instructions for updating:
Use `self.session()` or `self.cached_session()` instead.
WARNING:tensorflow:From /usr/lib/python3.6/contextlib.py:60: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version.
Instructions for updating:
Use `self.session()` or `self.cached_session()` instead.
..s.........sEWARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model_test.py:65: The name tf.estimator.tpu.TPUEstimator is deprecated. Please use tf.compat.v1.estimator.tpu.TPUEstimator instead.
WARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model_test.py:65: The name tf.estimator.tpu.TPUEstimator is deprecated. Please use tf.compat.v1.estimator.tpu.TPUEstimator instead.
WARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model_test.py:68: The name tf.estimator.tpu.RunConfig is deprecated. Please use tf.compat.v1.estimator.tpu.RunConfig instead.
WARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model_test.py:68: The name tf.estimator.tpu.RunConfig is deprecated. Please use tf.compat.v1.estimator.tpu.RunConfig instead.
WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f491014b048>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f491014b048>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:1666: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:1666: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/training/training_util.py:236: Variable.initialized_value (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version.
Instructions for updating:
Use Variable.read_value. Variables in 2.X are initialized automatically both in eager and graph (inside tf.defun) contexts.
WARNING:tensorflow:From /home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow/python/training/training_util.py:236: Variable.initialized_value (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version.
Instructions for updating:
Use Variable.read_value. Variables in 2.X are initialized automatically both in eager and graph (inside tf.defun) contexts.
WARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model.py:192: The name tf.estimator.tpu.TPUEstimatorSpec is deprecated. Please use tf.compat.v1.estimator.tpu.TPUEstimatorSpec instead.
WARNING:tensorflow:From /home/tapas/tapas/models/tapas_pretraining_model.py:192: The name tf.estimator.tpu.TPUEstimatorSpec is deprecated. Please use tf.compat.v1.estimator.tpu.TPUEstimatorSpec instead.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 7 vs previous value: 7. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 7 vs previous value: 7. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 10 vs previous value: 10. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 10 vs previous value: 10. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 12 vs previous value: 12. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 12 vs previous value: 12. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 14 vs previous value: 14. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 14 vs previous value: 14. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 16 vs previous value: 16. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 16 vs previous value: 16. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.
.WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f48f8712400>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f48f8712400>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f49227ab8c8>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:Estimator's model_fn (<function model_fn_builder.<locals>.model_fn at 0x7f49227ab8c8>) includes params argument, but params are not passed to Estimator.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
WARNING:tensorflow:eval_on_tpu ignored because use_tpu is False.
.s.........WARNING:tensorflow:From /home/tapas/tapas/scripts/prediction_utils.py:41: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version.
Instructions for updating:
Use eager execution and:
`tf.data.TFRecordDataset(path)`
WARNING:tensorflow:From /home/tapas/tapas/scripts/prediction_utils.py:41: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version.
Instructions for updating:
Use eager execution and:
`tf.data.TFRecordDataset(path)`
................................................................./home/tapas/.tox/py36/lib/python3.6/site-packages/apache_beam/io/filesystem.py:599: DeprecationWarning: Flags not at the start of the expression 'tapas\\/utils\\/testda' (truncated)
re_pattern = re.compile(self.translate_pattern(pattern))
WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
/home/tapas/.tox/py36/lib/python3.6/site-packages/apache_beam/io/filesystem.py:599: DeprecationWarning: Flags not at the start of the expression '\\/tmp\\/tmpfkm018bx\\/' (truncated)
re_pattern = re.compile(self.translate_pattern(pattern))
.................................................
======================================================================
ERROR: tapas.experiments.prediction_utils_test (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: tapas.experiments.prediction_utils_test
Traceback (most recent call last):
File "/usr/lib/python3.6/unittest/loader.py", line 428, in _find_test_path
module = self._get_module_from_name(name)
File "/usr/lib/python3.6/unittest/loader.py", line 369, in _get_module_from_name
__import__(name)
File "/home/tapas/tapas/experiments/prediction_utils_test.py", line 21, in <module>
from tapas.experiments import prediction_utils
File "/home/tapas/tapas/experiments/prediction_utils.py", line 25, in <module>
from tapas.models import tapas_classifier_model
File "/home/tapas/tapas/models/tapas_classifier_model.py", line 31, in <module>
import tensorflow_probability as tfp
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/__init__.py", line 76, in <module>
from tensorflow_probability.python import * # pylint: disable=wildcard-import
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/__init__.py", line 23, in <module>
from tensorflow_probability.python import distributions
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/distributions/__init__.py", line 88, in <module>
from tensorflow_probability.python.distributions.pixel_cnn import PixelCNN
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/distributions/pixel_cnn.py", line 37, in <module>
from tensorflow_probability.python.layers import weight_norm
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/layers/__init__.py", line 31, in <module>
from tensorflow_probability.python.layers.distribution_layer import CategoricalMixtureOfOneHotCategorical
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/layers/distribution_layer.py", line 28, in <module>
from cloudpickle.cloudpickle import CloudPickler
ImportError: cannot import name 'CloudPickler'
======================================================================
ERROR: tapas.models.tapas_classifier_model_test (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: tapas.models.tapas_classifier_model_test
Traceback (most recent call last):
File "/usr/lib/python3.6/unittest/loader.py", line 428, in _find_test_path
module = self._get_module_from_name(name)
File "/usr/lib/python3.6/unittest/loader.py", line 369, in _get_module_from_name
__import__(name)
File "/home/tapas/tapas/models/tapas_classifier_model_test.py", line 22, in <module>
from tapas.models import tapas_classifier_model
File "/home/tapas/tapas/models/tapas_classifier_model.py", line 31, in <module>
import tensorflow_probability as tfp
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/__init__.py", line 76, in <module>
from tensorflow_probability.python import * # pylint: disable=wildcard-import
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/__init__.py", line 23, in <module>
from tensorflow_probability.python import distributions
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/distributions/__init__.py", line 88, in <module>
from tensorflow_probability.python.distributions.pixel_cnn import PixelCNN
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/distributions/pixel_cnn.py", line 37, in <module>
from tensorflow_probability.python.layers import weight_norm
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/layers/__init__.py", line 31, in <module>
from tensorflow_probability.python.layers.distribution_layer import CategoricalMixtureOfOneHotCategorical
File "/home/tapas/.tox/py36/lib/python3.6/site-packages/tensorflow_probability/python/layers/distribution_layer.py", line 28, in <module>
from cloudpickle.cloudpickle import CloudPickler
ImportError: cannot import name 'CloudPickler'
----------------------------------------------------------------------
Ran 163 tests in 23.146s
FAILED (errors=2, skipped=6)
Running pipeline with direct runner this might take a long time!
ERROR: InvocationError for command /home/tapas/.tox/py36/bin/python -m unittest discover -p '*_test.py' (exited with code 1)
py37 create: /home/tapas/.tox/py37
ERROR: InterpreterNotFound: python3.7
___________________________________ summary ____________________________________
ERROR: py36: commands failed
ERROR: py37: InterpreterNotFound: python3.7
The command '/bin/sh -c cd /home && git clone https://github.com/google-research/tapas.git && cd tapas && pip3 install --upgrade pip && pip install -e . && pip install tox && tox' returned a non-zero code: 1
Apparently there was an issue with a tensorflow probability dependency, and the right fix at the moment is changing the version dependency to 0.10.1 as mentioned in https://github.com/tensorflow/probability/issues/991 We will release a new version soon, in the meantime you can update the requirements file directly.
@OeslleLucena I have tried to dockerize the tapas and have pushed my docker into the docker hub repo. But the code is an adaptation of the TAPAS framework for just prediction using the SQA base model. Just FYI if that helps.
Any pointers to Docker image for TAPAS. Just want to tryout.
Hi All, is it possible that you guys release a docker image for TaPAS? That would be extremely helpful for batch jobs at AWS and GCP.