QData / TextAttack

TextAttack 🐙 is a Python framework for adversarial attacks, data augmentation, and model training in NLP https://textattack.readthedocs.io/en/master/
https://textattack.readthedocs.io/en/master/
MIT License
2.95k stars 395 forks source link

Trying to run textattack --help out of the box fails with AttributeError: module 'tensorflow.tools.docs.doc_controls' has no attribute 'inheritable_header' #560

Closed stroypet closed 2 years ago

stroypet commented 3 years ago

Make fresh env, pip install textattack[tensorflow, optional], textattack --help fails. See below.

C:\Users\stanl λ cd dev

C:\Users\stanl\dev λ mkdir textattacktest

C:\Users\stanl\dev λ cd textattacktest\

C:\Users\stanl\dev\textattacktest λ python -m venv tavenv

C:\Users\stanl\dev\textattacktest λ tavenv\Scripts\activate.bat

C:\Users\stanl\dev\textattacktest (tavenv) λ pip install textattack[tensorflow,optional] Collecting textattack[optional,tensorflow] Using cached textattack-0.3.3-py3-none-any.whl (361 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\cb\f3\5a\d88198fdeb46781ddd7e7f2653061af83e7adb2a076d8886d6\word2number-1.1-py3-none-any.whl Collecting flair Using cached flair-0.9-py3-none-any.whl (319 kB) Collecting bert-score>=0.3.5 Using cached bert_score-0.3.10-py3-none-any.whl (59 kB) Collecting num2words Using cached num2words-0.5.10-py3-none-any.whl (101 kB) Collecting pandas>=1.0.1 Using cached pandas-1.3.4-cp38-cp38-win_amd64.whl (10.2 MB) Collecting more-itertools Using cached more_itertools-8.10.0-py3-none-any.whl (51 kB) Collecting PySocks!=1.5.7,>=1.5.6 Using cached PySocks-1.7.1-py3-none-any.whl (16 kB) Collecting editdistance Using cached editdistance-0.6.0-cp38-cp38-win_amd64.whl (24 kB) Collecting language-tool-python Using cached language_tool_python-2.6.1-py3-none-any.whl (30 kB) Collecting lemminflect Using cached lemminflect-0.2.2-py3-none-any.whl (769 kB) Collecting datasets Using cached datasets-1.14.0-py3-none-any.whl (290 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\08\8f\5f\253d0105a55bd84ee61ef0d37dbf70421e61e0cd70cef7c5e1\terminaltables-3.1.0-py3-none-any.whl Processing c:\users\stanl\appdata\local\pip\cache\wheels\ab\ed\82\d47759a6e2416fb78d6ee81c02a970a50d5dc6fc270b5a3242\lru_dict-1.1.7-cp38-cp38-win_amd64.whl Collecting nltk Using cached nltk-3.6.5-py3-none-any.whl (1.5 MB) Collecting transformers>=3.3.0 Using cached transformers-4.12.2-py3-none-any.whl (3.1 MB) Collecting numpy>=1.19.2 Using cached numpy-1.21.3-cp38-cp38-win_amd64.whl (14.0 MB) Collecting tqdm<4.50.0,>=4.27 Using cached tqdm-4.49.0-py2.py3-none-any.whl (69 kB) Collecting filelock Using cached filelock-3.3.2-py3-none-any.whl (9.7 kB) Collecting scipy>=1.4.1 Using cached scipy-1.7.1-cp38-cp38-win_amd64.whl (33.7 MB) Collecting torch!=1.8,>=1.7.0 Using cached torch-1.10.0-cp38-cp38-win_amd64.whl (226.6 MB) Collecting wandb; extra == "optional" Downloading wandb-0.12.6-py2.py3-none-any.whl (1.7 MB) |████████████████████████████████| 1.7 MB 1.6 MB/s Collecting sentence-transformers>0.2.6; extra == "optional" Downloading sentence-transformers-2.1.0.tar.gz (78 kB) |████████████████████████████████| 78 kB 3.4 MB/s Collecting visdom; extra == "optional" Downloading visdom-0.1.8.9.tar.gz (676 kB) |████████████████████████████████| 676 kB 6.4 MB/s Collecting stanza; extra == "optional" Downloading stanza-1.3.0-py3-none-any.whl (432 kB) |████████████████████████████████| 432 kB 6.4 MB/s Collecting gensim==3.8.3; extra == "optional" Using cached gensim-3.8.3-cp38-cp38-win_amd64.whl (24.2 MB) Collecting tensorflow-text>=2; extra == "tensorflow" Using cached tensorflow_text-2.6.0-cp38-cp38-win_amd64.whl (2.1 MB) Collecting tensorboardX; extra == "tensorflow" Using cached tensorboardX-2.4-py2.py3-none-any.whl (124 kB) Collecting tensorflow-hub; extra == "tensorflow" Using cached tensorflow_hub-0.12.0-py2.py3-none-any.whl (108 kB) Collecting tensorflow>=2; extra == "tensorflow" Using cached tensorflow-2.6.0-cp38-cp38-win_amd64.whl (423.2 MB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\36\6d\90\6d9b11ba404f68f340ef3f6060cfdf9c9f34653b08eceeacf6\segtok-1.5.10-py3-none-any.whl Processing c:\users\stanl\appdata\local\pip\cache\wheels\92\82\8c\54ef8d8770fd1a80938197e55d3ccd26eccd117f44c58f601b\sqlitedict-1.7.0-py3-none-any.whl Processing c:\users\stanl\appdata\local\pip\cache\wheels\3d\9f\9d\d806a20bd97bc7076d724fa3e69fa5be61836ba16b2ffa6126\mpld3-0.3-py3-none-any.whl Collecting sentencepiece==0.1.95 Using cached sentencepiece-0.1.95-cp38-cp38-win_amd64.whl (1.2 MB) Collecting hyperopt>=0.1.1 Using cached hyperopt-0.2.5-py2.py3-none-any.whl (965 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\e2\62\1e\926d1ebe7b1e733c78d627fd288d01b83feaf67efc06e0e4c3\gdown-3.12.2-py3-none-any.whl Collecting conllu>=4.0 Using cached conllu-4.4.1-py2.py3-none-any.whl (15 kB) Collecting janome Using cached Janome-0.4.1-py2.py3-none-any.whl (19.7 MB) Collecting python-dateutil>=2.6.1 Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting matplotlib>=2.2.3 Using cached matplotlib-3.4.3-cp38-cp38-win_amd64.whl (7.1 MB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\13\c7\b0\79f66658626032e78fc1a83103690ef6797d551cb22e56e734\langdetect-1.0.9-py3-none-any.whl Collecting scikit-learn>=0.21.3 Using cached scikit_learn-1.0.1-cp38-cp38-win_amd64.whl (7.2 MB) Collecting bpemb>=0.3.2 Using cached bpemb-0.3.3-py3-none-any.whl (19 kB) Collecting tabulate Using cached tabulate-0.8.9-py3-none-any.whl (25 kB) Collecting huggingface-hub Using cached huggingface_hub-0.0.19-py3-none-any.whl (56 kB) Collecting konoha<5.0.0,>=4.0.0 Using cached konoha-4.6.5-py3-none-any.whl (20 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\ed\88\e3\da3d4d73cb91d659488cfa25913b84bbc26febec99d257bce9\wikipedia_api-0.5.4-py3-none-any.whl Processing c:\users\stanl\appdata\local\pip\cache\wheels\7f\40\63\4bf603cec3ecc4a26985405834cb47eb8368bfa59e15dde046\ftfy-6.0.3-py3-none-any.whl Collecting regex Using cached regex-2021.10.23-cp38-cp38-win_amd64.whl (273 kB) Collecting deprecated>=1.2.4 Using cached Deprecated-1.2.13-py2.py3-none-any.whl (9.6 kB) Collecting lxml Using cached lxml-4.6.3-cp38-cp38-win_amd64.whl (3.5 MB) Collecting packaging>=20.9 Using cached packaging-21.2-py3-none-any.whl (40 kB) Collecting requests Using cached requests-2.26.0-py2.py3-none-any.whl (62 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\56\ea\58\ead137b087d9e326852a851351d1debf4ada529b6ac0ec4e8c\docopt-0.6.2-py2.py3-none-any.whl Collecting pytz>=2017.3 Using cached pytz-2021.3-py2.py3-none-any.whl (503 kB) Collecting aiohttp Using cached aiohttp-3.7.4.post0-cp38-cp38-win_amd64.whl (635 kB) Collecting multiprocess Using cached multiprocess-0.70.12.2-py38-none-any.whl (128 kB) Collecting dill Using cached dill-0.3.4-py2.py3-none-any.whl (86 kB) Collecting pyarrow!=4.0.0,>=1.0.0 Using cached pyarrow-6.0.0-cp38-cp38-win_amd64.whl (15.5 MB) Collecting fsspec[http]>=2021.05.0 Using cached fsspec-2021.10.1-py3-none-any.whl (125 kB) Collecting xxhash Using cached xxhash-2.0.2-cp38-cp38-win_amd64.whl (35 kB) Collecting joblib Using cached joblib-1.1.0-py2.py3-none-any.whl (306 kB) Collecting click Using cached click-8.0.3-py3-none-any.whl (97 kB) Collecting pyyaml>=5.1 Using cached PyYAML-6.0-cp38-cp38-win_amd64.whl (155 kB) Collecting tokenizers<0.11,>=0.10.1 Using cached tokenizers-0.10.3-cp38-cp38-win_amd64.whl (2.0 MB) Collecting sacremoses Using cached sacremoses-0.0.46-py3-none-any.whl (895 kB) Collecting typing-extensions Using cached typing_extensions-3.10.0.2-py3-none-any.whl (26 kB) Collecting docker-pycreds>=0.4.0 Using cached docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\4c\8e\7e\72fbc243e1aeecae64a96875432e70d4e92f3d2d18123be004\pathtools-0.1.2-py3-none-any.whl Collecting yaspin>=1.0.0 Downloading yaspin-2.1.0-py3-none-any.whl (18 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\54\aa\01\724885182f93150035a2a91bce34a12877e8067a97baaf5dc8\promise-2.3-py3-none-any.whl Collecting GitPython>=1.0.0 Downloading GitPython-3.1.24-py3-none-any.whl (180 kB) |████████████████████████████████| 180 kB 6.4 MB/s Collecting sentry-sdk>=1.0.0 Downloading sentry_sdk-1.4.3-py2.py3-none-any.whl (139 kB) |████████████████████████████████| 139 kB 6.4 MB/s Processing c:\users\stanl\appdata\local\pip\cache\wheels\9f\69\d1\50b39b308a87998eaf5c1d9095e5a5bd2ad98501e2b7936d36\subprocess32-3.5.4-py3-none-any.whl Collecting configparser>=3.8.1 Using cached configparser-5.0.2-py3-none-any.whl (19 kB) Collecting protobuf>=3.12.0 Using cached protobuf-3.19.1-cp38-cp38-win_amd64.whl (895 kB) Collecting psutil>=5.0.0 Using cached psutil-5.8.0-cp38-cp38-win_amd64.whl (245 kB) Collecting six>=1.13.0 Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting shortuuid>=0.5.0 Using cached shortuuid-1.0.1-py3-none-any.whl (7.5 kB) Collecting torchvision Downloading torchvision-0.11.1-cp38-cp38-win_amd64.whl (984 kB) |████████████████████████████████| 984 kB 3.3 MB/s Collecting tornado Using cached tornado-6.1-cp38-cp38-win_amd64.whl (422 kB) Collecting pyzmq Using cached pyzmq-22.3.0-cp38-cp38-win_amd64.whl (1.0 MB) Collecting jsonpatch Downloading jsonpatch-1.32-py2.py3-none-any.whl (12 kB) Collecting torchfile Using cached torchfile-0.1.0.tar.gz (5.2 kB) Collecting websocket-client Downloading websocket_client-1.2.1-py2.py3-none-any.whl (52 kB) |████████████████████████████████| 52 kB 525 kB/s Collecting pillow Using cached Pillow-8.4.0-cp38-cp38-win_amd64.whl (3.2 MB) Collecting emoji Downloading emoji-1.6.1.tar.gz (170 kB) |████████████████████████████████| 170 kB 6.8 MB/s Collecting smart-open>=1.8.1 Using cached smart_open-5.2.1-py3-none-any.whl (58 kB) Collecting Cython==0.29.14 Using cached Cython-0.29.14-cp38-cp38-win_amd64.whl (1.7 MB) Collecting astunparse~=1.6.3 Using cached astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Collecting tensorflow-estimator~=2.6 Using cached tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB) Collecting h5py~=3.1.0 Using cached h5py-3.1.0-cp38-cp38-win_amd64.whl (2.7 MB) Collecting absl-py~=0.10 Using cached absl_py-0.15.0-py3-none-any.whl (132 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\f1\60\77\22b9b5887bd47801796a856f47650d9789c74dc3161a26d608\clang-5.0-py3-none-any.whl Collecting keras-preprocessing~=1.1.2 Using cached Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\5f\fd\9e\b6cf5890494cb8ef0b5eaff72e5d55a70fb56316007d6dfe73\wrapt-1.12.1-cp38-cp38-win_amd64.whl Collecting flatbuffers~=1.12.0 Using cached flatbuffers-1.12-py2.py3-none-any.whl (15 kB) Collecting tensorboard~=2.6 Using cached tensorboard-2.7.0-py3-none-any.whl (5.8 MB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\a0\16\9c\5473df82468f958445479c59e784896fa24f4a5fc024b0f501\termcolor-1.1.0-py3-none-any.whl Collecting wheel~=0.35 Using cached wheel-0.37.0-py2.py3-none-any.whl (35 kB) Collecting google-pasta~=0.2 Using cached google_pasta-0.2.0-py3-none-any.whl (57 kB) Collecting gast==0.4.0 Using cached gast-0.4.0-py3-none-any.whl (9.8 kB) Collecting opt-einsum~=3.3.0 Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB) Collecting grpcio<2.0,>=1.37.0 Using cached grpcio-1.41.1-cp38-cp38-win_amd64.whl (3.2 MB) Collecting keras~=2.6 Using cached keras-2.6.0-py2.py3-none-any.whl (1.3 MB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\8e\70\28\3d6ccd6e315f65f245da085482a2e1c7d14b90b30f239e2cf4\future-0.18.2-py3-none-any.whl Collecting cloudpickle Using cached cloudpickle-2.0.0-py3-none-any.whl (25 kB) Collecting networkx>=2.2 Using cached networkx-2.6.3-py3-none-any.whl (1.9 MB) Collecting pyparsing>=2.2.1 Using cached pyparsing-3.0.4-py3-none-any.whl (96 kB) Collecting kiwisolver>=1.0.1 Using cached kiwisolver-1.3.2-cp38-cp38-win_amd64.whl (52 kB) Collecting cycler>=0.10 Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB) Collecting threadpoolctl>=2.0.0 Using cached threadpoolctl-3.0.0-py3-none-any.whl (14 kB) Collecting importlib-metadata<4.0.0,>=3.7.0 Using cached importlib_metadata-3.10.1-py3-none-any.whl (14 kB) Processing c:\users\stanl\appdata\local\pip\cache\wheels\6a\4f\72\28857f75625b263e2e3f5ab2fc4416c0a85960ac6485007eaa\overrides-3.1.0-py3-none-any.whl Collecting wcwidth Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB) Collecting idna<4,>=2.5; python_version >= "3" Using cached idna-3.3-py3-none-any.whl (61 kB) Collecting charset-normalizer~=2.0.0; python_version >= "3" Using cached charset_normalizer-2.0.7-py3-none-any.whl (38 kB) Collecting certifi>=2017.4.17 Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB) Collecting urllib3<1.27,>=1.21.1 Using cached urllib3-1.26.7-py2.py3-none-any.whl (138 kB) Collecting attrs>=17.3.0 Using cached attrs-21.2.0-py2.py3-none-any.whl (53 kB) Collecting async-timeout<4.0,>=3.0 Using cached async_timeout-3.0.1-py3-none-any.whl (8.2 kB) Collecting chardet<5.0,>=2.0 Using cached chardet-4.0.0-py2.py3-none-any.whl (178 kB) Collecting yarl<2.0,>=1.0 Using cached yarl-1.7.0-cp38-cp38-win_amd64.whl (122 kB) Collecting multidict<7.0,>=4.5 Using cached multidict-5.2.0-cp38-cp38-win_amd64.whl (45 kB) Collecting colorama; platform_system == "Windows" Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB) Collecting gitdb<5,>=4.0.1 Downloading gitdb-4.0.9-py3-none-any.whl (63 kB) |████████████████████████████████| 63 kB 4.8 MB/s Collecting jsonpointer>=1.9 Downloading jsonpointer-2.1-py2.py3-none-any.whl (7.4 kB) Collecting google-auth-oauthlib<0.5,>=0.4.1 Using cached google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Collecting google-auth<3,>=1.6.3 Using cached google_auth-2.3.2-py2.py3-none-any.whl (155 kB) Requirement already satisfied: setuptools>=41.0.0 in c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages (from tensorboard~=2.6->tensorflow>=2; extra == "tensorflow"->textattack[optional,tensorflow]) (49.2.1) Collecting tensorboard-data-server<0.7.0,>=0.6.0 Using cached tensorboard_data_server-0.6.1-py3-none-any.whl (2.4 kB) Collecting markdown>=2.6.8 Using cached Markdown-3.3.4-py3-none-any.whl (97 kB) Collecting tensorboard-plugin-wit>=1.6.0 Using cached tensorboard_plugin_wit-1.8.0-py3-none-any.whl (781 kB) Collecting werkzeug>=0.11.15 Using cached Werkzeug-2.0.2-py3-none-any.whl (288 kB) Collecting zipp>=0.5 Using cached zipp-3.6.0-py3-none-any.whl (5.3 kB) Collecting smmap<6,>=3.0.1 Downloading smmap-5.0.0-py3-none-any.whl (24 kB) Collecting requests-oauthlib>=0.7.0 Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB) Collecting pyasn1-modules>=0.2.1 Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) Collecting cachetools<5.0,>=2.0.0 Using cached cachetools-4.2.4-py3-none-any.whl (10 kB) Collecting rsa<5,>=3.1.4; python_version >= "3.6" Using cached rsa-4.7.2-py3-none-any.whl (34 kB) Collecting oauthlib>=3.0.0 Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB) Collecting pyasn1<0.5.0,>=0.4.6 Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) Using legacy 'setup.py install' for sentence-transformers, since package 'wheel' is not installed. Using legacy 'setup.py install' for visdom, since package 'wheel' is not installed. Using legacy 'setup.py install' for torchfile, since package 'wheel' is not installed. Using legacy 'setup.py install' for emoji, since package 'wheel' is not installed. Installing collected packages: word2number, more-itertools, regex, segtok, typing-extensions, torch, sqlitedict, mpld3, sentencepiece, future, tqdm, six, cloudpickle, numpy, scipy, networkx, hyperopt, filelock, idna, charset-normalizer, certifi, urllib3, requests, gdown, conllu, janome, python-dateutil, pyparsing, kiwisolver, pillow, cycler, matplotlib, smart-open, Cython, gensim, langdetect, joblib, threadpoolctl, scikit-learn, bpemb, tabulate, packaging, pyyaml, huggingface-hub, zipp, importlib-metadata, overrides, konoha, wikipedia-api, wcwidth, ftfy, wrapt, deprecated, lxml, tokenizers, colorama, click, sacremoses, transformers, flair, pytz, pandas, bert-score, docopt, num2words, PySocks, editdistance, language-tool-python, lemminflect, attrs, async-timeout, chardet, multidict, yarl, aiohttp, dill, multiprocess, pyarrow, fsspec, xxhash, datasets, terminaltables, lru-dict, nltk, docker-pycreds, pathtools, termcolor, yaspin, promise, smmap, gitdb, GitPython, sentry-sdk, subprocess32, configparser, protobuf, psutil, shortuuid, wandb, torchvision, sentence-transformers, tornado, pyzmq, jsonpointer, jsonpatch, torchfile, websocket-client, visdom, emoji, stanza, tensorflow-hub, wheel, astunparse, tensorflow-estimator, h5py, absl-py, clang, keras-preprocessing, flatbuffers, oauthlib, requests-oauthlib, pyasn1, pyasn1-modules, cachetools, rsa, google-auth, google-auth-oauthlib, tensorboard-data-server, grpcio, markdown, tensorboard-plugin-wit, werkzeug, tensorboard, google-pasta, gast, opt-einsum, keras, tensorflow, tensorflow-text, tensorboardX, textattack Running setup.py install for sentence-transformers ... done Running setup.py install for torchfile ... done Running setup.py install for visdom ... done Running setup.py install for emoji ... done ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.

We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.

packaging 21.2 requires pyparsing<3,>=2.0.2, but you'll have pyparsing 3.0.4 which is incompatible. flair 0.9 requires more-itertools~=8.8.0, but you'll have more-itertools 8.10.0 which is incompatible. datasets 1.14.0 requires tqdm>=4.62.1, but you'll have tqdm 4.49.0 which is incompatible. tensorflow 2.6.0 requires numpy~=1.19.2, but you'll have numpy 1.21.3 which is incompatible. tensorflow 2.6.0 requires six~=1.15.0, but you'll have six 1.16.0 which is incompatible. tensorflow 2.6.0 requires typing-extensions~=3.7.4, but you'll have typing-extensions 3.10.0.2 which is incompatible. Successfully installed Cython-0.29.14 GitPython-3.1.24 PySocks-1.7.1 absl-py-0.15.0 aiohttp-3.7.4.post0 astunparse-1.6.3 async-timeout-3.0.1 attrs-21.2.0 bert-score-0.3.10 bpemb-0.3.3 cachetools-4.2.4 certifi-2021.10.8 chardet-4.0.0 charset-normalizer-2.0.7 clang-5.0 click-8.0.3 cloudpickle-2.0.0 colorama-0.4.4 configparser-5.0.2 conllu-4.4.1 cycler-0.11.0 datasets-1.14.0 deprecated-1.2.13 dill-0.3.4 docker-pycreds-0.4.0 docopt-0.6.2 editdistance-0.6.0 emoji-1.6.1 filelock-3.3.2 flair-0.9 flatbuffers-1.12 fsspec-2021.10.1 ftfy-6.0.3 future-0.18.2 gast-0.4.0 gdown-3.12.2 gensim-3.8.3 gitdb-4.0.9 google-auth-2.3.2 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 grpcio-1.41.1 h5py-3.1.0 huggingface-hub-0.0.19 hyperopt-0.2.5 idna-3.3 importlib-metadata-3.10.1 janome-0.4.1 joblib-1.1.0 jsonpatch-1.32 jsonpointer-2.1 keras-2.6.0 keras-preprocessing-1.1.2 kiwisolver-1.3.2 konoha-4.6.5 langdetect-1.0.9 language-tool-python-2.6.1 lemminflect-0.2.2 lru-dict-1.1.7 lxml-4.6.3 markdown-3.3.4 matplotlib-3.4.3 more-itertools-8.10.0 mpld3-0.3 multidict-5.2.0 multiprocess-0.70.12.2 networkx-2.6.3 nltk-3.6.5 num2words-0.5.10 numpy-1.21.3 oauthlib-3.1.1 opt-einsum-3.3.0 overrides-3.1.0 packaging-21.2 pandas-1.3.4 pathtools-0.1.2 pillow-8.4.0 promise-2.3 protobuf-3.19.1 psutil-5.8.0 pyarrow-6.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pyparsing-3.0.4 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 pyzmq-22.3.0 regex-2021.10.23 requests-2.26.0 requests-oauthlib-1.3.0 rsa-4.7.2 sacremoses-0.0.46 scikit-learn-1.0.1 scipy-1.7.1 segtok-1.5.10 sentence-transformers-2.1.0 sentencepiece-0.1.95 sentry-sdk-1.4.3 shortuuid-1.0.1 six-1.16.0 smart-open-5.2.1 smmap-5.0.0 sqlitedict-1.7.0 stanza-1.3.0 subprocess32-3.5.4 tabulate-0.8.9 tensorboard-2.7.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.0 tensorboardX-2.4 tensorflow-2.6.0 tensorflow-estimator-2.7.0 tensorflow-hub-0.12.0 tensorflow-text-2.6.0 termcolor-1.1.0 terminaltables-3.1.0 textattack-0.3.3 threadpoolctl-3.0.0 tokenizers-0.10.3 torch-1.10.0 torchfile-0.1.0 torchvision-0.11.1 tornado-6.1 tqdm-4.49.0 transformers-4.12.2 typing-extensions-3.10.0.2 urllib3-1.26.7 visdom-0.1.8.9 wandb-0.12.6 wcwidth-0.2.5 websocket-client-1.2.1 werkzeug-2.0.2 wheel-0.37.0 wikipedia-api-0.5.4 word2number-1.1 wrapt-1.12.1 xxhash-2.0.2 yarl-1.7.0 yaspin-2.1.0 zipp-3.6.0 WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available. You should consider upgrading via the 'c:\users\stanl\dev\textattacktest\tavenv\scripts\python.exe -m pip install --upgrade pip' command.

C:\Users\stanl\dev\textattacktest (tavenv) λ textattack --help Traceback (most recent call last): File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Users\stanl\dev\textattacktest\tavenv\Scripts\textattack.exe__main.py", line 4, in File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack__init.py", line 12, in from .attack_args import AttackArgs, CommandLineAttackArgs File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\attack_args.py", line 15, in from .attack import Attack File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\attack.py", line 19, in from textattack.constraints import Constraint, PreTransformationConstraint File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints__init.py", line 25, in from . import semantics File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics__init.py", line 7, in from . import sentence_encoders File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\init__.py", line 12, in from .universal_sentence_encoder import ( File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\universal_sentence_encoder\init__.py", line 8, in from .multilingual_universal_sentence_encoder import ( File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\universal_sentence_encoder\multilingual_universal_sentence_encoder.py", line 5, in import tensorflow_text # noqa: F401 File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\init.py", line 22, in from tensorflow_text.python.ops import * File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\python\ops\init.py", line 26, in from tensorflow_text.python.ops.hub_module_splitter import HubModuleSplitter File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\python\ops\hub_module_splitter.py", line 18, in import tensorflow_hub as hub File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_hub\init__.py", line 88, in from tensorflow_hub.estimator import LatestModuleExporter File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_hub\estimator.py", line 62, in class LatestModuleExporter(tf.compat.v1.estimator.Exporter): File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow\python\util\lazy_loader.py", line 62, in getattr module = self._load() File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow\python\util\lazy_loader.py", line 45, in _load module = importlib.import_module(self.name) File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\importlib__init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator__init__.py", line 10, in from tensorflow_estimator._api.v1 import estimator File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator_api\v1\estimator\init__.py", line 10, in from tensorflow_estimator._api.v1.estimator import experimental File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator_api\v1\estimator\experimental\init__.py", line 10, in from tensorflow_estimator.python.estimator.canned.dnn import dnn_logit_fn_builder File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator\python\estimator\canned\dnn.py", line 27, in from tensorflow_estimator.python.estimator import estimator File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator\python\estimator\estimator.py", line 70, in @doc_controls.inheritable_header("""\ AttributeError: module 'tensorflow.tools.docs.doc_controls' has no attribute 'inheritable_header'

C:\Users\stanl\dev\textattacktest (tavenv) λ textattack attack --help Traceback (most recent call last): File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Users\stanl\dev\textattacktest\tavenv\Scripts\textattack.exe__main.py", line 4, in File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack__init.py", line 12, in from .attack_args import AttackArgs, CommandLineAttackArgs File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\attack_args.py", line 15, in from .attack import Attack File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\attack.py", line 19, in from textattack.constraints import Constraint, PreTransformationConstraint File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints__init.py", line 25, in from . import semantics File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics__init.py", line 7, in from . import sentence_encoders File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\init__.py", line 12, in from .universal_sentence_encoder import ( File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\universal_sentence_encoder\init__.py", line 8, in from .multilingual_universal_sentence_encoder import ( File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\textattack\constraints\semantics\sentence_encoders\universal_sentence_encoder\multilingual_universal_sentence_encoder.py", line 5, in import tensorflow_text # noqa: F401 File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\init.py", line 22, in from tensorflow_text.python.ops import * File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\python\ops\init.py", line 26, in from tensorflow_text.python.ops.hub_module_splitter import HubModuleSplitter File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_text\python\ops\hub_module_splitter.py", line 18, in import tensorflow_hub as hub File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_hub\init__.py", line 88, in from tensorflow_hub.estimator import LatestModuleExporter File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_hub\estimator.py", line 62, in class LatestModuleExporter(tf.compat.v1.estimator.Exporter): File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow\python\util\lazy_loader.py", line 62, in getattr module = self._load() File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow\python\util\lazy_loader.py", line 45, in _load module = importlib.import_module(self.name) File "C:\Users\stanl\AppData\Local\Programs\Python\Python38\lib\importlib__init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator__init__.py", line 10, in from tensorflow_estimator._api.v1 import estimator File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator_api\v1\estimator\init__.py", line 10, in from tensorflow_estimator._api.v1.estimator import experimental File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator_api\v1\estimator\experimental\init__.py", line 10, in from tensorflow_estimator.python.estimator.canned.dnn import dnn_logit_fn_builder File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator\python\estimator\canned\dnn.py", line 27, in from tensorflow_estimator.python.estimator import estimator File "c:\users\stanl\dev\textattacktest\tavenv\lib\site-packages\tensorflow_estimator\python\estimator\estimator.py", line 70, in @doc_controls.inheritable_header("""\ AttributeError: module 'tensorflow.tools.docs.doc_controls' has no attribute 'inheritable_header'

Aniloid2 commented 3 years ago

same problem here, try pip uninstall tensorflow-text pip install tensorflow-text==2.5

this should downgrade the tensorflow-text package and tensorflow, it works for me, not sure why. It must be some dependency problem

stroypet commented 3 years ago

@Aniloid2 Thanks that did it!

This should probably be raised as a bug.

qiyanjun commented 2 years ago

please pip install textattack[tensorflow]