Ch4osMy7h / FramenetParser

PyTorch code for "A Graph-Based Neural Model for End-to-End Frame Semantic Parsing" (EMNLP2021)
17 stars 5 forks source link

Unable to train data using the train_parser.sh file #1

Open ajaysurya1221 opened 2 years ago

ajaysurya1221 commented 2 years ago

train_parser.sh: line 14: allennlp: command not found train_parser.sh: line 26: syntax error near unexpected token done' train_parser.sh: line 26:done'

allennlp -v produced:

ModuleNotFoundError: No module named 'torch.ao.quantization'

Python version : 3.7.9 and pip install requirements.txt

Ch4osMy7h commented 2 years ago

I guess this problem may caused by mismatch between allennlp and torch version. You can try to remove the constraint " torch===1.9.0" in requirements.txt and then let allennlp automatically download the most matched torch version.

ajaysurya1221 commented 2 years ago

(winenv) C:\Users\Ajay\Desktop\FramenetParser>bash train_parser.sh train_parser.sh: line 14: allennlp: command not found train_parser.sh: line 26: syntax error near unexpected token done' train_parser.sh: line 26:done'

I'm still getting this :(

(winenv) C:\Users\Ajay\Desktop\FramenetParser>pip list Package Version


allennlp 2.10.0 appdirs 1.4.4 atomicwrites 1.4.1 attrs 22.1.0 base58 2.1.1 black 20.8b1 blis 0.7.8 boto3 1.24.43 botocore 1.27.43 cached-path 1.1.5 cachetools 5.2.0 catalogue 2.0.8 certifi 2022.6.15 charset-normalizer 2.1.0 click 8.1.3 colorama 0.4.5 commonmark 0.9.1 cymem 2.0.6 dill 0.3.5.1 docker-pycreds 0.4.0 fairscale 0.4.6 filelock 3.7.1 flake8 5.0.3 gitdb 4.0.9 GitPython 3.1.27 google-api-core 2.8.2 google-auth 2.9.1 google-cloud-core 2.3.2 google-cloud-storage 2.4.0 google-crc32c 1.3.0 google-resumable-media 2.3.3 googleapis-common-protos 1.56.4 h5py 3.7.0 huggingface-hub 0.8.1 idna 3.3 importlib-metadata 4.12.0 iniconfig 1.1.1 Jinja2 3.1.2 jmespath 1.0.1 joblib 1.1.0 langcodes 3.3.0 lmdb 1.3.0 MarkupSafe 2.1.1 mccabe 0.7.0 more-itertools 8.13.0 murmurhash 1.0.7 mypy 0.800 mypy-extensions 0.4.3 nltk 3.7 numpy 1.21.6 packaging 21.3 pathspec 0.9.0 pathtools 0.1.2 pathy 0.6.2 Pillow 9.2.0 pip 20.1.1 pluggy 1.0.0 preshed 3.0.6 promise 2.3 protobuf 3.20.0 psutil 5.9.1 py 1.11.0 pyasn1 0.4.8 pyasn1-modules 0.2.8 pycodestyle 2.9.0 pydantic 1.8.2 pyflakes 2.5.0 Pygments 2.12.0 pyparsing 3.0.9 pytest 7.1.2 python-dateutil 2.8.2 PyYAML 6.0 regex 2022.7.25 requests 2.28.1 rich 12.1.0 rsa 4.9 s3transfer 0.6.0 sacremoses 0.0.53 scikit-learn 1.0.2 scipy 1.7.3 sentencepiece 0.1.96 sentry-sdk 1.9.0 setproctitle 1.3.0 setuptools 47.1.0 shortuuid 1.0.9 six 1.16.0 smart-open 5.2.1 smmap 5.0.0 spacy 3.3.1 spacy-legacy 3.0.9 spacy-loggers 1.0.3 srsly 2.4.4 tensorboardX 2.5.1 termcolor 1.1.0 thinc 8.0.17 threadpoolctl 3.1.0 tokenizers 0.12.1 toml 0.10.2 tomli 2.0.1 torch 1.11.0 torchvision 0.12.0 tqdm 4.64.0 traitlets 5.3.0 transformers 4.20.1 typed-ast 1.4.3 typer 0.6.1 typing-extensions 4.3.0 urllib3 1.26.11 wandb 0.12.21 wasabi 0.10.1 zipp 3.8.1

Ch4osMy7h commented 2 years ago

In Allennlp, they said "we presently do not support Windows but are open to contributions.". I'm sorry about that you may need to run this repo in a linux system.

ajaysurya1221 commented 2 years ago

Now in Ubuntu:

(venv) ajay@ROG:/mnt/c/Users/Ajay/Desktop/FramenetParser$ bash train_parser.sh Traceback (most recent call last): File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/bin/allennlp", line 8, in sys.exit(run()) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/main.py", line 39, in run main(prog="allennlp") File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/commands/init.py", line 111, in main parser, args = parse_args(prog) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/commands/init.py", line 99, in parse_args import_plugins() File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/common/plugins.py", line 102, in import_plugins importlib.import_module(module_name) File "/usr/lib/python3.7/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1006, in _gcd_import File "", line 983, in _find_and_load File "", line 967, in _find_and_load_unlocked File "", line 677, in _load_unlocked File "", line 728, in exec_module File "", line 219, in _call_with_frames_removed File "/mnt/c/Users/Ajay/Desktop/FramenetParser/framenet_parser/init.py", line 8, in from framenet_parser.dataset_readers import * File "/mnt/c/Users/Ajay/Desktop/FramenetParser/framenet_parser/dataset_readers/init.py", line 1, in from framenet_parser.dataset_readers.framenet_reader import FramenetParserReader File "/mnt/c/Users/Ajay/Desktop/FramenetParser/framenet_parser/dataset_readers/framenet_reader.py", line 71, in class FramenetParserReader(DatasetReader): File "/mnt/c/Users/Ajay/Desktop/FramenetParser/framenet_parser/dataset_readers/framenet_reader.py", line 84, in FramenetParserReader def _read(self, file_path: str): File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/overrides/overrides.py", line 88, in overrides return _overrides(method, check_signature, check_at_runtime) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/overrides/overrides.py", line 114, in _overrides _validate_method(method, super_class, check_signature) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/overrides/overrides.py", line 135, in _validate_method ensure_signature_is_compatible(super_method, method, is_static) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/overrides/signature.py", line 93, in ensure_signature_is_compatible ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name) File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/overrides/signature.py", line 288, in ensure_return_type_compatibility f"{method_name}: return type {sub_return} is not a {super_return}." TypeError: FramenetParserReader._read: return type None is not a typing.Iterable[allennlp.data.instance.Instance]. train_parser.sh: line 26: syntax error near unexpected token done' train_parser.sh: line 26:done'

Please Help

Ch4osMy7h commented 2 years ago

Try to remove overrides in each file, not only in dataset_reader.py. Or you can downgrade allennlp version to 2.3.0 by "pip install allennlp==2.3.0".

ajaysurya1221 commented 2 years ago

Thanks, @Ch4osMy7h worked like a charm.

ajaysurya1221 commented 2 years ago

sorry to bother you @Ch4osMy7h

File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/common/params.py", line 423, in assert_empty "Extra parameters passed to {}: {}".format(class_name, self.params) allennlp.common.checks.ConfigurationError: Extra parameters passed to Checkpointer: {'keep_most_recent_by_count': 1} train_parser.sh: line 26: syntax error near unexpected token done' train_parser.sh: line 26:done'

full Traceback is too big

Ch4osMy7h commented 2 years ago

change "keep_most_recent_by_count" to "num_serialized_models_to_keep" in config file

ajaysurya1221 commented 2 years ago

now this.

File "/mnt/c/Users/Ajay/Desktop/FramenetParser/venv/lib/python3.7/site-packages/allennlp/modules/token_embedders/pretrained_transformer_embedder.py", line 195, in forward max_type_id = type_ids.max() RuntimeError: CUDA error: no kernel image is available for execution on the device

Ch4osMy7h commented 2 years ago

Which type of gpus you used? if you use RTX 3090 or Tesla A100, you may need to download the correct version of torch with cudatoolkit >= 11.0 from https://pytorch.org/get-started/previous-versions.

ajaysurya1221 commented 2 years ago

ok thanks :) @Ch4osMy7h

ajaysurya1221 commented 2 years ago

Hey, @Ch4osMy7h I've started training, and it is running fine. Thanks a lot, @Ch4osMy7h. I need to write an inference script for my end project and could you please share if you have written any? will be a great help. Thanks in advance. :)

Ch4osMy7h commented 2 years ago

I have shown an example code in README.md, you can see it at the end of page.

ajaysurya1221 commented 2 years ago

Thanks, @Ch4osMy7h I haven't seen an enthusiastic developer who replies this much quicker!

ajaysurya1221 commented 2 years ago

Hey @Ch4osMy7h, I started my training in colab pro +, and unfortunately, the runtime got disconnected when it hit the 62nd epoch. Do you have your best model saved somewhere which you can share via drive or something, please?

Ch4osMy7h commented 2 years ago

Could you please give me your individual email, i will send the checkpoints to you via it.

ajaysurya1221 commented 2 years ago

ajaysuryasenthilrajan@gmail.com

you're a saviour @Ch4osMy7h 😊

ajaysurya1221 commented 2 years ago

Hi @Ch4osMy7h, I'm at the final stage now and I'm facing one issue. could you please help?

Command : !allennlp predict --output-file output.txt --include-package framenet_parser --predictor framenet_parser --cuda-device 0 experiments/training/framenet_parser_v1.7_233 experiments/inference/sample.json

Traceback (most recent call last): File "/usr/local/bin/allennlp", line 8, in sys.exit(run()) File "/usr/local/lib/python3.7/dist-packages/allennlp/main.py", line 34, in run main(prog="allennlp") File "/usr/local/lib/python3.7/dist-packages/allennlp/commands/init.py", line 119, in main args.func(args) File "/usr/local/lib/python3.7/dist-packages/allennlp/commands/predict.py", line 259, in _predict predictor = _get_predictor(args) File "/usr/local/lib/python3.7/dist-packages/allennlp/commands/predict.py", line 142, in _get_predictor extra_args=predictor_args, File "/usr/local/lib/python3.7/dist-packages/allennlp/predictors/predictor.py", line 409, in from_archive return predictor_class(model, dataset_reader, extra_args) File "/content/framenet_parser/predictors/framenet_parser.py", line 21, in init self._tokenizer = SpacyTokenizer(language=language, pos_tags=True) File "/usr/local/lib/python3.7/dist-packages/allennlp/data/tokenizers/spacy_tokenizer.py", line 63, in init self.spacy = get_spacy_model(language, pos_tags, parse, ner) File "/usr/local/lib/python3.7/dist-packages/allennlp/common/util.py", line 277, in get_spacy_model spacy_model = spacy.load(spacy_model_name, disable=disable) File "/usr/local/lib/python3.7/dist-packages/spacy/init.py", line 51, in load name, vocab=vocab, disable=disable, exclude=exclude, config=config File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 324, in load_model return load_model_from_package(name, kwargs) File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 357, in load_model_from_package return cls.load(vocab=vocab, disable=disable, exclude=exclude, config=config) File "/usr/local/lib/python3.7/dist-packages/en_core_web_sm/init.py", line 10, in load return load_model_from_init_py(file, **overrides) File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 523, in load_model_from_init_py config=config, File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 392, in load_model_from_path nlp = load_model_from_config(config, vocab=vocab, disable=disable, exclude=exclude) File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 435, in load_model_from_config validate=validate, File "/usr/local/lib/python3.7/dist-packages/spacy/language.py", line 1677, in from_config raw_config=raw_config, File "/usr/local/lib/python3.7/dist-packages/spacy/language.py", line 779, in add_pipe validate=validate, File "/usr/local/lib/python3.7/dist-packages/spacy/language.py", line 660, in create_pipe resolved = registry.resolve(cfg, validate=validate) File "/usr/local/lib/python3.7/dist-packages/thinc/config.py", line 747, in resolve config, schema=schema, overrides=overrides, validate=validate, resolve=True File "/usr/local/lib/python3.7/dist-packages/thinc/config.py", line 796, in _make config, schema, validate=validate, overrides=overrides, resolve=resolve File "/usr/local/lib/python3.7/dist-packages/thinc/config.py", line 856, in _fill overrides=overrides, File "/usr/local/lib/python3.7/dist-packages/thinc/config.py", line 849, in _fill promise_schema = cls.make_promise_schema(value, resolve=resolve) File "/usr/local/lib/python3.7/dist-packages/thinc/config.py", line 1040, in make_promise_schema func = cls.get(reg_name, func_name) File "/usr/local/lib/python3.7/dist-packages/spacy/util.py", line 143, in get ) from None catalogue.RegistryError: [E893] Could not find function 'spacy.Tagger.v2' in function registry 'architectures'. If you're using a custom function, make sure the code is available. If the function is provided by a third-party package, e.g. spacy-transformers, make sure the package is installed in your environment.

Available names: spacy-legacy.CharacterEmbed.v1, spacy-legacy.EntityLinker.v1, spacy-legacy.HashEmbedCNN.v1, spacy-legacy.MaxoutWindowEncoder.v1, spacy-legacy.MishWindowEncoder.v1, spacy-legacy.MultiHashEmbed.v1, spacy-legacy.Tagger.v1, spacy-legacy.TextCatBOW.v1, spacy-legacy.TextCatCNN.v1, spacy-legacy.TextCatEnsemble.v1, spacy-legacy.Tok2Vec.v1, spacy-legacy.TransitionBasedParser.v1, spacy.CharacterEmbed.v2, spacy.EntityLinker.v1, spacy.HashEmbedCNN.v2, spacy.MaxoutWindowEncoder.v2, spacy.MishWindowEncoder.v2, spacy.MultiHashEmbed.v2, spacy.PretrainCharacters.v1, spacy.PretrainVectors.v1, spacy.Tagger.v1, spacy.TextCatBOW.v1, spacy.TextCatCNN.v1, spacy.TextCatEnsemble.v2, spacy.TextCatLowData.v1, spacy.Tok2Vec.v2, spacy.Tok2VecListener.v1, spacy.TorchBiLSTMEncoder.v1, spacy.TransitionBasedParser.v1, spacy.TransitionBasedParser.v2

Ch4osMy7h commented 2 years ago

My current environment is: torch 1.11.0 allennlp 2.10.0 spacy 3.3.1

You can check it.

ajaysurya1221 commented 2 years ago

Thanks, @Ch4osMy7h, the code is working all good.