allenai / allennlp

An open-source NLP research library, built on PyTorch.
http://www.allennlp.org
Apache License 2.0
11.76k stars 2.25k forks source link

allennlp.common.checks.ConfigurationError: 'XXXXXXXXX not in acceptable choices for model.type #3524

Closed manelAffi closed 4 years ago

manelAffi commented 4 years ago

System

Question I'm trying to train a custom Model using allennlp train \path\to\the\config\file -s \path to the output directory. config.json:

"{
....
 "model": {
    "type": "MyModelName", 
....}
...
}"

I encounter the TypeError as the title descripted, and I do not know how to deal with it? The traceback information is as following:

Traceback (most recent call last):
  File "/usr/local/bin/allennlp", line 8, in <module>
    sys.exit(run())
  File "/usr/local/lib/python3.6/dist-packages/allennlp/run.py", line 18, in run
    main(prog="allennlp")
  File "/usr/local/lib/python3.6/dist-packages/allennlp/commands/__init__.py", line 102, in main
    args.func(args)
  File "/usr/local/lib/python3.6/dist-packages/allennlp/commands/train.py", line 124, in train_model_from_args
    args.cache_prefix)
  File "/usr/local/lib/python3.6/dist-packages/allennlp/commands/train.py", line 168, in train_model_from_file
    cache_directory, cache_prefix)
  File "/usr/local/lib/python3.6/dist-packages/allennlp/commands/train.py", line 226, in train_model
    cache_prefix)
  File "/usr/local/lib/python3.6/dist-packages/allennlp/training/trainer_pieces.py", line 65, in from_params
    model = Model.from_params(vocab=vocab, params=params.pop('model'))
  File "/usr/local/lib/python3.6/dist-packages/allennlp/common/from_params.py", line 359, in from_params
    default_to_first_choice=default_to_first_choice)
  File "/usr/local/lib/python3.6/dist-packages/allennlp/common/params.py", line 363, in pop_choice
    raise ConfigurationError(message)
allennlp.common.checks.ConfigurationError: 'MyModelName' not in acceptable choices for model.type: [\'bert_for_classification\', \'bcn\', \'constituency_parser\', \'biaffine_parser\', \'biaffine_parser_multilang\', \'coref\', \'crf_tagger\', \'decomposable_attention\', \'event2mind\', \'simple_seq2seq\', \'composed_seq2seq\', \'copynet_seq2seq\', \'bidaf\', \'bidaf-ensemble\', \'dialog_qa\', \'naqanet\', \'qanet\', \'nlvr_coverage_parser\', \'nlvr_direct_parser\', \'quarel_parser\', \'wikitables_mml_parser\', \'wikitables_erm_parser\', \'atis_parser\', \'text2sql_parser\', \'srl\', \'simple_tagger\', \'esim\', \'bimpm\', \'graph_parser\', \'language_model\', \'bidirectional_language_model\', \'bidirectional-language-model\', \'masked_language_model\', \'next_token_lm\', \'basic_classifier\', \'srl_bert\']. You should either use the --include-package flag to make sure the correct module is loaded, or use a fully qualified class name in your config file like {"model": "my_module.models.MyModel"} to have it imported automatically.'

I'm trying to use --include-package flag but I don't know what should be the --include-package argument exactly? @matt-gardner

dirkgr commented 4 years ago

Stolen from the tagger documentation:

The decorator that registers your classes only runs when the module containing it is loaded. And the allennlp execution script has no way of knowing it needs to load the modules containing your custom code (indeed, it doesn't even know those modules exist). And so the --include-package argument tells AllenNLP to load the specified modules (and in particular run their register decorators) before instantiating and training your module.

In other words, load your package with --include-package my.package.name, the same way you would import it with import my.package.name.

manelAffi commented 4 years ago

hi thanks a lot, I try tu use --include-package models.MyModelName but it says that ModuleNotFoundError: No module named 'models' other question is what is my model path under Google colab? I was wondering if you could help me

thanks Best regards Manel @dirkgr

dirkgr commented 4 years ago

I don't know anything about Google colab, but here is the general documentation of how Python loads modules: https://docs.python.org/3.7/reference/import.html

One thing I have found helpful in the past is explicitly setting the PYTHONPATH environment variable so your module is on the path.

mojesty commented 4 years ago

@manelAffi this is the relative path to the directory from which you invoke allennlp train. AllenNLP has repo with example project that extends the library functionality with custom components: https://github.com/allenai/allennlp-as-a-library-example

manelAffi commented 4 years ago

@mojesty @dirkgr thanks. fixed like in #1942 Google Colab don't permit the change of package. Posting it here just in case this helps someone.