Closed manelAffi closed 4 years ago
Stolen from the tagger documentation:
The decorator that registers your classes only runs when the module containing it is loaded. And the
allennlp
execution script has no way of knowing it needs to load the modules containing your custom code (indeed, it doesn't even know those modules exist). And so the--include-package
argument tells AllenNLP to load the specified modules (and in particular run theirregister
decorators) before instantiating and training your module.
In other words, load your package with --include-package my.package.name
, the same way you would import it with import my.package.name
.
hi thanks a lot, I try tu use --include-package models.MyModelName but it says that ModuleNotFoundError: No module named 'models' other question is what is my model path under Google colab? I was wondering if you could help me
thanks Best regards Manel @dirkgr
I don't know anything about Google colab, but here is the general documentation of how Python loads modules: https://docs.python.org/3.7/reference/import.html
One thing I have found helpful in the past is explicitly setting the PYTHONPATH
environment variable so your module is on the path.
@manelAffi this is the relative path to the directory from which you invoke allennlp train
. AllenNLP has repo with example project that extends the library functionality with custom components: https://github.com/allenai/allennlp-as-a-library-example
@mojesty @dirkgr thanks. fixed like in #1942 Google Colab don't permit the change of package. Posting it here just in case this helps someone.
System
Question I'm trying to train a custom Model using
allennlp train \path\to\the\config\file -s \path to the output directory
. config.json:I encounter the TypeError as the title descripted, and I do not know how to deal with it? The traceback information is as following:
I'm trying to use
--include-package
flag but I don't know what should be the--include-package
argument exactly? @matt-gardner