GateNLP / gate-lf-pytorch-json

PyTorch wrapper for the LearningFramework GATE plugin
Apache License 2.0
1 stars 2 forks source link

Better way to store the meta file with the modelwrapper or not store at all #39

Closed johann-petrak closed 5 years ago

johann-petrak commented 5 years ago

The modelwrapper needs to know about the meta file after restoring for either continuing training or application, but if we move the whole directory to another machine, a stored absolute path will not work.

So we should either store the relative path as effectively used or simply expect the path to get specified when we load the wrapper. The latter may be the even better solution.

johann-petrak commented 5 years ago

The meta file location should probably never get stored with the wrapper and every task should require the meta file to get specified from the command line. That way the file can be stored in any location relative to the process running the task or specified as an absolute path without causing problems when moving the directory.

johann-petrak commented 5 years ago

ok for now we still store the metafile, but allow to override the metafile at application time, or when resume training is requested during training time. The apply program will always pass on the metafile that has been specified and the apply script now should always pass on the metafile. The apply program already had the option --metafile so we can use this right away when invoking from the LF scripts.

johann-petrak commented 5 years ago

This has now been implemented, needs testing.

johann-petrak commented 5 years ago

Works