davidberenstein1957 / crosslingual-coreference

A multi-lingual approach to AllenNLP CoReference Resolution along with a wrapper for spaCy.
MIT License
102 stars 17 forks source link

Is it possible to load own model locally #12

Closed SavitaKumariPandit closed 1 year ago

SavitaKumariPandit commented 1 year ago

@davidberenstein1957 @dvsrepo @martin-kirilov @DavidFromPandora is there any possible ways to use own model locally.

davidberenstein1957 commented 1 year ago

Yes, this is possible! You can train a custom model via AllenNLP. Since they don´t actively maintain the package, I created a wrapper for it with some custom updates and models.

SavitaKumariPandit commented 1 year ago

Yes, this is possible! You can train a custom model via AllenNLP. Since they don´t actively maintain the package, I created a wrapper for it with some custom updates and models. @davidberenstein1957 @dvsrepo @martin-kirilov @DavidFromPandora

I am using minilm quantized model that have " model.onnx" format. how to use this model format for prediction using cross-lingual package. is there any way to assign in the function predictor = Predictor(language="en_core_web_sm", device=-1, model_name="minilm") in the model_name this model.onnx format

davidberenstein1957 commented 1 year ago

I contemplated speeding the model up using that approach but have not gotten around to looking into that. So I suggest you take a look at https://aclanthology.org/N18-2108.pdf and the implementation of the AllenNLP from allennlp.predictors.predictor import Predictor => Predictor.from_path()method.

SavitaKumariPandit commented 1 year ago

I contemplated speeding the model up using that approach but have not gotten around to looking into that. So I suggest you take a look at https://aclanthology.org/N18-2108.pdf and the implementation of the AllenNLP from allennlp.predictors.predictor import Predictor => Predictor.from_path()method.

Thanks for the update @davidberenstein1957 Really appreciate you taking time to look at this,