Closed pradeepdev-1995 closed 1 year ago
@AkshitaB this is just a friendly ping to make sure you haven't forgotten about this issue 😜
@AkshitaB No Never.strictly following😇
@pradeepdev-1995 Closing this in favor of https://github.com/allenai/allennlp/issues/5723 . Let us know if the guide chapter linked there does not help.
Here is the code I tried for the coreference resolution
And it worked well I got the output with solved coreference. like below
Now I have quantized the model used here (https://storage.googleapis.com/pandora-intelligence/models/crosslingual-coreference/minilm/model.tar.gz) and the new quantized model is stored in a specific path in my local machine.
Shall I use that customized(quantized) model from my local path in
model_url
value and use this prediction command like below?