Closed mruwnik closed 1 year ago
Thanks! These might just be a documentation detail but I want to clarify:
reader model - returns answer given a question & context ... not for duplicates. I followed the naming convention from research papers. Maybe it'll be clearer to rename this to qa_model
like this hugging face example below from https://huggingface.co/tasks/question-answering?
question = "Where do I live?"
context = "My name is Merve and I live in İstanbul."
qa_model(question = question, context = context)
## {'answer': 'İstanbul', 'end': 39, 'score': 0.953, 'start': 31}
retriever model - returns encodings given text or list of text. Additionally can return duplicates paraphrase_mining
when given a list of strings/titles/questions.
literature search - returns encodings. We won't use paraphrase_mining
here but it should be supported by the model so no harm keeping it.
Thanks! These might just be a documentation detail but I want to clarify:
* reader model - returns answer given a question & context ... **not for duplicates**. I followed the naming convention from research papers. Maybe it'll be clearer to rename this to `qa_model` like this hugging face example below from https://huggingface.co/tasks/question-answering?
question = "Where do I live?" context = "My name is Merve and I live in İstanbul." qa_model(question = question, context = context) ## {'answer': 'İstanbul', 'end': 39, 'score': 0.953, 'start': 31}
* retriever model - returns encodings given text or list of text. **Additionally can return duplicates `paraphrase_mining` when given a list of strings/titles/questions.** * literature search - returns encodings. We won't use `paraphrase_mining` here but it should be supported by the model so no harm keeping it.
That does make things a lot clearer :D I changed the reader-model to qa-model and added a question_answering endpoint for encoders
The models can be accessed over HTTP, below are curl calls:
curl -X POST -H "Content-Type: application/json" -d '{"query": "generate something for this", "context": "asdasdasd"}' -H "Authorization: Bearer $(gcloud auth print-identity-token)" https://reader-model-t6p37v2uia-uw.a.run.app/
curl -X POST -H "Content-Type: application/json" -d '{"query": "generate something for this", "context": "asdasdasd"}' -H "Authorization: Bearer $(gcloud auth print-identity-token)" https://retriever-model-t6p37v2uia-uw.a.run.app
curl -X POST -H "Content-Type: application/json" -d '{"query": "generate something for this", "context": "asdasdasd"}' -H "Authorization: Bearer $(gcloud auth print-identity-token)" https://lit-search-model-t6p37v2uia-uw.a.run.app
To deploy a microservice, run
./deploy_model.sh <service name> <huggingface model> <model type>
, e.g../deploy_model.sh reader-model deepset/electra-base-squad2 pipeline