Closed GaiserChan closed 2 years ago
@GaiserChan
You need to provide a Translator
to load your own model:
PtBertQATranslator translator = PtBertQATranslator.builder().opeTokenizerName("distilbert").toLowerCase(true).build();
Criteria<QAInput, String> result = Criteria.builder()
.setTypes(QAInput.class, String.class)
.optEngine("PyTorch")
.optOption("mapLocation", "true")
.optModelPath(Paths.get("build/input/models"))
.optModelName("bi_encoder")
.optTranslator(translator)
.build();
@GaiserChan
Here is an example of how the translator
is implemented: https://pub.towardsai.net/deploy-huggingface-nlp-models-in-java-with-deep-java-library-e36c635b2053
You can also look into the example ai/djl/examples/inference/BertQaInference.java
Feel free to reopen this issue if you still have questions.
`@Configuration public class BertQaInferenceConfiguration { static final Logger logger = LoggerFactory.getLogger(BertQaInferenceConfiguration.class); @Bean public Criteria<QAInput, String> nlpCriteria() { Criteria<QAInput, String> result = Criteria.builder() .optApplication(Application.NLP.QUESTION_ANSWER) .setTypes(QAInput.class, String.class) .optFilter("backbone", "bert") .optEngine("PyTorch") .optOption("mapLocation", "true") .optDevice(Device.cpu()) .optProgress(new ProgressBar()) .optModelPath(Paths.get("build/input/models")) .optModelName("bi_encoder.pt") .build(); logger.info(result.toString()); return result; }
}`
I run before code to load my own model,but it not work.