-
搭建paddle service的时候遇见问题,export model的时候我的输入设定是[None, None],代表[batch, token_size],在service中,preprocess函数中也是给的这个维度的数据,但是最后报这个错
-
Hi, BentoML team.
This is a new suggestion for Yatai. In general, when serving a ML model, the input provided to the model and the output returned by the model are stored in an external storage, and …
-
I was working through this repository about deploying models with Azure and noticed that the model input shape is (1,6). However, in both 06_Model_Deployment and 07_CICD_Pipelines, score.py expects an…
rzvcs updated
2 years ago
-
It looks like FB's license prohibits distribution of the OPT 175B weights, so any OPT 175B implementation is going to require a little extra work:
* https://alpa.ai/tutorials/opt_serving.html#launch-…
-
usage: /home/lbc/.local/bin/bert-serving-start -model_dir /home/lbc/chinese_L-12_H-768_A-12/chinese_L-12_H-768_A-12/ -num_worker=4
ARG VALUE
______________________________________…
-
**Prerequisites**
* [x] Are you running the latest `bert-as-service`?
* [x] Did you check the [FAQ list in `README.md`](https://github.com/hanxiao/bert-as-service#speech_balloon-faq)?
* [x] Did…
-
How do I change the default accelerator type used for Dreambooth training.
Simply changing the following line is throwing me a cascade of RPC errors, please point me towards a way.
https://github…
-
All components have to follow a certain template
```
#Release#
- component_name|release-branch|release-notes
- component_name|release-branch|release-notes
```
please use the following component …
-
Hi, nice repo and good job!
Is there an officially recommend way how to run this inside Docker / Docker compose?
I'd appreciate if an official image is published as we have non python backend an…
-
We need support for assembling bundles which model a file-system, and probably support for loading multiple manifests. This will clear the way for chunk-based frontend JS serving, etc.