-
## Bug Report
If this is a bug report, please fill out the following form in full:
### System information
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**:
- Ubuntu 20.04
- **T…
rb-23 updated
6 months ago
-
Somewhat similar to https://github.com/tensorflow/tensor2tensor/issues/344, I have my model served by TensorFlow Serving API, how can I translate an English sentence against this model now?
Model c…
-
### Problem/Solution
It has been great to see Ollama added as a first-class option in https://github.com/jupyterlab/jupyter-ai/pull/646, this has made it easy to access a huge variety of models and…
-
Based on the [mnist serving tutorial](https://www.tensorflow.org/serving/serving_basic) I'm trying to export the quickdraw-data set to load it into the [tensorflow-model-server](https://github.com/ten…
d-bum updated
6 years ago
-
### Your current environment
docker with vllm/vllm-openai:v0.4.3 (latest)
### 🐛 Describe the bug
python3 -m vllm.entrypoints.openai.api_server --model ./Qwen1.5-72B-Chat/ --max-model-len 2400…
-
### Describe the bug
Hi,
We have k8s cluster setup in orbstack.
It's setup to be accessible in our office LAN by other devs. However, when we try to manage anything using `kubectl` outside of th…
-
![image](https://user-images.githubusercontent.com/58454582/217485204-842cb026-eca2-4d65-b72f-5368f40a8d40.png)
![image](https://user-images.githubusercontent.com/58454582/217485237-0e8d0ef0-5fad-412…
-
### 问题确认 Search before asking
- [X] 我已经搜索过问题,但是没有找到解答。I have searched the question and found no related answer.
### 请提出你的问题 Please ask your question
你好,我的机器配置为
paddlepaddle-gpu 2.4.0.pos…
-
Currently we use DL4J for serving Keras models.
For the "old" multi-backend Keras, this is pretty robust.
For TensorFlow 2.0 Keras models, they may include general TF ops (not just Keras layers) tha…
-
Hello, I have a suggestion for a notebook -- an **example of a cuml trained model being exported so it can be served by TensorRT.**
More information on TensorRT:
- https://docs.nvidia.com/deeplear…