-
{"timestamp":"2024-04-15T05:20:55.796456Z","level":"ERROR","error":"AppError(error message received from triton: [request id: ] expected number of inputs between 1 and 3 but got 9 inputs for model 'my…
-
**Describe the bug**
Any OpenAI o1 model, does not work.
**To Reproduce**
Add the OpenAI provider using LiteLLM, and specify o1-mini or o1-preview.
**Expected behavior**
The model answers my …
-
1.
When I test LLM backend service:
```
curl http://${host_ip}:9009/generate \
-X POST \
-d '{"inputs":"What is Deep Learning?","parameters":{"max_new_tokens":17, "do_sample": true}}' \
-H…
-
### What happened?
ReadFromKafka not forwarding in streaming mode.
using apache-beam 2.44.0
beam_options = PipelineOptions(streaming = True)
pipeline = beam.Pipeline(options=beam_options)
m…
-
I'm encountering issues in streaming response in flask-smorest. I'm following the guidance here - [https://flask.palletsprojects.com/en/2.3.x/patterns/streaming/](url) for streaming responses from my …
-
Using development version. My expectation is that it should delete successfully since the dependents are also being deleted.
```ucm
@alvaroc1/webauth/main> version
39b76f3
@alvaroc1/weba…
-
When trying to persist an event that is too large, currently one gets `RangeError: Invalid string length`
Maybe this could be improved somewhat
-
## Description
`API._streaming_download()` chokes on HTTP `416` `Range Not Satisfiable` errors, because the JSON error object may be received in the first chunk of any request, including any retry,…
-
I tried to run the example in the readme file ( replacing the url and chatId)
"flowise-sdk": "^1.0.9",
node version: 20.15.0
SO: Windows
```
// changed this because the original code breaks. …
-
### Description
Hello,
when using video.js v7.21.5 in our project, we need it to use @http-streaming 2.16.3, but in dependencies it is
"@videojs/http-streaming": "2.16.2", which cannot even be …