-
## Proposal
**Use case. Why is this important?**
The default `max_samples` is 50 million which, is about 800MB for the "domain model" where each sample point is about 16 bytes. But, if those 50 mi…
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…
-
streaming mode does not parse the data into JSON like the none-streaming mode.
Need to fix this datapath to match non-streaming
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…
-
Response bodies on List objects returned by Kubernetes/Openshift can be very large; the current implementation of `Kubeclient::Client.get_entities` requires loading the entire response body into memor…
-
### What happened?
When using LiteLLM Proxy with streaming often (around 20% of the time) the response gets cut off. The model was going to use a tool in that response, but it was cut off before th…
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…
-
```
I'd like to propose three enhancements:
1. Support the Streaming API. I believe this can be done via libcurl, but I
haven't tried it yet.
2. Return objects in JSON format in addition to or inste…