-
### The Feature
I would like to request the addition of native support for OCIGenAI models in litellm. This feature would enable litellm users to interact seamlessly with Oracle’s Generative AI model…
-
### The problem
Recorder: telegram.ext.Updater
Source: /usr/local/lib/python3.12/site-packages/telegram/ext/_updater.py:411
First message: 23:48:13 (5 сообщений)
Last message: 04:11:25
Exceptio…
-
### Which packages are you using?
stream_chat_flutter
### On what platforms did you experience the issue?
iOS, Android
### What version are you using?
```yml
stream_chat_flutter: ^8.0.0-beta.3
…
-
2024-05-04 13:26:33,956 - openai._base_client - INFO - Retrying request to /chat/completions in 0.919341 seconds
2024-05-04 13:26:35,124 - openai._base_client - INFO - Retrying request to /chat/compl…
-
I have:
```json
{"type":"function","function":{"name":"foo","parameters":"bar"}}
{"type":"search","search":{"keyword":"foo"}}
```
I want them to be deserialized into
```rust
#[derive(Dese…
-
### Describe the bug
Loops will occur when writing code using deepseek
Understood. We will adjust the plot to have a 6:2 (3:1) aspect ratio.
Step 6: Modify and Plot the Data with S…
-
I created a sample custom function to demonstrate the issue we are seeing - https://github.com/mihirkothari25/bolt-js-getting-started-app.
It's adapted from the sample starter app. I run ngrok and se…
-
OpenAI added a new usage field,
```
"usage": {
"prompt_tokens": 13,
"completion_tokens": 7,
"total_tokens": 20,
"completion_tokens_details": {
…
-
### Motivation
When using lmdeploy to inference, sometimes we'd like to set do_sample = false, but according to official document there's no do_sampling config, can we add this just like Automodel? e…
-
Thanks for this amazing work!
Would you be interested to have a `--service` mode, to be able to run llama3.java as a service, and have a third-party chat communicating with this service?