-
### Confirm this is an issue with the Python library and not an underlying OpenAI API
- [X] This is an issue with the Python library
### Describe the bug
When running the AzureOpenAI service with a…
-
### System Info
```Shell
accelerate version: main
python version: 3.11
torch version: 2.4
numpy version: 1.26.4
```
### Information
- [X] The official example scripts
- [ ] My own modified scri…
-
### Confirm this is a Node library issue and not an underlying OpenAI API issue
- [X] This is an issue with the Node library
### Describe the bug
I am using the example from https://github.co…
JotaP updated
8 hours ago
-
### Description
Testing the connection as well as the chat was working before. A day later when I checked again, it's not anymore. No changes were made in the code.
Afterwards,
Deleted the ent…
-
## Description
As I can understand, currently, BAML sends JSON schemas as part of the LLM prompt to structure outputs. With the recent introduction of GPT-4's Structured Outputs feature, we have an o…
-
Collecting all the requests for better AMD support for AO here:
* [Reddit](https://www.reddit.com/r/LocalLLaMA/comments/1fteod6/pytorch_native_architecture_optimization_torchao/lprdwzm/) feedback
…
-
only gpt-3.5turbo genereting responses when running locally
does this excludes the gpt-4,gpt-3.5-turbo-0301,gpt-4-0314
-
Found a bug? Please fill out the sections below. 👍
### Describe the bug
When i launch "operate" and give my OpenAI key, this is the result :
Hello, I can help you with anything. What would y…
-
### What is the issue?
The streamed chat-completion response from ollama's openai-compatible API repeats `"role": "assistant"` in all returned chunks. This is different to OpenAI's API which just has…
-
curl 'localhost:8008/message?message=what%20is%20the%20advantage%20of%20rust%20over%20c&name=a%20curious%20explorer'
i got error like this
{"statusCode":500,"error":"Internal Server Error","messa…