-
> Please provide us with the following information:
> ---------------------------------------------------------------
### This issue is for a: (mark with an `x`)
```
- [ x] bug report -> pleas…
-
Hi i've somewhat copypasted the config with minor modification to use dolphin-mistral
```
(setq …
-
**Bug description**
I have a tool, that returns a large JSON/tabular data. I find that the DI takes all this data(50,000) chars and tries to add it to the prompt for subsequent analysis. (This is t…
-
使用https://platform.openai.com/tokenizer确认了文本的tokens长度为19102,使用gpt-4-1106-preview发现gpt无法根据文本内容回答,使用GPT-3.5-16K也是,疑似存在其他tokens长度限制,可能是为了保持对话完整导致单句token限制?
-
**例行检查**
[//]: # (方框内删除已有的空格,填 x 号)
+ [x] 我已确认目前没有类似 issue
+ [x] 我已确认我已升级到最新版本
+ [x] 我已完整查看过项目 README,尤其是常见问题部分
+ [x] 我理解并愿意跟进此 issue,协助测试和提供反馈
+ [x] 我理解并认可上述内容,并理解项目维护者精力有限,**不遵循规则的 issue 可能…
-
I tried to add the stream attribute, but the request seems to have an error message
```
const assistant = await Assistant.create(ctx, {
model: 'gpt-4-1106-preview',
instructions: message…
-
### Version
Command-line (Python) version
### Operating System
MacOS
### What happened?
First off, let me say that i really like the idea of the tool and it somewhat works really good w…
-
Hello, GPT-4 Turbo and the Vision Model have been released by OpenAI. Can you provide support for these models?
-
After stream run 1_home.py. How I can build a agent? by input what and how I can upload or point a file, have a example? No matter what I input, it always be:
system_prompt=None file_paths=[] docs…
-
When I tried to use GPT-4-turbo via
```python
@lmql.query(model="openai/gpt-4-1106-preview")
async def foo():
...
```
I get the following error message:
`lmql.runtime.bopenai.openai_api.…