-
The are significant numbers of non-decodable tokens in the cl100k-base BPE. These tokens don't decode back to strings using the utf-8 encoding.
Nether do the models `gpt-3.5-turbo` or `gpt-4` gener…
-
### ⚠️ Search for existing issues first ⚠️
- [X] I have searched the existing issues, and there is no existing issue for my problem
### Which Operating System are you using?
Windows
### Wh…
-
### ⚠️ Search for existing issues first ⚠️
- [X] I have searched the existing issues, and there is no existing issue for my problem
### Which Operating System are you using?
Windows
### Wh…
-
**I Keep running out of tokens before managing to accomplish any "multiple" steps task. The program encounters errors for mistakes or, like in this case, it does not manage to pip install the librarie…
-
https://regex101.com/r/ahGRp4/6
pls tell me steps for add es.ts to mask folder and compile or something for have my own default mask
-
Don't surpport gpt-4-0613.
Would you add the gpt-4-0613?
-
At the moment(6/27), Counting token is slightly changed.
I changed the Example of ChatMessage
```
// below link may not work on Chrome(error: Unable to render code block)
// then, use FireFox
//…
-
why --implement False, but it still generate the code?
2023-10-23 09:52:31.562 | INFO | metagpt.config:__init__:44 - Config loading done.
2023-10-23 09:52:34.146 | INFO | metagpt.software_…
-
With the release of gpt-3.5-turbo-0613 and gpt-4-0613, OpenAI implemented function calling which now makes it possible to call python functions via chat responses.
It can also work with GPT 3.5 for a…
-
In the [function calling example](https://github.com/load1n9/openai/blob/main/examples/chatCompletionFunction.ts), the model should be updated to `gpt-3.5-turbo-0613`.