-
Using latest version from main with model `gpt-4o` throws the following error when running on AWS Lamda:
```
Failed to clip tokens: [Errno 30] Read-only file system: '/var/lang/lib/python3.10/site…
-
I'm trying to use `langchainrb` gem, which has `tiktoken_rb` as a dependency. I'm getting a `LoadError` when trying to use the gem:
```
LoadError - cannot load such file -- /gems/ruby/3.0.0/gems/t…
-
Hi,
I am trying to train Tiktoken on a custom dataset (size 15 GB) with 30k vocab size. It seems it will take a long time to finish. 1 vocab update took almost 8 hours. Any suggestion to make it fas…
-
## Description
I'm trying to setup tiktoken locally with the help of this guide - https://stackoverflow.com/questions/76106366/how-to-use-tiktoken-in-offline-mode-computer. But the problem I'm faci…
nkilm updated
1 month ago
-
I am trying to chunk a huge document but it runs forever. Did I miss something in my code?
[File here](https://drive.google.com/file/d/1Xnp5jJhjIWNA6R5u9w96L9WO_Hb61Jmh/view?usp=sharing)
```pyt…
-
https://github.com/neulab/prompt2model/pull/335#issuecomment-1711578817
Another thing is that I am not sure that all the API-based models use tiktoken to compute input tokens. Thus, the num of inpu…
-
For every message submitted to the Cat, we can store in working_memory:
- tokens used (input and output)
- prompts used
- replies for each prompt
Info can be sent back to the client in the `why`…
-
-
### Bug Description
Llama is not working with latest langchain using llama-index-llms-langchain package and because of this there is an issue with using tiktoken 0.7.0 that is required for gpt-4o
…
-
Currently none of the gpt-4-turbo variants are [included](https://github.com/timoklimmer/powerproxy-aoai/blob/main/app/helpers/tokens.py#L37), maybe a prefix based approach similar to that used in [ti…