issues
search
simonw
/
ttok
Count and truncate text based on tokens
Apache License 2.0
242
stars
7
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Options for special token handling
#13
grantjenks
closed
2 months ago
3
Add allowed-special
#12
FergusFettes
closed
2 months ago
1
Does `ttok` need an internet connection?
#11
NightMachinery
opened
6 months ago
3
ttok adds a 20 second delay when using llm-gpt4all when offline
#10
learning4life
opened
7 months ago
1
Support for other models via Hugging Face tokenizers
#9
simonw
opened
10 months ago
2
Ability to count tokens for models other than OpenAI
#8
simonw
opened
10 months ago
3
Option to turn token integers back into text
#7
simonw
closed
1 year ago
6
Truncate => trim tokens
#6
jefftriplett
opened
1 year ago
0
Verbose option
#5
faroukfaiz10
opened
1 year ago
0
An additional option to view the actual tokenized text
#4
oostopitre
closed
1 year ago
11
Mechanism for splitting rather than truncating
#3
simonw
opened
1 year ago
21
Add split option to split large input into smaller parts instead of truncating
#2
c4pt0r
opened
1 year ago
2
Initial design
#1
simonw
closed
1 year ago
3