issues
search
transitive-bullshit
/
agentic
AI agent stdlib that works with any LLM and TypeScript AI SDK.
MIT License
16.07k
stars
2.11k
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
SyntaxError: Invalid regular expression ERROR
#598
josephshenjs
closed
1 month ago
1
How to grab total token usage from response
#597
nchan1994
closed
1 month ago
1
Feature xt zzz
#596
TZZack
closed
1 month ago
0
@dqbd/tiktoken no such file or directory, t_tiktoken_bg.wasm
#595
kiraraty
closed
1 month ago
1
Chinese documents?
#594
quinn-getty
closed
1 month ago
2
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'Window': Illegal invocation
#592
cuongdnv2307
closed
1 month ago
6
how to send last 5 message by user_id ?
#591
mtariqsajid
closed
1 month ago
3
Support 'continue' action
#590
ArneBouillon
closed
1 month ago
1
Error: The text contains a special token that is not allowed: <|endofprompt|>
#589
yokebtc
closed
1 month ago
1
function call support
#588
jtaox
closed
1 year ago
0
How can I create pre-prompts?
#587
cmastro
closed
1 month ago
1
Support disabling storing message context
#586
kibertoad
closed
1 month ago
1
When sending a request, how can I not include the 'max_tokens' parameter by default?
#585
quzard
closed
1 month ago
1
Support for Function calling
#584
ChoqueCastroLD
closed
1 month ago
9
Add ai project management tool using this chatgpt api
#583
maxlibin
closed
1 month ago
0
The proxy api ever had any Embeddings endpoints for use ?
#582
madruga8
closed
1 year ago
1
Feature/limit fix: maximum context length limit #580
#581
CooperJiang
closed
1 month ago
1
fix: maximum context length limit
#580
CooperJiang
closed
1 year ago
0
Token Usage Counter?
#579
HarveyLijh
closed
1 month ago
2
Conversations cannot be tracked
#576
inannan423
closed
1 year ago
1
"You did not write down any '// @ai' comment lines
#575
dupiesdupreez
closed
1 month ago
1
fix: utilize the param 'name' in sendMessage
#574
zhouhan760503
closed
6 months ago
0
Why does the returned content often contain '�'
#573
walker-peng22
closed
1 month ago
1
fetch is not defined
#572
didoee
closed
1 year ago
4
switch tokenizer implementation with pure js and more compatible js-tiktoken
#571
masterkain
closed
1 year ago
4
Error: Missing tiktoken_bg.wasm issue with next.js 13
#570
masterkain
closed
1 year ago
8
Need help setting up.
#569
H4RRY-B4WLS
closed
1 month ago
1
TypeError: Class extends value #<Object> is not a constructor or null
#568
nayan27
closed
1 month ago
11
Model gpt 4 don't work
#567
photoongit
closed
1 month ago
3
Incorrent value of maxModelTokens for gpt-3.5-turbo
#566
yaojingguo
closed
1 month ago
1
Update rate limit details
#565
PawanOsman
closed
8 months ago
0
docs: update README for access-token library
#564
moonrailgun
closed
1 month ago
0
Feat: Support proxy
#563
bumu
closed
1 year ago
2
Unexpected token < in JSON at position 0
#562
ElenaChes
closed
2 months ago
0
Add link to PromptsZone
#561
hao1300
closed
4 months ago
0
ChatGPTUnofficialProxyAPI GPT-4
#560
joseph27
closed
1 month ago
4
feat: support open ai message array
#559
toxic-johann
closed
1 month ago
9
Hope to add the ability to clear historical conversations
#558
Jxells
closed
1 year ago
1
SyntaxError: Unexpected number in JSON at position 4
#557
kirito141211
closed
1 year ago
2
https://api.pawan.krd/backend-api/conversation Not working
#556
seckingyo
closed
1 year ago
1
SyntaxError: Unexpected non-whitespace character after JSON at position 4
#555
seckingyo
closed
1 month ago
2
Uncatchable Exception
#554
nhuethmayr
closed
1 year ago
0
parentMessageId is not valid,How to use parentMessageId
#553
lwm98
closed
1 month ago
16
how to enhance networking
#552
halower
closed
1 year ago
1
feat: support multiple system messages
#551
toxic-johann
closed
1 year ago
4
How can you use the promptPrefix in the ChatGPTUnofficialProxyAPI?
#550
ashuvax
closed
1 month ago
1
offer free proxy
#549
gcslaoli
closed
1 year ago
3
Fix prompt length calculation
#548
alxmiron
closed
1 month ago
5
Are conversations that exceed 4000 tokens automatically culled by oldest messages?
#547
strich
closed
1 year ago
1
try to fix prompt token count
#546
zhujunsan
closed
1 year ago
1
Previous
Next