issues
search
simonw
/
llm
Access large language models from the command-line
https://llm.datasette.io
Apache License 2.0
4.8k
stars
266
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Fix windows bug where llm doesn't run <<llm chat>> on Windows issue #495
#646
sukhbinder
opened
1 hour ago
1
Add support for Duckduckgo AI chat models (llama3, claude3, gpt4o, mixtral)?
#645
zoobab
opened
1 day ago
0
Utility method for getting usage details from Response
#644
simonw
closed
2 days ago
2
Python API documentation for Response objects
#643
simonw
opened
2 days ago
0
Log input tokens, output tokens and token details
#642
simonw
closed
2 days ago
4
Log async responses to the database when run using llm --async
#641
simonw
closed
2 days ago
1
llm.get_models() and .get_async_models() documented functions
#640
simonw
closed
1 day ago
1
Feature request: Copilot Chat support
#639
Uninen
opened
3 days ago
0
WIP: Fragments
#638
simonw
opened
4 days ago
2
Add `editor` command (and make it the default when no args provided)
#637
oeo
opened
4 days ago
0
llm install llm-sentence-transformers results in error
#636
axel-at-drom
opened
5 days ago
0
llm chat -m mistral-7b-instruct-v0 led to OSError: [Errno 9] Bad file descriptor
#635
raybellwaves
opened
5 days ago
0
Enhancement Proposal: Track Head (for backtracking)
#634
FergusFettes
opened
5 days ago
1
chat - coloring tokens by logprob
#633
mlaugharn
opened
6 days ago
0
Replying to an async conversation does not work
#632
simonw
closed
1 week ago
5
Add logprobs for openai chat models
#631
irthomasthomas
closed
2 days ago
0
Add command-name settings for --help
#630
web-sst
opened
1 week ago
4
docs: add llm-grok
#629
Hiepler
closed
1 week ago
1
async support for embeddings
#628
simonw
opened
1 week ago
0
register_embedding_models() missing from plugin-hooks docs page
#627
simonw
opened
1 week ago
0
Plugins incorrectly loaded during test runs
#626
simonw
closed
5 days ago
2
Report estimated cost / token usage in the end of response
#625
Joilence
opened
1 week ago
1
Add guidance supports customized openai url and the key
#624
MonolithFoundation
opened
1 week ago
0
`llm keys get name` command
#623
simonw
closed
1 week ago
2
Update default model information to 4o-mini
#622
tnorthcutt
closed
1 week ago
1
tiny typo on README.md
#621
jpita
closed
1 week ago
1
Supporting composable prompts and templates
#620
mhalle
opened
1 week ago
0
Should LLM be tightly coupled to SQLite?
#619
simonw
opened
1 week ago
0
Keys in environment should precede static configuration
#618
kfet
opened
2 weeks ago
0
Option for passing in longer context fragments, stored in a deduped table
#617
simonw
opened
2 weeks ago
45
Run cog -r in PRs, use that to update logging.md with new tables
#616
simonw
closed
2 weeks ago
0
Add attachments tables to schema documentation
#615
simonw
closed
2 weeks ago
1
OpenAI token usage stored incorrectly
#614
simonw
closed
2 weeks ago
1
llm.get_async_model(), llm.AsyncModel base class and OpenAI async models
#613
simonw
closed
1 week ago
14
`llm models --options` should show supported attachment types, too
#612
simonw
closed
2 weeks ago
2
Make it possible to send one or more attachments with no accompanying prompt
#611
simonw
closed
2 weeks ago
5
Abstract out token usage numbers
#610
simonw
closed
2 days ago
12
[FR] `--json`: Save metadata as JSON
#609
NightMachinery
opened
2 weeks ago
0
Support `gpt-4o-audio-preview` for input (not output)
#608
NightMachinery
closed
2 weeks ago
8
Tool usage research
#607
simonw
opened
2 weeks ago
15
Chaining `llm chat` with stdin
#606
aud
opened
2 weeks ago
3
Error: Completions.create() got an unexpected keyword argument 'stream_options'
#605
xezpeleta
closed
2 weeks ago
0
Added comments and improved code readability
#604
AyhamJo7
opened
2 weeks ago
0
`audio/wave` `.wav` files not supported
#603
NightMachinery
closed
2 weeks ago
11
Ability to configure attachment support for models in extra-openai-models.yaml
#602
NightMachinery
opened
2 weeks ago
1
Allow passing of can_stream in openai_models.py
#600
cmungall
closed
2 weeks ago
1
Allow setting can_stream in extra-openai-models.yaml to allow for o1 over proxy
#599
cmungall
closed
2 weeks ago
1
`llm chat` errors on followup since 0.6
#601
yorickvP
closed
2 weeks ago
5
Installation borked after trying to uninstall gpt4all
#598
ianconsolata
opened
3 weeks ago
0
`llm chat` works fine for one followup prompt, then crashes accessing `Response.attachments` (with or without attachments present)
#597
maxwelljoslyn
closed
2 weeks ago
4
Next