issues
search
thmsmlr
/
instructor_ex
Structured outputs for LLMs in Elixir
https://hexdocs.pm/instructor
432
stars
49
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Support Bedrock?
#64
cigrainger
opened
1 week ago
0
Chat completion inconsistent (Ollama)
#63
jamilabreu
opened
2 weeks ago
2
Fix system prompt doubling on retry
#62
srcrip
opened
3 weeks ago
2
Debugging
#61
srcrip
closed
3 weeks ago
0
Exposing usage of llm tokens
#60
antkim003
closed
3 weeks ago
0
Make instructor usable with current `mix phx.new` install
#59
thbar
closed
1 week ago
1
Draft: FIx #32 & Add Gemini Adapter
#58
acalejos
opened
1 month ago
1
fix(Llamacpp): `Llamacpp.url/0` doesn't return the url that was set in config
#57
alvises
opened
1 month ago
0
Add support for issuing chat completions against Azure OpenAI
#56
armanm
opened
2 months ago
1
Return raw errors
#55
seanmor5
opened
2 months ago
2
use Req.post/2 rather than Req.post!/2
#54
petrus-jvrensburg
closed
1 week ago
1
Return HTTP errors, rather than raise
#53
petrus-jvrensburg
closed
1 week ago
0
When should I use the Instructor.Validator?
#52
dfalling
closed
1 month ago
2
fix: json schema included twice
#51
petrus-jvrensburg
closed
1 week ago
1
Warn on missing docs
#50
petrus-jvrensburg
opened
2 months ago
0
Code not being executed during tests
#49
petrus-jvrensburg
opened
2 months ago
0
docs: adding pii data sanitization example
#48
themusicman
closed
1 week ago
1
Missing schema descriptions in prod
#47
petrus-jvrensburg
opened
2 months ago
3
Add callbacks for debugging
#46
petrus-jvrensburg
opened
2 months ago
4
Google Gemini support
#45
samrat
opened
2 months ago
2
Error when using streaming chat completion
#44
Ali-Kalout
opened
2 months ago
3
ReduceJSONFormattingErrorsProbabilityAndFixPassingInAdapter
#43
noozo
opened
3 months ago
0
Inconsistency in getting adapter from config via function params?
#42
noozo
opened
3 months ago
2
Correct to_date field in the quickstart example
#41
nthock
closed
1 week ago
1
Integration with local LLaVA, do we want it? What is the way?
#40
thbar
opened
3 months ago
3
SpamPrediction example failing to compile with "Unknown Registry"
#39
medoror
closed
2 months ago
2
feat: support for claude
#38
TwistingTwists
opened
3 months ago
16
GroqCloud support
#37
torepettersen
opened
3 months ago
0
json response error: gpt-4-vision-preview
#36
petrus-jvrensburg
closed
2 months ago
9
Add mix release configuration instructions to readme
#35
stevehodgkiss
closed
3 months ago
1
docs: update examples to reflect `chat_completion/2` interface
#34
nickgnd
closed
3 months ago
1
support claude
#33
TwistingTwists
opened
3 months ago
17
`Instructor.echo_response/1` no function clause matching
#32
kevinschweikert
opened
3 months ago
4
Mint.HTTPError "the given data exceeds the request window size" when sending a large request
#31
knewter
closed
4 months ago
7
Jaxon dependency maintenance status
#30
thbar
opened
4 months ago
2
Fix more tests
#29
thbar
closed
4 months ago
0
Add first GitHub CI setup
#28
thbar
closed
4 months ago
3
Get runtime adapter from params, not from config
#27
maxdrift
opened
4 months ago
3
default options for connections
#26
TwistingTwists
closed
3 months ago
1
Could not compile instructor in livebook
#25
Munksgaard
opened
4 months ago
8
Enable passing through Req opts?
#24
cigrainger
opened
4 months ago
3
Make the adapter overridable at runtime
#23
cigrainger
closed
4 months ago
0
Runtime switching adapters?
#22
cigrainger
closed
4 months ago
4
Quickstart Link in README 404's
#21
dfalling
closed
4 months ago
2
Remove dependency on openai client, smart defaults
#20
thmsmlr
closed
4 months ago
1
Add CI
#19
thmsmlr
closed
4 months ago
1
Overwrite the OpenAI Client HTTP defaults
#18
thmsmlr
closed
4 months ago
0
GPT4 Vision
#17
thmsmlr
closed
4 months ago
1
feat: support json mode
#16
TwistingTwists
closed
4 months ago
2
[WIP] Ollama support
#15
lilfaf
closed
4 months ago
5
Next