issues
search
philschmid
/
easyllm
https://philschmid.github.io/easyllm/
MIT License
431
stars
34
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
bedrock model support...
#48
mmgxa
opened
4 months ago
0
Support more Anthropic models form AWS Bedrock
#47
igor-pechersky
closed
3 months ago
0
Add support for Vertex AI pretrained language models (GCP)
#46
tekumara
opened
10 months ago
0
Adds support for setting additional client args.
#45
dylan-stark
opened
11 months ago
0
Need to provide additional args to InferenceClient
#44
dylan-stark
opened
11 months ago
2
What is the difference between EasyLLM and Langchain?
#43
murdadesmaeeli
closed
11 months ago
2
ensure `generated_tokens` is an integer
#42
olinguyen
closed
11 months ago
2
Need to pass custom_attributes='accept_eula=true' when invoking SageMaker endpoint
#41
olinguyen
closed
10 months ago
1
bedrock.ChatCompletion.create Raises ValidationError for Non-Integer Token Values in Python 3.9
#40
5c0rp
closed
11 months ago
1
Boto dependency shouldn't be foreced
#39
giyaseddin
opened
11 months ago
5
Add bedrock client
#38
philschmid
closed
11 months ago
0
(Chat)Completion objects cannot generate diverse outputs
#37
KoutchemeCharles
opened
11 months ago
6
add falcon 180b
#36
philschmid
closed
1 year ago
0
[Feature] Add support for logit_bias
#35
bitnom
opened
1 year ago
0
OverloadedError: Model is overloaded
#34
farshidbalan
opened
1 year ago
1
Datafilter
#33
philschmid
closed
10 months ago
0
Streaming support in Sagemaker?
#32
AIByteSmith
opened
1 year ago
1
Issue setting huggingface.prompt_builder = 'llama2' when using sagemaker as client
#31
bcarsley
opened
1 year ago
2
Fix prompt builder inputs
#30
lewtun
closed
1 year ago
4
Multiple messages
#29
dayuyang1999
closed
1 year ago
2
support pydantic v1
#28
pacman100
closed
1 year ago
0
Adds SageMaker client
#27
philschmid
closed
1 year ago
0
Add Agent example & Fix stop sequences
#26
philschmid
closed
1 year ago
0
Fix LLaMA 2 prompt format.
#25
float-trip
closed
1 year ago
1
rag llama example
#24
philschmid
closed
1 year ago
0
Pydantic problem
#23
bacoco
opened
1 year ago
6
Is there a way to enable structured output?
#22
phiweger
opened
1 year ago
7
Local inference of TGI
#21
bacoco
closed
1 year ago
4
Add tests for base schemas
#20
chainyo
closed
1 year ago
0
Move makefile to hatch scripts
#19
chainyo
closed
1 year ago
3
Extend README
#18
chainyo
closed
1 year ago
0
Remove base prompt for text
#17
philschmid
closed
1 year ago
0
release
#16
philschmid
closed
1 year ago
0
Fix build docs add prompt mapping
#15
philschmid
closed
1 year ago
0
Bad request: Model requires a Pro subscription
#14
zhuxiaosheng
closed
1 year ago
5
Another Actions improvement
#13
philschmid
closed
1 year ago
0
Fix/Makes `model` parameter optional
#12
philschmid
closed
1 year ago
0
Bug: `model` must be defined despite docs saying that if not provided, it defaults to base url.
#11
tomaarsen
closed
1 year ago
3
add more actions and limit when those are run
#10
philschmid
closed
1 year ago
0
docs: typos & slight documentation improvements
#9
tomaarsen
closed
1 year ago
0
Add more prompt templates and tests
#8
philschmid
closed
1 year ago
0
enhancement: Reduce reliance on docs by expanding kwargs into parameters
#7
tomaarsen
closed
1 year ago
2
Chat completion format for empty system content.
#6
viniciusarruda
closed
1 year ago
3
more docs
#5
philschmid
closed
1 year ago
0
update docs
#4
philschmid
closed
1 year ago
0
docs: Resolve indentation on README codeblocks and typos
#3
tomaarsen
closed
1 year ago
0
Integration with self hosted models through TGI
#2
SupreethRao99
closed
1 year ago
2
docs
#1
philschmid
closed
1 year ago
0