issues
search
kserve
/
open-inference-protocol
Repository for open inference protocol specification
Apache License 2.0
42
stars
10
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add community meet link and calendar
#22
sivanantha321
closed
3 months ago
2
publish to buf
#21
majolo
closed
3 months ago
1
Adding a GET Interface to Inference Would Allow for Better Performance
#20
fstakem
opened
8 months ago
3
Rename branch master to main in CI lint
#19
sivanantha321
closed
10 months ago
3
Text Generate REST API schema
#18
gavrissh
closed
9 months ago
6
Known error responses
#17
zevisert
opened
11 months ago
0
Usable OpenAPI operation names
#16
zevisert
opened
11 months ago
0
OpenAPI path parameter format
#15
zevisert
opened
11 months ago
0
Optional model version
#14
zevisert
opened
11 months ago
0
Open governance
#13
zevisert
opened
1 year ago
2
Encoding of BYTES data in JSON?
#12
rehevkor5
opened
1 year ago
0
Add linter for OpenApi and grpc proto in github actions CI
#11
sivanantha321
closed
1 year ago
3
Add linter github actions for the repo
#10
yuzisun
closed
1 year ago
0
Adding Properties field to Model Metadata Response
#9
nnshah1
closed
1 year ago
0
GRPC Protocol Update for Inference Parameter Types
#8
nnshah1
closed
1 year ago
0
WIP: Add generate API schema
#7
yuzisun
closed
11 months ago
5
Add meeting notes
#6
adriangonz
closed
1 year ago
0
How open-inference-protocol works in LLMs, any use case?
#5
lizzzcai
opened
1 year ago
3
Change single line comment style used for copyright
#4
andyi2it
closed
1 year ago
1
Proto file comment section is invalid
#3
andyi2it
opened
1 year ago
2
Add link to slides
#2
adriangonz
closed
1 year ago
0
Initial commit for open inference protocol
#1
yuzisun
closed
1 year ago
0