issues
search
SqueezeAILab
/
LLMCompiler
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
https://arxiv.org/abs/2312.04511
MIT License
1.38k
stars
102
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Code for WebShop Benchmark
#22
ShayekhBinIslam
opened
1 month ago
1
Friendli endpoints support
#21
kssteven418
closed
2 months ago
0
store-path and JSONDecoderError
#20
Vluptronic
opened
3 months ago
0
Questions about LLMCompiler
#19
RobinQu
opened
3 months ago
1
Double join is right answer?
#18
sangmandu
closed
4 months ago
1
api key as a env variable
#17
kssteven418
closed
5 months ago
0
Azure Endpoint Support
#16
kssteven418
closed
5 months ago
0
Streaming Bugfix
#15
kssteven418
closed
6 months ago
0
Error local variable 'full_output' referenced before assignment
#14
tcxia
closed
6 months ago
2
Update README.md
#13
kssteven418
closed
7 months ago
0
Kssteven418 patch 2
#12
kssteven418
closed
7 months ago
0
benchmarking code added
#11
kssteven418
closed
7 months ago
0
Update README.md
#10
kssteven418
closed
7 months ago
0
Great job, when will you update the functionality to support the open source model llama2?
#9
ECHO967
closed
7 months ago
2
Custom model support using vLLM
#8
kssteven418
closed
8 months ago
0
bugfix
#7
kssteven418
closed
8 months ago
0
Support ReAct
#6
kssteven418
closed
8 months ago
0
Update README.md
#5
kssteven418
closed
8 months ago
0
remove unused file and package
#4
SuhongMoon
closed
9 months ago
0
Update tools.py
#3
eltociear
closed
9 months ago
0
missing requirements added
#2
kssteven418
closed
9 months ago
0
Update README.md to use proper name for stream parameter
#1
hwchase17
closed
9 months ago
1