issues
search
modular-ml
/
wrapyfi-examples_llama
Inference code for facebook LLaMA models with Wrapyfi support
GNU General Public License v3.0
130
stars
14
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Cpu/memory requirements for worker nodes
#10
ACiDGRiM
opened
11 months ago
1
Running a REST API instead of "example.py"
#9
UmutAlihan
closed
1 year ago
1
Running on CPUs?
#8
fedelrick
closed
1 year ago
1
What's the total WPS when i use multi-gpu to work for inference?
#7
hailiyidishui
closed
1 year ago
2
Error when run torch.load(ckpt_path, map_location="cpu")
#6
elricwan
closed
1 year ago
7
Model Parallel Question
#5
sharlec
closed
1 year ago
1
where is zeromq_proxy_broker.py and dir standalone?
#4
ChuangLee
closed
1 year ago
1
Docker installation guidelines. Running Docker-ed to be added soon
#3
fabawi
closed
1 year ago
0
2PC with 4GPUs each
#2
sophieyl820
closed
1 year ago
3
Added multiple machine support instead of only 2
#1
fabawi
closed
1 year ago
0