Closed ICLXL closed 6 months ago
https://llama.meta.com/llama3/
No response
Actually you can use llama3 by vllm backend now.
Any update? It would be very interesting to have llama3 available
llama 3 is already supported.
Feature request
https://llama.meta.com/llama3/
Motivation
No response
Other
No response