Closed chenglu66 closed 5 months ago
Hi, please provide detailed steps to reproduce the issue, eg command you used to start tabby.
Hi, please provide detailed steps to reproduce the issue, eg command you used to start tabby. OK cargo run serve --device experimental-http \ --model '{"kind": "openai", "model_name": "codellama/CodeLlama-70b-Instruct-hf", "api_endpoint": "xxxxxx:8080/v1", "prompt_template": "
"}'
Describe the bug A clear and concise description of what the bug is.
Information about your version Please provide output of
tabby --version
Information about your GPU Please provide output of
nvidia-smi
Additional context Add any other context about the problem here.