danielgross / localpilot

MIT License
3.32k stars 141 forks source link

Update llama_cpp_python to 0.2.29 (Adds StableLM support) #29

Open limdingwen opened 5 months ago

limdingwen commented 5 months ago

Tested with Stable Code 3B. It is functional, at least.

limdingwen commented 5 months ago

Example config:

import os

models = {
    "GitHub": {
        "domain": "https://copilot-proxy.githubusercontent.com",
        "type": "remote",
    },
    "stable-code-3b": {
        "url": "https://huggingface.co/stabilityai/stable-code-3b/resolve/main/stable-code-3b-Q5_K_M.gguf",
        "type": "local",
        "filename": "stable-code-3b-Q5_K_M.gguf",
    },
    "default": "stable-code-3b",
}

model_folder = os.path.expanduser("~/models")