Hukasx0 / character-factory

Generate characters for SillyTavern, TavernAI, TextGenerationWebUI using LLM and Stable Diffusion
GNU Affero General Public License v3.0
118 stars 11 forks source link

Better LLM-models? #11

Closed ghost closed 10 months ago

ghost commented 11 months ago

Would it be possible to use better language models with this? I would prefer using something that is 13B as they are far better than 7B ones.

Hukasx0 commented 11 months ago

It is possible, I only inserted the ones on which I tested and under which I made prompts, but as llm in these scripts you can use any llm model.

You only replace these two lines:

model_url = "link_to_llm_model"
model="models/model_name"

The first line is at the top of every script, the second is inside llm variable:

llm = CTransformers(
        model="models/model_name",    # here
        model_type="llama",
        gpu_layers=gpu_layers,
        config={
            "max_new_tokens": 1024,
            "repetition_penalty": 1.1,
            "top_k": 40,
            "top_p": 0.95,
            "temperature": 0.8,
            "context_length": 8192,
            "gpu_layers": gpu_layers,
            "stop": [
                "/s",
                "</s>",
                "<s>",
                "<|system|>",
                "<|assistant|>",
                "<|user|>",
                "<|char|>",
            ],
        },
    )
ghost commented 11 months ago

I noticed you had the models ending in .gguf inside the scripts, does that matter or can I use GPTQ or AWQ models too and is just putting the folder name enough for it to know what to do?

Hukasx0 commented 11 months ago

In every script available in the repository, I use the CTransformers library for this. Here is the list of models listed as supported from their repository: supported_models