nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69.01k stars 7.57k forks source link

tinyllama-spanish_16bit template problem #2757

Closed b4zz4 closed 1 month ago

b4zz4 commented 1 month ago

I solved it:

### Instruction:
Tú eres un asistente 

### Input:
%1

### Response:
ThiloteE commented 1 month ago

Link to the model?

b4zz4 commented 1 month ago

https://huggingface.co/biololab/tinyllama-spanish_16bit

ThiloteE commented 1 month ago

The model author has not provided a prompt template, nor a tokenizer_config.json, therefore it is impossible to find out what to use. It seems like an experimental model.

If anything, you might want to try

### Instruction:
Tú eres un asistente 

### Input:
%1

### Response:
%2

See https://github.com/nomic-ai/gpt4all/wiki/Custom-Models-Sideload-or-Download#drafting-the-system-prompt-and-chat-template for GPT4All specific syntax.

Closing this issue, since I believe there is not enough information disclosed by the authors of this model to solve this issue.