Adriankhl / godot-llm

LLM in Godot
MIT License
100 stars 3 forks source link

Text interruption #6

Closed JekSun97 closed 2 months ago

JekSun97 commented 2 months ago

For some reason, the model’s response is interrupted, as if it does not have enough characters to answer, how can this be fixed?

I'm using this code: `extends Node

var lm = GDLlama.new()

func _ready() -> void: if FileAccess.file_exists("res://model-q4_K.gguf"): lm.model_path="res://model-q4_K.gguf" lm.n_predict = 20 else: print("no model!") pass

func _on_button_pressed() -> void: var gen = lm.generate_text_simple($Control/TextEdit.text) $Control/Label.text=gen pass `

Adriankhl commented 2 months ago

Because here you set n_predict = 20 so it only predicts up to 20 tokens. Typically you can set n_predict up to its context size, which is 512 by default, and you can probably at least increase both to at least 4096. Be aware that it will be much slower if this is too big.