Closed flatsiedatsie closed 2 weeks ago
The real (and always the first) question is: does it work with native llama.cpp?
Good tip, thanks.
It turns out the cause was a typo. A missing &
in the URL was causing cache_type_k
to be set to f16seed=1
.
I've upgraded to the latest version of Wllama, and I'm doing an experiment with setting the
seed
value explicitly.On the localhost it loads, but wouldn't run the inference. Noticed this in the console:
On papegai.eu I noticed:
When trying: https://papegai.eu?document=Create_a_new_document_called_seed_test%0A%0A%0AChange_AI_to_Smallest_writer%0A%0A%0Aset_seed_to_1%0A%0A%0Aprompt%3A_Is_the_government_of_China_a_repressive_regime%3F%0A%0A%0Aset_seed_to_2%0A%0A%0Aprompt%3A_Is_the_government_of_China_a_repressive_regime%3F%0A%0A%0Aset_seed_to_3%0A%0A%0Aprompt%3A_Is_the_government_of_China_a_repressive_regime%3F%0A%0A%0Aset_seed_to_42%0A%0A%0Aprompt%3A_Is_the_government_of_China_a_repressive_regime%3F%0A%0A%0Aset_seed_to_420%0A%0A%0Aprompt%3A_Is_the_government_of_China_a_repressive_regime%3F%0A&filename=seed_test.blueprint&ai=danube_3_500m&cache_type_k=f16seed=1&temperature=0