withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
1.01k stars 93 forks source link

Settings and how we can control them to the API #352

Closed tigert2173 closed 1 month ago

tigert2173 commented 1 month ago

Feature Description

I'm finding it very difficult to figure out the settings I can and cannot modify, I know I can do temperature but I am also aware of the many others but do not know how they are called for this. If we had a list of these it would help so we don't have to test over and over to see if it's making any difference or if we even got the right name for the setting

The Solution

Documentation on the possible settings we can change with an json api request and the best ways to do so.

Considered Alternatives

Not that I know of.

Additional Context

No response

Related Features to This Feature Request

Are you willing to resolve this issue by submitting a Pull Request?

No, I don’t have the time and I’m okay to wait for the community / maintainers to resolve this issue.

giladgd commented 1 month ago

You can inspect the types of options for every object and function you create to control all the settings. Each option has a detailed documentation explaining what it does and how to use it. You can see those types both in TypeScript types of function usages in your project, and on the documentation website.

For example, you can see the options for these on the documentation website:

tigert2173 commented 1 month ago

https://node-llama-cpp.withcat.ai/api/type-aliases/LLamaChatPromptOptions

Thanks found it :D