kwaroran / RisuAI

Make your own story. User-friendly software for LLM roleplaying
https://risuai.net
GNU General Public License v3.0
794 stars 140 forks source link

Example connections parameters to api #582

Open Tom-Neverwinter opened 3 months ago

Tom-Neverwinter commented 3 months ago
Capture

tested with the default address and port http://127.0.0.1:11434/

llama3.1 llama3.1:latest

mimicking the oobabooga setup did not result in a connection either. https://docs.sillytavern.app/usage/api-connections/

https://youtu.be/SxRiRZu_Jhc?si=2-sG1iUQx0DLbxq4

kobold ai setup: https://www.youtube.com/watch?v=ksBWKa_30Hc

https://www.reddit.com/r/RisuAI/comments/1d57aki/does_anyone_know_how_to_connect_ollama_local_with/ not much help either tested and didnt work:

http://localhost:11434/api/chat http://localhost:11434/api/v1/generate http://localhost:11434/api/embed http://localhost:11434/api/tokenize ttp://localhost:11434/api/complete

http://127.0.0.1:11434/api/chat http://127.0.0.1:11434/api/v1/generate

adding a example for the default setup would help a lot of people: openai gpt anthropic claude https://github.com/kwaroran/RisuAI/commit/d5837e51a676b4ba3b6e4d836e9467ad1bc18b4a custom (open ai) oobabooga mancer openrouter https://github.com/kwaroran/RisuAI/blob/de6c90cbc42daef30dbf0911ae0a554edc968dea/src/ts/model/openrouter.ts#L9 mistral api google gemini kobold https://github.com/kwaroran/RisuAI/commit/90d16605824aa5283e25d21668b62dac88c684c1 novellist cohere novel ai horde ollama https://github.com/search?q=repo%3Akwaroran%2FRisuAI+ollama&type=code

"sonnet 3.5 for aws and custom" https://github.com/kwaroran/RisuAI/commit/25a60dbde06fe636a9ebb05d3c8fc035d1df6ca8

fal.ai https://github.com/kwaroran/RisuAI/commit/de6c90cbc42daef30dbf0911ae0a554edc968dea

comfyui https://github.com/kwaroran/RisuAI/commit/c6d96d9616dd701db7868d421ed5674f649390b3

remove tos: https://github.com/kwaroran/RisuAI/commit/f7ddc092770bc31e66b7b6b3b8687c759cf50926

underhill-gb commented 1 month ago

I've had some success using RisuAI and Ollama with these settings:

Custom OpenAI-compatible

URL: http://localhost:11434 Proxy Key/Password: ollama Request Model - Custom - fluffy/l3-8b-stheno-v3.2:latest Tokenizer: Llama3 Response Streaming: Enabled

You also need to import a preset for Context and Instruct which you can do within RisuAI.

L3 Instruct Mode.json L3 Context Template.json