kwaroran / RisuAI

Make your own story. User-friendly software for LLM roleplaying
https://risuai.net
GNU General Public License v3.0
743 stars 132 forks source link

Can I compile by myself? #20

Closed bolivkazelenayau closed 1 year ago

bolivkazelenayau commented 1 year ago

Hey there! Thank you for wonderful project and efforts, I appreciate it a lot.

So, I found an issue in the current build (I've even installed an updated 0.7.3 .msi file) where I can't connect Risu to oogabooga WebUI. As I've learned, it happens because current code version relies on the old API. I've modified the database.ts file to make it run on the new API, but I can't make it work. So I've decided to try to compile it by myself, hence I'm opening this issue.

May you please tell me what dependencies and libraries I need? Or maybe some extensions for Visual Studio? Thank you.

kwaroran commented 1 year ago

install node.js, then install pnpm

then run

pnpm i
pnpm tauri build

to build yourself

bolivkazelenayau commented 1 year ago

install node.js, then install pnpm

then run

pnpm i
pnpm tauri build

to build yourself

image

What am I doing wrong?

blvckclxud commented 1 year ago

Hello. Was attempting to compile on my machine and got the same error! Impressive work otherwise.

asm6788 commented 1 year ago

You must install Rust, too. https://www.rust-lang.org/tools/install

bolivkazelenayau commented 1 year ago

Hmm. I've installed Rust and Visual C++ Build Tools in Visual Studio and then I was able to build the app, thank you.

I still get this error, though:

image

And if I change the API address, then I get this error (even if I rebuild the app with another API address):

image

I don't know what's causing it :(

bolivkazelenayau commented 1 year ago

It looks like you are using API incorrect way. Changing the API model or key will be helpful.

May you please explain further? I'm using Oogabooga's WebUI within an api mode.

blvckclxud commented 1 year ago

I have also managed to compile the app, but I am getting the exact same error as the above user. I am also using oobabooga and sending an API key locally as in TavernAi. I noticed red text that said "You must use textgen webui with --no-stream and without --cai-chat or --chat" for flag settings while running the model. Perhaps the error is here? Looks like the model and the interface are not exchanging messages correctly.

I am also getting 127.0.0.1 - - [09/May/2023 22:06:34] code 404, message Not Found

Trying to troubleshoot how to fix it. Are there particular LLaMA models that this program works best with?

image

kwaroran commented 1 year ago

had you put url correctly as specified in help button?

bolivkazelenayau commented 1 year ago

had you put url correctly as specified in help button?

Yes I did. I've tried to run it via /v1/generate and via other addresses. Still, no luck. Yet TavernAI works without any issues.

kwaroran commented 1 year ago

did you put url 172.0.0.1 or localhost in url? that will not work

btw, cuz its getting off topic, I closed the issue