MaggotHATE / Llama_chat

A chat UI for Llama.cpp
MIT License
11 stars 0 forks source link

I'm impressed #1

Open Succubyss opened 7 months ago

Succubyss commented 7 months ago

I found this piece of software by accident. It's very easy to use and made it incredibly simple to load a local AI model. I just opened it up to see what it was, saw that it was asking for a specific model filename, and thus I put the filename into a search engine, downloaded it, and presto. I didn't even know my card could handle a 13B model.

Needless to say I'm very impressed.

Would you please consider releasing a new beta with the latest commits? I know you included build instructions, but I don't really have the time right now to set up a new compilation environment.

Ideally, I'd also love to see a GitHub workflow that creates artifacts for each commit so I can keep up with your progress in a way that's convenient. This would probably be ideal as it states in your README that you're on an old CPU and that others should compile it themselves.

I'll be letting others know about this, thank you for putting it out there.

MaggotHATE commented 7 months ago

@Succubyss Hi! Thank you for kind words and for testing my program!

I've uploaded the latest versions of UI, it comes with the latest commits (including MoE fixes), but I've moved to a new setup, so:

I will update README later, didn't really have time for that, but there are a lot of things changed. I'm also not familiar with CI, so it'll take me some time to figure it out - but I will do it eventually! Again, thanks for testing and glad you like it.

Edit: added new config example - if you are familiar with json, highly recommend looking at it and playing with settings...

Succubyss commented 7 months ago

I will update README later, didn't really have time for that, but there are a lot of things changed. I'm also not familiar with CI, so it'll take me some time to figure it out - but I will do it eventually! Again, thanks for testing and glad you like it.

Did the compiling process change? I did attempt it since that w64devkit thing looked interesting, but after jumping through a few hoops, I ran into an error:

x86_64-w64-mingw32-g++ -I. -Ibase -Iinclude -Itinyfiledialogs -o Llama_Chat_gguf main_vk2.cpp o/imgui/imgui.o o/imgui/imgui_demo.o o/imgui/imgui_draw.o o/imgui/imgui_tables.o o/imgui/imgui_widgets.o o/imgui/imgui_impl_sdl2.o o/imgui/imgui_impl_vulkan.o o/imgui/imgui_stdlib.o o/tinyfiledialogs.o o/ggml.o o/ggml-alloc.o o/ggml-backend.o o/llama.o o/sampling.o o/common.o o/ggml-quants.o o/grammar-parser.o o/unicode.o o/unicode-data.o o/sgemm.o chat_plain.h thread_chat.h UI.h llama_chat1.res -O3 -std=c++2a -fPIC -DNDEBUG -march=native -mtune=native -DGGML_USE_K_QUANTS -DLOG_DISABLE_LOGS -w -Iimgui_f6836ff -Iimgui_f6836ff/backends -g -Wall -Wformat -pipe -Xassembler -muse-unaligned-vector-move -D_WIN32_WINNT=0x602 `pkg-config --cflags sdl2` -DIMGUI_USE_WCHAR32   -static -lgdi32 -lopengl32 -limm32 `pkg-config --static --libs sdl2` -lshell32 -lvulkan
In file included from main_vk2.cpp:20:
UI.h:104:13: error: redefinition of 'void sanitizePath(std::string&)'
  104 | static void sanitizePath(std::string& path){
      |             ^~~~~~~~~~~~
In file included from ./chat_plain.h:24,
                 from ./thread_chat.h:20,
                 from UI.h:10:
include/jsonParams.h:651:13: note: 'void sanitizePath(std::string&)' previously defined here
  651 | static void sanitizePath(std::string& path){
      |             ^~~~~~~~~~~~
main_vk2.cpp: In function 'int SDL_main(int, char**)':

One of the hoops I jumped through was setting CPLUS_INCLUDE_PATH to the VulkanSDK Include directory even though I'd already stuck it in the directory that indicates where to put it. I also had to download SDL2 (which wasn't mentioned in the README), move the pkg-config files into w64devkit, and include its Include\SDL2 directory in that same variable.

Edit: added new config example - if you are familiar with json, highly recommend looking at it and playing with settings...

I'll be sure to do that!

MaggotHATE commented 7 months ago

UI.h:104:13

You are trying to compile an old version - the new ones are _mini and use UI_simple.h. So, for Vulkan it is demo_vk_mini, for Clblast - demo_cl_mini. I haven't used CPU-only for a while, but now I do and will updated Makefile for that.

I also had to download SDL2 (which wasn't mentioned in the README), move the pkg-config files into w64devkit, and include its Include\SDL2 directory in that same variable.

Yes, sorry for that, I assumed it would be apparent from ImgUi setup.

As for VulkanSDK, I just installed Lunarg's installer, and it works just fine on my new system.