-
With the default (black) color and dark theme buttons are almost invisible. I modified deco-theme.cpp line 210 to draw the the lines in white and looks great for now, almost matching my GTK Adwaita-Da…
-
Hi, awesome work on this project!
I'm building some Swift apps using llama.cpp, and I'd love to try getting clip.cpp running on my app too.
I'm curious if you're going to support running clip.cp…
-
A lot is us Competetive programmers have to solve multiple ques fast. Rather than creating multiple cpp files, it would be awesome if we can use the polyglot notebook for c++. Does it already supports…
-
Hello, awesome addon, which has saved me from tons of shader code more than once. Today I saw the new version and decided to update, but I did it unwisely ... I just replaced the files of the old vers…
-
Hi team, I checked the locallama and found that gemma can work well with the Self-Extend method. It would be awesome if this technique could be added to the gemma.cpp.
References:
- [locallama](http…
-
**Use OpenAI compatible servers**
A lot of recent frameworks (llama.cpp, vLLM, and other...) make their models available through an OpenAI compatible API.
I think it would be awesome if we could us…
-
### Search before asking
- [X] I had searched in the [issues](https://github.com/apache/doris/issues?q=is%3Aissue) and found no similar issues.
### Version
Latest master code.
### What's Wrong?
…
-
**Description**
Please consider adding Core ML model package format support to utilize Apple Silicone Nural Engine + GPU.
**Success Criteria**
Utilize both ANE & GPU, not just GPU on Apple Sili…
-
This is an awesome project!
Ollama is based on llama.cpp and there are many openai compatible backends as long as the user can choose a different base url.
-
Would it be possible to add a flag to disable printing "NetHogs version x.x.x" at the top of the window? I like to run nethogs in an itty-bitty tmux pane and it would be 2 lines more space efficient i…