pgosar / ChatGDB

Harness the power of ChatGPT inside the GDB or LLDB debugger!
MIT License
901 stars 31 forks source link

Possibility of using local self-hosted model API ? (locallama, text-generation-webui ?) #10

Closed shodanx2 closed 11 months ago

shodanx2 commented 11 months ago

I rather have a entirely open source software stack if possible, even if it is not as good as the SaaS stuff from openai

pgosar commented 11 months ago

I'm probably not going to implement this, but feel free to make a fork. Just need to replace the API call to chatgpt on chatgdb's side.

Green-Sky commented 5 months ago

All this would need is to be able to define your own endpoint url and default to openai by default.

eg llama.cpp server supports a subset of the openai api, so it would be interesting to see if this just works.

Green-Sky commented 5 months ago

aaaand it looks like the last commit enabled this :trollface: #11 3b445fd22170f34987c263efa1076414607e9a0e

shodanx2 commented 5 months ago

Awesome, this is what, as a batch file fiend, I was waiting for to jump into AI !

On Thursday, February 15, 2024, Erik Scholz @.***> wrote:

aaaand it looks like the last commit enabled this < https://ci3.googleusercontent.com/proxy/WExJ2_gcDVqYfnlpbfDRcugkF570NtTZsDCSFuP_bC3yGKXWQ9QoSOKwSUE2Wt5kvHi5Op3trWW_IEh1J6RrGikJQ-aoM_46SKUEL19mJD-3hDs=s0-d-e1-ft#https://github.githubassets.com/images/icons/emoji/trollface.png>

11 3b445fd

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.< https://ci6.googleusercontent.com/proxy/SNJlL7IqzOEbwKIKJNMRJQh-j2F0aVgiKEx9FsevBPTCcrq4asSCnMuVgtfO2y4cYA04uZisArVHnmXyqajnEpmiviF1H4vSJXETkrKfbOXnKHP_Hd-UPZa5AN9d5p2vmLda7GP31YTAZ1fUO8gI5M2-uqBttHV_MxcEWrJPGf_h2Udnr_gGO_WPnfDVW4jKkZKiEFakwbtZbuaGqoztcWSUeyt_dV1zdMmzIWeepyyFxP2RHg=s0-d-e1-ft#https://github.com/notifications/beacon/ACRBHPMUMD5S7IBDEIIFTBTYTXUSPA5CNFSM6AAAAAA3KTS4QCWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTTT7O37A.gif>Message ID: @.***>