PLangHQ / plang

The code repository from plang programming language
GNU Lesser General Public License v2.1
49 stars 5 forks source link

Using plang with a local llm #55

Open mercurial-moon opened 5 days ago

mercurial-moon commented 5 days ago

Hi, Looking into https://github.com/PLangHQ/plang/issues/14

Is there a possibily to get plang working with a LLM running locally?

ingig commented 4 days ago

@mercurial-moon I haven´t done it myself, it should be in theory. If local model follows openai scheme with system, user, assistant messages then it should work.

You can try adding environment variable to point to your localhost https://github.com/PLangHQ/plang/blob/main/Documentation/PlangOrOpenAI.md#local-llm-development

I haven´t tried it and the model would be gpt-4o that would be sent in the request, if you can get the localhost to accept that, it should work

ingig commented 4 days ago

@mercurial-moon

The wanted to add the local LLM has to have GPT-4 capabilities, I couldn´t get gpt-3.5 work reliable, also gpt-4o-mini is not working well. I think what is need is to finetune a LLM on the data responses, then I think we could get to a very small and quick LLM. The good news is that the data is very structured and very predictable so I think a small LLM could handle it well.

I don't have experience in fine tuning models and I haven't used the small models, this is all just a theory of mine, what I have gathered from reading and monitoring what is going on.

mercurial-moon commented 4 days ago

Hi, Thanks for your comments... is the LLM only doing the "matching" part... that is selecting the right code snippet for that particular english line? or is it doing something more...