Open DiAifU opened 8 months ago
Thank you for reporting!
The error appears to be related to the OkHttp library and how it processes the event streams. For some reason, LM Studio doesn't seem to append empty newline at the end of the final response, causing OkHttp to fail. I am not yet sure what the fix is.
Maybe @lmstudio-ai can sort this out on their end...
https://github.com/langchain4j/langchain4j/issues/670 related issue.
The error java.lang.IllegalArgumentException: byteCount < 0: -1
can be reproduced by removing the empty newlines from the mocked response: LocalCallbackServer.java#L111
Same problem here, any suggestions on how to temporarily fix it?
Hello, everyone and especially @lucacri & @DiAifU. If it's still relevant - because LM Studio has OpenAI-like server API - CodeGPT partially supports it via "Custom OpenAI" provider, I just checked it.
Custom OpenAI
provider in CodeGPTCustom AI
provider to localhostAnd don't forget to write correct model name loaded in LM studio Important! For code completions you model should support FIM pattern
Awesome, thank you!
We could add a preset template for it, similar to how others are done: https://github.com/carlrobertoh/CodeGPT/blob/master/src/main/kotlin/ee/carlrobert/codegpt/settings/service/custom/template/CustomServiceTemplate.kt
@carlrobertoh Ok. Let me try to find time for this. Will add. Wait for PR from mine :)
What happened?
Hi,
LM Studio, with a locally running model, is working by using OpenAI with a Preset of Ollama or Lama Cpp and changing the port to the default LM Studio one (1234). The prompts are being generated but always end up with the error "Unknown API response. Code: 200, Body:".
Could you please tell me how to fix this or add support for responses being generated by LM Studio ?
Thanks for the great plugin !
Nicolas
Relevant log output or stack trace
Steps to reproduce
CodeGPT version
2.4.0
Operating System
Windows