Closed zanyatta closed 2 months ago
It only seems to affect the "test llm connection"
I post the successful experience of LM Studio here for others to reuse. Note: Since AutoDev currently only obtains content from the first message returned, SSE mode is not supported and "stream" in the body parameter should be set to false.
llm server: Custom
server: http://localhost:1234/v1/chat/completions
response(json path): $.choices[0].message.content
request body(json): {"customHeaders":{"Content-Type":"application/json"},"customFields":{"model":"lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF","temperature":0.7,"max_tokens":-1,"stream":false}}
Already fix, after build finish you can download and test it: https://github.com/unit-mesh/auto-dev/actions/runs/9218399014
nice
Feature Proposal: Allow customization of initial system prompt
like https://github.com/unit-mesh/auto-dev/issues/175
AutoDev is currently unable to work with LM Studio:![image](https://github.com/unit-mesh/auto-dev/assets/29851115/8fe2189a-3a97-45fe-a59a-fbc964e7e33c)