THUDM / CodeGeeX4

CodeGeeX4-ALL-9B, a versatile model for all AI software development scenarios, including code completion, code interpreter, web search, function calling, repository-level Q&A and much more.
https://codegeex.cn
Apache License 2.0
635 stars 51 forks source link

set plugin to use local api #4

Closed clude closed 3 days ago

clude commented 2 weeks ago

do you have a plan to make the vs/jetbrains plugin supporting local deployed CodeGeeX Service

ImJoyed commented 2 weeks ago

The latest plugins are now supported.

clude commented 2 weeks ago

cooooooool!!!!
found the latest VS plugin working on local ai model now! will the next released jetbrains plugin have the same feature ?

ShaoboZhang commented 2 weeks ago

cooooooool!!!! found the latest VS plugin working on local ai model now! will the next released jetbrains plugin have the same feature ?

Yes, but Jetbrains has a stricter review process, so it takes a bit longer than vs code.

agdkgg commented 1 week ago

ollama 本地 连接失败 403错误代码,网络 后缀名试了几个,是代码还没修改吗?

limingchina commented 1 week ago

I've seen those settings in vscode

 "Codegeex.Local": {

        "apiURL": "",
        "useChatGLM": true,
        "chatGLM": {
            "apiKey": "",
            "model": ""
        },

What should be set for them? @clude , have you tried it successfully?

clude commented 1 week ago
image

you just need to config the openai api address and model name。 by the way , if u use ollama as a service , don't forget to set the env value: OLLAMA_ORIGINS="*"

Choumingzhao commented 1 week ago

Got same error with codegeex vscode plugin.
Using Ollama api: http://localhost:11434/api/generate. Ollama 0.2.1 API test passed with curl both on local and remote. I also tried other ollama code completion plugins in vscode and they worked.

The screenshot of local api usage by @Stanislas0 in local mode tutorials only show the openai style api . So ollama 0.2.1 api (api/generate, api/chat) may not work in current plugin. Edit: tested openai style api and still get same network error.

Output from CodeGeeX Local in VSCode:

data: { "messages": [ { "role": "user", "content": "Hi" } ], "model": "codegeex4", "stream": false, "temperature": 0.9, "top_p": 0.9, "stop": [] } headers: {} [error] [Model Config] error: { "message": "Network Error", "name": "AxiosError", "stack": "AxiosError: Network Error\n at u.onerror (https://vscode-remote+ssh-002dremote-002bjss.vscode-resource.vscode-cdn.net/home/user/.vscode-server/extensions/aminer.codegeex-2.12.2/webview-ui/dist/assets/index.js:55:1051909)\n at XMLHttpRequest.r (https://vscode-remote+ssh-002dremote-002bjss.vscode-resource.vscode-cdn.net/home/user/.vscode-server/extensions/aminer.codegeex-2.12.2/webview-ui/dist/assets/index.js:35:854177)", "config": { "transitional": { "silentJSONParsing": true, "forcedJSONParsing": true, "clarifyTimeoutError": false }, "adapter": ["xhr", "http"], "transformRequest": [null], "transformResponse": [null], "timeout": 0, "xsrfCookieName": "XSRF-TOKEN", "xsrfHeaderName": "X-XSRF-TOKEN", "maxContentLength": -1, "maxBodyLength": -1, "env": {}, "headers": { "Accept": "application/json, text/plain, /", "Content-Type": "application/json", "Exp-Id": "", "Variant-Id": "", "Is-Exp-Active": "false" }, "baseURL": "https://codegeex.cn/prod", "method": "post", "url": "http://localhost:11434/api/generate", "data": "{\"messages\":[{\"role\":\"user\",\"content\":\"Hi\"}],\"model\":\"codegeex4\",\"stream\":false,\"temperature\":0.9,\"top_p\":0.9,\"stop\":[]}" }, "code": "ERR_NETWORK", "status": null }

* OpenAI style compatible API also by ollama
```quote
2024-07-16 10:12:45.105 [info] [Model Config] try to connect: http://localhost:11434/v1/chat/completions 

  data: {"messages":[{"role":"user","content":"Hi"}],"model":"codegeex4","stream":false,"temperature":0.9,"top_p":0.9,"stop":[]}
  headers: {}

2024-07-16 10:12:45.144 [error] [Model Config] error: {"message":"Network Error","name":"AxiosError","stack":"AxiosError: Network Error\n    at u.onerror (https://vscode-remote+ssh-002dremote-002bjss.vscode-resource.vscode-cdn.net/home/user/.vscode-server/extensions/aminer.codegeex-2.12.2/webview-ui/dist/assets/index.js:55:1051909)\n    at XMLHttpRequest.r (https://vscode-remote+ssh-002dremote-002bjss.vscode-resource.vscode-cdn.net/home/user/.vscode-server/extensions/aminer.codegeex-2.12.2/webview-ui/dist/assets/index.js:35:854177)","config":{"transitional":{"silentJSONParsing":true,"forcedJSONParsing":true,"clarifyTimeoutError":false},"adapter":["xhr","http"],"transformRequest":[null],"transformResponse":[null],"timeout":0,"xsrfCookieName":"XSRF-TOKEN","xsrfHeaderName":"X-XSRF-TOKEN","maxContentLength":-1,"maxBodyLength":-1,"env":{},"headers":{"Accept":"application/json, text/plain, */*","Content-Type":"application/json"},"baseURL":"https://codegeex.cn/prod","method":"post","url":"http://localhost:11434/v1/chat/completions","data":"{\"messages\":[{\"role\":\"user\",\"content\":\"Hi\"}],\"model\":\"codegeex4\",\"stream\":false,\"temperature\":0.9,\"top_p\":0.9,\"stop\":[]}"},"code":"ERR_NETWORK","status":null}

For those who are eager to try CodeGeeX4 model in Ollama now, I recommand Continue. I managed to use the model with the plugin.

clude commented 3 days ago

For me , I tried CodeQwen1.5 on ollama 0.1.44 and CodeGeex4-9b on Ollama 0.2.5, both are working。

currently, if VsCode is your primary IDE , "Continue" may be the best choice, it is stable、customizable and has much more features than CodeGeex plugin in local mode. but "Continue"'s jetbrains plugin will disappoint you, it is not stable, always crash。

so, I still have high hopes on the CodeGeex plugin's local model, hope it has the full feature as the online mode version.