techee / geany-lsp

LSP plugin for the Geany editor
GNU General Public License v2.0
14 stars 1 forks source link

[Feature request] LSP-AI #56

Open Johnmcenroyy opened 1 week ago

Johnmcenroyy commented 1 week ago

Hi @techee Found interesting LSP-AI project (open-source language server bringing Copilot power to all editors, designed to assist and empower software engineers, not replace them) - https://github.com/SilasMarvin/lsp-ai. It seems doesn't work by default, but I think it needs more configuration itself. If you have time/interest in it please take a look. Thanks.

P.S. I will try to run it and post logs here and all info that I can find.

techee commented 1 week ago

I tried something like this in XCode (Apple's IDE) and didn't like the result at all so I don't want to spend much time on trying to set it up. In any case, let me know what you learn and if there's some problem with the plugin, I'll try to fix it.

Johnmcenroyy commented 2 days ago

Hi @techee So, it took some time :) I figured out how to config Geany for lsp-ai and tested some ai models. What can I say, chat in editor with ai for me was not very convenient (I have tested in Helix), I think that is better to have separate ai chat window in Geany terminal. Completion depends of course on llm model and must be configured more accurately, but technically it works :) Maybe it will be interesting for you or somebody else.

Main configuration

  1. First install ollama (for using ai's llm locally - no internet, no services etc.)
  2. Run ollama server
    # for localhost
    ollama serve
    or
    # for localhost and remote
    OLLAMA_HOST=0.0.0.0 ollama serve
  3. Load llm
    ollama pull qwen2.5:0.5b
    # very light and fast model for testing
    # full list of models is here: https://ollama.com/library
    # for me gemma2:2b and gemma2:9b were rather good
  4. Load ollama chat for testing

    ollama run qwen2.5:0.5b
    # for testing write to chat: hello world in lua
    # ai will print hello world function in lua
    # for faster working ollama must be configured with vulkan or rocm/cuda
    # ollama run is very simple chat, for more advanced chat I suggest
    # https://github.com/dustinblackman/oatmeal  or
    # https://github.com/ggozad/oterm

    LSP-AI in Geany

    1. install lsp-ai server from here https://github.com/SilasMarvin/lsp-ai/wiki/Installation in Arch Linux: yay -Sy lsp-ai
    2. in lsp.conf of geany-lsp plugin (for python files for example) write
      [Python]
      cmd=lsp-ai
      initialization_options={"memory": {"file_store": {}}, "models": {"model1": {"type": "ollama", "model": "qwen2.5:0.5b", "chat_endpoint": "http://127.0.0.1:11434/api/chat", "generate_endpoint": "http://127.0.0.1:11434/api/generate", "max_requests_per_second": 1}}, "completion": {"model": "model1", "parameters": {"max_context": 2000, "options": {"num_predict": 32}}}, "chat": [{"trigger": "!C", "action_display_name": "Chat", "model": "model1", "parameters": {"max_context": 4096, "max_tokens": 1024, "system": "You are a code assistant chatbot. The user will ask you for assistance coding and you will do you best to answer succinctly and accurately"}}]}
      # the main challenge here was to not get tangled in brackets :)
      # change 127.0.0.1 to ip address of ollama server if it is in local network
      # also see https://github.com/SilasMarvin/lsp-ai/wiki/Configuration
      # and https://github.com/SilasMarvin/lsp-ai/wiki/In‐Editor-Chatting

    I have wanted to ask you also some questions.

    In Helix for chat in editor you must send request to lsp server with textDocument/codeAction by clicking space+a near the trigger code !C. Here is a video how it works in Helix: https://github.com/SilasMarvin/lsp-ai?tab=readme-ov-file#in-editor-chatting. How I understand it is supported by geany-lsp plugin but there is no commands for this server (lsp-ai) https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_codeAction

    Also what do you think about supporting textDocument/inlineCompletion for some future ai interaction ? https://microsoft.github.io/language-server-protocol/specifications/lsp/3.18/specification/#textDocument_inlineCompletion https://www.tabnine.com/blog/introducing-inline-code-completions/

    Thanks again for such a great project. Regards.

P.S. example of AI in Geany with oatmeal chat in terminal with ollama backend and qwen2.5:0.5b llm. qwen2 5:0 5b