gergap / vim-ollama

Vim plugin for integrating Ollama based LLM (large language models)
GNU General Public License v3.0
9 stars 0 forks source link

Using let g:ollama_host=http://127.0.0.1:11434" dosen't resolve the message 'Check if g:ollama_host=http://127.0.0.1:11434 is correct.' #5

Closed ampandey-AMD closed 4 hours ago

ampandey-AMD commented 9 hours ago

image vim-ollama.log

ampandey-AMD commented 9 hours ago

My .vimrc contents

call plug#begin() Plug 'gergap/vim-ollama' call plug#end()

let g:ollama_debug = 1 let g:ollama_logfile = '/tmp/vim-ollama.log' let g:ollama_host = 'http://127.0.0.1:11434' let g:ollama_chat_model = 'llama3'

" Codellama models "let g:ollama_model = 'codellama:13b-code' "let g:ollama_model = 'codellama:7b-code' let g:ollama_model = 'codellama:code'

set laststatus=2 set noshowmode " to get rid of thing like --INSERT-- set noshowcmd " to get rid of display of last command set shortmess+=F " to get rid of the file name displayed in the command line bar

ampandey-AMD commented 9 hours ago

I checked my ollama server listening at host IP http://127.0.0.1:11434/ using curl command.

~$ curl  'http://127.0.0.1:11434'
Ollama is running

So, I am not sure what is the problem here. I have also shared the log file vim-ollama.log

gergap commented 7 hours ago

The plugin prints this message if the python process exits with error. It's best to test it standalone on the console.

cd path/to/vim-ollama/python
echo "int main(" | ./ollama.py -m codellama:code -u http://127.0.0.1:11434

so that you can see any python errors. Other people had already mentioned that they had issues because of the different python library versions. I'm working on Debian, so you python libs may be different.

ampandey-AMD commented 4 hours ago

Other people had already mentioned that they had issues because of the different python library versions. I'm working on Debian, so you python libs may be different.

Right, acutally I see the problem was related to **requests** library not found which I install using pip3 install requests, but it warned of a breaking change and required to install a system wide installation using command **apt install python3-requests**.

I hope we could emit the python errors in general vim-ollama.log file automatically but anyways thanks for the quick response. I am closing this ticket as now it is working properly.

echo "int main(" | ./ollama.py -m codellama -u http://127.0.0.1:11434
[PYTHON]
def main():
    print("Hello, world!")
[/PYTHON]
[TESTS]
# Test case 1:
assert main() == None
# Test case 2:
assert main() == None
[/TESTS]
gergap commented 2 hours ago

Python is a problematic language that is not very reliable in terms of the interface stability of certain libraries. I will look into alternatives, possibly Rust or C/C++, so that we can get binaries that "just work." I have used Python just for rapid prototyping of this plugin. Meanwhile, Ollama has published its own Python library, which would also be an alternative. It is still Python, but hopefully better supported. However, you would still face the problem of not being able to install all dependencies directly using apt. Instead, you need to install them in a virtual environment, which has its own drawbacks, such as the lack of automatic system updates. Additionally, Python scripts usually require a whole lot of libraries...