SilasMarvin / lsp-ai

LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
MIT License
1.82k stars 55 forks source link

support api_endpoint for ollama #15

Closed luixiao0 closed 2 weeks ago

SilasMarvin commented 2 weeks ago

This is really awesome thank you!

What do you think about explicitly adding a chat_endpoint and a generate_endpoint in the config options for Ollama instead of just endpoint? We do this for every other API backend. If a user uses Ollama remotely are we sure they will have chat at /api/chat and generate at /api/generate?

luixiao0 commented 2 weeks ago

This is really awesome thank you!

What do you think about explicitly adding a chat_endpoint and a generate_endpoint in the config options for Ollama instead of just endpoint? We do this for every other API backend. If a user uses Ollama remotely are we sure they will have chat at /api/chat and generate at /api/generate?

I think Adding separate chat_endpoint and generate_endpoint for Ollama isn't necessary for now because once Ollama is running, it provides both chat and generate functionalities at /api/chat and /api/generate. But it is Fine to have 2 config options, providing better consistency across providers

SilasMarvin commented 2 weeks ago

If you can make the suggested changes and resolve the merge conflict I will merge it in! Thank you!

luixiao0 commented 2 weeks ago

resolved, 3a351ffhttps://github.com/SilasMarvin/lsp-ai/pull/15/commits/237f4cf179b79f74b4ad2e169e8e4f855f96a6b4

SilasMarvin commented 2 weeks ago

resolved, 3a351ff237f4cf

Awesome thank you. If you can implement the changes I suggested above I will merge it

luixiao0 commented 2 weeks ago

Sorry, I misunderstood your suggestion. The suggested changes have been committed.

SilasMarvin commented 2 weeks ago

Sorry, I misunderstood your suggestion. The suggested changes have been committed.

Don't apologize you are totally fine. I think we are having some issues communicating. If you scroll to the top of the page, you will see I have added some suggestions for changes. You can click commit on those directly, or you can add them yourself. Please verify they work and I will merge the PR. Thank you for your contributions!

SilasMarvin commented 2 weeks ago

Thanks for testing it locally. These are the suggestions I am talking about:

Screenshot 2024-06-12 at 12 14 49 PM

We shouldn't call it completions_endpoint as Ollama calls it generate. I liked the defaults you had before.

If you can commit those suggestions and test it, I will merge it!

luixiao0 commented 2 weeks ago

Is github not working as intended? i cannot see them on my side, nor my mailbox, i will make the change shown in your screenshot image

I liked the defaults you had before.

i will revert them at next commit

suggested by gpt-4o: try "submitreview"

SilasMarvin commented 2 weeks ago

Is github not working as intended? i cannot see them on my side, nor my mailbox, i will make the change shown in your screenshot image

I liked the defaults you had before.

i will revert them at next commit

suggested by gpt-4o: try "submitreview"

Oh super strange. I have attached the two other suggestions that I made.

Screenshot 2024-06-12 at 12 38 12 PM Screenshot 2024-06-12 at 12 38 16 PM
luixiao0 commented 2 weeks ago

Is github not working as intended? i cannot see them on my side, nor my mailbox, i will make the change shown in your screenshot image

I liked the defaults you had before.

i will revert them at next commit suggested by gpt-4o: try "submitreview"

Oh super strange. I have attached the two other suggestions that I made.

Screenshot 2024-06-12 at 12 38 12 PM Screenshot 2024-06-12 at 12 38 16 PM

I've addressed the suggested modifications and it's ready for your feedback

SilasMarvin commented 2 weeks ago

Looks pretty good!

You should be able to change:

.post(self
                .configuration
                .generate_endpoint
                .as_ref()
                .unwrap_or(&"http://localhost:11434/api/generate".to_string())

To:

.post(self.configuration.generate_endpoint.as_deref().unwrap_or("http://localhost:11434/api/generate"))
luixiao0 commented 2 weeks ago

my bad, it's done, never heard of deref before

SilasMarvin commented 2 weeks ago

No worries! Awesome work, I'm going to test locally and merge it tonight!

Thanks for contributing!