JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
553 stars 58 forks source link

Add a customizable URL for Ollama calls - Ollama can run locally #227

Open farlistener opened 5 months ago

farlistener commented 5 months ago

For those who have a personal cloud with a personal Ollama server running (and not only localhost), provides an url setting for IA calls.

davila7 commented 5 months ago

You can use this with the Custom connection

Captura de pantalla 2024-01-31 a las 15 26 18
Kramins commented 4 months ago

Do you have an example on how to use the Custom provider with a remote ollama?

igor-elbert commented 3 months ago

You can use this with the Custom connection

Captura de pantalla 2024-01-31 a las 15 26 18

I tried but could not get it to work. Gives "Connection Error" all the time, I don't even see the calls in Ollama log

ZhengRui commented 1 month ago

it's sad many issues are related to enable custom host and port, and the response is kinda rude, custom does not work at all, so many people has reported.

davila7 commented 1 month ago

Hi 👋, we're about to finish an article explaining step by step how to connect the extension locally. It's taken us a while to write the article, but it's almost ready.

Sorry for the delay, we're working full-time to improve the extension... a new update is coming soon that will work much better and be more intuitive.

ZhengRui commented 1 month ago

Nice, truly appreciated, looking forward to the article :)

pawlakus commented 1 month ago

Use this proxy on your machine, you can run ollama on any remote machine, and also swap models on the fly, just edit the code bellow and run it on your localhost:

pip install aiohttp
import asyncio
from aiohttp import ClientSession, ClientResponseError
from aiohttp.web import Response, StreamResponse, run_app, Application
import json
import sys

TARGET_URL = 'http://192.168.0.200:11434'  # Replace with your target server URL

MODEL_SWAP = {
    'codestral': 'codestral:latest',
    'phi3': 'phi3:14b-medium-4k-instruct-q4_K_M',
    'llama3': 'llama3:8b-instruct-q8_0',
    'llama3:8b': 'llama3:8b-instruct-q8_0',
    'codeqwen:code': 'codeqwen:7b-code-v1.5-q8_0',
    'starcoder2': 'starcoder2:15b'
}

async def handle_request(request):
    try:
        method = request.method
        path = request.path_qs
        headers = {key: value for key, value in request.headers.items()}

        if request.can_read_body:
            post_data = await request.read()
        else:
            post_data = None

        if post_data:
            try:
                json_data = json.loads(post_data)
                if "model" in json_data:
                    swap_key = json_data["model"]
                    print(f"json model: {swap_key}")
                    if swap_key in MODEL_SWAP:
                        print("replacing {} with {}".format(swap_key, MODEL_SWAP[swap_key]))
                        json_data["model"] = MODEL_SWAP[swap_key]
                        post_data = json.dumps(json_data).encode('utf-8')
                        headers['Content-Length'] = str(len(post_data))
            except json.JSONDecodeError:
                return Response(status=400, text="Bad Request: Invalid JSON")

        async with ClientSession() as session:
            async with session.request(method, TARGET_URL + path, headers=headers, data=post_data) as response:
                stream_response = StreamResponse(status=response.status, reason=response.reason)
                for key, value in response.headers.items():
                    if key.lower() != 'transfer-encoding':
                        stream_response.headers[key] = value
                await stream_response.prepare(request)

                async for chunk in response.content.iter_any():
                    await stream_response.write(chunk)
                await stream_response.write_eof()

                return stream_response

    except ClientResponseError as e:
        return Response(status=e.status, text=str(e))
    except Exception as e:
        return Response(status=500, text=str(e))

async def create_app():
    app = Application()
    app.router.add_route('*', '/{path_info:.*}', handle_request)  # Match all routes
    return app

if __name__ == '__main__':
    app = create_app()
    run_app(app, host='127.0.0.1', port=11434)