continuedev / ggml-server-example

An example of running local models with GGML
38 stars 5 forks source link

Use a remote linux server for the LLM and local machine vscode and continuedev #5

Open colindaven opened 8 months ago

colindaven commented 8 months ago

Cool extension.

Is it possible to use a remote linux server (internal network) for the LLM and use a local machine to code with vscode and continuedev ? Or might this be supported in future ?

thanks Colin