Closed sestinj closed 5 months ago
Hi @sestinj
I don't think the link above works anymore, could you guide me to where the CLI app init is present?
Also, have we considered employing Langchain in the background for more complicated requests from users?
@Sushanti99 thanks for the heads up, just updated the link. We are currently avoiding LangChain due to the extra size it would add to our binary, but if you're using headless mode it is very possible to just import LangChain separately
An interesting thought would be to export an OpenAI-compatible endpoint so you could consume the continue api through an endpoint definition fully supported by nearly all llm agent implementations. It could make a headless mode a bit less involved.
Since Continue is built to work with any IDE, it can also work with no IDE. This would be useful if you want to run tasks in the background, for example from the CLI or in CI/CD.
Implementation of headless mode is complete at a basic level, but there isn't yet a sophisticated CLI application. Here you can find the beginnings of a simple CLI app, but much more can probably be done, for example: