Closed rodfer0x80 closed 7 months ago
I would like to see a fork where we could run our own custom llama/or w/e model locally instead of being API dependent on a massive private corp
I'm working on adding opensource LLM model + open source vector DB in the coming releases, so OpenChat will operate 100% offline, at the end, this the product's vision XD
@gharbat Checkout this repo, I think it's a great alternative to chatGPT API
I would like to see a fork where we could run our own custom llama/or w/e model locally instead of being API dependent on a massive private corp