Closed auxon closed 12 months ago
@auxon given the popularity of openai, the library started with supporting first. this is the first issue towards non-openai solutions. do you have a particular LLM provider in mind?
I can then make openai
optional similar to langchain
.
@ajndkr I'm using Llama2 7b-Chat right now, but may switch models over time.
@auxon Right now, the implementation is tightly coupled with langchain's openai integration. So until there's supported for more LLM providers, I dont want to prematurely work towards making openai
optional.
if you are interested to contribute towards this, maybe you can propose a plan and we can work towards supported more LLM providers.
@ajndkr I'll take a look but things seem to work fine as is, so it may be low priority.
@auxon i've removed the openai_aiosession
decorator after openai_v1 migration. see #149
the changes will come into effect with v0.8 release. i have other features/bugfixes in mind. Will take a week or so until i make the new release.
P.S. i checked and openai
was always an optional dependency. see: https://github.com/ajndkr/lanarky/blob/90bd4bf2fe3b66b0c708f1ebb6c73006cc372627/pyproject.toml#L19
Currently, the openai module is required, even if I'm using another LLM. Why is it required, and can it be removed?