OoriData / OgbujiPT

Client-side toolkit for using large language models, including where self-hosted
Apache License 2.0
103 stars 8 forks source link

Eliminate use of langchain #12

Closed uogbuji closed 1 year ago

uogbuji commented 1 year ago

11 started with an investigation on what langchain is up to, and whether things could be made transparent. Conclusion was that we should just replace LC with its upstream building blocks. That's now done & OgbujiPT is independent from LC, both in core libraries, and in demos.

uogbuji commented 1 year ago

First step is a broad code review. Starting this now.

uogbuji commented 1 year ago

@choccccy I notice demo/chat_pdf_streamlit_ui.py doesn't use ogbujipt.async_helper.schedule_llm_call and indeed just directly calls openai_api.Completion.create. Is that since you already have the throbber running in parallel anyway (since it's just an anim GIF), and you're not also running a throbber coroutine n the command line? In other words, is it OK if openai_api.Completion.create blocks, or maybe you actually found that it doesn't block?

I'm definitely OK with not including a multiprocessing component in this demo, but I just wanted to make sure I understand, so I can update any comments (e.g. main docstring).

choccccy commented 1 year ago

@choccccy I notice demo/chat_pdf_streamlit_ui.py doesn't use ogbujipt.async_helper.schedule_llm_call and indeed just directly calls openai_api.Completion.create. Is that since you already have the throbber running in parallel anyway (since it's just an anim GIF), and you're not also running a throbber coroutine n the command line? In other words, is it OK if openai_api.Completion.create blocks, or maybe you actually found that it doesn't block?

I'm definitely OK with not including a multiprocessing component in this demo, but I just wanted to make sure I understand, so I can update any comments (e.g. main docstring).

Nope I was just trying to make it work, I was gonna put async back in, but I was just happy to have it working at all

uogbuji commented 1 year ago

Nope I was just trying to make it work, I was gonna put async back in, but I was just happy to have it working at all

If it worked well enough for the demo, I'm suggesting we leave it that way. It doesn't have to include async, since we already have demos that cover that.