Closed uogbuji closed 1 year ago
First step is a broad code review. Starting this now.
@choccccy I notice demo/chat_pdf_streamlit_ui.py
doesn't use ogbujipt.async_helper.schedule_llm_call
and indeed just directly calls openai_api.Completion.create
. Is that since you already have the throbber running in parallel anyway (since it's just an anim GIF), and you're not also running a throbber coroutine n the command line? In other words, is it OK if openai_api.Completion.create
blocks, or maybe you actually found that it doesn't block?
I'm definitely OK with not including a multiprocessing component in this demo, but I just wanted to make sure I understand, so I can update any comments (e.g. main docstring).
@choccccy I notice
demo/chat_pdf_streamlit_ui.py
doesn't useogbujipt.async_helper.schedule_llm_call
and indeed just directly callsopenai_api.Completion.create
. Is that since you already have the throbber running in parallel anyway (since it's just an anim GIF), and you're not also running a throbber coroutine n the command line? In other words, is it OK ifopenai_api.Completion.create
blocks, or maybe you actually found that it doesn't block?I'm definitely OK with not including a multiprocessing component in this demo, but I just wanted to make sure I understand, so I can update any comments (e.g. main docstring).
Nope I was just trying to make it work, I was gonna put async back in, but I was just happy to have it working at all
Nope I was just trying to make it work, I was gonna put async back in, but I was just happy to have it working at all
If it worked well enough for the demo, I'm suggesting we leave it that way. It doesn't have to include async, since we already have demos that cover that.
11 started with an investigation on what langchain is up to, and whether things could be made transparent. Conclusion was that we should just replace LC with its upstream building blocks. That's now done & OgbujiPT is independent from LC, both in core libraries, and in demos.