mneedham / LearnDataWithMark

Code and scripts behind the @LearnDataWithMark YouTube channel
https://learndatawithmark.com
135 stars 38 forks source link

Improvement: use ollama's own `AsyncClient` instead of OpenAI `AsyncOpenAI` #74

Open MrCsabaToth opened 4 weeks ago

MrCsabaToth commented 4 weeks ago

I'm looking for streamlit ollama parallel calls and found your blog post and video. I notice that in the source https://github.com/mneedham/LearnDataWithMark/blob/main/ollama-parallel/app.py you use AsyncOpenAI. I don't know if there's any specific reason, but I'll try to get rid of the extra dependency and use ollama's AsyncClient: https://github.com/ollama/ollama-python/blob/ebe332b29d5c65aeccfadd4151bf6059ded7049b/examples/async-chat-stream/main.py#L27C19-L27C30