Open MrCsabaToth opened 4 weeks ago
I'm looking for streamlit ollama parallel calls and found your blog post and video. I notice that in the source https://github.com/mneedham/LearnDataWithMark/blob/main/ollama-parallel/app.py you use AsyncOpenAI. I don't know if there's any specific reason, but I'll try to get rid of the extra dependency and use ollama's AsyncClient: https://github.com/ollama/ollama-python/blob/ebe332b29d5c65aeccfadd4151bf6059ded7049b/examples/async-chat-stream/main.py#L27C19-L27C30
AsyncOpenAI
AsyncClient
I'm looking for streamlit ollama parallel calls and found your blog post and video. I notice that in the source https://github.com/mneedham/LearnDataWithMark/blob/main/ollama-parallel/app.py you use
AsyncOpenAI
. I don't know if there's any specific reason, but I'll try to get rid of the extra dependency and use ollama'sAsyncClient
: https://github.com/ollama/ollama-python/blob/ebe332b29d5c65aeccfadd4151bf6059ded7049b/examples/async-chat-stream/main.py#L27C19-L27C30