Open morphy76 opened 1 year ago
I guess new issues are required to bind:
Plz, forgive me, I'm just trying to understand the plan
Yes, you're totally right @morphy76, the idea is to have this extension support memories, native functions and so on. But first I want to focus on testing (I need to understand how Semantic Kernel uses their Mock to test against OpenAI services) and documentation so we can release a first 0.1 version of this extension.
Nice, I'll stay on this page, then... it should also help to create a better spec than the one I did :)
Teamwork!!! 😄
I'm sorry to be a little bit not-present, it's a quite busy period but today I did some checks about mocking OpenAI; here it is some notes, hopefully useful:
DefaultKernelTest
in the semantic-kernel
sources but I'm not sure about its adoption... not sure if it is feasable nor recommendablequarkiverse-mockserver
, introducing an override-endpoint configuration for the openai client
OpenAIClientProvider
connector is not flexible enough to override the endpointOpenAIClientBuilder
(in SemanticKernelClientProducer.produceOpenAIAsyncClient
) creates a wide range of configurable attributes or a wide range of default assumptions, not an issue because after all it's what OpenAIClientProvider
more or less is already doingOpenAIClientBuilder
also requires to provide a non-default HTTP pipeline due to a non configurable check against HTTPS, I was running mockserver using HTTPOpenAIClientBuilder
to create an azure client, which could not be an issue in terms of testing, it's just a matter of mocked requests/responses: we're not testing openai or the quality of the answersquarkiverse-mockserver
OpenAIClientProvider
cannot be used anyway, due to my "own" issue about running mockserver using TLS I failed to run mockserver using HTTPS due to:
Looking forward for hints/suggestions/ideas/recommendations/corrections/whatever; next time-slot, I'll give a better look at semantic kernel sources starting from DefaultKernelTest
but I'm quite pessimistic :/ I think to push more on mockserver
(edit: workplace/lab https://github.com/morphy76/quarkus-semantic-kernel/tree/introducing_openai_server_mock)
@morphy76 I know @geoand has created a LangChain4j extension and he uses OpenAI. I haven't had a look at it yet, but I remember Giorgios mentionning something about mocking OpenAI calls. I'll double check
We don't provide any OOTB support for mocking the calls, but it's not hard to do, and I definitely do want to have some testing utilities in soon
The kernel binds and execute all the semantic kernel components, in particular AI services and memories.
The kernel provider needs to create a named instance so that being accordingly preconfigured for available/default AI services and memory.
Multiple kernels can be configured.