It looks like something went wrong while building the config/add_completion_service.py file. The completion service gets added to the Kernel when defining the LLM_SERVICE as AzureOpenAI, but not when using OpenAI.
The wrong environment variable was used, it was still referencing the AzureOpenAI ID, while choosing the OpenAI LLM service.
The way how the chat service or completion service was creation didn't work as it was creating a new kernel instead of extending the existing kernel.
Hopes this helps ;) For me at least it now works with both LLM services. Thanks for the great tutorials!
Cedric
It looks like something went wrong while building the
config/add_completion_service.py
file. The completion service gets added to the Kernel when defining theLLM_SERVICE
asAzureOpenAI
, but not when usingOpenAI
.Hopes this helps ;) For me at least it now works with both LLM services. Thanks for the great tutorials! Cedric