Closed jinz2014 closed 7 months ago
I've validated the following command when running Create an LLM-powered Chatbot using OpenVINO with OpenVINO™ Notebooks 2023.3.
model_configuration = SUPPORTED_LLM_MODELS[model_id.value]
print(f"Selected model {model_id.value}")
Result:
Please re-install OpenVINO™ Notebooks 2023.3 with the steps from the Installation Guide and re-launch Create an LLM-powered Chatbot using OpenVINO.
@Wan-Intel It is not about which model is selected. Can you reproduce the error during the inference phase ?
I'm able to run inference of Create an LLM-powered Chatbot using OpenVINO with OpenVINO™ Notebooks 2023.3.
Please re-install OpenVINO™ Notebooks 2023.3 with the steps from the Installation Guide and re-launch Create an LLM-powered Chatbot using OpenVINO.
Running the example from 254-llm-chatbot shows the following error. Can you reproduce the error ? Thanks for your instruction.