Open radiantone opened 3 months ago
To resolve the issue with the WebLLM example code in Google Chrome, where the code throws an exception indicating that the model is not loaded before calling chatCompletion()
, you need to ensure that the model is properly loaded before invoking the chatCompletion()
method. This can be done by calling the reload
method on the Engine
instance and awaiting its completion before proceeding.
Here is an example of how you can modify the code to ensure the model is loaded:
import { ChatWebLLM } from "@langchain/community/chat_models/webllm";
import { HumanMessage } from "@langchain/core/messages";
const model = new ChatWebLLM({
model: "Phi-3-mini-4k-instruct-q4f16_1-MLC",
chatOptions: {
temperature: 0.5,
},
});
await model.engine.reload(model.model); // Ensure the model is loaded
const response = await model.invoke([
new HumanMessage({ content: "What is 1 + 1?" }),
]);
console.log(response);
/*
AIMessage {
content: ' 2\n',
}
In this example, the reload
method is called on the engine
instance of the ChatWebLLM
model to ensure the model is loaded before any chat operations are performed. This should resolve the issue of the model not being loaded before calling chatCompletion()
[1].
Additionally, ensure that you are running the code in a web environment, such as a web worker, as shown in the example above.
To continue talking to Dosu, mention @dosu.
I suspect related to #5776
We will lock the peer dep to a specific version from here on out
@jacoblee93 Actually for this specific issue it's not related to webllm version, but wrong sample code in the documentation. As suggested by the Dosubot above, either model.engine.reload
or the wrapper model.initialize()
need to be called before calling completion.
Ah I see - thank you! That PR should fix the example as well then.
Thank you all. I will make all the updates described here and try again.
Checked other resources
Example Code
Tried the example code from langchain and it throws exception in Google chrome. web-llm 0.2.46
Error Message and Stack Trace (if applicable)
Description
WebLLM Example Code does not Work in Google Chrome
System Info
langchain@0.2.5 | MIT | deps: 16 | versions: 277 Typescript bindings for langchain https://github.com/langchain-ai/langchainjs/tree/main/langchain/
keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores
dist .tarball: https://registry.npmjs.org/langchain/-/langchain-0.2.5.tgz .shasum: fc23848c20244a1d0a3dd5dd4663af9a41017ccb .integrity: sha512-H5WL0NanCdQ+tzoeEt7Fyz9YGdR3wbfDvfQrJvxAO95istKo5JraRh24dzyvqxM9439xwRMNaMIpMwsyqtWDtQ== .unpackedSize: 4.0 MB
dependencies: @langchain/core: ~0.2.0 @langchain/openai: ~0.1.0 @langchain/textsplitters: ~0.0.0 binary-extensions: ^2.2.0 js-tiktoken: ^1.0.12 js-yaml: ^4.1.0 jsonpointer: ^5.0.1 langchainhub: ~0.0.8 langsmith: ~0.1.30 ml-distance: ^4.0.0 openapi-types: ^12.1.3 p-retry: 4 uuid: ^9.0.0 yaml: ^2.2.1 zod-to-json-schema: ^3.22.3 zod: ^3.22.4
maintainers:
dist-tags: latest: 0.2.5 next: 0.2.3-rc.0
published a week ago by jacoblee93 jacoblee93@gmail.com