brianpetro / obsidian-smart-connections

Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
https://smartconnections.app
GNU General Public License v3.0
2.81k stars 183 forks source link

WebGPU Error: MatMul Kernel Failure in Attention Layer (Transformers 3.0.0-alpha.15) #810

Open yariyotche opened 1 month ago

yariyotche commented 1 month ago

2024-09-28 23:22:56.902299 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running MatMul node. Name:'/encoder/layer.0/attention/self/MatMul' Status Message: Failed to run JSEP kernel or @ transformers@3.0.0-alpha.15:100 transformers@3.0.0-alpha.15:175 An error occurred during model execution: "Error: [WebGPU] Kernel "[MatMul] /encoder/layer.0/attention/self/MatMul" failed. Error: Failed to generate kernel's output[0] with dims [10,12,4079,4079]. If you are running with pre-allocated output, please make sure the output type/dims are correct. Error: 8703528". E @ transformers@3.0.0-alpha.15:175 transformers@3.0.0-alpha.15:175 Inputs given to model: Object E @ transformers@3.0.0-alpha.15:175 about:srcdoc:287 error_embedding_batch Error: [WebGPU] Kernel "[MatMul] /encoder/layer.0/attention/self/MatMul" failed. Error: Failed to generate kernel's output[0] with dims [10,12,4079,4079]. If you are running with pre-allocated output, please make sure the output type/dims are correct. Error: 8703528 at Object._OrtRun (transformers@3.0.0-alpha.15:100:26047) at async Uu (transformers@3.0.0-alpha.15:100:356741) at async md.run (transformers@3.0.0-alpha.15:100:362331) at async e.run (transformers@3.0.0-alpha.15:100:16509) at async E (transformers@3.0.0-alpha.15:175:16210) at async Function.O [as _forward] (transformers@3.0.0-alpha.15:175:17906) at async Function.forward (transformers@3.0.0-alpha.15:175:22824) at async Function._call (transformers@3.0.0-alpha.15:175:22778) at async Function._call (transformers@3.0.0-alpha.15:187:8013) at async SmartEmbedTransformersAdapter.embed_batch (about:srcdoc:280:20) embed_batch @ about:srcdoc:287 about:srcdoc:331 Error processing message: Error: Not implemented at SmartEmbedTransformersAdapter.embed (about:srcdoc:207:11) at about:srcdoc:288:61 at Array.map () at SmartEmbedTransformersAdapter.embed_batch (about:srcdoc:288:42) at async _SmartEmbedModel.embed_batch (about:srcdoc:185:12) at async processMessage (about:srcdoc:319:18) at async about:srcdoc:341:38 processMessage @ about:srcdoc:331 plugin:smart-connections:9382 Uncaught (in promise) Error: Not implemented at SmartEmbedTransformersIframeAdapter._handle_message (plugin:smart-connections:9382:39)

brianpetro commented 1 month ago

Try the "legacy transformers" setting 🌴