microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.86k stars 2.94k forks source link

GetElementType is not implemented after updating onnxruntime #22075

Open theolivenbaum opened 2 months ago

theolivenbaum commented 2 months ago

Describe the issue

Just saw this error on our logs, I've to investigate how to reproduce:

[E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running BiasGelu node. Name:'BiasGelu_token_31' Status Message: GetElementType is not implemented [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running If node. Name:'optimum::if' Status Message: Non-zero status code returned while running BiasGelu node. Name:'BiasGelu_token_31' Status Message: GetElementType is not implemented

To reproduce

Happened while using https://github.com/curiosity-ai/sentence-transformers-sharp to OCR an image on a Ubuntu machine

Urgency

No response

Platform

Linux

OS Version

Ubuntu 22.04.4 LTS

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.19.2

ONNX Runtime API

C#

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

skottmckay commented 2 months ago

What data type were you trying to run the model with?

github-actions[bot] commented 1 month ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

kcoul commented 1 month ago

Also came across this issue upon updating to 1.19.2:

[E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running Cast node. Name:'/dec/m_source/l_sin_gen/Cast_5' Status Message: GetElementType is not implemented