onnx / onnx-mlir

Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure
Apache License 2.0
743 stars 315 forks source link

Next Ops to work on #922

Open AlexandreEichenberger opened 2 years ago

AlexandreEichenberger commented 2 years ago

Idea: put here a quick comment to claim the operations that you care currently working on, so that we do not replicate work. Can also add a request for new op.

AlexandreEichenberger commented 2 years ago

Working on compress

tungld commented 2 years ago

Working on NonMaxSuppression

etiotto commented 2 years ago

I am working on SpaceToDepth (#926) and DepthToSpace (#927).

AlexandreEichenberger commented 2 years ago

FYI, here are some of the benchmarks we are focusing on and that have ops that are not working yet.

high priority: (from model zoo)

high priority: support compile models compiled to their lowest component (like RNNs not exported as high level ONNX ops). No crash.

medium prio: hugging face GBERTQnA

A list of ops currently not supported and present in Model Zoo are listed at the end of this issue #128

etiotto commented 2 years ago

I am going to look at categorymapper (#941).

AlexandreEichenberger commented 2 years ago

working on one hot to work with multiple types

tungld commented 2 years ago

working on Hardmax to support Bidaf. PR #950 (merged).

chentong319 commented 2 years ago

working on Resize.

mmoldawsky commented 2 years ago

Working on IsNaN op

etiotto commented 2 years ago

Working on ScatterElements (needed by fasterrcnn-10.onnx, maskrcnn-10.onnx). PR is https://github.com/onnx/onnx-mlir/pull/1352

Scatter is deprecated but we map it to ScatterElements. PR is https://github.com/onnx/onnx-mlir/pull/1337

etiotto commented 2 years ago

Working on ScatterND. PR is https://github.com/onnx/onnx-mlir/pull/1370

etiotto commented 2 years ago

Implemented GatherElements. PR is https://github.com/onnx/onnx-mlir/pull/1375.

etiotto commented 2 years ago

Working on GahterND. PR is https://github.com/onnx/onnx-mlir/pull/1382.

AlexandreEichenberger commented 2 years ago

Status of implemented ops are listed here now: https://github.com/onnx/onnx-mlir/blob/main/docs/SupportedONNXOps-cpu.md

airMeng commented 1 year ago

Hi, thank you for your excellent work! I am quite new to MLIR so the questions may be stupid, please never mind. I see ArgMax is supported in onnx-mlir but ArgMin not, is there any special issue for ArgMin? If not, can I open a PR about ArgMin just based on ArgMax with little modification?

tungld commented 1 year ago

@airMeng please go ahead with a PR for ArgMin. Thank you!

Ris-Bali commented 1 year ago

Hi, can I work on celu op ?

muzafferkal commented 1 year ago

Somebody please support QuantizeLinear/DequantizeLinear ops for quantized networks.

srcarroll commented 3 months ago

is anyone working on extending the decompose-onnx pass (or a similar pass) to support more onnx.Custom ops? In particular i am trying to compile a basic backwards graphs and get InPlaceAccumulatorV2 custom op after converting onnx to mlir with onnx-mlir --EmitONNXIR.

AlexandreEichenberger commented 2 months ago

@srcarroll CustomOps is a non-standard op that @chentong319 added to easily convert a custom op into a function call. While Tong knows best, my recollection (and I may be wrong here) is that custom ops are mainly generated within onnx-mlir, and not parsed in from an ONNX protobuf.

Tong is away for a bit, if you wanted to add support for more custom ops, we would certainly be interested in taking in the changes. There are also ONNX functions, maybe that may help too.

srcarroll commented 2 months ago

@AlexandreEichenberger thanks for the response. i'd be happy add support, but I can't find any info on the definition of InPlaceAccumulatorV2. do you know where i can find that?

could you also point me to the ONNX functions you are referring to and how to emit them? thanks

AlexandreEichenberger commented 2 months ago

I could not find info about your "InPlaceAccumulatorV2", it does not appears in the ONNX specs or what I could find about the OnnxRuntimeExtensions, though I may have overlooked something in the ORT as I am not very familiar with it. How did you create the ONNX graph?

As far as creating custom ONNX functions, you can see a reference here in the ORT literature as a preferred way to make new custom ops: https://onnxruntime.ai/docs/reference/operators/add-custom-op.html

The ONNX specs also have a section on ONNX functions: https://onnx.ai/onnx/intro/concepts.html#functions