Open p-wysocki opened 1 year ago
Hi, I wanted to do some work on OpenVino as OS contributor, could you assign this issue to me?
Hi, I wanted to do some work on OpenVino as OS contributor, could you assign this issue to me?
and you are, let us know if you need any guidance :)
I'm returning the task to being open due to current assignee's inactivity.
Hello! Can I take this task if it is still available? @p-wysocki
Hello @bagrorg, thanks for taking a look! I assigned you.
@bagrorg could you please confirm whether you're still working on this issue?
@p-wysocki Hi!
Yes, I've done it except tests part. I had to distract myself, I didn't have time to come back yet, hope will complete it this week. Sorry =(
No worries, I'm just updating the task statuses in the main good first issue list. :)
Hi @bagrorg, are you still working on this issue or can I return it to be picked up by other contributors?
.take
Thank you for looking into this issue! Please let us know if you have any questions or require any help.
Hello @vbayeva, thanks for taking a look! Please let us know if you have any questions or need any help. :)
You can use our WIP roadmap for contributors at https://github.com/openvinotoolkit/openvino/pull/21322, if you do - please let us know if it helped and give us any feedback on how it could be improved. :)
I am happy to announce that we have created a channel dedicated to Good First Issues support on our Intel DevHub Discord server! Join it to receive support, engage in discussions, ask questions and talk to OpenVINO developers.
Hi I was interested in doing this task, although I'm pretty new to this, would it be possible to brief me on what to do? and I could proceed from there
Hi @adityashibu! I assigned you. If I were you I would start with:
Hi @adityashibu! I assigned you. If I were you I would start with:
- Go through the technical guide linked in the contribution guidelines in Resources
- Find a similar PR you like which has been already merged into OpenVINO as a reference - you can just search for phrase "Extend ONNX Frontend" in the OpenVINO issues to see a completed one.
- Start writing code by following the steps in the To Do list from issue description.
- Join Intel DevHub Discord server (link is in the CONTRIBUTING.md) and ask questions, or you can also ask them here.
Alright, I'll start working on it then. Thank you so much for the support
Hi, I've created files called range.cpp and range.hpp under "src/frontends/onnx/frontend/src/op/com.microsoft", is that the first task?
Hi, it's first part of the task :) Now you need to go to point 2:
Prepare an implementation of this operator in form of a function. It should be placed in opset 1 namespace. The implementation should be the same as standard ONNX Range operator
I believe you can more or less copy the existing implementation from the second link, but I'm not certain. You can find more information about ONNX Range under https://onnx.ai/onnx/operators/onnx__Range.html.
Alright gotcha, working on it rn
@p-wysocki I have created a PR with the requested changes, can you have a look at it and check if there's anything else that needs to be improved upon?
could you please link PR to ticket or ticket to PR?
could you please link PR to ticket or ticket to PR?
I linked it, please check
Context
Neural networks are graphs consisting of nodes called operators. Each operator corresponds to a mathematical function, usually described in framework's documentation or an AI standard, such as ONNX.
OpenVINO ONNX Frontend is a component responsible for working with ONNX graphs and requires implementation of different ONNX operators in order to use ONNX models.
This task requires extending OpenVINO ONNX Frontend with
com.microsoft.Range
operator.Necessary help will be provided by ONNX Fronted team.
Operator specification
Operator details can be found in official ONNX Runtime docs.
To do list
.hpp
and.cpp
files forcom.microsoft.Range
hereMore details in adding operators to ONNX Frontend guide
Resources
Example PRs
Contact points
@tomdol @mbencer @p-wysocki Don't hesitate to reach out, we're here to help!