webonnx / wonnx

A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
Other
1.65k stars 60 forks source link

creating parent workspace #49

Closed haixuanTao closed 2 years ago

haixuanTao commented 2 years ago

Creating a new Workspace to hold wonnx and py-wonnx.

That is it.

We can merge the #48 first.

pixelspark commented 2 years ago

Seems pretty much straightforward, will have a closer look tonight/tomorrow.

PS, now that we have this workspace, would it be an idea to also add a small command line utility (like this one)?

haixuanTao commented 2 years ago

Seems pretty much straightforward, will have a closer look tonight/tomorrow.

PS, now that we have this workspace, would it be an idea to also add a small command line utility (like this one)?

Definitely! It would be great! I think we can move this project to an organisation as we would have equal legitimacy in that case.

I tried to create one with wonnx but some user already took this name space.

We can probably take something like: wonnx.io / wonnx.ai / wonnx.rs

pixelspark commented 2 years ago

@haixuanTao not sure how attached you are to the name wonnx, but a while back I came up with bionnx (pronounce: 'bionics') and thought that could be cool as well. Only downside seems to be the possible association with something 'bio' related. It was free however so I claimed it (https://github.com/bionnx). The repository name could be either bionnx/bionnx or bionnx/wonnx. A domain name would be cool as well. The .ai ones are quite expensive but I think something like wonnx.ml or bionnx.ml would be pretty cool too. Let me know what you think.

haixuanTao commented 2 years ago

Well, wonnx is a reference to Webonnx such as wasm reference webassembly, wgpu -> webgpu, webrtc.

I think that bionnx sounds great but the roots might be harder to understand, and may create confusion for lazy people. From my experience, High Level Executives might get pretty confused with such a name even though the name is pretty cool.

For the same reason, wonnx.ml might create confusion as we are not really doing ML in the purest sense. I would be more comfortable with: wonnx.rs

We can otherwise choose Webonnx which is pretty straightforward, I reserved the name.

We might not be affiliated with the web at the moment, but it is similar to wasm or wgpu.

pixelspark commented 2 years ago

Well, wonnx is a reference to Webonnx such as wasm reference webassembly, wgpu -> webgpu, webrtc.

I agree it is a good name!

For the same reason, wonnx.ml might create confusion as we are not really doing ML in the purest sense. I would be more comfortable with: wonnx.rs

Yeah we can debate this for a long time (wonnx.rs ties the library to the Rust language but we also already have Python bindings, and maybe want to run in the browser through WASM and have a JS interface..) but anyway, let's be pragmatic and go with wonnx.rs.

We can otherwise choose Webonnx which is pretty straightforward, I reserved the name.

The extra advantage of this is that the org name explains the repository name! WebONNX/wonnx.

We might not be affiliated with the web at the moment, but it is similar to wasm or wgpu.

Yeah, I was wondering about the WebAssembly work. I tried to compile for WebAssembly a while back but it wasn't really usable yet... not sure if you are working on this? Would be very cool to have this as soon as WebGPU becomes more widespread, I believe we're actually the first doing ONNX-on-WebGPU.

haixuanTao commented 2 years ago

Yeah we can debate this for a long time (wonnx.rs ties the library to the Rust language but we also already have Python bindings, and maybe want to run in the browser through WASM and have a JS interface..) but anyway, let's be pragmatic and go with wonnx.rs.

Yeah, that's true wonnx.rs is tied to the language. So another extension would probably better.

Yeah, I was wondering about the WebAssembly work. I tried to compile for WebAssembly a while back but it wasn't really usable yet... not sure if you are working on this? Would be very cool to have this as soon as WebGPU becomes more widespread, I believe we're actually the first doing ONNX-on-WebGPU.

So, I believe WGSL is not yet supported on the web by wgpu but I have to double check: https://docs.rs/wgpu/0.12.0/src/wgpu/lib.rs.html#756-785 . For me, the web has always been one of the end goal as the offering for doing Fast DL on the web is limited, and as you said we're among the only one doing computational DL on WebGPU. But I agree wasm is also one of the top priority.

pixelspark commented 2 years ago

@haixuanTao this looks good to me. I noticed that cargo run --example squeeze would not work from the workspace root (model file not found) but worked from the wonnx directory. Hence I changed the code to load the model relative to the CARGO_MANIFEST_DIR. Another option would be to use include_bytes! to just embed the model to the binary.

haixuanTao commented 2 years ago

Perfect, thanks! This looks great as is. So, I will merge it.

haixuanTao commented 2 years ago

So, just to close the loop on the name of the org, shall we go with webonnx?

I am open to other propositions if this one doesn't fit you.

pixelspark commented 2 years ago

So, just to close the loop on the name of the org, shall we go with webonnx?

I am open to other propositions if this one doesn't fit you.

For me this is fine! So just to be clear:

Thank you by the way for being so considerate of my input. Much appreciated!

haixuanTao commented 2 years ago

Perfect!

All the above sounds perfect!

No props. I will make everything equal footing in the org. Ping me if anything seems wrong.

pixelspark commented 2 years ago

@haixuanTao nice. Now all we need is a logo :-) (the generated one does look a bit like a robot which is kind of applicable but we should replace it probably to look a bit more professional...).

image

Might want to use one of those AI based logo generators. I don't really care as long as it is recognizable (I like the yellow-black thing on the front-page, should probably have a nice icon/image added)

haixuanTao commented 2 years ago

Ahah, yeah right ;)

In that case, I have kept the old README logo for convenience.