calebwin / emu

The write-once-run-anywhere GPGPU library for Rust
https://calebwin.github.io/emu
MIT License
1.59k stars 53 forks source link

Example fails to compile, because of the quote crate #31

Closed palmada closed 4 years ago

palmada commented 4 years ago

Hello,

Your project looks very interesting and I wanted to give it ago. I copied your example code on the README and I couldn't compile it. After solving the issue in ticket #27 by using: em = { git = "https://github.com/calebwin/emu", branch = "dev" }

I get the below output when doing cargo check:

error[E0433]: failed to resolve: could not find `__rt` in `quote`
   --> C:\Users\Pedro\.cargo\git\checkouts\emu-7973979264d9dc07\095942b\emu_macro\src\accelerating.rs:123:66
    |
123 | ...                   .is_ident(&Ident::new("load", quote::__rt::Span::call_site()))
    |                                                            ^^^^ could not find `__rt` in `quote`

error[E0433]: failed to resolve: could not find `__rt` in `quote`
   --> C:\Users\Pedro\.cargo\git\checkouts\emu-7973979264d9dc07\095942b\emu_macro\src\accelerating.rs:169:66
    |
169 | ...                   .is_ident(&Ident::new("read", quote::__rt::Span::call_site()))
    |                                                            ^^^^ could not find `__rt` in `quote`

error[E0433]: failed to resolve: could not find `__rt` in `quote`
   --> C:\Users\Pedro\.cargo\git\checkouts\emu-7973979264d9dc07\095942b\emu_macro\src\accelerating.rs:193:68
    |
193 | ...                   .is_ident(&Ident::new("launch", quote::__rt::Span::call_site()))
    |                                                              ^^^^ could not find `__rt` in `quote`

error[E0433]: failed to resolve: could not find `__rt` in `quote`
   --> C:\Users\Pedro\.cargo\git\checkouts\emu-7973979264d9dc07\095942b\emu_macro\src\accelerating.rs:259:64
    |
259 |                     let ident = Ident::new(&param.name, quote::__rt::Span::call_site());
    |                                                                ^^^^ could not find `__rt` in `quote`

I tried hard-setting the 'quote' crate to a specific version (1.0.1) and got the following message:

error: failed to select a version for `quote`.
    ... required by package `emu_macro v0.1.0 (https://github.com/calebwin/emu?branch=dev#095942ba)`
    ... which is depended on by `em v0.3.0 (https://github.com/calebwin/emu?branch=dev#095942ba)`
    ... which is depended on by `emu-test v0.1.0 (D:\Code\Rust\emu-test)`
versions that meet the requirements `^1.0.2` are: 1.0.3

all possible versions conflict with previously selected packages.

  previously selected package `quote v1.0.1`
    ... which is depended on by `emu-test v0.1.0 (D:\Code\Rust\emu-test)`

failed to select a version for `quote` which could resolve this conflict

Setting quote to 1.0.3 doesn't solve the issue, but it's interesting that 1.0.2 seems to have been yanked from crates.io. Is it possible emu_macro depends on code that is no longer present in 1.0.3?

palmada commented 4 years ago

I compared what changed between 1.0.2 and 1.0.3 of quote and it seems that rt has been renamed to private. The author seemed to be unhappy about other developers using __rt, as per the comments on that commit:

https://github.com/dtolnay/quote/commit/41543890aa76f4f8046fffac536b9445275aab26

Make it clearer that __rt isn't public

This was doc(hidden) and commented "Not public API" but occasionally people still decide they should refer to it.

matveyryabov commented 4 years ago

I am having the same issue. When I tried to hardset the quote package to version 1.0.2 it started giving errors that em_quote requires version 1.0.3. Were you able to get anywhere further with this issue?

sezna commented 4 years ago

It is unfortunate that they didn't decide to at least make that a minor version bump, as this broke a lot of crates overnight. Additionally, 1.0.2 was yanked, which is what this crate uses.

I'm going to take a whack at fixing this.

calebwin commented 4 years ago

@sezna @matveyryabov @palmada

Thanks for the interest, everyone. I just released a new version that totally shifts the focus of this project (for like the third time? i think?). The details are on the README, but basically I created a compute-specific abstraction over WebGPU giving a simple API for doing compute with WebGPU.

Currently, it still makes use of GLSL as a language but the design is such that you can swap out the compiler with a new one (like maybe RLSL?). Anyway, if you guys could check it out and let me know your thoughts that would be great!

sezna commented 4 years ago

@calebwin awesome! This is a really impressive project, and I am excited to use this new version. It looks like a more fully-featured, simple in a good way, API without as much "magic", which is something I value.

As for feedback, I am not as familiar with WebGPU as I am with OpenCL or CUDA, so a lot of this new stuff isn't usable for me until I learn how to rewrite my simulation (my primary use case is n-body simulations) using it. That being said, I do have a preliminary question that I couldn't find on WebGPU's website: does the performance of WebGPU match the performance of OpenCL? I was under the impression that it provides a web-based API to the GPU but lacks the performance of OpenCL, and certainly CUDA.

Also, would you consider releasing the previous macro-based version either as a feature, branch, or separate repo, just for those few who depend upon it in other projects? At least for the time being, I will be pinning to my fixed fork, because I will be going and learning WebGPU to update. If you'd like, I could help maintain it.

calebwin commented 4 years ago

Thank you for your kind words!

I don't think you need to know WebGPU as much you just need to know the GLSL language (to use Emu, that is). In terms of performance, while I haven't done any benchmarking, WebGPU shouldn't be significantly worse than OpenCL. Most of what it's doing is straightforward forwarding to Vulkan/Metal/DX functions. And you can most certainly optimize your code in similar ways such as based on shared memory, wide data types, swizzles, register blocking, etc. Also, even if performance is slightly lacking, I think having reproducible code that anyone can just open and run on their integrated/discrete GPU can be useful.

So, and given that the macro-based version supports a pretty small subset of Rust, I would recommend learning GLSL compute shaders and using this new version of Emu. That said, the old version is still up on Crates.io and in the same location on GitHub and you're more than welcome to work with it.

sezna commented 4 years ago

Oh, that makes sense. I'm a little new in this space in general, so I'll go take a look at GLSL. Thanks for the link to the resources.

What I meant by releasing the previous macro-based version was releasing a crates.io patch release including the quote crate fixes, as the version currently on crates.io doesn't compile due to that breaking change. A rev-pinned git dep should do the trick, though.

Again, thanks for contributing this great tool.