Open dsyme opened 9 years ago
For GPUs we would have to look how to best take advantage of the parallelism offered.
Maybe on a quantum computer... I think that for the Nothing abstraction the primordial quantum “nothingness” is the right place to be.
I suggest we fork out a SchrodingerNothing for the quantum scenario, or else it will just bloat the NuGet package.
Can we run this on FPGAs? Or at least GPUs?