Closed p0nce closed 1 year ago
It all stems for the decision to make it work all the time. It the intrinsics were forcing instruction set usage locally, it would force to enforce instruciton set support externally at application level, thus allowing divergent code path.
At the same time, one wouldn't want to mixin the whole intrinsics definition at the point where we know the instruction set needed, that would be a really big mixin.
Perhaps this is why Walter think badly about "emulating vectors"
This makes intel-intrinsics kind of biased against consumer software, and biased pro-server software