Closed sukamenev closed 4 months ago
If you write documentation on how to implement new functions, then maybe I can try to fix it myself.
But I'm afraid that it will take me 1-3 months just to understand your code without any hints.
Actually this one is trivial to implement: Look for operator in ATen/RegistrationDeclarations.h
Tensor & clamp_min_out(const Tensor & self, const Scalar & min, Tensor & out); // {"schema": "aten::clamp_min.out(Tensor self, Scalar min, *, Tensor(a!) out) -> Tensor(a!)", "dispatch": "True", "default": "False"}
And look at src/pointwise_ops.cpp
function clamp_out
it is really trivial one. Strange it does not call clamp_out
UserWarning: The operator 'aten::clamp_min.out' is not currently supported on the ocl backend. Please open an issue at for requesting support https://github.com/artyom-beilis/pytorch_dlprim/issues (Triggered internally at /home/insdf/Ksdfsd/programming/ZenDnn/pytorch_dlprim/src/tensor_ops.cpp:311.)
return torch.binary_cross_entropy_with_logits(input, target, weight, pos_weight, reduction_enum)