Closed dasmysh closed 1 year ago
Hi, and thanks for the report. :+1:
The problem with std::result_of
should be solved when switching to the latest version of FunctionalPlus. See this thread for details.
Regarding std::tanh
, I'm not using tanhf
, because it's intended for float
s. By default, however, FDEEP_FLOAT_TYPE
is double
, so that's why I'm using the overloaded tanh
.
Can you try replacing tanh
with std::tanh
in attention_layer.hpp
locally, and let me know if this resolves your compiler error?
The above suggestion might not work. But I just pushed a commit, that should fix it.
Can you please try compiling again with the latest version from master
and let me know if that helped?
Can you try replacing
tanh
withstd::tanh
inattention_layer.hpp
locally, and let me know if this resolves your compiler error?
That is still ambiguous. But what does help is specifying the template arguments of transform_tensor
by explicitly calling transform_tensor<float_type(float_type)>(tanh, [...])
.
And how about the latest version from master
, i.e., the one with tanh_typed
?
master
also works.
So yes, issue is fixed. Thanks for the fast replies.
Thanks for the error report, helping to make the library better, and to make the experience smoother for future users! :heart:
I encountered several compiler errors with your code in VS2022 and C++20. The first issue is for sure a C++20 issue, the second one might be a VS(2022) thing.
First, FunctionalPlus does not work.
std::result_of
was removed in favor ofstd::invoke_result
and they are still using that. I'll write a github issue there as well but I'll also describe the fix here:Any instance of
becomes
Another issue is in the
attention_layer.hpp
andadditive_attention_layer.hpp
: Thetanh
used in one of thetransform_tensor
calls seems to be ambiguous. Replacing that with atanhf
fixes it.