Closed jzhoulon closed 6 years ago
I have not seen much F(7,3). I would have started with the same numbers you did probably, might try 3 or 3/2 as alternatives to 1/3.
Do you really need such a large transform? I usually find with state of the art neural networks that a smaller transform can be implemented more efficiently. Could F(4,3) be a good fit for your application? F(6,3) is also reasonably numerically accurate.
Another technique to consider: if you are actually nesting F(7,3) in 2 dimensions to form F(7x7,3x3), then you could try F(7x1,3x3), which basically nests the 1D algorithm F(7,3) with a 1D direct convolution algorithm. The theoretical speedup will just be the square root of what you get with F(7x7, 3x3), but it also will be much more numerically stable, and probably much easier to implement efficiently.
You could also try F(7x2, 3x3). Anyway this all assumes that "7" is somehow an important choice, but maybe you are better off using a smaller tile as suggested in the previous comment, even if it means doing a little bit of over-compute.
thanks very much
Hi,which polynomial points did you finally choose? The first seven numbers are 0,1,-1,2,-2,Rational(1,2),-Rational(1,2), and the last number tries 3,4,Rational(2,3),Rational(3,2). The accuracy seems not very good.
Hi, I am not familiar with polynomial points, do you have experience with that is the best polynomial points for (7,3). I tried v="(0, 1,-1,2,-2,Rational(1,2),-Rational(1,2), -Rational(1,3)), 7,3" and v="(0, 1,-1,Rational(1,2),-Rational(1,2), Rational(5,4),2,-2), 7,3", and tried various scalars, seems always have precision issue, thanks very much.