jdnie / Winograd_study

理解winograd算法原理
7 stars 0 forks source link

Derivation of matrices G and B #1

Open buttercutter opened 4 years ago

buttercutter commented 4 years ago

Could anyone advise on how to derive matrices G and B from matrix A using Fast Algorithms for Convolutional Neural Networks ?

G_B

winograd_GB_matrices

jdnie commented 4 years ago

G, B, A can be calculated by TOOM-COOK algorighm or Chinese remainder theorem, you can try to calculate F(4,3) F(6, 3)

buttercutter commented 4 years ago

but in your own screenshot, matrices G and B are derived from matrix A ?

jdnie commented 4 years ago

your G is my B, A is my G, and B is my A

jdnie commented 4 years ago

G transform g from 3 dims to 4 dims, B transform d from 4 dims to 4 dims, A transform Y from 2 dims to 4 dims, then use dot product replace convolution. It's the same as the FFT transform pics from spatial domain to frequency domain, and use dot product in frequency domain to replace the convolution in frequency domain. The G/A/B are the transform basises.

buttercutter commented 4 years ago

G transform g from 3 dims to 4 dims, B transform d from 4 dims to 4 dims, A transform Y from 2 dims to 4 dims

This above sentence is confusing.

your G is my B, A is my G, and B is my A

ok, but how you derive your G and your B from your A as shown in your screenshot ?

jdnie commented 4 years ago

In TOOM-COOK, you can make a set of equations and solve it.

buttercutter commented 4 years ago

@jdnie Could you advise about https://github.com/NervanaSystems/neon/issues/224#issuecomment-605602605 ?

buttercutter commented 4 years ago

@jdnie

What are β0 , β1 and β2 ?

winograd_antkillerfarm