FrusqueGaetan / Learnable-Wavelet-Transform

Code tu use learnable wavelet transforms like L-WPT and DeSPaWN methods in pytorch
18 stars 1 forks source link

Perfect reconstruction + alternating filter positions #1

Open RamiMatar opened 1 year ago

RamiMatar commented 1 year ago

Hi @FrusqueGaetan , thank you for the excellent work on the papers and for sharing this code! I had two questions I hope you can help me with:

  1. Lines 221-234 Why do you have the low pass and high pass order switched for even and odd positions? Is there some other detail in the implementation where that's important?

  2. When initialized, shouldn't the LWPT be able to perfectly reconstruct signals based on the filter properties which are selected? From the paper, my understanding was that we can't guarantee perfect reconstruction after learning due to the required kernel property being likely not conserved after the kernel is updated with backpropagation. When I try to do a simple example with input of [1,2,..., 16], and 2 levels of the LWPT before training, however, this is the reconstruction

    lwpt = NeuralDWAV(2 ** 4, 2)
    x = torch.tensor([[range(1,17)]], dtype = torch.double)
    y = lwpt(x)
    #  y = [ 0.3453,  0.0740, -1.1725, -1.5650, -1.8428, -0.1822,  3.2033, 5.9611,  8.4227, 10.1521, 10.3272, 11.6218, 16.0963, 15.5719, 11.4356,  7.9666]
FrusqueGaetan commented 1 year ago

Hello Rami Matar,

Thanks a lot for taking time to look at this work.

  1. Nice catch ! this is due to the aliasing effect that after application of the HP filter + subsampling that makes the frequency order decreasing. For example without this change at level two the node order become AA-AD-DD-DA instead of AA-AD-DA-DD. Here is a paper that discuss this effect https://ieeexplore.ieee.org/abstract/document/847906 [https://ieeexplore.ieee.org/assets/img/ieee_logo_smedia_200X200.png]https://ieeexplore.ieee.org/abstract/document/847906 Wavelet packet feature extraction for vibration monitoringhttps://ieeexplore.ieee.org/abstract/document/847906 ieeexplore.ieee.org

    1. I think it is due to some border effect, since you are using a db4 filter that constains 8 coefficients over 2 WPT level. All the 2^4 elements in your signal are impacted, I remember when I was using WPT with 5 level, I think around the first and last 100 elements could be impacted by the border effect and I was using signal of size 2**13

If you have any other questions do not hesitate to ask!

Best regards, Gaëtan


De : RamiMatar @.> Envoyé : lundi 3 juillet 2023 07:32 À : FrusqueGaetan/Learnable-Wavelet-Transform @.> Cc : FGaetan @.>; Mention @.> Objet : [FrusqueGaetan/Learnable-Wavelet-Transform] Perfect reconstruction + alternating filter positions (Issue #1)

Hi @FrusqueGaetanhttps://github.com/FrusqueGaetan , thank you for the excellent work on the papers and for sharing this code! I had two questions I hope you can help me with:

  1. Lines 221-234 https://github.com/FrusqueGaetan/Learnable-Wavelet-Transform/blob/7b9982da87ba8c4978dd217edb1e23bfa3994217/Code/NeuralDWAV.py#L221 Why do you have the low pass and high pass order switched for even and odd positions? Is there some other detail in the implementation where that's important?

  2. When initialized, shouldn't the LWPT be able to perfectly reconstruct signals based on the filter properties which are selected? From the paper, my understanding was that we can't guarantee perfect reconstruction after learning due to the required kernel property being likely not conserved after the kernel is updated with backpropagation. When I try to do a simple example with input of [1,2,..., 16], and 2 levels of the LWPT before training, however, this is the reconstruction

lwpt = NeuralDWAV(2 ** 4, 2) x = torch.tensor([[range(1,17)]], dtype = torch.double) y = lwpt(x)

y = [ 0.3453, 0.0740, -1.1725, -1.5650, -1.8428, -0.1822, 3.2033, 5.9611, 8.4227, 10.1521, 10.3272, 11.6218, 16.0963, 15.5719, 11.4356, 7.9666]

— Reply to this email directly, view it on GitHubhttps://github.com/FrusqueGaetan/Learnable-Wavelet-Transform/issues/1, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AMFY3YWJV7PNDLSEURO3BRLXOJKPLANCNFSM6AAAAAAZ36IBIM. You are receiving this because you were mentioned.Message ID: @.***>