Closed Drone-Banks closed 5 years ago
Thank you for your PR. sorry for the late response.
Your DCN code looks greate to me :) About PNN, there is extended version of original PNN paper. original: https://arxiv.org/pdf/1611.00144.pdf extended: https://arxiv.org/pdf/1807.00311.pdf
the original version said:
The first hidden layer is fully connected with the product layer. The inputs to it consist of linear signals lz and quadratic signals lp. With respect to lz and lp inputs, separately, the formulation of l1 is: l1 = relu(lz + lp + b1)
but the extended version said:
The n embeddings of v, and the n(n − 1)/2 interactions of p are flattened and fully connected with the successive hidden layer
I saw authers's codes, and found they also concat the linear signals and inner product signals, then feed it into multi layer perceptron.
So I think my IPNN/OPNN codes may be correct. Your PNN code looks similar to Product-network In Network(PIN) in extended paper. Could you check it? thank you!
Thanks for your response, I have checked the extended version and changed back the PNN code. And My PNN code is based on the original version. There are another implementation of original paper: https://github.com/nzc/dnn_ctr/blob/master/model/PNN.py @rixwew
@Drone-Banks Thank you! Because Atomu2014 is corresponding author of PNN paper, we should think much of his implementation. https://github.com/Atomu2014/product-nets
Increase the embedding matrix dimension by one
Fix the DCN and PNN's structure