bstriner / dense_tensor

Dense Tensor Layer for Keras
MIT License
9 stars 1 forks source link

dense_tensor

Dense Tensor Layer for Keras. Supports both Keras 1 and 2. Tensor networks/second order networks.

Basically, this is like a quadratic layer if all other layers are linear.

There is an additional weight matrix V. The layer output is xVx+xW+b instead of simply xW+b.

See analysis by Socher.

See also Fenglei Fan, Wenxiang Cong, Ge Wang (2017) A New Type of Neurons for Machine Learning.

Normal Dense Layer: f_i = a( W_ix^T + b_i)

Dense Tensor Layer: f_i = a( xV_ix^T + W_ix^T + b_i)

DenseTensor: same usage as Keras Dense Layer

Variations

I provided several examples for different parameterizations of V, including a low-rank version of V, a symmetric V, and V restricted to positive-definite matrices. Please explore the examples and ask any questions.

Simple parameterization

x = Input(input_dim)
layer = DenseTensor(units=units)
y = layer(x)

Low-rank parameterization

factorization = tensor_factorization_low_rank(q=10)
layer = DenseTensor(units=units, factorization=factorization)

Comments

Please feel free to add issues or pull requests. I'm always interested in any improvements or issues.

Compatibility

Travis tests a matrix including Theano, tensorflow, Python 2.7, Python 3.5, Keras 1 and Keras 2. Code should work on most configurations but please let me know if you run into issues.