MarsTechHAN / keras2ncnn

A keras h5df to ncnn model converter
MIT License
89 stars 19 forks source link

Lambda support #10

Closed ankandrew closed 3 years ago

ankandrew commented 3 years ago

I was wondering if tf.keras.layers.Lambda can be supported. My model looks like:

Imports & utility:

from tensorflow.keras.activations import softmax
from tensorflow.keras.models import Model
from tensorflow.keras.layers import (
    Activation, Concatenate, Dense, Dropout, GlobalAveragePooling2D,
    Input, MaxPool2D, Lambda, Reshape
)

# block_no_activation is the same w/o Activation
def block_bn(i, k=3, n_c=64, s=1, padding='same'):
    x1 = Conv2D(kernel_size=k, filters=n_c, strides=s, padding=padding,
                kernel_regularizer=regularizers.l2(0.01), use_bias=False)(i)
    x2 = BatchNormalization()(x1)
    x2 = Activation(relu)(x2)
    return x2, x1

Model:

def model():
    h, w = 70, 140
    input_tensor = Input((h, w, 1))
    x, _ = block_bn(input_tensor)
    # ... More Conv2D -> BatchNorm -> Activation ...
    x, _ = block_bn(x, k=1, n_c=1024, s=1, padding='same')
    x = block_no_activation(x, k=1, n_c=259, s=1, padding='same')
    x = GlobalAveragePooling2D()(x)
    x = Reshape((7, 37, 1))(x)
    x = Lambda(lambda x: softmax(x, axis=-2))(x)
    return Model(inputs=input_tensor, outputs=x)
MarsTechHAN commented 3 years ago

Lambda was stored as a python blob in h5df, which is no way being converted to ncnn. I would suggest not using lambda in the backbone, and maybe implement the lambda part in C/C++ separately. (For your case, I think you can just use the Keras's softmax API)

ankandrew commented 3 years ago

Yes, chopping the model just after the GAP2D exports fine w/o problems. I just need to do softmax separately 7 times in C++ and there is no problem. Thanks for the response