tensorlayer / TensorLayer

Deep Learning and Reinforcement Learning Library for Scientists and Engineers
http://tensorlayerx.com
Other
7.33k stars 1.61k forks source link

[RoadMap] - TensorLayer 2.0 #860

Closed DEKHTIARJonathan closed 5 years ago

DEKHTIARJonathan commented 6 years ago

NETWORK API REFACTORING - TO DO LIST

Finished work:

- **activation.py:** * PReluLayer: - [x] refactored - [x] tested * PRelu6Layer: - [x] refactored - [x] tested * PTRelu6Layer: - [x] refactored - [x] tested - **convolution/** * AtrousConv1dLayer: - [ ] refactored => Need to be implemented - [ ] tested * AtrousConv2dLayer: - [x] refactored - [x] tested * AtrousDeConv2dLayer: - [x] refactored - [x] tested * BinaryConv2d: - [x] refactored - [x] tested * Conv1d: - [x] refactored - [x] tested * Conv2d: - [x] refactored - [x] tested * Conv1dLayer: - [x] refactored - [x] tested * Conv2dLayer: - [x] refactored - [x] tested * Conv3dLayer: - [x] refactored - [x] tested * DeConv2d: - [x] refactored - [x] tested * DeConv3d: - [x] refactored - [x] tested * DeConv2dLayer: - [x] refactored - [x] tested * DeConv3dLayer: - [x] refactored - [x] tested * DeformableConv2d: - [x] refactored - [x] tested * DepthwiseConv2d: - [x] refactored - [x] tested * DorefaConv2d: - [x] refactored - [x] tested * GroupConv2d: - [x] refactored - [x] tested * QuantizedConv2d: - [x] refactored - [x] tested * QuantizedConv2dWithBN: - [x] refactored - [x] tested * SeparableConv1d: - [x] refactored - [x] tested * SeparableConv2d: - [x] refactored - [x] tested * SubpixelConv1d: - [x] refactored - [x] tested * SubpixelConv2d: - [x] refactored - [x] tested * TernaryConv2d: - [x] refactored - [x] tested - **dense/** * BinaryDenseLayer: - [x] refactored - [x] tested * DenseLayer: - [x] refactored - [x] tested * DorefaDenseLayer: - [x] refactored - [x] tested * DropconnectDenseLayer: - [x] refactored - [x] tested * QuantizedDense: - [x] refactored - [x] tested * QuantizedDenseWithBN: - [x] refactored - [x] tested * TernaryDenseLayer: - [x] refactored - [x] tested - **dropout.py** * DropoutLayer: - [x] refactored - [x] tested - **extend.py** * ExpandDimsLayer: - [x] refactored - [x] tested * TileLayer: - [x] refactored - [x] tested - **image_resampling.py** * UpSampling2dLayer: - [x] refactored - [x] tested * DownSampling2dLayer: - [x] refactored - [x] tested - **importer.py** * SlimNetsLayer: - [x] refactored - [x] tested * KerasLayer: - [x] refactored - [x] tested - **inputs.py** * InputLayer: - [x] refactored - [x] tested * OneHotInputLayer: - [x] refactored - [x] tested * Word2vecEmbeddingInputlayer: - [x] refactored - [x] tested * EmbeddingInputlayer: - [x] refactored - [x] tested * AverageEmbeddingInputlayer: - [x] refactored - [x] tested - **lambda_layers.py** * ElementwiseLambdaLayer: - [x] refactored - [x] tested * LambdaLayer: - [x] refactored - [x] tested - **merge.py** * ConcatLayer: - [x] refactored - [x] tested * ElementwiseLayer: - [x] refactored - [x] tested - **noise.py** * GaussianNoiseLayer: - [x] refactored - [x] tested - **normalization.py** * BatchNormLayer: - [x] refactored - [x] tested * GroupNormLayer: - [x] refactored - [x] tested * InstanceNormLayer: - [x] refactored - [x] tested * LayerNormLayer: - [x] refactored - [x] tested * LocalResponseNormLayer: - [x] refactored - [x] tested * SwitchNormLayer: - [x] refactored - [x] tested - **padding.py** * PadLayer: - [x] refactored - [x] tested * ZeroPad1d: - [x] refactored - [x] tested * ZeroPad2d: - [x] refactored - [x] tested * ZeroPad3d: - [x] refactored - [x] tested - **pooling/** * MaxPool1d: - [x] refactored - [x] tested * MaxPool2d: - [x] refactored - [x] tested * MaxPool3d: - [x] refactored - [x] tested * MeanPool1d: - [x] refactored - [x] tested * MeanPool2d: - [x] refactored - [x] tested * MeanPool3d: - [x] refactored - [x] tested * GlobalMaxPool1d: - [x] refactored - [x] tested * GlobalMaxPool2d: - [x] refactored - [x] tested * GlobalMaxPool3d: - [x] refactored - [x] tested * GlobalMeanPool1d: - [x] refactored - [x] tested * GlobalMeanPool2d: - [x] refactored - [x] tested * GlobalMeanPool3d: - [x] refactored - [x] tested * PoolLayer: - [x] refactored - [x] tested - **quantize_layers.py** * SignLayer: - [x] refactored - [x] tested - **recurrent/** * BiDynamicRNNLayer: - [x] refactored - [x] tested * BiRNNLayer: - [x] refactored - [x] tested * ConvLSTMLayer: - [x] refactored - [x] tested * DynamicRNNLayer: - [x] refactored - [x] tested * RNNLayer: - [x] refactored - [x] tested * Seq2Seq: - [x] refactored - [x] tested - **reshape.py** * FlattenLayer: - [x] refactored - [x] tested * ReshapeLayer: - [x] refactored - [x] tested * TransposeLayer: - [x] refactored - [x] tested - **scale.py** * ScaleLayer: - [x] refactored - [x] tested

Unittests Status:

Work to be done

Layers

TensorLayer Hub

Examples

More TODO

plh = tf.placeholder(tf.float16, (100, 32))
net = tl.layers.InputLayer(name='in')(plh)
with tf.variable_scope('test'):
    net = tl.layers.DenseLayer(n_units=50, act=tf.nn.relu, name="dense")(net)
with tf.variable_scope('test', reuse=True):
    net = tl.layers.DenseLayer(n_units=50, act=tf.nn.relu, name="dense")(net)
print(net['test/dense'])
print(net['test/dense_2'])
assert len(net.all_weights) == 2
print(net.all_params)  # give a warning, but still works
with tf.variable_scope('test'):
    model.add(tl.layers.DenseLayer(n_units=50, act=tf.nn.relu, name="seq_layer_9"))
with tf.variable_scope('test', reuse=True):
    model.add(tl.layers.DenseLayer(n_units=50, act=tf.nn.relu, name="seq_layer_9"))
plh = tf.placeholder(tf.float16, (100, 32))
net = model.build(plc)
print(net['test/seq_layer_9'])

Give exception and hint when users use the old name.

def BatchNormLayer(*args, **kwargs): raise Exception("BatchNormLayer(net, is_train=True, name='bn') --> BatchNorm(name='bn')(net, is_train=True)")

zsdonghao commented 5 years ago

https://github.com/tensorlayer/tensorlayer/issues/900