Closed llan-ml closed 8 years ago
Hi, you can feed your tf.Tensor
into tl.layers.InputLayer
and then feed the layer into Dense and Conv layer.
If you look into the source code, the InputLayer just creat a Layer.outputs
, more details can be found on the documentation : http://tensorlayer.readthedocs.io/en/latest/modules/layers.html#understand-layer
@zsdonghao thx, the solution is the same as mine.
But this solution causes that newly created layers cannot track components,which are located before tl.layers.InputLayer
, using methods such as layer.all_params
and layer.all_drop
.
@llan-ml Yes, indeed. Layer
cannot track the parameters outside TensorLayer, I suggest you to usetrain_params = layer.all_params + [you parameters]
if you want to update the parameters outside TensorLayer .
For layers.all_drop
, if you control the keep probabilities by placeholder, then you will need to feed the values into feed_dict
, but in this case, to keep everything simple, I suggest you to create different graph for training and evaluating :
e.g. https://github.com/zsdonghao/tensorlayer/blob/master/tutorial_ptb_lstm_state_is_tuple.py
The input of each layer (e.g., Dense layer and Conv layer) except "Input layer" must be a
Layer
instance. I think this behavior is not so transparent to TensorFlow.In the case that I want to use an intermediate
tf.Tensor
as input for aLayer
instance, is there any existing method of TensorLayer to implement this function?