buriburisuri / sugartensor

A slim tensorflow wrapper that provides syntactic sugar for tensor variables. This library will be helpful for practical deep learning researchers not beginners.
MIT License
372 stars 63 forks source link

confused about function sg_aconv1d() and sg_conv1d()... #41

Open suenyandan opened 6 years ago

suenyandan commented 6 years ago

I have some problem when working with function sg_conv1d() and sgaconv1d(). As the example in README shows, we can call like that: "out = input.sg_aconv1d(size=opt.size, rate=opt.rate, causal=opt.causal, act='relu', bn=(not opt.causal), ln=opt.causal)" . But here is a problem, i can't find parameters 'act' and 'bn' in the sg_conv1d function definition code:

@tf.sg_layer_func def sg_aconv(tensor, opt): r"""Applies a 2-D atrous (or dilated) convolution.

Args:
  tensor: A 4-D `Tensor` (automatically passed by decorator).
  opt:
    size: A tuple/list of positive integers of length 2 representing `[kernel height, kernel width]`.
      Can be an integer if both values are the same.
      If not specified, (3, 3) is set automatically.
    rate: A positive integer. The stride with which we sample input values across
      the `height` and `width` dimensions. Default is 2.
    in_dim: A positive `integer`. The size of input dimension.
    dim: A positive `integer`. The size of output dimension.
    pad: Either `SAME` (Default) or `VALID`.
    bias: Boolean. If True, biases are added.
    regularizer:  A (Tensor -> Tensor or None) function; the result of applying it on a newly created variable
      will be added to the collection tf.GraphKeys.REGULARIZATION_LOSSES and can be used for regularization
    summary: If True, summaries are added. The default is True.

So I think maybe these two parameters are invalid,and I try to pass some arbitrary parameters in it to do some test , and it turns out to work normally without raising any exceptions.

So I'm wondering how do these two functions work with this two parameters?('act' and 'bn')

Looking forward to any replay . Thanks a lot!