torch / nngraph

Graph Computation for nn
Other
299 stars 96 forks source link

How can I access parameters of a node in nngraph #131

Open mingstupid opened 8 years ago

mingstupid commented 8 years ago

Hello! I am creating an nngraph which combines two embeddings together. One of the embeddings is a word embedding and the other is not. For the word embedding, I would like to initialize its weights with the pre-trained word2vec. Is there a way to do so? My nngraph looks like the following:

input1 = nn.Identity()() input2 = nn.Identity()() embed1 = nn.LookupTableMaskZero(nIndex1, embeddingSize)(input1) embed2 = nn.LookupTableMaskZero(nIndex2, embeddingSize)(input2) madd = nn.CAddTable()({embed1, embed2}) madd_t = nn.SplitTable(2)(madd) embed = nn.gModule({input1, input2}, {madd_t})

Could you suggest a way to, say, set embed1.weight to pretrained word2vec? Thank you!

taineleau-zz commented 8 years ago

Hi,

nngraph follows the design of nn.Module.

Hence you just need to call the method parameters (or :getParameters()) to obtain the weights in order.

Try:

require 'nn'
require 'rnn'
require 'nngraph'

nIndex1 = 20
embeddingSize = 128
nIndex2 = 30

input1 = nn.Identity()()
input2 = nn.Identity()()
embed1 = nn.LookupTableMaskZero(nIndex1, embeddingSize)(input1)
embed2 = nn.LookupTableMaskZero(nIndex2, embeddingSize)(input2)
madd = nn.CAddTable()({embed1, embed2})
madd_t = nn.SplitTable(2)(madd)
embed = nn.gModule({input1, input2}, {madd_t})

print(embed:parameters())

The output is:

{
  1 : DoubleTensor - size: 21x128
  2 : DoubleTensor - size: 31x128
}
{
  1 : DoubleTensor - size: 21x128
  2 : DoubleTensor - size: 31x128
}

The first table is weights, and the second is the gradient of weights, accordingly.

Hope this help!

chithangduong commented 7 years ago

I believe you can assess it using embed1.data.module.weight

brisker commented 7 years ago

how to copy particular layer's weights from pretrained nn.Sequential model to nngraph's particular module?

haanvid commented 7 years ago

@brisker I'm dealing with same problem and I'm thinking of annotating the layer that I want to check/initialize its weights and biases.

haanvid commented 7 years ago

@brisker

If you want to check/initialize parameters of a specific layer, then you can use annotation ( annotate() ) to give name to that layer, and find that layer to check/initialize its parameters

If you see the code below, I've gave name to the batch normalization layer as 'BN_1' and initialized the parameters and biases of that layer.

require 'torch'
require 'nn'
require 'nngraph'
require 'cunn'

function make_net()
  local x = nn.Identity()()
  local next_act1 = nn.Linear(2,3)(x)
  local next_act2 = nn.BatchNormalization(3,1e-5,0.1,true)(next_act1):annotate{
   name = 'BN_1', description = 'batch normalization'}
  --next_act.weight:fill(0.1)
  --next_act.bias:zero()
  local next3 = nn.Linear(3,1)(next_act2)

  local module = nn.gModule({x}, {next3})
  return module
end

function main()
  local mlp = make_net()

  x = torch.Tensor({10,20})

  print('mlp.forwardnodes : (press enter)')
  print(mlp.forwardnodes)
  io.read()

  for layer_idx = 1,3 do
    for _, node in ipairs(mlp.forwardnodes) do
      if node.data.annotations.name == 'BN_1' then
        node.data.module.weight:fill(1.0)
        print('BN weight: (press enter)')
        print(node.data.module.weight)
        io.read()
        node.data.module.bias:fill(0)
        print('BN bias: (press enter)')
        print(node.data.module.bias)
        io.read()

      end
    end
  end

  --local y = mlp:forward(x)
  --local paramx, paramdx = mlp:parameters()
  --print('paramx: ')
  --print(paramx)
  --print('paramdx: ')
  --print(paramdx)
end

main()
mamun58 commented 6 years ago

how can I add only weights and bias term in the neural net graph?