Closed grt1st closed 7 years ago
在第八章 3. 使用循环神经网络实现语言模型.ipynb 中,阅读代码我们可以看到,首先定义了一个类,在类中首先定义了self.initial_state = cell.zero_state(batch_size, tf.float32),之后将值给了变量state = self.initial_state,在这里使用了变量cell_output, state = cell(inputs[:, time_step, :], state),之后self.final_state = state。但是在session.run的代码为cost, state, _ = session.run([model.cost, model.final_state, train_op],{model.input_data: x, model.targets: y, model.initial_state: state}),我们可以看到,feeddict为{model.input_data: x, model.targets: y, model.initial_state: state}。所以我想问,我们真的可以给model.initial_state: state赋值吗?这里没有tf.placeholder啊
self.initial_state = cell.zero_state(batch_size, tf.float32)
state = self.initial_state
cell_output, state = cell(inputs[:, time_step, :], state)
self.final_state = state
session.run
cost, state, _ = session.run([model.cost, model.final_state, train_op],{model.input_data: x, model.targets: y, model.initial_state: state})
{model.input_data: x, model.targets: y, model.initial_state: state}
model.initial_state: state
tf.placeholder
在第八章 3. 使用循环神经网络实现语言模型.ipynb 中,阅读代码我们可以看到,首先定义了一个类,在类中首先定义了
self.initial_state = cell.zero_state(batch_size, tf.float32)
,之后将值给了变量state = self.initial_state
,在这里使用了变量cell_output, state = cell(inputs[:, time_step, :], state)
,之后self.final_state = state
。但是在session.run
的代码为cost, state, _ = session.run([model.cost, model.final_state, train_op],{model.input_data: x, model.targets: y, model.initial_state: state})
,我们可以看到,feeddict为{model.input_data: x, model.targets: y, model.initial_state: state}
。所以我想问,我们真的可以给model.initial_state: state
赋值吗?这里没有tf.placeholder
啊