PaddlePaddle / Paddle

PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
http://www.paddlepaddle.org/
Apache License 2.0
22.23k stars 5.58k forks source link

fluid capi报错 what(): Attribute 'sub_block' is required! #11186

Closed seanxh closed 6 years ago

seanxh commented 6 years ago

我有一个网络。当我截取其中某一段子网络时,可以正常用capi调用并predict。

当我把完整的网络取出来时,infer时报错。

 what():  Attribute 'sub_block' is required! at [Paddle/paddle/fluid/framework/attribute.h:241]
PaddlePaddle Call Stacks: 
0             0x421c08p paddle::platform::EnforceNotMet::EnforceNotMet(std::__exception_ptr::exception_ptr, char const*, int) + 576

代码如下:

    paddle::framework::LoDTensor output1;
    std::map<std::string, paddle::framework::LoDTensor *> fetch_targets;
    fetch_targets[fetch_target_names[0]] = &output1;

    executor.CreateVariables(*inference_program, scope, 0);
    std::unique_ptr<paddle::framework::ExecutorPrepareContext> ctx;
    ctx = executor.Prepare(*inference_program, 0);

    executor.Run(*inference_program, scope, &feed_targets, &fetch_targets,
                 false);

core在了这一行。

ctx = executor.Prepare(*inference_program, 0);

如果改成ctx = executor.Prepare(*inference_program, 1);则会在executor.Run这一行core,报的错也如上所述。

我的模型有load起来是有两个block,以下是python里加载并infer时的代码

program, feed_target_names, fetch_targets = fluid.io.load_inference_model('fluid_model_partial', exe,params_filename='param')
for i in range(0,len(program.blocks)):
    print i,program.blocks[i].vars.keys()

输出:

0 ['array_to_lod_tensor_0.tmp_0', 'blstm_1.b_0', 'fill_constant_0.tmp_0', 'fc_1.w_0', 'input.w_0', 'fc_0.w_0', 'output.w_1', 'blstm_0.tmp_0', 'dynamic_rnn_0_output_array_elementwise_mul_3.tmp_0_0', 'c.w_0', 'dynamic_rnn_input_array_0', 'sequence_expand_0.tmp_0', 'blstm_0.w_0', 'attention_output.w_0', 'lod_rank_table_0', 'blstm_1.tmp_1', 'dynamic_rnn_mem_init_reordered_0', 'blstm_0.tmp_1', 'blstm_1.tmp_3', 'attention_fc.w_0', 'feed', 'c.w_1', 'input.b_0', 'forget.w_1', 'forget.w_0', 'forget.b_0', 'attention_fc.b_0', 'c.b_0', 'fill_constant_1.tmp_0', 'cell_init', 'fetch', 'blstm_1.w_0', '_generated_var_0', 'output.b_0', 'fc_1.tmp_0', 'lod_reset_0.tmp_0', 'blstm_1.tmp_2', 'attention_output.b_0', 'blstm_0.b_0', 'dynamic_rnn_mem_array_1', 'blstm_0.tmp_2', 'blstm_1.tmp_0', 'dynamic_rnn_0.tmp_0', 'concat_0.tmp_0', 'data', 'data_lod_attention', 'dynamic_rnn_max_seq_len_0', 'fc_0.tmp_0', 'hidden_init', 'output.w_0', 'input.w_1', 'blstm_0.tmp_3', 'dynamic_rnn_mem_array_0']

1 ['sequence_pool_0.tmp_0', 'attention_output.tmp_0', 'attention_output.tmp_1', 'c.tmp_1', 'input.tmp_2', 'output.tmp_0', 'attention_fc.tmp_0', 'output.tmp_1', 'elementwise_mul_2.tmp_0', 'tanh_0.tmp_0', 'sum_0.tmp_0', 'shrink_memory_1.tmp_0', 'shrink_memory_0.tmp_0', 'sequence_softmax_0.tmp_0', 'c.tmp_2', 'sequence_expand_1.tmp_0', 'concat_1.tmp_0', 'elementwise_mul_0.tmp_0', 'elementwise_mul_1.tmp_0', 'c.tmp_3', 'forget.tmp_0', 'forget.tmp_1', 'attention_fc.tmp_1', 'forget.tmp_3', 'forget.tmp_2', 'input.tmp_0', 'sigmoid_0.tmp_0', 'input.tmp_1', 'array_read_0.tmp_0', 'input.tmp_3', 'array_read_1.tmp_0', 'c.tmp_0', 'sigmoid_2.tmp_0', 'array_read_2.tmp_0', 'elementwise_mul_3.tmp_0', 'sigmoid_1.tmp_0', 'tanh_1.tmp_0', 'output.tmp_2', 'sequence_pool_0.tmp_1', 'output.tmp_3']

请问,我在capi里要怎么改呢??

jacquesqiao commented 6 years ago

把load出来的program打印出来看一下 print(str(program))

seanxh commented 6 years ago
blocks {
  idx: 0
  parent_idx: -1
  vars {
    name: "feed"
    type {
      type: FEED_MINIBATCH
    }
    persistable: true
  }
  vars {
    name: "input.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "forget.w_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 30
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "forget.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "forget.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "attention_fc.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 1
        }
      }
    }
    persistable: true
  }
  vars {
    name: "c.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "fill_constant_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: INT64
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "fetch"
    type {
      type: FETCH_LIST
    }
    persistable: true
  }
  vars {
    name: "cell_init"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "attention_fc.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 45
          dims: 1
        }
      }
    }
    persistable: true
  }
  vars {
    name: "dynamic_rnn_mem_init_reordered_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
  }
  vars {
    name: "blstm_0.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "blstm_1.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "lod_rank_table_0"
    type {
      type: LOD_RANK_TABLE
    }
  }
  vars {
    name: "blstm_1.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "sequence_expand_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "dynamic_rnn_input_array_0"
    type {
      type: LOD_TENSOR_ARRAY
      tensor_array {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
      }
    }
  }
  vars {
    name: "c.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "dynamic_rnn_0_output_array_elementwise_mul_3.tmp_0_0"
    type {
      type: LOD_TENSOR_ARRAY
      tensor_array {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
  }
  vars {
    name: "blstm_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "blstm_0.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "attention_output.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 1
          dims: 1
        }
      }
    }
    persistable: true
  }
  vars {
    name: "blstm_1.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "_generated_var_0"
    type {
      type: STEP_SCOPES
    }
  }
  vars {
    name: "c.w_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 30
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "output.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "fc_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 60
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "lod_reset_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
      }
    }
    persistable: false
  }
  vars {
    name: "blstm_1.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 60
        }
      }
    }
    persistable: false
  }
  vars {
    name: "attention_output.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 1
        }
      }
    }
    persistable: true
  }
  vars {
    name: "blstm_0.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 1
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "dynamic_rnn_mem_array_1"
    type {
      type: LOD_TENSOR_ARRAY
      tensor_array {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
  }
  vars {
    name: "blstm_0.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 60
        }
      }
    }
    persistable: false
  }
  vars {
    name: "blstm_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "dynamic_rnn_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: BOOL
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "concat_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "data"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 39
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "data_lod_attention"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 39
        }
        lod_level: 2
      }
    }
    persistable: false
  }
  vars {
    name: "dynamic_rnn_max_seq_len_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: INT64
          dims: 1
        }
      }
    }
  }
  vars {
    name: "fc_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 60
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "hidden_init"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "output.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "input.w_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 30
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "blstm_0.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "dynamic_rnn_mem_array_0"
    type {
      type: LOD_TENSOR_ARRAY
      tensor_array {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
  }
  vars {
    name: "output.w_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 30
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "fc_0.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 39
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "input.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 15
          dims: 15
        }
      }
    }
    persistable: true
  }
  vars {
    name: "fc_1.w_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 39
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "fill_constant_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: INT64
          dims: 1
        }
      }
    }
    persistable: false
  }
  vars {
    name: "blstm_1.b_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: 1
          dims: 60
        }
      }
    }
    persistable: true
  }
  vars {
    name: "array_to_lod_tensor_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "feed"
    }
    outputs {
      parameter: "Out"
      arguments: "cell_init"
    }
    type: "feed"
    attrs {
      name: "col"
      type: INT
      i: 3
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "feed"
    }
    outputs {
      parameter: "Out"
      arguments: "hidden_init"
    }
    type: "feed"
    attrs {
      name: "col"
      type: INT
      i: 2
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "feed"
    }
    outputs {
      parameter: "Out"
      arguments: "data_lod_attention"
    }
    type: "feed"
    attrs {
      name: "col"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "feed"
    }
    outputs {
      parameter: "Out"
      arguments: "data"
    }
    type: "feed"
    attrs {
      name: "col"
      type: INT
      i: 0
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "data"
    }
    inputs {
      parameter: "Y"
      arguments: "fc_0.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "fc_0.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "Bias"
      arguments: "blstm_0.b_0"
    }
    inputs {
      parameter: "C0"
    }
    inputs {
      parameter: "H0"
    }
    inputs {
      parameter: "Input"
      arguments: "fc_0.tmp_0"
    }
    inputs {
      parameter: "Weight"
      arguments: "blstm_0.w_0"
    }
    outputs {
      parameter: "BatchCellPreAct"
      arguments: "blstm_0.tmp_3"
    }
    outputs {
      parameter: "BatchGate"
      arguments: "blstm_0.tmp_2"
    }
    outputs {
      parameter: "Cell"
      arguments: "blstm_0.tmp_1"
    }
    outputs {
      parameter: "Hidden"
      arguments: "blstm_0.tmp_0"
    }
    type: "lstm"
    attrs {
      name: "candidate_activation"
      type: STRING
      s: "tanh"
    }
    attrs {
      name: "cell_activation"
      type: STRING
      s: "tanh"
    }
    attrs {
      name: "gate_activation"
      type: STRING
      s: "sigmoid"
    }
    attrs {
      name: "is_reverse"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "use_peepholes"
      type: BOOLEAN
      b: false
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "data"
    }
    inputs {
      parameter: "Y"
      arguments: "fc_1.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "fc_1.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "Bias"
      arguments: "blstm_1.b_0"
    }
    inputs {
      parameter: "C0"
    }
    inputs {
      parameter: "H0"
    }
    inputs {
      parameter: "Input"
      arguments: "fc_1.tmp_0"
    }
    inputs {
      parameter: "Weight"
      arguments: "blstm_1.w_0"
    }
    outputs {
      parameter: "BatchCellPreAct"
      arguments: "blstm_1.tmp_3"
    }
    outputs {
      parameter: "BatchGate"
      arguments: "blstm_1.tmp_2"
    }
    outputs {
      parameter: "Cell"
      arguments: "blstm_1.tmp_1"
    }
    outputs {
      parameter: "Hidden"
      arguments: "blstm_1.tmp_0"
    }
    type: "lstm"
    attrs {
      name: "candidate_activation"
      type: STRING
      s: "tanh"
    }
    attrs {
      name: "cell_activation"
      type: STRING
      s: "tanh"
    }
    attrs {
      name: "gate_activation"
      type: STRING
      s: "sigmoid"
    }
    attrs {
      name: "is_reverse"
      type: BOOLEAN
      b: true
    }
    attrs {
      name: "use_peepholes"
      type: BOOLEAN
      b: false
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "blstm_0.tmp_0"
      arguments: "blstm_1.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "concat_0.tmp_0"
    }
    type: "concat"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "concat_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "concat_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "sequence_expand_0.tmp_0"
    }
    type: "sequence_expand"
    attrs {
      name: "ref_level"
      type: INT
      i: 0
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sequence_expand_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "data_lod_attention"
    }
    outputs {
      parameter: "Out"
      arguments: "lod_reset_0.tmp_0"
    }
    type: "lod_reset"
    attrs {
      name: "target_lod"
      type: INTS
    }
    is_target: false
  }
  ops {
    outputs {
      parameter: "Out"
      arguments: "fill_constant_0.tmp_0"
    }
    type: "fill_constant"
    attrs {
      name: "force_cpu"
      type: BOOLEAN
      b: true
    }
    attrs {
      name: "value"
      type: FLOAT
      f: 0.0
    }
    attrs {
      name: "shape"
      type: INTS
      ints: 1
    }
    attrs {
      name: "dtype"
      type: INT
      i: 3
    }
    is_target: false
  }
  ops {
    outputs {
      parameter: "Out"
      arguments: "fill_constant_1.tmp_0"
    }
    type: "fill_constant"
    attrs {
      name: "force_cpu"
      type: BOOLEAN
      b: true
    }
    attrs {
      name: "value"
      type: FLOAT
      f: 0.0
    }
    attrs {
      name: "shape"
      type: INTS
      ints: 1
    }
    attrs {
      name: "dtype"
      type: INT
      i: 3
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "lod_reset_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "lod_rank_table_0"
    }
    type: "lod_rank_table"
    attrs {
      name: "level"
      type: INT
      i: 0
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_max_seq_len_0"
    }
    type: "max_sequence_len"
    is_target: false
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "dynamic_rnn_max_seq_len_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_0.tmp_0"
    }
    type: "less_than"
    attrs {
      name: "axis"
      type: INT
      i: -1
    }
    attrs {
      name: "force_cpu"
      type: BOOLEAN
      b: true
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    inputs {
      parameter: "X"
      arguments: "lod_reset_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_input_array_0"
    }
    type: "lod_tensor_to_array"
    is_target: false
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_0.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "cell_init"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_mem_array_0"
    }
    type: "write_to_array"
    is_target: false
  }
  ops {
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    inputs {
      parameter: "X"
      arguments: "hidden_init"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_mem_init_reordered_0"
    }
    type: "reorder_lod_tensor_by_rank"
    is_target: false
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_0.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "dynamic_rnn_mem_init_reordered_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_mem_array_1"
    }
    type: "write_to_array"
    is_target: false
  }
  ops {
    inputs {
      parameter: "Condition"
      arguments: "dynamic_rnn_0.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "attention_output.b_0"
      arguments: "c.w_0"
      arguments: "c.w_1"
      arguments: "fill_constant_1.tmp_0"
      arguments: "output.w_1"
      arguments: "input.w_0"
      arguments: "forget.b_0"
      arguments: "lod_rank_table_0"
      arguments: "attention_fc.b_0"
      arguments: "attention_fc.w_0"
      arguments: "dynamic_rnn_input_array_0"
      arguments: "dynamic_rnn_mem_array_0"
      arguments: "dynamic_rnn_mem_array_1"
      arguments: "output.b_0"
      arguments: "input.w_1"
      arguments: "c.b_0"
      arguments: "output.w_0"
      arguments: "input.b_0"
      arguments: "forget.w_1"
      arguments: "forget.w_0"
      arguments: "attention_output.w_0"
      arguments: "dynamic_rnn_max_seq_len_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_0_output_array_elementwise_mul_3.tmp_0_0"
      arguments: "fill_constant_1.tmp_0"
      arguments: "dynamic_rnn_0.tmp_0"
      arguments: "dynamic_rnn_mem_array_0"
      arguments: "dynamic_rnn_mem_array_1"
    }
    outputs {
      parameter: "StepScopes"
      arguments: "_generated_var_0"
    }
    type: "while"
    attrs {
      name: "sub_block"
      type: BLOCK
      block_idx: 1
    }
    is_target: false
  }
  ops {
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    inputs {
      parameter: "X"
      arguments: "dynamic_rnn_0_output_array_elementwise_mul_3.tmp_0_0"
    }
    outputs {
      parameter: "Out"
      arguments: "array_to_lod_tensor_0.tmp_0"
    }
    type: "array_to_lod_tensor"
    is_target: true
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "array_to_lod_tensor_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "fetch"
    }
    type: "fetch"
    attrs {
      name: "col"
      type: INT
      i: 0
    }
  }
}
blocks {
  idx: 1
  parent_idx: 0
  vars {
    name: "input.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sigmoid_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "input.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "forget.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "forget.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "forget.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "c.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "elementwise_mul_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "elementwise_mul_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "forget.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "attention_fc.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "array_read_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
      }
    }
    persistable: false
  }
  vars {
    name: "input.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "array_read_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "c.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "array_read_2.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "sigmoid_2.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "elementwise_mul_3.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sigmoid_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "tanh_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "output.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "output.tmp_3"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sequence_pool_0.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
        }
      }
    }
    persistable: false
  }
  vars {
    name: "concat_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 45
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sequence_expand_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "c.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sequence_softmax_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "shrink_memory_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "shrink_memory_1.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
      }
    }
    persistable: false
  }
  vars {
    name: "sum_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "tanh_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "elementwise_mul_2.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "output.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "attention_fc.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "output.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "input.tmp_2"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "c.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 15
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "attention_output.tmp_1"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "attention_output.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 1
        }
        lod_level: 0
      }
    }
    persistable: false
  }
  vars {
    name: "sequence_pool_0.tmp_0"
    type {
      type: LOD_TENSOR
      lod_tensor {
        tensor {
          data_type: FP32
          dims: -1
          dims: 30
        }
      }
    }
    persistable: false
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "dynamic_rnn_input_array_0"
    }
    outputs {
      parameter: "Out"
      arguments: "array_read_0.tmp_0"
    }
    type: "read_from_array"
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "dynamic_rnn_mem_array_0"
    }
    outputs {
      parameter: "Out"
      arguments: "array_read_1.tmp_0"
    }
    type: "read_from_array"
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    inputs {
      parameter: "X"
      arguments: "array_read_1.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "shrink_memory_0.tmp_0"
    }
    type: "shrink_rnn_memory"
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "dynamic_rnn_mem_array_1"
    }
    outputs {
      parameter: "Out"
      arguments: "array_read_2.tmp_0"
    }
    type: "read_from_array"
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "RankTable"
      arguments: "lod_rank_table_0"
    }
    inputs {
      parameter: "X"
      arguments: "array_read_2.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "shrink_memory_1.tmp_0"
    }
    type: "shrink_rnn_memory"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "shrink_memory_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "array_read_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "sequence_expand_1.tmp_0"
    }
    type: "sequence_expand"
    attrs {
      name: "ref_level"
      type: INT
      i: 0
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "array_read_0.tmp_0"
      arguments: "sequence_expand_1.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "concat_1.tmp_0"
    }
    type: "concat"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "concat_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "attention_fc.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_fc.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_fc.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "attention_fc.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_fc.tmp_1"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_fc.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_fc.tmp_1"
    }
    type: "relu"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_fc.tmp_1"
    }
    inputs {
      parameter: "Y"
      arguments: "attention_output.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_output.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_output.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "attention_output.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_output.tmp_1"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_output.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "attention_output.tmp_1"
    }
    type: "relu"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "attention_output.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "sequence_softmax_0.tmp_0"
    }
    type: "sequence_softmax"
    attrs {
      name: "data_format"
      type: STRING
      s: "AnyLayout"
    }
    attrs {
      name: "use_cudnn"
      type: BOOLEAN
      b: true
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "array_read_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "sequence_softmax_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "elementwise_mul_0.tmp_0"
    }
    type: "elementwise_mul"
    attrs {
      name: "axis"
      type: INT
      i: 0
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "elementwise_mul_0.tmp_0"
    }
    outputs {
      parameter: "MaxIndex"
      arguments: "sequence_pool_0.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "sequence_pool_0.tmp_0"
    }
    type: "sequence_pool"
    attrs {
      name: "pooltype"
      type: STRING
      s: "SUM"
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "shrink_memory_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "forget.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "forget.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sequence_pool_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "forget.w_1"
    }
    outputs {
      parameter: "Out"
      arguments: "forget.tmp_1"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "forget.tmp_0"
      arguments: "forget.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "forget.tmp_2"
    }
    type: "sum"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "forget.tmp_2"
    }
    inputs {
      parameter: "Y"
      arguments: "forget.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "forget.tmp_3"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "forget.tmp_3"
    }
    outputs {
      parameter: "Out"
      arguments: "sigmoid_0.tmp_0"
    }
    type: "sigmoid"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "shrink_memory_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "input.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "input.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sequence_pool_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "input.w_1"
    }
    outputs {
      parameter: "Out"
      arguments: "input.tmp_1"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "input.tmp_0"
      arguments: "input.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "input.tmp_2"
    }
    type: "sum"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "input.tmp_2"
    }
    inputs {
      parameter: "Y"
      arguments: "input.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "input.tmp_3"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "input.tmp_3"
    }
    outputs {
      parameter: "Out"
      arguments: "sigmoid_1.tmp_0"
    }
    type: "sigmoid"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "shrink_memory_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "output.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "output.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sequence_pool_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "output.w_1"
    }
    outputs {
      parameter: "Out"
      arguments: "output.tmp_1"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "output.tmp_0"
      arguments: "output.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "output.tmp_2"
    }
    type: "sum"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "output.tmp_2"
    }
    inputs {
      parameter: "Y"
      arguments: "output.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "output.tmp_3"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "output.tmp_3"
    }
    outputs {
      parameter: "Out"
      arguments: "sigmoid_2.tmp_0"
    }
    type: "sigmoid"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "shrink_memory_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "c.w_0"
    }
    outputs {
      parameter: "Out"
      arguments: "c.tmp_0"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sequence_pool_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "c.w_1"
    }
    outputs {
      parameter: "Out"
      arguments: "c.tmp_1"
    }
    type: "mul"
    attrs {
      name: "y_num_col_dims"
      type: INT
      i: 1
    }
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
    attrs {
      name: "x_num_col_dims"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "c.tmp_0"
      arguments: "c.tmp_1"
    }
    outputs {
      parameter: "Out"
      arguments: "c.tmp_2"
    }
    type: "sum"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "c.tmp_2"
    }
    inputs {
      parameter: "Y"
      arguments: "c.b_0"
    }
    outputs {
      parameter: "Out"
      arguments: "c.tmp_3"
    }
    type: "elementwise_add"
    attrs {
      name: "axis"
      type: INT
      i: 1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "c.tmp_3"
    }
    outputs {
      parameter: "Out"
      arguments: "tanh_0.tmp_0"
    }
    type: "tanh"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sigmoid_0.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "shrink_memory_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "elementwise_mul_1.tmp_0"
    }
    type: "elementwise_mul"
    attrs {
      name: "axis"
      type: INT
      i: -1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sigmoid_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "tanh_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "elementwise_mul_2.tmp_0"
    }
    type: "elementwise_mul"
    attrs {
      name: "axis"
      type: INT
      i: -1
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "elementwise_mul_1.tmp_0"
      arguments: "elementwise_mul_2.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "sum_0.tmp_0"
    }
    type: "sum"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sum_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "tanh_1.tmp_0"
    }
    type: "tanh"
    attrs {
      name: "use_mkldnn"
      type: BOOLEAN
      b: false
    }
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "sigmoid_2.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "tanh_1.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "elementwise_mul_3.tmp_0"
    }
    type: "elementwise_mul"
    attrs {
      name: "axis"
      type: INT
      i: -1
    }
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "elementwise_mul_3.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_0_output_array_elementwise_mul_3.tmp_0_0"
    }
    type: "write_to_array"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "fill_constant_1.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "fill_constant_1.tmp_0"
    }
    type: "increment"
    attrs {
      name: "step"
      type: FLOAT
      f: 1.0
    }
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "elementwise_mul_3.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_mem_array_1"
    }
    type: "write_to_array"
  }
  ops {
    inputs {
      parameter: "I"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "X"
      arguments: "sum_0.tmp_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_mem_array_0"
    }
    type: "write_to_array"
  }
  ops {
    inputs {
      parameter: "X"
      arguments: "fill_constant_1.tmp_0"
    }
    inputs {
      parameter: "Y"
      arguments: "dynamic_rnn_max_seq_len_0"
    }
    outputs {
      parameter: "Out"
      arguments: "dynamic_rnn_0.tmp_0"
    }
    type: "less_than"
    attrs {
      name: "axis"
      type: INT
      i: -1
    }
    attrs {
      name: "force_cpu"
      type: BOOLEAN
      b: true
    }
  }
}
jacquesqiao commented 6 years ago

export GLOG_v=3 export GLOG_logtostderr=1

看看出问题前后发生了啥,另外这个代码是自己编译的还是官方下载的?

seanxh commented 6 years ago

代码是我自己写的。lib和头文件是从官方代码编译的

这是报错

I0605 18:51:39.577098 26396 executor.cc:127] Create variable hidden_init, which pointer is 0xb3a270
I0605 18:51:39.577101 26396 executor.cc:127] Create variable output.w_0, which pointer is 0xade030
I0605 18:51:39.577109 26396 executor.cc:127] Create variable input.w_1, which pointer is 0xb3bc20
I0605 18:51:39.577116 26396 scope.cc:56] Create variable blstm_0.tmp_3
I0605 18:51:39.577122 26396 executor.cc:127] Create variable blstm_0.tmp_3, which pointer is 0xb3af90
I0605 18:51:39.577127 26396 scope.cc:56] Create variable dynamic_rnn_mem_array_0
I0605 18:51:39.577134 26396 executor.cc:127] Create variable dynamic_rnn_mem_array_0, which pointer is 0xb3a3b0
~~~
terminate called after throwing an instance of 'paddle::platform::EnforceNotMet'
  what():  Attribute 'sub_block' is required! at [/home/map/git/Paddle/paddle/fluid/framework/attribute.h:241]
PaddlePaddle Call Stacks: 
0             0x421c08p paddle::platform::EnforceNotMet::EnforceNotMet(std::__exception_ptr::exception_ptr, char const*, int) + 576
Superjomn commented 6 years ago

这个模型是用了 RNN吗?

jacquesqiao commented 6 years ago

请先排查下protobuf的潜在问题。

Superjomn commented 6 years ago

Reference https://github.com/PaddlePaddle/Paddle/issues/11279

编译了一个能输出详细 debug信息的.so,替换后有链接问题,正在找相关同学解决。

luotao1 commented 6 years ago
executor.CreateVariables(*inference_program, scope, 0);
std::unique_ptr<paddle::framework::ExecutorPrepareContext> ctx;
ctx = executor.Prepare(*inference_program, 0);

从代码里看,是手动调用了CreateVariablesPrepare两个函数。

Superjomn commented 6 years ago

https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/fluid/layers/io.py#L651

@JiayiFeng 可以解答下这个是啥原理? C++端没有这样的逻辑,会因为这里出现这个 issue 的问题吗

JiayiFeng commented 6 years ago

使用的是否是最新代码?以前的Paddle中存在一个这方面的bug,上周修复了

Superjomn commented 6 years ago

是 python 的 bug?还是 c++的。 如果是python 的是否要重新训练生成模型? @JiayiFeng

JiayiFeng commented 6 years ago

是C++的一个bug。应该不需要重新生成训练模型

Superjomn commented 6 years ago

@seanxh 可以试下最新的预测库

seanxh commented 6 years ago

Good。结果复现了,引用最新的(lib和头文件)[http://www.paddlepaddle.org/docs/develop/documentation/fluid/zh/howto/inference/build_and_install_lib_cn.html#id1]就行了

应该是18.06.10号左右的版本修复的。我引用的版本

    paddle::framework::LoDTensor output1;
    std::map<std::string, paddle::framework::LoDTensor *> fetch_targets;
    fetch_targets[fetch_target_names[0]] = &output1;

    executor.Run(*inference_program, scope, &feed_targets, &fetch_targets,
                 false);