kamalkraj / ALBERT-TF2.0

ALBERT model Pretraining and Fine Tuning using TF2.0
Apache License 2.0
200 stars 45 forks source link

how to inference online with tensorflow2.0? #24

Open freefuiiismyname opened 4 years ago

freefuiiismyname commented 4 years ago

i am trying to inference online with tensorflow2.0. my code is as follows:

    self.graph = tf.Graph()

    with self.graph.as_default() as g:
        self.input_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                             FLAGS.max_seq_length], name="input_ids")
        self.input_mask = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                              FLAGS.max_seq_length], name="input_mask")
        self.p_mask = tf.compat.v1.placeholder(tf.float32, [FLAGS.batch_size,
                                                            FLAGS.max_seq_length], name="p_mask")
        self.segment_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size,
                                                               FLAGS.max_seq_length], name="segment_ids")
        self.cls_index = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size], name="segment_ids")
        self.unique_ids = tf.compat.v1.placeholder(tf.int32, [FLAGS.batch_size], name="unique_ids")

        # unpacked_inputs = tf_utils.unpack_inputs(inputs)
        self.squad_model = ALBertQAModel(
            albert_config, FLAGS.max_seq_length, init_checkpoint, FLAGS.start_n_top, FLAGS.end_n_top,
            FLAGS.squad_dropout)

        learning_rate_fn = tf.keras.optimizers.schedules.PolynomialDecay(initial_learning_rate=1e-5,
                                                                         decay_steps=10000,
                                                                         end_learning_rate=0.0)
        optimizer_fn = AdamWeightDecay
        optimizer = optimizer_fn(
            learning_rate=learning_rate_fn,
            weight_decay_rate=0.01,
            beta_1=0.9,
            beta_2=0.999,
            epsilon=1e-6,
            exclude_from_weight_decay=['layer_norm', 'bias'])

        self.squad_model.optimizer = optimizer
        graph_init_op = tf.compat.v1.global_variables_initializer()

        y = self.squad_model(
            self.unique_ids, self.input_ids, self.input_mask, self.segment_ids, self.cls_index,
            self.p_mask, training=False)
        self.unique_ids, self.start_tlp, self.start_ti, self.end_tlp, self.end_ti, self.cls_logits = y

        self.sess = tf.compat.v1.Session(graph=self.graph, config=gpu_config)
        self.sess.run(graph_init_op)
        with self.sess.as_default() as sess:
            self.squad_model.load_weights(FLAGS.model_dir)

This code is executable, but it runs bad result. It looks like the parameters are unloaded.I guess this is probably because I'm not using tf.Session to set default parameters on the model, such as' saver.restore(sess, tf.train. Latest_checkpoint (init_checkpoint)) '. I've tried several ways to do this, but it hasn't worked.And there are very few examples of online inferencing using tensorflow2.0 on the Internet, and I have trouble finding a solution. :(((( May i get some help here, thx very much!!

Bidek56 commented 4 years ago

This code works to inference a single value from a saved model, hopefully it helps.