Hironsan / anago

Bidirectional LSTM-CRF and ELMo for Named-Entity Recognition, Part-of-Speech Tagging and so on.
https://anago.herokuapp.com/
MIT License
1.48k stars 371 forks source link

Using trained model in a server #65

Closed dsalfran closed 6 years ago

dsalfran commented 6 years ago

Has anyone tried to use a model trained with anago together with tensorflow serving or flask? I'm trying to create an application that uses the model to predict entities.

dsalfran commented 6 years ago

I'm still open to an answer on how to connect anago models with tensorflow serving. Nevertheless, I was able to set it up with flask. Here is the code in case someone is interested:

#!/usr/bin/env python
# -*- coding: utf-8 -*-

from flask import Flask, request, jsonify
import json
import codecs
import tensorflow as tf
import anago

app = Flask(__name__)
reader = codecs.getreader("utf-8")

def model_predict(text):
    # Function to process the text and return the entitites
    with graph.as_default():
        tokens = text.split()
        pred = model.analyze(tokens)

    return pred

@app.route("/ner", methods=['POST'])
def get_entities():

    data = json.loads(request.get_data().decode('utf-8'))
    print(data)
    text = data.get('text')
    if text is None:
        print("get_entities", "Text to analyze can't be missing")

    tagged_text = model_predict(text)

    return jsonify(tagged_text)

if __name__ == "__main__":
    # The model has to be loaded in the main server process. Lesson learned the
    # hard way
    model = anago.Sequence.load("models")
    graph = tf.get_default_graph()
    app.run(host='0.0.0.0', debug=True)