edhenry / chexnet

Implementation and fullstack pipeline for CheXNet classifier
MIT License
13 stars 12 forks source link

TensorFlow Serving Client #4

Closed edhenry closed 5 years ago

edhenry commented 5 years ago

Currently the CheXNet model is trained and posted to a local file store for serving but we will need to extend the Flask app to support a client model for querying the model that is being served.

There some thought being put in to whether we can publish to model inference request to a message queue and allow session state to be maintained using the message bus or having the front end directly query the model. In the interest of time it might make sense to just allow the 'front end' flask app to make direct REST calls to the TFServing server, to start.

In the future it might make more sense to scale the model query architecture into a message bus model to allow for MuX'ing requests and potential scaling requirements.

edhenry commented 5 years ago

Client has been created. Client will retrieve messages from configured Kafka Topic, pre-process the images, query the TensorFlow Serving instance with the transformed image. Right now there is no support for keys within the Kafka message, however, future support will be added. For demo purposes a 1:1 mapping of the service is sufficient.