aws / sagemaker-tensorflow-serving-container

A TensorFlow Serving solution for use in SageMaker. This repo is now deprecated.
Apache License 2.0
172 stars 101 forks source link

Adding pip dependencies in the pre/post processing #163

Closed samueleresca closed 4 years ago

samueleresca commented 4 years ago

Hi,

I would like to use Redis in the Pre/Post processing hooks of the serving container. More in detail, something similar to the following implementation:

import json
import redis

r = redis.StrictRedis(host=redis_host, port=6379, db=0)

def input_handler(data, context):
    if context.request_content_type == 'application/json':
        # pass through json (assumes it's correctly formed)
        d = data.read().decode('utf-8')
        obj = r.get(f"features-id-{d['id']}")
        return obj

    raise ValueError('{{"error": "unsupported content type {}"}}'.format(
        context.request_content_type or "unknown"))

The approach above would avoid passing the full set of features (3Mb of size) as part of the REST request. Therefore, I need a way to include Redis as a dependency of the pre/post-processing code. I saw that it is possible to do that by overriding the Dockerfile definition.

I was wondering if it exists/is in roadmap any other way to include pip packages without overriding the Dockerfile definition.