ShannonAI / service-streamer

Boosting your Web Services of Deep Learning Applications.
Apache License 2.0
1.22k stars 187 forks source link

Problem using flask_multigpu_example.py with gunicorn #64

Open miangangzhen opened 4 years ago

miangangzhen commented 4 years ago

Please, could you give an example show how to use multi-gpu flask service streamer with gunicorn?

I write these code, but they are not available:

# coding=utf-8
from gevent import monkey; monkey.patch_all()
from flask_multigpu_example import app

def post_fork(server, worker):
    from service_streamer import RedisStreamer, Streamer
    import flask_multigpu_example
    from bert_model import ManagedBertModel, TextInfillingModel as Model
    flask_multigpu_example.streamer = Streamer(ManagedBertModel, batch_size=64, max_latency=0.1, worker_num=4, cuda_devices=(0, 1, 2, 3))
    model = Model()

bind = '0.0.0.0:5005'
workers = 4
worker_class = 'gunicorn.workers.ggevent.GeventWorker'
proc_name = "redis_streamer"
rubby33 commented 4 years ago

@miangangzhen 后来解决了吗? 如果 redis stream+ flask_multigpu_example 例子在readme里能够更加详尽就更好? 这个例子有点复杂,好多人不是很理解 @Meteorix 多谢。