Open miangangzhen opened 4 years ago
Please, could you give an example show how to use multi-gpu flask service streamer with gunicorn?
I write these code, but they are not available:
# coding=utf-8 from gevent import monkey; monkey.patch_all() from flask_multigpu_example import app def post_fork(server, worker): from service_streamer import RedisStreamer, Streamer import flask_multigpu_example from bert_model import ManagedBertModel, TextInfillingModel as Model flask_multigpu_example.streamer = Streamer(ManagedBertModel, batch_size=64, max_latency=0.1, worker_num=4, cuda_devices=(0, 1, 2, 3)) model = Model() bind = '0.0.0.0:5005' workers = 4 worker_class = 'gunicorn.workers.ggevent.GeventWorker' proc_name = "redis_streamer"
@miangangzhen 后来解决了吗? 如果 redis stream+ flask_multigpu_example 例子在readme里能够更加详尽就更好? 这个例子有点复杂,好多人不是很理解 @Meteorix 多谢。
Please, could you give an example show how to use multi-gpu flask service streamer with gunicorn?
I write these code, but they are not available: