Open arshpreetsingh opened 6 years ago
Sure, With each function there could be decorator something like where user can control number of threads for specified function, Something like:
@fire_resource(threads=4) # it could use Threading API Or may be joblib in Sklearn def function_name(): return "running multiple threads"
same if we have to run functions concurrently.
@fire_resource(threads=4,concurrent=True) # it could use Multiprocessing API
def function_name():
return "running concurrent processes"
I just found that when Training a ML function we obviously need multiple threads but when making predictions it's "Really Good" to run it concurrently. Please feel free to ping for more detailed elaboration.
I think this is a documentation issue. Firefly is a Python WSGI application. It can be run using multiple threads or processes using any WSGI server. The recommended approach is using gunicorn.
To run it using 4 processes:
gunicorn --workers 4 firefly.main.app -e FIREFLY_FUNCTIONS="funcs.square,funcs.cube"
To run using 4 threads:
gunicorn --workers 4 firefly.main.app -e FIREFLY_FUNCTIONS="funcs.square,funcs.cube"
@arshpreetsingh does it answer your question?
Yes, Thanks!
Hi @arshpreetsingh
Can you please elaborate on the issue?