Credit: originally a fork of OurNemanja/YOLOv5-fastapi-celery-redis-rabbitm
Application to expose Yolov5 model using FastAPI. Inferencing requests are submitted to a Celery task queue, and an asynchronous API is available for polling for results. This approach is described in detail in the excellent article Serving ML Models in Production with FastAPI and Celery by @jonathanreadshaw
Run all containers:
docker-compose up
This will start:
Perform some API requests using the integrated Swagger UI http://localhost:8000/docs
API Services available | Endpoint | Method | Description |
---|---|---|---|
http://localhost:8000/api/process | POST | Send one or more pictures to be processed by Yolov5. Return the task_id of each task. | |
http://localhost:8000/api/status/ |
GET | Retrieve the status of a given task | |
http://localhost:8000/api/result/ |
GET | Retrieve the results of a given task | |
http://localhost:8000/docs | GET | Documentation generated for each endpoint | |
http://localhost:15672 | GET | RabbitMQ monitor. User: guest Password: guest. | |
http://localhost | GET | Simple webapp to show how to use and display results from the API. |
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
isort .
black .
pytest .
# none implementedOverview of the code