Open emettely opened 5 years ago
Microservices that poll on the queue to receive the jobs
+-------+
Poll | Video |
+----+ Pre |
| +-------+
|
+----------+ AWS Queue | +-------+
| | PUSH +-+-+-+-+-+-+-+-+-+-+ | | Audio |
| API +------>+ | | | | | | | | | +<--------+ Strip +----+
| | +---+-+-+-+-+-+-+-+-+ | +-------+ |
| | ^ | |
+----------+ | | +-------+ |
| | | STT | |
| +----+ | |
| +-------+ |
| |
| |
+----------------------------------------+
PUSH
Message processing microservice fronts API Microservices. The message processor polls the queues and calls the APIs.
+-------+
| Video |
+--->+ Pre |
| +-------+
|
+----------+ AWS Queue +------+ | +-------+
| | PUSH +-+-+-+-+-+-+-+-+-+-+ POLL | msg | POST | | Audio |
| API +------>+ | | | | | | | | | +<------+ proc +---------->+ Strip +---+
| | +---+-+-+-+-+-+-+-+-+ +------+ | +-------+ |
| | ^ | |
+----------+ | | +-------+ |
| | | STT | |
| +--->+ | |
| +-------+ |
| |
| |
+--------------------------------------------------------+
PUSH
Alternatively, get rid of all the queues.
+-------+
| Video |
+--->+ Pre |
| +-------+
+----------+ |
| | | +-------+
| API | POST | | Audio |
| +---------->+ Strip |
| | | +-------+
+----------+ |
| +-------+
| | STT |
+--->+ |
+-------+
This is a very good read on Microservices architecture style
But to flash this out a bit more we should also look into other Architecture styles
Also worth reading this awesome API design guide about RESTful APIs.
In particular the section about dealing with Asynchronous operations.
Sometimes a POST, PUT, PATCH, or DELETE operation might require processing that takes a while to complete. If you wait for completion before sending a response to the client, it may cause unacceptable latency. If so, consider making the operation asynchronous. Return HTTP status code 202 (Accepted) to indicate the request was accepted for processing but is not completed.
You should expose an endpoint that returns the status of an asynchronous request, so the client can monitor the status by polling the status endpoint. Include the URI of the status endpoint in the Location header of the 202 response. For example:
~Also see Simple Microservices Architecture on AWS~
10 Design Principles for AWS Cloud Architecture
IT systems should ideally be designed in a way that reduces inter-dependencies. Your components need to be loosely coupled to avoid changes or failure in one of the components from affecting others.
Your infrastructure also needs to have well defined interfaces that allow the various components to interact with each other only through specific, technology-agnostic interfaces. Modifying any underlying operations without affecting other components should be made possible.
An application running in the cloud is expected to handle a large number of requests. Rather than process each request synchronously, a common technique is for the application to pass them through a messaging system to another service (a consumer service) that handles them asynchronously. This strategy helps to ensure that the business logic in the application isn't blocked while the requests are being processed.
Queue-Based Load Leveling pattern
Use a queue that acts as a buffer between a task and a service it invokes in order to smooth intermittent heavy loads that can cause the service to fail or the task to time out. This can help to minimize the impact of peaks in demand on availability and responsiveness for both the task and the service.
TL;DR: we don’t have to use microservices - we could use
Will write up an ADR around this with an updated diagram in infrastructure.
Context
Deciding the best approach for communication between microservices and API.
Acceptance Criteria
Decide and create an ADR around this.
Things to consider