Open lzukanovic opened 10 months ago
This is where most of the work for logging will be done. Some changes can be observed in the feature/logging
branch.
Currently the whole logging stack (elasticsearch, kibana, metricbeat - optional since we have Prometheus, filebeat and logstash) has been implemented to work in a local Docker environment using docker-compose.logging.yml
.
For this assignment we are merely focused on elasticsearch, kibana and logstash which is the service that can accept log outputs to stdout from other containers (and additionally can read log files that are placed in a specific directory). The other two (metricbeat and filebeat) are probably not necessary at this point.
The issue that arrises is within the config (currently just docker-compose.yml) of each individual application. For each container (e.g. app, and db) we need to define the logging configuration to specify what the container does with logs (what driver/protocol to use, where to send its logs and what tag should it use for metadata).
Here is an example:
services:
app:
container_name: cinepik-catalog
...
networks:
- cinepik-network
logging:
driver: gelf
options:
gelf-address: udp://host.docker.internal:12201
tag: cinepik-catalog-app
The issue I encountered when setting up everything for docker is that this container, even though it is on the same network as the required logstash container (and all other logging containers) it was unable to reach the container using the following host names: localhost
, container-name
, etc. The only thing that successfully managed to reach the container was using the host.docker.internal
hostname. Something about that while both containers are in the same network, the docker daemon that intercepts the logging traffic is not and that is why it doesn't work?
Hopefully this is something that will be resolved when deployed to Kubernetes.
To summarise: The logging part pertaining to the common repo is working, it just needs all of the k8s resource files. What needs to be fixed is the sending logs functionality on each of the app repos.
This is the tutorial I followed to setup logging locally: Getting started with the Elastic Stack and Docker Compose: Part 1
Here is a tutorial I found on how to deploy the elastic stack to Kubernetes (using Help charts): Deploy the Elastic Stack on Kubernetes
Also a helpful tool to automatically generate k8s resource files is kompose
. Here is a example command to convert the docker-compose.logging.yml
file:
kompose convert -f docker-compose.logging.yml -n logging -o k8s/