danionescu0 / docker-flask-mongodb-example

Uses docker compose with a python flask microservice and MongoDB instance to make a sample application
GNU General Public License v3.0
99 stars 41 forks source link
baesian crud docker docker-compose flask fulltext-search geospatial grafana influxdb locusts microservice mongodb mosquitto mqtt photo-processor python python3 stress-testing swagger

Purpose

test-build

A working demo usage of multiple technologies like: Docker, Docker-compose, MongoDb, Python3, Flask framework, Fastapi framework, Mosquitto, Swagger, Locusts, Grafana, InfluxDB, KrakenD, Kubernetes, Chronograf

Please consider adding issues and enhancements

If you consider this demo usefull give it a star so others will find it quicker :)

Contributing

Prerequisites

Docker:

How to install docker on Ubuntu: https://docs.docker.com/install/linux/docker-ce/ubuntu/

How to install docker on Centos: https://docs.docker.com/engine/install/centos/

How to install docker on Windows: https://docs.docker.com/docker-for-windows/install-windows-home/

tested with version: 20.10.2

Docker compose

How to install docker compose: https://docs.docker.com/compose/install/

tested with version: 1.28.4

If you're using Docker Desktop ("docker compose" comand instead of "docker-compose") and the build fails with error:

....failed to build LLB...

Consider disabling buildkit:

export DOCKER_BUILDKIT=0
export COMPOSE_DOCKER_CLI_BUILD=0

Applications:

The applications will run using docker-compose, below are the services descriptions, for more info click on the title:

1 Random service generates random numbers and lists them (port 800) a second cotainer with same app is opened on port 801 using pypy which is faster, check Random service description to see details about the test

2 User CRUD service Create, read, update and detele operations over a user collection (port 81)

3 MQTT service will use a MQTT server (Mosquitto) to allow to publish sensor updates over MQTT (port 1883) The updates will be saved in mongodb (/demo/sensors). It will also compute a running average for each sensor and publish it to a separate topic

4 Fulltext service fulltext search engine backed by fulltext MongoDb index (port 82)

5 Geolocation search service geospacial search service that supports adding places, and quering the placing by coordonates and distance (port 83)

6 Baesian_average baesian average demo (https://en.wikipedia.org/wiki/Bayesian_average) (port 84)

7 Photo process A demo of working with file photo uploads, hash searching and using docker volumes. Photos will be stored on disk retrived and resized / rotated. Also a search by image API will be available (port 85)

8 Book collection A virtual book library, has API methods for managing books, and borrowing book mechanisms. The users must have "profiles" created using the User CRUD service. This API used flask rest plus (https://flask-restplus.readthedocs.io/en/stable/) (port 86)

9 Grafana and InfluxDb Grafana with InfluxDb storage. For showing graphs on sensors. It is connected with the MQTT service. Every datapoint that passes through the MQTT service will be saved in InfluxDb and displayed in Grafana. (port 3000) Default credentials are: admin / admin

Also here we have Chronograf (https://www.influxdata.com/time-series-platform/chronograf/) this is a tool for exploring influxDb database (port 8888)

10 User CRUD fastapi Create, read, update and detele operations made available with fastapi framework (port 88)

11 Tic-tac-toe A tic tac toe game written in flask using flask_session. It has a simple UI (port 89)

12 Krakend API gateway using Krakend (port 8080)

13 Deployment using Kubernetes demo deployment for two services using Kubernetes

14 User CRUD service using GraphQl

Diagram Grafana

Technollogies involved

Run the microservice

Before running check that the ports are available and free on your machine!

On linux run the following command and check if the ports are free:

cd docker-flask-mongodb-example
./check_ports.sh 
port 800 is free
port 801 is free
port 81 is free
port 82 is free
port 83 is free
port 84 is free
port 85 is free
port 86 is free
port 88 is free
port 89 is free
port 1883 is free
port 27017 is free
port 8080 is free
port 3000 is free

Start the microservice architecture:

cd docker-flask-mongodb-example
docker network create project-network
docker-compose up --build

Populate with mock data [optional]

Run the import.sh script, it will populate data for users and fulltext search usecases

  1. start all services with docker compose

  2. run the import script

    chmod +x import.sh
    ./import.sh

Manual testing:

For all HTTP requests we'll be using [curl][https://curl.haxx.se/docs/manpage.html]. In most UNIX systems curl is already installed. If you're using Debian/Ubuntu and you don't have curl install it using:

sudo apt-get install curl

An alternative manual HTTP testing you could use Swagger, for example for users crud operations in a browser open: http://localhost:81/apidocs to see the Swagger UI and to perform test requests from there!

For the MQTT application we'll use mosquitto cli: To install mosquitto cli in Debian / Ubuntu use:

sudo apt-get install mosquitto-clients

Stress testing using locusts.io

Using locust.io

  1. Installation: conda install -c conda-forge locust

then

conda activate your_env
conda install -c conda-forge locust
pip install -r python/requirements-dev.txt

Quickstart(optional): https://docs.locust.io/en/stable/quickstart.html

  1. Ensure port 8089 is free

  2. Go to project root in console

  3. Run the following:

Testing random demo microservice:

locust -f stresstest-locusts/random_demo.py --host=http://localhost:800 --web-host localhost

Testing users microservice:

locust -f stresstest-locusts/users.py --host=http://localhost:81 --web-host localhost

Testing fulltext_search microservice:

locust -f stresstest-locusts/fulltext_search.py --host=http://localhost:82 --web-host localhost

Testing geolocation_search microservice:

locust -f stresstest-locusts/geolocation_search.py --host=http://localhost:83 --web-host localhost

Testing baesian microservice:

locust -f stresstest-locusts/baesian.py --host=http://localhost:84 --web-host localhost

After starting any service open http://localhost:8089 to acces the testing UI

Services explained:

Random service

This service generates random numbers and store them in a capped array (last 5 of them). Also it can generate and return a random number.

The random number collection has only one documents with '_id' lasts and an "items" key that will be a capped array.

MongoDb capped array: https://www.mongodb.com/blog/post/push-to-sorted-array

Sample data in "random_numbers" collection document:

{
  "_id" : "lasts",
  "items": [3, 9, 2, 1, 2]
}
...

MongoDb official documentation (array operations): https://docs.mongodb.com/manual/reference/operator/update/slice/

Swagger URL: http://localhost:800/apidocs

API methods using Curl:

curl -i "http://localhost:800/random-list"

Using PyPy for speed

A second container with the same API opened on port 801 using pypy which should be faster in theory.

The test system is a I7-4720HQ 2.60GHz wih 12 GB RAM and a SSD drive on which i've run the docker-compose architecture.

The load testing is done using apache bench (https://httpd.apache.org/docs/2.4/programs/ab.html) 3000 requests with 20 concurent requests, the results measured are average requests per second.

Results without PyPy:

ab -n 3000 -c 20 'http://localhost:800/random?lower=10&upper=10000'
...
Requests per second:    692.83 [#/sec] (mean)
Time per request:       28.867 [ms] (mean)
Time per request:       1.443 [ms] (mean, across all concurrent requests)
...

Results with PyPy:

ab -n 3000 -c 20 'http://localhost:801/random?lower=10&upper=10000'
...
Requests per second:    1224.29 [#/sec] (mean)
Time per request:       16.336 [ms] (mean)
Time per request:       0.817 [ms] (mean, across all concurrent requests)
...

The results with pypy are far better, approximatively 90% percent increase, BUT i've noticed that need a warm up (i've run the benchmark on this service multiple times before) until they get to this performance, i'll investigate this further.

These tests are made possible thanks to neelabalan(https://github.com/neelabalan).

User CRUD service

CRUD stands for Create, Read, Update and Delete operations. I've written a demo for these operations on a collections of users. A user has a userid, name and email fields.

Sample data in user collection document:

{
  "_id" : 12,
  "email": "some@gmail.com",
  "name": "some name"
}
...

Swagger URL: http://localhost:81/apidocs

API methods using Curl:

example: add a new user with name dan email test@yahoo.com and userid 189

curl -X POST -d email=test@yahoo.com -d name=dan http://localhost:81/users/189

example, modify the email of the user with id 10

curl -X PUT -d email=test22@yahoo.com  http://localhost:81/users/10

example for with id 1:

curl -i "http://localhost:81/users/1"

example for with id 1:

curl -X DELETE -i "http://localhost:81/users/1"

MQTT service

This usecase uses MQTT protocol instead of HTTP to communicate. It involves storing last 5 numerical values of a sensor, running a moving average on them and publishing the average each time on "/averages/{sensor_id}" topic

MQTT official link: http://mqtt.org/

MQTT explained: https://randomnerdtutorials.com/what-is-mqtt-and-how-it-works/

MQTT info, tools: https://github.com/hobbyquaker/awesome-mqtt#tools

MongoDb capped array: https://www.mongodb.com/blog/post/push-to-sorted-array

Sample data in sensors collection document:

    "_id" : "some_sensor",
    "items" : [
        {
            "value" : 23,
            "date" : ISODate("2019-03-09T17:49:10.585Z")
        },
        {
            "value" : 5,
            "date" : ISODate("2019-03-09T17:49:08.766Z")
        },
        ... 3 more
    ]
...
mosquitto_pub -h localhost -u some_user -P some_pass -p 1883 -d -t sensors -m "{\"sensor_id\": \"temperature\", \"sensor_value\": 15.2}"

This will publish to mosquitto in the "sensors" topic the following json: {'sensor_id": 'temperature', 'sensor_value': 15.2}

Our python script will listen to this topic too, and save in the mongo sensors collections the value for the sensor in a capped array.

After it writes the new values, it reads the last 5 values and computes an average (running average) and publish it to topic "average/temperature"

This will just listen for updates for the topic "averages/temperature" and get a running average for the sensor "temperature". Each time someone publishes a new value for a sensor, the python script will calculate the average values of last 5 values and publishes it

Fulltext service

This service exposes a REST API for inserting a text into the full text database, and retriving last 10 matches

MongoDb official documentation (text search): https://docs.mongodb.com/manual/text-search/

Sample data in fulltext_search collection document:

{
    "_id" : ObjectId("5c44c5104f2137000c9d8cb2"),
    "app_text" : "ana has many more apples",
    "indexed_date" : ISODate("2019-01-20T18:59:28.060Z")
}
...

Swagger URL: http://localhost:82/apidocs

API methods using Curl:

Geolocation search service

The service will allow to insert locations with coordonates (latitude and longitude), and will expose an REST API to all locations near a point.

MongoDb official documentation (geospacial index): https://docs.mongodb.com/manual/geospatial-queries/

Sample data in places collection document:

{
    "_id" : ObjectId("5c83f901fc91f1000c29fb4d"),
    "location" : {
        "type" : "Point",
        "coordinates" : [
            -75.1652215,
            39.9525839
        ]
    },
    "name" : "Philadelphia"
}
..

Swagger URL: http://localhost:83/apidocs

API methods using Curl:

curl -X POST "http://localhost:83/login" -H "accept: application/json" -H "Content-Type: application/x-www-form-urlencoded" -d "username=admin&password=secret" 

To make authenticated requests use -H "Authorization: Bearer obtained_token"

Baesian average

This is a naive implementation of the baesian average (https://en.wikipedia.org/wiki/Bayesian_average).

It's naive because it's not built with scalability in mind.

The baesian average is used in rating systems to add weight to the number of votes.

In this example we'll use items and users. A user is represented by it's id and it rates items from 0 - 10

A Item it's represented by an id and a name and it's rated by the users.

A full rating example:

Items:

How to calculate Baesian average for item 2:

avg_num_votes = 2.333   // The average number of votes for all items (1+3+4) / 3

avg_rating = 8.5        // The average rating for all items (10+9+10+9+10+4+8+8) / 8

item_num_votes = 3      // The number of votes for current item (Item 2) 

item_rating = 9.33      // The average rating for current item (Item 2): (9+10+9)/3

bayesian_average = 8.941 // ((avg_num_votes * avg_rating) + (item_num_votes * item_rating)) / (avg_num_votes + item_num_votes)

Averages:

Element 1: 8.909        
Element 2: 8.941
Element 3: 7.9

You can see although Hamlet has an 10, and Cicero has two 9's and one 10 the baesian average of Cicero is the highest

Sample data in baesian collection document:

{
    "_id" : 2,
    "name" : "Cicero",
    "marks" : [
        {
            "userid" : 5,
            "mark" : 9
        },
        {
            "userid" : 3,
            "mark" : 10
        },
        {
            "userid" : 2,
            "mark" : 9
        }
    ],
    "nr_votes" : 3,
    "sum_votes" : 27
}

Swagger URL: http://localhost:84/apidocs

Using Curl:

To create an item:

curl -X POST -i "http://localhost:84/item/3" -d name=some_name

To vote an item:

ex: user with id 9 votes item 3 with mark 8

curl -X PUT -i "http://localhost:84/item/vote/3" -d mark=8 -d userid=9

To get an item, along with it's average:

curl -i "http://localhost:84/item" 

Photo process

This usecase provides a demo for working with images. Image upload, delete, resize, rotate, brightness adjust and search similar images by uploaded image.

We'll use docker volumes to map the local folder called "container-storage" inside the Docker image. The python webserver will write / delete images in this folder.

One interesting feature of this API is to search similar images. For example you cand take one photo from the container-storage folder, rename it, and modify the brightness or slightly crop it and then try to find it using the API.

To achieve this the script will load all images from disk, hash them using a hasing function and compare the hash differences. It's only a naive implementation for demo purposes, it's main drawbacks are memory limit (all hashes should fit in memory) and linear search times, the complexity of the search it's linear with the number of photo hashed and saved.

The API will expose methods for adding and deleting images along with resizing and rotating and search by similar image

Swagger URL: http://localhost:85/apidocs

API methods using Curl:

To get image with id 1, and rezise it by height 100

curl -i "http://localhost:85/photo/1?resize=100"

To delete image with id 1:

curl -X DELETE http://localhost:85/photo/1

To add a image with id 1:

curl -X PUT -F "file=@image.jpg" http://localhost:85/photo/1

To search images similar with a given one:

curl -X PUT -F "file=@image.jpg" http://localhost:85/photo/similar

Book-collection

A book library. Users are created previously using User CRUD service.

Book profiles are created be created through the API.

Books can be borrowed and an accounting mechanism for this is in place.

Uses Flask Restplus: https://flask-restplus.readthedocs.io

Swagger URL: http://localhost:86

API methods using Curl:

Add a book:

curl -X PUT "http://localhost:86/book/978-1607965503" -H "accept: application/json" -H "Content-Type: application/json" -d "{ \"isbn\": \"978-1607965503\", \"name\": \"Lincoln the Unknown\", \"author\": \"Dale Carnegie\", \"publisher\": \"snowballpublishing\", \"nr_available\": 5}"

Get a book:

curl -i "curl -X GET "http://localhost:86/book/978-1607965503" -H "accept: application/json"" 

List all books:

curl -X GET "http://localhost:86/book?limit=5&offset=0" -H "accept: application/json" 

Delete a book:

curl -X DELETE "http://localhost:86/book/1" -H "accept: application/json"

Borrow book:

curl -X PUT "http://localhost:5000/borrow/1" -H "accept: application/json" -H "Content-Type: application/json" -d "{ \"id\": \"1\", \"userid\": 4, \"isbn\": \"978-1607965503\", \"borrow_date\": \"2019-12-12T09:32:51.715Z\", \"return_date\": \"2020-02-12T09:32:51.715Z\"}"

List a book borrow:

curl -X GET "http://localhost:86/borrow/1" -H "accept: application/json"

List all book borrows:

curl -X GET "http://localhost:86/borrow?limit=2&offset=0" -H "accept: application/json"

Return a book:

curl -X PUT "http://localhost:86/borrow/return/16" -H "accept: application/json" -H "Content-Type: application/json" -d "{ \"id\": \"16\", \"return_date\": \"2019-12-13T08:48:47.899Z\"}"

Tic tac toe

A tic tac toe game written in flask using flask_session. It has a simple UI.

About tic tac toe (https://en.wikipedia.org/wiki/Tic-tac-toe)

This service renders a html template located in /python/templates/tictactoe.html

Swagger URL: does not have an API

Grafana and InfluxDb

Grafana with InfluxDb integration for displaying sensor data. the MQTT service sends datapoints to InfluxDb and Grafana displays the metrics.

Default credentials are: admin / admin

The Grafana web interface is available at: http://localhost:3000

The Grafana API methods are available here: https://grafana.com/docs/grafana/latest/http_api/

The Grafana docker image creates a dashboard called "SensorMetrics", so in the interface go to home and select it. Tips, in the "sensortype" selector write humidity / temperature or whatever you inserted and then press enter on it. I sometimes saw a bug, if you don't press enter it didn't select the corect sensortype as a default

Inserting humidity datapoint with value 61 data directly into the InfluxDb database:

curl -i -XPOST 'http://localhost:8086/write?db=influx' --data-binary 'humidity value=61'

Inserting humidity datapoint with value 61 using MQTT:

mosquitto_pub -h localhost -u some_user -P some_pass -p 1883 -d -t sensors -m "{\"sensor_id\": \"humidity\", \"sensor_value\": 61}"

After you have inserted some datapoints, to view the graphs first select the "SensorMetrics", then from the top right corner select "Last 5 minutes" and from sensortype textbox type the sensor name (the one you inserted datapoints for) like "humidity", default is "temperature"

To connect to your local InfluxDb instance use for debug or for fun:

docker ps

You should get something like:

CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS                                        NAMES
88598a4bf2bc   web-standard-pypy                       "pypy /root/flask-mo…"   32 seconds ago   Up 21 seconds   0.0.0.0:801->5000/tcp                        docker-flask-mongodb-example_web-random-pypy_1
2c15756423dc   web-bookcollection-image                "python /root/flask-…"   5 days ago       Up 19 seconds   0.0.0.0:86->5000/tcp                         docker-flask-mongodb-example_web-book-collection_1
f804e912e512   devopsfaith/krakend                     "/usr/bin/krakend ru…"   5 days ago       Up 20 seconds   8000/tcp, 8090/tcp, 0.0.0.0:8080->8080/tcp   docker-flask-mongodb-example_krakend_1
15ea923f378c   web-photo-image                         "python /root/flask-…"   5 days ago       Up 24 seconds   0.0.0.0:85->5000/tcp                         docker-flask-mongodb-example_web-photo-process_1
90003c4c5ea9   backgorund-mqtt-image                   "python /root/flask-…"   5 days ago       Up 27 seconds   5000/tcp                                     docker-flask-mongodb-example_background-mqtt_1
e9a21902695c   web-geolocation-image                   "python /root/flask-…"   5 days ago       Up 29 seconds   0.0.0.0:83->5000/tcp                         docker-flask-mongodb-example_web-geolocation-search_1
6896e7c793bc   web-standard                            "python /root/flask-…"   5 days ago       Up 28 seconds   0.0.0.0:84->5000/tcp                         docker-flask-mongodb-example_web-baesian_1
32bacfb75aa5   web-standard                            "python /root/flask-…"   5 days ago       Up 27 seconds   0.0.0.0:82->5000/tcp                         docker-flask-mongodb-example_web-fulltext-search_1
0e3c2f07d699   web-standard                            "python /root/flask-…"   5 days ago       Up 25 seconds   0.0.0.0:81->5000/tcp                         docker-flask-mongodb-example_web-users_1
19edd4cd73fa   web-standard                            "python /root/flask-…"   5 days ago       Up 22 seconds   0.0.0.0:800->5000/tcp                        docker-flask-mongodb-example_web-random_1
cb77566c0cd5   web-users-fastapi-image                 "uvicorn users-fasta…"   5 days ago       Up 30 seconds   0.0.0.0:88->5000/tcp                         docker-flask-mongodb-example_web-users-fast-api_1
d220b6da78cf   docker-flask-mongodb-example_grafana    "/app/entrypoint.sh"     5 days ago       Up 23 seconds   0.0.0.0:3000->3000/tcp                       docker-flask-mongodb-example_grafana_1
0f4b9577ce7a   docker-flask-mongodb-example_influxdb   "/app/entrypoint.sh"     5 days ago       Up 28 seconds   0.0.0.0:8086->8086/tcp                       docker-flask-mongodb-example_influxdb_1
af82fc58a992   mongo:latest                            "docker-entrypoint.s…"   5 days ago       Up 32 seconds   0.0.0.0:27017->27017/tcp                     docker-flask-mongodb-example_mongo_1
5aa9566539bc   docker-flask-mongodb-example_mqtt       "/docker-entrypoint.…"   5 days ago       Up 30 seconds   0.0.0.0:1883->1883/tcp                       docker-flask-mongodb-example_mqtt_1

Now lunch the influx shell inside the container replacing 035124f1b665 with your own container id like so:

docker exec -it 035124f1b665 influx

And you're inside the influx shell, and you can issue commands like:

SHOW DATABASES
-- will print all databases
USE influx
-- selects "influx" database
SHOW MEASUREMENTS
-- shows currents measuremets: like temperature humidity etc
SELECT * from "humidity"
-- shows all records from humidity

More examples in the documentation: https://docs.influxdata.com/influxdb/v1.7/introduction/getting-started/

User CRUD fastapi

Same as User CRUD, written with fastapi framework (https://fastapi.tiangolo.com/):

Swagger URL: http://localhost:88/docs

API gateway using Krakend

Website: https://www.krakend.io

Web configurator: https://designer.krakend.io/

The API gateway is installed inside this project using a docker container in docker.compose.yml, it loads it's configuration from krakend.json

For demo purposes i've configured the gateway for two endpoints. The Krakend runs on port 8080

  1. the random generator service (web-random), GET request
    curl -i "http://localhost:8080/random?upper=10&lower=5"

PUT request:

curl -X PUT -i "http://localhost:8080/random" -d lower=10 -d upper=20
  1. the users service, (web-users), GET all users request
    curl -i "http://localhost:8080/users?limit=5&offset=0"

All requests can be configured through this gateway using the json file or the web configurator.

Deployment using Kubernetes (an alternative to docker-compose)

Works for full text search and random demo.

Prerequisites :

Steps:

  1. Apply the kubernetes configuration:
    kubectl apply -f ./kubernetes
  2. Activate the minikube tunnel in a separate console
    minikube tunnel
  3. Look up services ips:
    
    kubectl get services

it will show up smth like

NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE fulltext-search-service LoadBalancer 10.108.117.106 10.108.117.106 82:30902/TCP 6s random-demo-service LoadBalancer 10.108.246.115 10.108.246.115 800:30858/TCP 6s

Now visit the app in your browser ex: http://external_ip_for_random_demo_service:800/apidocs

# User CRUD service using GraphQl

Url: http://localhost:90/graphql

````graphql
# Get a list of all users
query AllUsers {
  listUsers {
    success
    errors
    users {
      userid
      email 
      name
    }
  }
}

# Get a single user
query GetUser {
  getUser(userid: 1) {
    user {
      userid
      name
      email
      birth_date
      country
    }
    success
    errors
  }
}
# Create / update a user
mutation CreateNewUser {
  upsertUser(
    userid: 1, 
    name: "ana",
    email: "ana@gmail.com"    
    birth_date: "2000-01-05 00:00:20"
    country: "Romania"
  ) {
    user {
      userid
      name
      email
      birth_date
      country      
    }
    success
    errors
  }
}

# Delete a user
mutation DeleteUser {
  deleteUser(userid:"2") {
    user {
      userid
      email
    }
    success
    errors
  }
}

Contributing

Here

Folder structure

├── check_ports.sh                           -> Script that checks the free ports
├── container-storage                        -> Conatins some demo pictures for the photo process
│ ├── 1.jpg
│ ├── 2.jpg
│ ├── 3.jpg
│ ├── 4.jpg
│ └── 5.jpg
├── CONTRIBUTING.md                          
├── docker-compose.yml 
├── docker-grafana                           -> Grafana docker conainer files
│ ├── configuration.env
│ ├── datasources
│ │ └── influx.json
│ ├── Dockerfile
│ └── entrypoint.sh
├── docker-influxdb                          -> Influxdb docker container files
│ ├── configuration.env
│ ├── Dockerfile
│ └── entrypoint.sh
├── docker-mosquitto                         -> Mosquitto docker container files
│ └── Dockerfile
├── docker-python                            -> Python docker container files
│ ├── base
│ │ └── Dockerfile
│ ├── Dockerfile
│ └── project
├── docker-python-pypy                       -> Python (with pypy accelerator) docker container files
│ └── Dockerfile
├── docker-redis                             -> Redis docker container files
│ ├── Dockerfile
│ └── redis.conf                                                                           
├── import.sh                                -> Script for importing some demo data into MongoDb                                                       
├── krakend.json                             -> Config file for krakend
├── kubernetes                               -> Kuberenetes configuration files
│ ├── fulltext-search-deplyment.yaml          
│ ├── fulltext-serarch-service.yaml
│ ├── mongodb-deplyment.yaml
│ ├── mongodb-service.yaml
│ ├── random-demo-deplyment.yaml
│ └── random-demo-service.yaml
├── LICENSE
├── python                                   -> Python source files, also contains requirements
│ ├── graphql                                -> Graphql demo files
│   ├── schema.graphql                       -> Graphql schema
│   ├── users.py                             -> Users endpoint
│ ├── baesian.py
│ ├── bookcollection.py
│ ├── caching.py
│ ├── diagrams_generator.py
│ ├── fulltext_search.py
│ ├── geolocation_search.py
│ ├── mqtt.py
│ ├── photo_process.py
│ ├── python_app.log
│ ├── random_demo.py
│ ├── requirements-dev.txt
│ ├── requirements-fastapi.txt
│ ├── requirements-mqtt.txt
│ ├── requirements-photo.txt
│ ├── requirements-restplus.txt
│ ├── requirements.txt
│ ├── templates                              -> Html templates
│ │ └── tictactoe.html
│ ├── tictactoe.py
│ ├── users-fastapi.py
│ ├── users.py
│ └── utils.py
├── python_app.log
├── README.md
├── resources                                -> Documentaion resources like diagrams
│ ├── autogenerated.png
│ ├── diagram.jpg
│ ├── diagram.odp
│ └── grafana.png
├── secrets                                  -> Docker secrets like user, passwords
│ ├── mqtt_pass.txt
│ ├── mqtt_user.txt
│ └── redis_pass.txt
├── stresstest-locusts                       -> Stresstesting using locusts.io
│ ├── baesian.py
│ ├── fulltext_search.py
│ ├── geolocation_search.py
│ ├── random_demo.py
│ └── users.py
├── tests                                    -> Unit testing
│ ├── conftest.py
│ ├── requirements.txt
│ ├── resources                              -> Unit testing resources like pictures, files
│ │ └── test.jpg
│ ├── test_baesian.py
│ ├── test_bookcollection.py
│ ├── test_fulltext_search.py
│ ├── test_geolocation_search.py
│ ├── test_mqtt.py
│ ├── test_photo.py
│ ├── test_random_demo.py
│ ├── test_users_fastapi.py
│ ├── test_0_users.py
│ └── utils.py                                -> Testing utils
└── test.txt