kshitij10496 / hercules

The mighty hero helping you build projects on top of IIT Kharagpur's academic data
https://hercules-10496.herokuapp.com/api/v1/static/index.html
MIT License
34 stars 18 forks source link

Create a docker-compose file for the application #39

Closed kshitij10496 closed 5 years ago

kshitij10496 commented 5 years ago

It would be wonderful if we can get the application to run via a single command. This would involve the use of docker-compose for managing multiple containers. Ideally, we should have 2 services - one for the Postgres DB and another for the API.

kshitij10496 commented 5 years ago

Here is a simple, non-functioning compose file I could come up with:

version: '3'
services:
  hercules_db:
    image: postgres
    ports:
      - "5432:5432"
    volumes: # Load the SQL dump to the container
      - ./hercules_backup_20181020.sql:/docker-entrypoint-initdb.d/init.sql
    environment:
      - POSTGRES_PASSWORD: postgres
      - POSTGRES_USER: postgres

  hercules_api:
    build: .
    ports:
      - "8080:8080"
    volumes:
      - .:/go/src/github.com/kshitij10496/hercules
    depends_on:
      - db
    environment:
      - PORT=8080
      - HERCULES_DATABASE=#TODO: Update the connection string

@icyflame Could you help me with this? 😅

kshitij10496 commented 5 years ago

Ping @icyflame

icyflame commented 5 years ago

@kshitij10496 This compose looks fine for two containers. You need to write a Dockerfile for the hercules_api container that will take the go code and run the server.

A common pattern I have seen is people compile the code and copy only the binary to the container from the host, instead of copying the whole thing.

kshitij10496 commented 5 years ago

A common pattern I have seen is people compile the code and copy only the binary to the container from the host, instead of copying the whole thing.

Oh yes! We already have a got a multi-stage build Dockerfile in the house 🎉

The issues I'm facing are related to the PostgreSQL, specifically:

icyflame commented 5 years ago
  1. You can do this by creating a dockerfile for the postgres container. Put the init.sql file in that folder. Then, in the Dockerfile, copy this file into the container and run whatever postgres command there is to initiatlize a database/tables using a dump. This should be a CLI command and can be run using the RUN rule in Dockerfiles
  2. You can connect to containers in the same network using their names if you establish a link in the docker-compose file. You can do it by following the example here. There is some information about this in the dropbox paper which I wrote for the metakgp server overview that we once had in KGP. You should be able to find it (let me know if you can't)

As long as you use docker-compose, you won't even need to worry about the network stuff. Docker-compose does some magic for you.

A good way to test it would be:

  1. add the link in the compose file
  2. install postgres shell in the api container
  3. start the containers using docker-compose up etc
  4. get the shell for the api container
  5. run a "postgres" shell command to connect to the DB in the other container

This way, you will find out the exact connection string to use inside your app/ORM.