francoisruty / fruty_trading-automation

41 stars 15 forks source link

Procedure

Make sure all services are up and running. NOTE: pgweb is often down at first because postgres is still starting. Wait a few seconds, re-run docker-compose up -d and pgweb will come online.

docker-compose exec postgres /bin/bash psql --username=trading \i /init/init.sql; \q exit

You should see a few failed tasks (when we hadn't created yet the DB table), and the latest tasks must be successful.

You should see some records in the forex_data_eurusd table.

Click on "add data source"

Name: postgres

Type: PostgreSQL

Host: postgres

Database: trading (see .env file)

User: trading (see .env file)

Password: trading (see .env file)

SSL mode: disable

Then click on "create a dashboard", type "Graph". When your graph panel is created,

Click on its title and then on "Edit".

Select Postgres data source, and for the query, enter:

SELECT $__time(time), value_open FROM forex_data_eurusd WHERE $__timeFilter(time)

Click on the "eye" icon on the right to test the query. You should see a graph.

Then click on "Save dashboard" on the top menu.

That's it! You now have a fully functioning automated trading platform!

Functional Notes

The logic is implemented in the celery python worker, in worker/tasks.py and script.py

In order to update the logic, go and read ib_insync package documentation (https://github.com/erdewit/ib_insync). You'll probably want first to fetch more diverse data, and then, ultimately, to code a trading strategy and place buy/sell orders. To achieve this, you'll need to understand the ib_insync calls and use them in the python worker logic.

Of course, you might have to create new tables in the Postgres DB, if you start fetching new kinds of data.

You can edit the app.conf.beat_schedule dict in worker/tasks.py to edit the frequency at which Celery Beat runs the tasks you created.

NOTES