Closed MrOrz closed 4 years ago
@darkbtf
現在 elasticsearch 是整個在 docker container 裡的。
如果每隔一天固定 docker commit
然後 push 上 docker hub,是一個可行的備份手段嗎 XD
後來我覺得上 docker hub 不太好。 雖然我一直很憧憬 docker pull 下來、接上 kibana 就能對 production db snapshot 做一些分析這件事, 但是 users 這個 index 含有 email 個資,不應該直接上網。
We now have a backup script that generates zip files from DB's docker volume. Maybe we just need a remote backup now.
Actually it's already done by cron job running this script:
#!/bin/bash
SNAPSHOT_NAME=`date --iso-8601`
cd ~docker/rumors-deploy
/usr/local/bin/docker-compose exec -T db curl -XPUT localhost:9200/_snapshot/gcs/$SNAPSHOT_NAME
Where gcs is a google cloud storage repository https://www.elastic.co/guide/en/elasticsearch/plugins/master/repository-gcs.html
Should backup elasticsearch DB to S3 periodically using a backup script activated from a cron job.
Guide: https://www.elastic.co/guide/en/elasticsearch/guide/current/backing-up-your-cluster.html Reference: https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html
S3 plugin: https://www.elastic.co/guide/en/elasticsearch/plugins/5.1/repository-s3.html (Its installation should be included in the seed script!)