Open ripper234 opened 8 years ago
We will soon have data export and import feature. Data shouldn't be very large. Is manual backup of files generated by the system using some cloud service is a good solution for the time being?
Well,
People are going to be using the system, adding events, modifying them up until the burn itself. So what does "manual" mean? You personally go to the server and do an export every week? Every two weeks?
The answer is "yes, for the time being, it's ok". This is not a show-stopper for launch. But I'd like to understand if we can solve this now easily, or if not how much it would cost (time/money) to solve later.
It is possible to SSH into the server, connect to the Mongo, dump a backup somewhere and put it in some storage. And if it can be scripted it can also be automated. As you said, for now we can do with manual backups but later we'll be prudent to automate this process. I don't think it should be expensive or time consuming. My point was that we will soon have a decent working solution, and later a good working solution.
No worries, this is not a priority right now. I opened this for tracking, and in order to get some understanding of the devops stack.
Talk to Shimon. He's our servers manager and an expert on devops.
@omerpines can you tag him? Is he on github?
Anyway I posted all our devops issues on #devops in Slack. I'll just find Shimon on Slack.
(Anyone jumping in - note that this is NOT for the upcoming launch in 2 weeks, but for a later milestone).
Currently our app's data is stored on a MongoDB server on the application server.
This means that if the server crashes, we lose all the data.
We should make the system resilient to this. One way is backups (either from within the application, or AWS level backups). I prefer to just put the data layer on another server e.g. our Amazon RDS.
Questions: