Closed ghost closed 5 years ago
The answer to this question depends on your scenario.
For general use the recommended EC2 instance type is m5.xlarge
which has 16g of memory. The memory allocation should be something like 4g to Snowstorm and 6g to Elasticsearch with a few g free for disk level caching which Elasticsearch needs to work well.
As always you should perform load testing for your expected usage and use monitor the JVMs in production. Functions that take the most resources are RF2 import and some of the more complex ECL statements.
Snowstorm has been designed to scale horizontally allowing additional pairs of Elasticsearch nodes and Snowstorm instances to be added to a cluster to deal with demand but this has not yet been thoroughly tested. We know that some parts of the authoring functionality will not yet work in this configuration, that will be addressed in the first half of this year.
Thank you @kaicode
Hello @kaicode ,
One followup question, for just prototypin and internal testing, what's the minimal instance confiuration you'd recommend? Very little traffic.
Regards, Vybhav
Hello @kaicode ,
One followup question, for just prototypin and internal testing, what's the minimal instance confiuration you'd recommend? Very little traffic.
Regards, Vybhav
It may run on an 8g ram instance. I would try giving Snowstorm 2560m and Elasticsearch 3g. Use the Elasticsearch configuration file config/jvm.options with the memory options -Xms3g and -Xmx3g. If it's a development machine you can use the spot instance type to get a better price.
Great. Thank you @kaicode . I'll give that a shot
Hello Team,
What is the recommended EC2 instance type to run the SNOWSTORM server?
Thank you, Vybhav