Closed naarani closed 6 years ago
There's no clear answer to your questions. It depends on the amount and kind of data that you ingest, query, store.
What for sure can be said: One Elasticsearch instance should not have more than 30GB of heap. Elastic provides a pretty good starting point regarding the sizing: General: https://www.elastic.co/blog/found-sizing-elasticsearch Heap sizing: https://www.elastic.co/guide/en/elasticsearch/reference/current/heap-size.html
I would not start with nodes lower than 4GB of RAM. So 2GB for the heap and the rest for the filesystem cache.
ok, thx
What kind of node should support with configuration? some node with 8 giga with 2 CPU each can support this Elasticsearch configuration, or it should start with 16, 32 o more bigger memory requirements?
What is the node memory required and what's to be expected: it can not grow to infinity memory use, so it slow if it should manage too much requests or crash?
thanks