Open Tonnulus opened 1 year ago
Platform container does not support multi-threading because its based on nodejs that is not a multi threaded by design. However you can deploy multiple instances of the platform container to split the load depending of your needs. You can also activate/deactivate some managers to also split the load between platform instances.
Thank you for your answer. How to activate/deactivate these managers ?
About NodeJs are you sure ? https://betterprogramming.pub/scaling-node-js-applications-with-multiprocessing-b0c25511832a
Use case
I would like to improve the performance of my OpenCTI platform, as I am facing scalability issues due to the limitations of the OpenCTI container.
Current Workaround
I have consulted the article https://blog.filigran.io/opencti-platform-performances-e3431b03f822, which has helped me optimize the performance of my platform.However, the OpenCTI container is not currently scalable enough to handle the expected increase in clients and connectors.
The OpenCTI container is the only one running on the VM, all other services are hosted on different VM. For now, I don't have any performance issues on this VM. However, I have already noticed a high CPU consumption when connectors are working, and I don't have a solution to improve the performance of the OpenCTI container.
In addition, I have noticed that the container does not seem to support multi-threading, which could be contributing to the performance limitations.
Proposed Solution
I would like to propose the implementation of multi-threading in the OpenCTI container to enhance its scalability and performance. It would also be interesting to explore whether the container can be broken down into smaller modules to improve its flexibility.
Additional Information
I would like to express my gratitude for the quality of the OpenCTI platform ;)