Closed fandrei closed 12 years ago
Can you quantify this?
(how many files => how much performance decrease?)
On 27 June 2012 07:43, fandrei < reply@reply.github.com
wrote:
Reply to this email directly or view it on GitHub: https://github.com/fandrei/AppMetrics/issues/88
David Laing Open source @ City Index - github.com/cityindex http://davidlaing.com Twitter: @davidlaing
The worst problem was when the folder with files was opened in file manager. Update of any file -> refresh of files list. When it had ~1500 files, it was taking twice as much CPU.
Part of that performance problem will be to do with the EC2 instance size - we're on m1.small, which has the following specs:
Small Instance – default*
1.7 GB memory 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit) 160 GB instance storage 32-bit or 64-bit platform I/O Performance: Moderate API name: m1.small
In specific, note that I/O performance is throttled.
On 27 June 2012 10:19, fandrei < reply@reply.github.com
wrote:
The worst problem was when the folder with files was opened in file manager. Update of any file -> refresh of files list. When it had ~1500 files, it was taking twice as much CPU.
Reply to this email directly or view it on GitHub: https://github.com/fandrei/AppMetrics/issues/88#issuecomment-6598555
David Laing Open source @ City Index - github.com/cityindex http://davidlaing.com Twitter: @davidlaing
It will grow with growth of files count, anyway. But storing archives in subfolders should be enough.
I've made backup tool to store zip archives hierarchically (the same structure as S3 storage)