Open catdad opened 8 years ago
@stezu Do you see the same thing, or is this a stupid Windows thing? req
is a request for a 10ish line text file from IIS... I'm sure Apache will work too.
Might be a false alarm. It looks to be an IIS issue. Here are the same results when requesting the same file from a NodeJS server:
D:\Git\grandma>grandma run req -d 20s -r 2500 --threads 1 | grandma report
Summary: duration rate concurrent total
20s 2500 null 50000
Latencies: mean 50 95 99 max
fullTest 2.223ms 2.132ms 2.687ms 6.008ms 18.276ms
Successes: 50000
Failures: 0
D:\Git\grandma>grandma run req -d 20s -r 2500 --threads 2 | grandma report
Summary: duration rate concurrent total
20s 2500 null 50000
Latencies: mean 50 95 99 max
fullTest 1.993ms 2.198ms 3.322ms 7.446ms 23.385ms
Successes: 50000
Failures: 0
D:\Git\grandma>grandma run req -d 20s -r 2500 --threads 3 | grandma report
Summary: duration rate concurrent total
20s 2500 null 50000
Latencies: mean 50 95 99 max
fullTest 1.575ms 1.242ms 3.475ms 7.363ms 36.238ms
Successes: 50000
Failures: 0
D:\Git\grandma>grandma run req -d 20s -r 2500 --threads 4 | grandma report
Summary: duration rate concurrent total
20s 2500 null 50000
Latencies: mean 50 95 99 max
fullTest 2.307ms 1.329ms 5.092ms 8.865ms 50.534ms
Successes: 50000
Failures: 0
4 threads still yields a bit slower times, but not nearly as much as the test running against IIS. CPU usage was at 100% for 2 threads when running against IIS, but it only went up to 80% with 4 threads running against Node. This just further validates my choice to be a Node developer over working in C#.
For reference, the Node code (using the same text as the original share.txt file):
var http = require('http');
http.createServer(function(req, res) {
res.write('Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean sed ante eleifend, porta diam ut, feugiat felis. Nullam sollicitudin lacus ac tincidunt blandit. Maecenas posuere tellus sed est pellentesque hendrerit. Vestibulum luctus sem et dapibus fermentum. Vivamus non finibus odio, vitae facilisis nulla. Maecenas commodo tempor dolor, nec rutrum nunc. Duis id aliquam nisl, lacinia fermentum mauris. Suspendisse fringilla, augue aliquet dignissim vulputate, mauris metus blandit orci, non tempor mauris nisl a nunc. Vestibulum sodales molestie tempor. Duis vestibulum eros at finibus cursus. Cras commodo placerat augue. Morbi non varius enim, ac pulvinar libero. Cras luctus ullamcorper massa ut feugiat. Suspendisse potenti. Sed tristique vestibulum odio et tristique. Nulla facilisi. Nunc consectetur lacinia velit, sit amet molestie eros placerat sed. Aliquam sed placerat urna. Donec sed maximus ligula, ac hendrerit velit. Vivamus placerat et justo a pretium. Donec in aliquet tortor, consequat condimentum dui. Sed accumsan varius quam. Sed maximus a ex sed elementum. Donec enim odio, consequat quis viverra ac, tristique vel libero. In at gravida neque. Aenean id dolor et elit porta finibus. Quisque quis iaculis neque. Duis et sem neque. Nam consequat tincidunt risus, id lacinia sem luctus sed. Vestibulum accumsan leo vitae tellus tincidunt, ut dictum libero bibendum. Proin tincidunt cursus commodo. Suspendisse consectetur leo euismod nulla efficitur, at scelerisque velit pharetra. Fusce eget accumsan felis. Aenean magna urna, malesuada in magna id, viverra finibus nisl. Sed nec tellus a libero congue ultrices vel ac urna. Aenean hendrerit vulputate massa, sit amet bibendum tellus hendrerit bibendum. Morbi auctor suscipit metus sit amet feugiat. Sed viverra commodo sagittis. Integer maximus ex ac tortor cursus cursus. Cras auctor urna quis sapien tincidunt, a condimentum risus lacinia. Donec vitae aliquam nulla, nec cursus diam. Maecenas nunc justo, condimentum a iaculis cursus, tincidunt id purus. Donec vehicula volutpat neque dictum ultrices. In hac habitasse platea dictumst. Curabitur quis ante pretium, ornare dui id, porttitor arcu. Phasellus vitae velit malesuada, consectetur est non, vestibulum ex. Mauris in quam ut ligula faucibus volutpat sit amet interdum nulla. Nullam in est tincidunt, vulputate risus quis, ullamcorper mi. Etiam consectetur semper felis vitae feugiat. Vivamus auctor aliquam lacinia. Curabitur vitae facilisis mi, quis iaculis ante. Nam bibendum consequat lorem, sed luctus diam pulvinar in. Quisque id facilisis massa, vitae scelerisque leo. Duis dapibus magna nec nibh mollis ullamcorper.');
res.end();
}).listen(81);
Well, using 0.0.22 on my macbook air I get these kinds of results running the fixtures/req.js
test on the node server from your last comment:
> grandma run index -d 20s -r 1000 --threads 1 --directory . --out test.log
Summary: duration rate concurrent total
20s 1000 null 16975
Latencies: mean 50 95 99 max
fullTest 1863.967ms 1920.467ms 3541.624ms 3778.187ms 6915.322ms
Successes: 16975
Failures: 0
Apache had similar results, the tests just took longer and longer as time went on. The first few were under 30ms and the last few were 3800ms.
MacBook Potato?
Good point that I should also look at the graph though. It seems that both the Node and IIS versions (4 threads) hover around the 2ms to 10ms times, for the most part, and sometimes, go up to 25ms-ish. However, the IIS version has a bunch of requests at the very start that take 600ms, which skews all the numbers. IIS is stupid.
Do threads make things slower? I could see the messaging system having enough overhead to slow everything the fuck down?