StatisticalMice / performance-tests

Web based performance tests
MIT License
0 stars 1 forks source link

Genie v1 prod with own renderers #1

Closed essenciary closed 3 years ago

essenciary commented 3 years ago

This setup produced the best performance on my machine.

StatisticalMice commented 3 years ago

I thought about the logging. In my opinion, logging is necessary for a production server, so it should be retained. If it makes this slower, then it is slower.

StatisticalMice commented 3 years ago

I found it easier to combine some of your changes with my local changes and commit them separately.

essenciary commented 3 years ago

Oh course re logging. What matters is that: a) there is choice (people can disable logging, log to file, log to services, etc) b) when comparing various frameworks, they should run in similar conditions (eg number of threads, production settings if available, logging, etc). Taking things further, Genie already provides a lot more features out of the box - ex html(...) and json(...) both output the correct Content-Type while HTTP.jl seems to output text for both endpoints. So the HTTP.jl test should be extended to include the correct headers (this should not be optional/omitted when returning HTML or JSON responses).

StatisticalMice commented 3 years ago

I created a ticket about the difference between Genie.jl and HTTP.jl.

StatisticalMice commented 3 years ago

I'm not sure what to do about the logging in tests. Maybe it should be disabled.

essenciary commented 3 years ago

Oh, and another thing re logging is that in production Genie logs to file (which is the right thing to do). So logging needs to be similar between frameworks (either disable file logging for Genie or enable for others).

Overall the best approach would be to start off with the requirements for the tests, and make sure every framework meets them. These could be: 1/ running in production environment (as offered by each framework) 2/ HTML, JSON and text responses (with proper content-type headers) 3/ Logging to the same backend (console and/or text file)

essenciary commented 3 years ago

While setup up similarly (same output), logging should not make any difference, given that they use Julia's logger. Disabling should be fine (it will remove noise from both frameworks). Flask should run in the same conditions though in terms of logging.

If any framework makes optimizations for logging (eg batching writes or something) these would be then relevant, as they'd employ different/smarter strategies (so maybe Flask?)

StatisticalMice commented 3 years ago

Removing the logging sounds good, at least for the moment.

As to Flask, it’s not a good test target. I spent 15 minutes coding it, and the reason was debugging problems when running on a Mac. Flask would also be the one benefiting most from a nginx in front. Nginx won’t happen for a while at least.

StatisticalMice commented 3 years ago

I removed logging.

My plan is that I run the official benchmarks always on Google Cloud. I created terraform/ansible scripts that set up the environment. (They don't quite work yet; they install things but communication doesn't work possibly due to firewall rules.)

Then I'd have shell scripts that start one of the servers, or the goose attack client. They worked locally, but not yet on the cloud.

I'll close this PR as it's no longer serving a useful purpose, but further PRs are welcome.