Open joedicastro opened 7 years ago
Well, in order to have a clear path into this I'm starting a round of contacts with the providers in the open about this project. Initially I would contact with the current providers reviewed in the comparison.
I suppose it's a waterfall, as soon as a couple of them say yes, it would be easier to get more aboard, but maybe the problem is a lack of visibility: you have a great comparision, but not a shiny website (cheap template + domain) with some data, graphs, etc... explaining methodology, etc ala http://www.hostbenchmarker.com/ http://www.hostbenchmarker.com/performance-testing
Maybe having something like that, published again @ HN, and doing the mass-mailing/contact at the same time as publishing could help. Apart from that, you can always use affiliate links in the "Offers" section of the website, as long as you clearly state that "Affiliate links used to help pay server expenses".
Just my 2 cents.
@MarcosBL I'm definitely going to put this into a simple website, that's a relatively easy step. I'm going to keep to evolve this project until that point for sure, and improve it a bit more. And if that works, I could go the entire path to a interactive website with all the bells and whistles.
But the key here for me is transparency, to get the providers with worst results into this would be very hard to do, I'm afraid, specially if I put the logs to the naked eye.
Affiliate links would work for sure for the best providers, those that people would choose after see the comparison, but it still would be a good number of them without that kind of support. I predict that I would be something like a Gaussian bell curve. And probably would be the unique solution to those providers that would not work with me in this, but still think that it would fall short.
Thanks for share your thoughts!
@joedicastro We at @UpCloudLtd would love to be included in these comparisons and are willing help you out in some way to make it happen. We already have a dynamic Ansible-playbook that you could use.
The only caveat is that we don't provide any $5/mo plans. We would be happy to provide any credits you would require, but we insist that the tests should be run independently and as transparent as possible.
Feel free to drop me an email at jonathan.gabor@upcloud.com if you have any questions!
Hashes ( #3 ) are a good idea, but won't stop anyone from more directly messing with the results. So I'd say go with what science does: Peer review.
Additionally one could
As for financing, you could of course also ask for donations to offset the costs, but I have no idea about how effective that is.
Oh, and about providers not responding: For one thing a lot of them are on github as well - just include them when you upload their benchmarks, maybe they'll follow @UpCloudLtd 's lead.
FYI: we have since my last comment launched a $5/mo plan. 🙂
Announcement: https://www.upcloud.com/blog/introducing-new-5-mo-plan/
I'm getting a lot of requests to extend the comparison to another plans/providers (here, in HN, in Reddit, by mail, twitter, etc). I want to, I would like to extend it, but there is a series of constrains. I try to address this here and I would welcome your ideas and suggestions about this topic.
Currently I'm only benchmarking against some providers and plans, all limited by two boundaries:
The reasons for this are mainly two:
Time is the factor more easy to address here, because I can always automate the process more and more, and only some general data would have to be researched by hand. I'm still restricted by my spare time, but I think that I can cover several new plans per week without some much hassle.
But costs, if I'm not wrong, could be a major problem in a short term. Perform those tests currently take several hours, and some times shit happens and you have to repeat them. Also you need more than one instance if you want to find one that is representative of the median values, without too good or too poor performance. Thus, I you have several instances and several providers, and with big instances with monthly costs above $200 or more, this could be a reasonable big number very soon.
At this moment, I only have tree ideas to try to find a way to make this possible:
That the providers itself would help me to review their servers. Usually they would give me free credits for this. I actually have 2 offers to do this from two different providers. The only concern here is to keep the independence of the test, thus I would keep doing them by myself once I have solved the issue #3 to ensure that the numbers weren't coked in any way.
Find a way to finance this. Here I'm totally lost, I don't even know if there are people interested in this until this point.
You could run the benchmarks on your plans and merge the numbers into the project. Making a pull request would be the most suited way. Only the logs folder would be necessary (or the generated tables if you wish), I would put the tables and chars in the README.org to save you work. Here the issue #3 is again a main factor to ensure that the numbers are truthful.
I actually have several plans in mind to extend the comparison by myself, but beyond that I have nothing planned.
Any suggestion or idea is welcome, I would appreciate it a lot if you share your thoughts about this with me.
Personally I have a special interest in keep this as independent, reproducibly, transparent and trustworthy as possible. Mainly because I started this from the frustration of not finding any service on the web that could guarantee me this.