joedicastro / vps-comparison

A comparison between some VPS providers. It uses Ansible to perform a series of automated benchmark tests over the VPS servers that you specify. It allows the reproducibility of those tests by anyone that wanted to compare these results to their own. All the tests results are available in order to provide independence and transparency.
https://github.com/joedicastro/vps-comparison#automation
MIT License
1.42k stars 81 forks source link

Expand providers and plans to cover & future of the project. #22

Open joedicastro opened 7 years ago

joedicastro commented 7 years ago

I'm getting a lot of requests to extend the comparison to another plans/providers (here, in HN, in Reddit, by mail, twitter, etc). I want to, I would like to extend it, but there is a series of constrains. I try to address this here and I would welcome your ideas and suggestions about this topic.

Currently I'm only benchmarking against some providers and plans, all limited by two boundaries:

The reasons for this are mainly two:

Time is the factor more easy to address here, because I can always automate the process more and more, and only some general data would have to be researched by hand. I'm still restricted by my spare time, but I think that I can cover several new plans per week without some much hassle.

But costs, if I'm not wrong, could be a major problem in a short term. Perform those tests currently take several hours, and some times shit happens and you have to repeat them. Also you need more than one instance if you want to find one that is representative of the median values, without too good or too poor performance. Thus, I you have several instances and several providers, and with big instances with monthly costs above $200 or more, this could be a reasonable big number very soon.

At this moment, I only have tree ideas to try to find a way to make this possible:

I actually have several plans in mind to extend the comparison by myself, but beyond that I have nothing planned.

Any suggestion or idea is welcome, I would appreciate it a lot if you share your thoughts about this with me.

Personally I have a special interest in keep this as independent, reproducibly, transparent and trustworthy as possible. Mainly because I started this from the frustration of not finding any service on the web that could guarantee me this.

joedicastro commented 7 years ago

Well, in order to have a clear path into this I'm starting a round of contacts with the providers in the open about this project. Initially I would contact with the current providers reviewed in the comparison.

  1. I contacted about this with one of the current providers and the response was... no response. They avoided to answer the question. I interpreted it as they are not going to stop me to doing the tests, but also they are not going to support me to make them.
MarcosBL commented 7 years ago

I suppose it's a waterfall, as soon as a couple of them say yes, it would be easier to get more aboard, but maybe the problem is a lack of visibility: you have a great comparision, but not a shiny website (cheap template + domain) with some data, graphs, etc... explaining methodology, etc ala http://www.hostbenchmarker.com/ http://www.hostbenchmarker.com/performance-testing

Maybe having something like that, published again @ HN, and doing the mass-mailing/contact at the same time as publishing could help. Apart from that, you can always use affiliate links in the "Offers" section of the website, as long as you clearly state that "Affiliate links used to help pay server expenses".

Just my 2 cents.

joedicastro commented 7 years ago

@MarcosBL I'm definitely going to put this into a simple website, that's a relatively easy step. I'm going to keep to evolve this project until that point for sure, and improve it a bit more. And if that works, I could go the entire path to a interactive website with all the bells and whistles.

But the key here for me is transparency, to get the providers with worst results into this would be very hard to do, I'm afraid, specially if I put the logs to the naked eye.

Affiliate links would work for sure for the best providers, those that people would choose after see the comparison, but it still would be a good number of them without that kind of support. I predict that I would be something like a Gaussian bell curve. And probably would be the unique solution to those providers that would not work with me in this, but still think that it would fall short.

Thanks for share your thoughts!

jgabor commented 7 years ago

@joedicastro We at @UpCloudLtd would love to be included in these comparisons and are willing help you out in some way to make it happen. We already have a dynamic Ansible-playbook that you could use.

The only caveat is that we don't provide any $5/mo plans. We would be happy to provide any credits you would require, but we insist that the tests should be run independently and as transparent as possible.

Feel free to drop me an email at jonathan.gabor@upcloud.com if you have any questions!

lcts commented 6 years ago

Hashes ( #3 ) are a good idea, but won't stop anyone from more directly messing with the results. So I'd say go with what science does: Peer review.

Additionally one could

As for financing, you could of course also ask for donations to offset the costs, but I have no idea about how effective that is.

lcts commented 6 years ago

Oh, and about providers not responding: For one thing a lot of them are on github as well - just include them when you upload their benchmarks, maybe they'll follow @UpCloudLtd 's lead.

jgabor commented 6 years ago

FYI: we have since my last comment launched a $5/mo plan. 🙂

Announcement: https://www.upcloud.com/blog/introducing-new-5-mo-plan/