iheartradio / kanaloa

Make your service more resilient by providing protection against traffic oversaturation
http://iheartradio.github.io/kanaloa
Other
124 stars 13 forks source link

Implement load shedding feature #101

Open kailuowang opened 8 years ago

kailuowang commented 8 years ago

In some cases load shedding might be more useful than back pressure. We need to implement this based on proved algo.

dt-rush commented 5 years ago

@kailuowang what do you take the difference to be between the two? I've only heard them used synonymously.

dt-rush commented 5 years ago

I can hazard a guess that you mean:

backpressure: dropping a proportion of incoming requests

load shedding: dropping a proportion of queued requests

Is that correct?

kailuowang commented 5 years ago

You got it right @dt-rush

dt-rush commented 5 years ago

@kailuowang additionally, by "we need to implement this based on proved algo", do you mean, we should implement this, but only if we can have some proofs around it? Or are you implying that this is part of the IETF paper on bufferbloat which inspired this project? Because I don't see a mention of load shedding in the paper, only backpressure.

Is there a particular instance in which you can imagine that load-shedding would be more useful than the backpressure? I'm guessing you found some kind of edge case where dropping incoming requests alone does not tend to keep the queue within reasonable performance bounds.

kailuowang commented 5 years ago

@dt-rush what I meant we should seek a (or possibly more than one) load shedding algorithm that is already battlefield tested and published. I haven't done any research yet.

To be honest, a coworker suggested a use case for load shedding and now I forgot the detail of it, but something along the line that allowing users to send with each request a per-request timeout and/or priority. This would make the drop more selective, but obviously with more computation cost, so probably only useful when the actual workload for each request is significantly higher.