Open ohadschn opened 9 years ago
I've never thought about rate-limiting the filteredRows
observable itself, but that's an interesting idea. Currently, in our production applications, if we have a large dataset and want incremental filtering, we create a secondary observable (say, inputFilter
) and throttle/rate-limit it, using it to write to the datatable's filter. Rate-limiting the underlying filteredRows
observable would certainly make incremental changes easier to implement for the user.
That is a clever workaround! However, it won't handle row manipulation scenarios. For example, suppose the 10K city names are additional names you need to add to an existing repository. If the table is sorted, calling addRecord
N times in a tight loop will result in N sort operations performed on the array, instead of one in the throttled case (even if the rate limit is zero).
That's a really great point. I'll get that added in whenever I have a chance.
TLDR, I propose the following:
.extend({rateLimit: 0})
for filteredRows
.extend({rateLimit: this.options.filterRateLimit})
for filter
There is an issue with rate-limiting the filteredRows
computed, viz. that if the filter is set high (for instance, high enough to feasibly handle the Tokyo problem detailed above), it then causes a delay equal to the rate-limit time when changing the other dependencies of filteredRows
(e.g. sorting a field, thereby changing the sortField
observables value).
We can, however, use .extend({rateLimit: 0})
to handle the tight loop problem, as detailed here (at the end of Example 3: Avoiding multiple Ajax requests).
I also propose that we do use a rateLimit
extender for the filter
observable, but perhaps we should have it be configurable. That way, if the dataset is small, the user can choose to use a small (even 0
) value, and if the dataset is large, the user can adjust appropriately.
Good thinking!
Instead of rate-limiting the fileteredRows
observable though, perhaps the rows
observable should be rate-limited? That will solve the tight loop issue while still preserving immediate sorting changes (not that I expect much of a difference between "immediate" and "rate-limit 0" in non-programmatic scenarios). As a bonus, it will prevent the redundant (albeit short) recalculations of rowAttributeMap
too.
Also note that rateLimit
was introduced in KO 3.1.0. I don't mind this dependency, just pointing it out in case you do (in which case throttle
can be used for previous versions).
I'm inclined to think that rate-limiting the filteredRows
observable is still the better option. Consider a (obviously contrived) example of a synchronous loop of calls to the toggleSort
observable. We would then need to also rate-limit toggleSort
. Rate-limiting filteredRows
attacks the problem at the source and alleviates the need to rate-limit any other observables (aside from filter
, for obvious reasons).
In that case perhaps rate-limit both? I admit it's not that important though, either or both will work well in most real-world scenarios (I still give rows
the edge for plausibility though ;) )
Binding an input's
textInput
to the table'sfilter
property is very useful for incremental filtering. However, it may result in unnecessary processing of intermediate user input.For example, suppose you are bound to a list of 10K city names, and the user type in "Tokyo". On each keystroke ('T', 'o', 'k', and so forth) , the filter is re-evaluated which can be costly, especially on weaker machines.
In such cases it may be desirable to throttle changes to the
filteredRows
computed property. Fortunately, KO makes this extremely easy and it boils down to a singleextend
call. The desired throttling interval could be provided in theDataTable
constructor (if not specified, no throttling will take place).Deprecated API (for older KO versions): http://knockoutjs.com/documentation/throttle-extender.html Current API: http://knockoutjs.com/documentation/rateLimit-observable.html