Optimise filtering very large sets of choices by not computing the filter-value inside the loop, and by caching each result-set (and resetting it completely when adding items)
Why are we doing this?
I'm using very large lists of choices (9000 items), and with an active filter, it'd take more than a second to redraw the menu when using arrow-keys or adding more filtering.
Benefits
Large lists will perform better.
Drawbacks
Memory usage will increase due to caching the result sets.
Requirements
Put an X between brackets on each line if you have done the item:
[] Tests written & passing locally?
[] Code style checked?
[] Rebased with master branch?
[] Documentaion updated?
Coverage increased (+0.007%) to 97.146% when pulling 06ee6f0571322ab9bab11dc40074d9ef4428fc21 on kvs:patch-1 into d7bff789af04946f5f6a9ac8d54b3197e8997136 on piotrmurach:master.
Describe the change
Optimise filtering very large sets of choices by not computing the filter-value inside the loop, and by caching each result-set (and resetting it completely when adding items)
Why are we doing this?
I'm using very large lists of choices (9000 items), and with an active filter, it'd take more than a second to redraw the menu when using arrow-keys or adding more filtering.
Benefits
Large lists will perform better.
Drawbacks
Memory usage will increase due to caching the result sets.
Requirements
Put an X between brackets on each line if you have done the item: [] Tests written & passing locally? [] Code style checked? [] Rebased with
master
branch? [] Documentaion updated?