MeanEYE / Sunflower

Small and highly customizable twin-panel file manager for Linux with support for plugins.
GNU General Public License v3.0
427 stars 41 forks source link

Long delay sorting file list with 1000+ items #490

Closed joshas closed 3 years ago

joshas commented 3 years ago

Opening a directory with 1000+ directories (yes, you guessed it, it was node_modules) takes about 20 seconds to sort. And whole UI just freezes. I've started digging in code and ended up at this strange line: https://github.com/MeanEYE/Sunflower/blob/7225b7992374e736b2e6846d3a962d351e8bed38/sunflower/plugins/file_list/file_list.py#L1744-L1745

Not really sure what was the intention, and why such arbitrary number as "100" was chosen here, but I have a feeling, that _flush_queue function does not work as intended. When _item_queue reaches 100 items, for each item above a new _flush_queue thread is started, and if I'm not mistaken, that ends up being 1000+ threads (minus 100).

Next thing I did, was increasing that number from 100 to 2000. This gets whole directory with 1000+ items sorted in 2 seconds.

MeanEYE commented 3 years ago

Ooh, it should be == instead of >. Queue should be flushed every 100 items or so. We definitely don't need to queue so many flushes. I'll fix it. Number 100 is arbitrary and something I came up by feel. I have pushed the change, can you please test it should be better now.

joshas commented 3 years ago

After the fix _flush_queue function gets called only a single time. With exactly 1061 directories it should be called at least 10 times. While this fixes my issue with slow loading times, I wonder if _item_queue is not actually cleared when running in different thread?

MeanEYE commented 3 years ago

Since it's a queue it probably getting populated as it's getting emptied. And since we are looking for exact number of items in a queue it never occurs. Probably sub-optimal solution. Handling items in batches is the desirable effect.