Open roym899 opened 3 days ago
I hadn't anticipated navigating such a large directory; I doubt anyone could practically sift through 10,000 entries visually.
While this might not directly solve your issue, a "Fuzzy Finder" tool like fern-mapping-fzf.vim could be helpful.
Please note that a performance issue was previously reported with 2,000 entries (#508) and should be resolved, but I'm unsure if we can reliably handle 10,000+ entries.
@tomtomjhj, any thoughts?
With the script in #508, the version of fern uses less than 100MB for 10000 files. It's not very space efficient, but it's not as bad as described by OP. Please check if you are using the latest version.
However, it still takes several seconds to expand the directory. As briefly mentioned in #508, the majority of the cost comes from sorting with using vimscript function as comparator. In general, vimscript is not fast enough for working with more than 10k+ items.
count total (s) self (s) function
120419 2.205655 <SNR>149_compare()
20000 1.337711 0.499651 <SNR>135_node()
20000 0.404806 0.368561 <SNR>122_to_slash_unix()
1 2.513686 0.307956 <SNR>145_sort()
20000 0.263833 <SNR>145_new()
One way to improve this would be to rewrite some part of fern with vim9script for vim9 and lua for nvim, but the latter would need a bit more care because lua<->vimscript bridge incurs a fair amount of overhead.
Thanks for making this it's my go to explorer currently; however, one thing is bothering me: Sometimes I have folders with lots of files (say 10000 or more). When expanding such folders vim becomes unresponsive for a while and eats up a lot of memory (GBs) when it finally opens the folder. Not sure what's going on here, but this doesn't seem to be expected?
Happy to provide more information if this isn't easy enough to reproduce.