Open roarin-roran opened 2 years ago
Reuse is usually better; not creating an object even more. But this seems like something worth profiling before spending time on it. The number of merges is O(n) and the objects are small.
sounds good! will leave it as small fry until we can profile with a range of inputs... or I get bored. could get us a little more speed in the cases which are particularly good for us (longer runs)
not creating an object may help with very small inputs (will probably make an issue about that actually - all of our methods seem to have huge overheads for really short lists), but one object per sort should be better than O(n)
Profile it and see if it is relevant. Use random permutations were we get most merges and a reasonable size, say 100k. We don't care to optimize sorting 1000 numbers.
I suppose that those can be special-cased at some point if someone cares about performance at low n... but us doing so is actively harmful to the dev process
currently when we want to merge something, we create a merger, use it once, and then discard it.
@sebawild, do you think it would be better to use one merger to reduce overheads (and just update the values that the merger is using everytime it's reused), or to create a Merger.merge() method that keeps things neat and tidy? I feel like I should do one or the other.
not a priority either way, but a small speedup from reducing overheads seems worth having - and if it isn't, then making the code cleaner and more uniform seems like a good habit.