Open HaphLife opened 3 years ago
I have a bad feeling this would slow down the search significantly
Unless my math is wrong, it would take a bit more than double the amount of time, and slight bit more memory. (it would have the delay of 2x the first search + 3x the second search + 4x the third search + 5x the fourth search etc, but mainly only the first and second search matter much.) I could also just be an idiot and it takes way longer.
Memory would also be a significant issue. Also, the same argument applies as here: https://github.com/Earthcomputer/EnchantmentCracker/issues/244#issuecomment-766476473
Yes. In the current design, it only hangs on to the first search till the second search happens, but this would require it to do two second searches with the same first search.
Ugg. I should write some code to show what I mean.
The worst case memory usage is 1x first search results plus 2x second search results, rather than 1x first search results+ 1x second search results.
Search_example.zip Here is a zipped .py file with an example of most of the search structure I am describing, and code around it to "emulate" the rest of the cracker. The main functions of note are "search_repair_single_error()" and more importantly it's subfunction "repair_layer"
The way skipping is implemented is a bad bad bad hack, cause I had an oversight in how my structure worked (while writing this file, the concept of error correction as a whole isn't flawed.). I'm doing a re-write to fix it.
C: Perhaps the XP seed cracker could take an extra input or two, and calculate the search multiple times, excluding each possible option once. (you can re-use the results of the first search for every search, except the one where the first input is excluded, and likewise with the second input, and the third, and so on, to reduce the searching required to only double that of a normal search. This way, the user would only have to add a few extra data points when the XP seed cracking fails, rather than double-checking every single input.