[ ] We need data on how much time and memory it takes on large setup to compute the differences between two arbitrary loadouts.
[ ] We need data on how much time and memory it takes on large setup to compute the differences between a loadout state and what is on filesystem.
We need data on the entire process, including stuff like loading the data from disk or computing the flattened loadout, not just the compare part.
Impact
The data is required to determine how feasible it would be to compute these continuously.
The outcome will inform Design team of what data can be available at any moment and what might instead require a delay to compute.
Examples would be:
Number of files changes that need to be Applied
Size of data to write/delete to disk on apply
What mods contain unapplied changes
What mods contain unapplied changes after switching profiles
Spike
Proposal
We need data on the entire process, including stuff like loading the data from disk or computing the flattened loadout, not just the compare part.
Impact
The data is required to determine how feasible it would be to compute these continuously. The outcome will inform Design team of what data can be available at any moment and what might instead require a delay to compute.
Examples would be:
Supporting Information
See #421 , #918