Closed jeremiah-c-leary closed 3 months ago
converting my configuration from yaml to json took one execution down from 2.768s to 2.219s, a gain of 20%! -- sample size of 1. plural of 'anecdote' is 'data'
Now, where did I leave that python profiler...
Wow, that is significant. You would think with YAML having less characters to process would be faster.
I profiled the apply_rules
function, which contains the majority of the features, and came up with this chart:
rule_list.py(fix)
calls rule.py(analyze)
and then rule.py(fix)
rule_list.py(check_rules)
calls rule.py(analyze)
It is interesting that analyze is not called twice as much as fix.
vhdlFile.py(_process_File)
parses the file and classifies all the tokens so they can be analyzed.
The rows in orangish are a series of function calls from the first to the last, with the last one taking the majority of the time. The functions are attempting to find inner pairs of tokens to ensure nested productions are detected correctly. I have tried several methods to improve the performance but none have panned out.
--Jeremy
There are inefficiencies in the code. If these issues are fixed then performance will improve.
Consider:
lower()