Closed grayaii closed 8 years ago
The problem is that you are not comparing the same thing. The other tools only compute cyclomatic complexity, so you should compare them with radon cc
only.
To compute cyclomatic complexity Radon makes use of the Python AST parser, which is written in C and therefore it should be quite fast. On the other hand, to compute raw metrics Radon needs the source code tokens. It uses the tokenize
module from the standard library, which is written in Python and really slow compared to AST parsing. I'm willing to bet that the majority of those 2.5 hours are spent in radon raw
. Let me know.
you're right! the cc command is lightning fast, but the raw command is super slow.
Yeah. Unfortunately there's no simple way to work around the slowness of the tokenize
module.
I've been trying various tools to get CC of my project and I found that running radon takes around 2.5 hours to run through my project, whereas other tools like mccabe and lizard take around 5 min to run:
I'm just curious if there is an underlying issue in performance or if you know a way to speed up the execution. thanks for writing this tool.