A friend told me about pigaios after I considered implementing something similar. Despite the 4 years since the last commit, I was surprised to see that pigaios almost worked out-of-the box!
I managed to install the old Clang version 5.0 using a Ubuntu 18.04 so that the "-export" features could work right away. The IDA part though needed an update, so I got my hands dirty and tried to keep the code as close as possible to what you did originally.
Two potential issues though:
I had to update this snippet but I'm not sure what you were trying to do in the first place, so I can't test it.
- ti = GetTinfo(f)
- if ti:
- prototype2 = idc_print_type(ti[0],ti[1], func_name, PRTYPE_1LINE)
+ rv = get_local_tinfo(f)
+ prototype2 = ""
+ if rv is not None:
+ (typei, fields) = rv
+ if typei:
+ prototype2 = idc_print_type(typei,fields, func_name, PRTYPE_1LINE)
The pickle file was created with a dependency to an old sklearn version so I recreated it by re-training the model with:
I followed the instructions here https://github.com/joxeankoret/pigaios/issues/19 but I expected to find information in the output about other additional classifiers besides the Decision Tree one.
Hey @joxeankoret
A friend told me about
pigaios
after I considered implementing something similar. Despite the 4 years since the last commit, I was surprised to see thatpigaios
almost worked out-of-the box!I managed to install the old Clang version 5.0 using a Ubuntu 18.04 so that the "-export" features could work right away. The IDA part though needed an update, so I got my hands dirty and tried to keep the code as close as possible to what you did originally.
Two potential issues though:
I had to update this snippet but I'm not sure what you were trying to do in the first place, so I can't test it.
The pickle file was created with a dependency to an old sklearn version so I recreated it by re-training the model with:
I followed the instructions here https://github.com/joxeankoret/pigaios/issues/19 but I expected to find information in the output about other additional classifiers besides the Decision Tree one.
So, there you go. It's a really nice project btw!