ChatAFLndss / ChatAFL

Large Language Model guided Protocol Fuzzing (NDSS'24)
Apache License 2.0
246 stars 27 forks source link

The reproduction confirmation. #9

Closed adcf3016 closed 3 weeks ago

adcf3016 commented 1 month ago

Recently, I have been trying to reproduce your results and encountered a few issues:

After using it for a period of time, the number of OpenAI tokens does not change anymore (assuming it is set for one hour, the token count stops changing around 40 to 50 minutes). I would like to ask if this is normal.

Below are the results of executing ./run.sh 5 60 forker-daapd chatafl. In other reproductions, I encounter almost the same results. I would like to ask if it is normal that the subsequent coverage does not increase. cov_over_time_forked-daapd state_over_time_forked-daapd

Thank you very much for taking the time to reply.

Marti2203 commented 1 month ago

Hi, The reason for this is that the fuzzer by default has been set to have a maximum amount of calls to the LLM when a coverage plateau. We have limited this with the variable CHATTING_THRESHOLD. You can increase it to change the expected behavior.