protectai / vulnhuntr

Zero shot vulnerability discovery using LLMs
GNU Affero General Public License v3.0
1.1k stars 113 forks source link

Support for local ollama APIs? #9

Closed claythearc closed 3 weeks ago

claythearc commented 1 month ago

Is there interest in supporting this? I don’t mind helping with the implementation, but it would be beneficial for commercial customers who want to use it on internal code bases w/ 70B or similar, without leaking secrets to The Man.

DanMcInerney commented 1 month ago

We can do that. Shouldn't be much of a change.

DanMcInerney commented 3 weeks ago

I added ollama support and custom model names so you can use local models. Unfortunately, I don't have the resources to run the larger models with large context windows and high param counts like Llama3.2 so I tested on some smaller models. None of them were capable of following the very specific response formatting instructions so IDK how useful this will be until the smaller models improve by a lot or someone hosts a public Llama3.2 with API access. There might be some bugs in the Ollama deployment since I can't fully debug it due to none of the smaller models getting past response validation.