JusticeRage / Gepetto

IDA plugin which queries uses language models to speed up reverse-engineering
GNU General Public License v3.0
2.87k stars 263 forks source link

IDA 9.0 doesn't seem to work properly with the CLI #44

Closed zhefox closed 2 months ago

zhefox commented 2 months ago

image

image

JusticeRage commented 2 months ago

I worked on this version with version 9.0 and can confirm that it works, at least on my machine. Is there any error message in the console?

zhefox commented 2 months ago

似乎是因为我之前修改的模型名称不太对,所以导致了IDC没有注册,这个我解决了,可以正常chat了 image 但是报错仍然存在 Exception in thread Thread-2 (do_generate_model_select_menu): Traceback (most recent call last): File "C:\Users\ZHEFOX\AppData\Local\Programs\Python\Python310\Lib\threading.py", line 1016, in _bootstrap_inner self.run() File "C:\Users\ZHEFOX\AppData\Local\Programs\Python\Python310\Lib\threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "C:\Program Files/IDA Professional 9.0/plugins\gepetto\ida\ui.py", line 114, in do_generate_model_select_menu self.detach_actions() File "C:\Program Files/IDA Professional 9.0/plugins\gepetto\ida\ui.py", line 99, in detach_actions for model in provider.supported_models(): File "C:\Program Files\IDA Professional 9.0\plugins\gepetto\models\local_ollama.py", line 28, in supported_models OLLAMA_MODELS = [m["name"] for m in create_client().list()["models"]] File "C:\Users\ZHEFOX\AppData\Local\Programs\Python\Python310\lib\site-packages\ollama_client.py", line 465, in list return self._request('GET', '/api/tags').json() File "C:\Users\ZHEFOX\AppData\Local\Programs\Python\Python310\lib\site-packages\ollama_client.py", line 75, in _request raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError

他似乎默认还是会加载 image

JusticeRage commented 2 months ago

I'm sorry, the Chinese part of your post is not clear for me (I don't speak the language). Can you clarify which missing setting caused the exception?

zhefox commented 2 months ago

When using the plug-in, it enabled ollama by default. Because I did not deploy ollama, try except reported an error, but the captured type does not contain the client's responseError, so the error message will be displayed

fixed:

    def supported_models():
        global OLLAMA_MODELS

        if OLLAMA_MODELS is None:
            try:
                OLLAMA_MODELS = [m["name"] for m in create_client().list()["models"]]
            except _httpx.ConnectError:
                OLLAMA_MODELS = []
            except ollama._types.ResponseError:  # catch ResponseError
                OLLAMA_MODELS = []
        return OLLAMA_MODELS
JusticeRage commented 2 months ago

This is weird. Which version of the ollama package for Python are you using? I tested the script on a machine without Ollama and it didn't cause this exception.

EDIT: I cannot reproduce this error, but the fix seems harmless enough. I'll push ASAP, thanks!