onuratakan / gpt-computer-assistant

gpt-4o for windows, macos and linux
MIT License
4.75k stars 441 forks source link

error running after clean install #91

Open svankirk opened 3 weeks ago

svankirk commented 3 weeks ago

After insurng a clean install with the latest python 3.12.4, I cannot run computerassistant. I get the following error stack. I have gone through all the dependencies as well as I am able, but haven't found anything obvious. Any idea what might be wrong?

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "M:\Software\python\Scripts\computerassistant.exe__main.py", line 4, in File "M:\Software\python\Lib\site-packages\gpt_computer_assistant__init.py", line 5, in from .tooler import Tool File "M:\Software\python\Lib\site-packages\gpt_computer_assistant\tooler.py", line 2, in from langchain.tools import tool File "M:\Software\python\Lib\site-packages\langchain\tools__init.py", line 23, in from langchain_core.tools import BaseTool, StructuredTool, Tool, tool File "M:\Software\python\Lib\site-packages\langchain_core\tools.py", line 34, in from langchain_core.callbacks import ( File "M:\Software\python\Lib\site-packages\langchain_core\callbacks\init__.py", line 22, in from langchain_core.callbacks.manager import ( File "M:\Software\python\Lib\site-packages\langchain_core\callbacks\manager.py", line 29, in from langsmith.run_helpers import get_run_tree_context File "M:\Software\python\Lib\site-packages\langsmith\run_helpers.py", line 40, in from langsmith import client as ls_client File "M:\Software\python\Lib\site-packages\langsmith\client.py", line 52, in from langsmith import env as ls_env File "M:\Software\python\Lib\site-packages\langsmith\env\init.py", line 3, in from langsmith.env._runtime_env import ( File "M:\Software\python\Lib\site-packages\langsmith\env_runtime_env.py", line 10, in from langsmith.utils import get_docker_compose_command File "M:\Software\python\Lib\site-packages\langsmith\utils.py", line 31, in from langsmith import schemas as ls_schemas File "M:\Software\python\Lib\site-packages\langsmith\schemas.py", line 69, in class Example(ExampleBase): File "M:\Software\python\Lib\site-packages\pydantic\v1\main.py", line 286, in new cls.try_update_forward_refs() File "M:\Software\python\Lib\site-packages\pydantic\v1\main.py", line 807, in try_update_forward_refs__ update_model_forward_refs(cls, cls.fields.values(), cls.config__.json_encoders, localns, (NameError,)) File "M:\Software\python\Lib\site-packages\pydantic\v1\typing.py", line 554, in update_model_forward_refs update_field_forward_refs(f, globalns=globalns, localns=localns) File "M:\Software\python\Lib\site-packages\pydantic\v1\typing.py", line 520, in update_field_forwardrefs field.type = evaluateforwardref(field.type, globalns, localns or None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\Software\python\Lib\site-packages\pydantic\v1\typing.py", line 66, in evaluateforwardref return cast(Any, type)._evaluate(globalns, localns, set()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: ForwardRef._evaluate() missing 1 required keyword-only argument: 'recursive_guard'

kakakagan110 commented 3 weeks ago

سلام ..انگلیسی م ضعیفه بزن تو گوگل ترنلسیت ترجمه کنه برات

ار ورژن 3.12.3 استفاده کنی این ارور رو نمیده ..تست کن شاید مشکلت حل شد

svankirk commented 3 weeks ago

Thank you! I will give it a try.

On Sat, Jun 8, 2024, 12:18 PM kakakagan110 @.***> wrote:

سلام ..انگلیسی م ضعیفه بزن تو گوگل ترنلسیت ترجمه کنه برات

ار ورژن 3.12.3 استفاده کنی این ارور رو نمیده ..تست کن شاید مشکلت حل شد

— Reply to this email directly, view it on GitHub https://github.com/onuratakan/gpt-computer-assistant/issues/91#issuecomment-2156128704, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4P2DPVEAX2YH667FBEZSDZGNDHJAVCNFSM6AAAAABI7WDUKWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJWGEZDQNZQGQ . You are receiving this because you authored the thread.Message ID: @.***>

horace-yip commented 3 weeks ago

Using Python 3.12.3 resolved the problem for me. Thank you.