Closed taozhiyuai closed 6 months ago
did you download the zip of the repo from github?
looks like some tooling expects the git repo to exist, try git clone
instead of downloading zip.
LLM_MODEL="qwen:32b-chat-v1.5-fp16" LLM_API_KEY="ollama" LLM_BASE_URL="http://localhost:11434/v1" LLM_EMBEDDING_MODEL="nomic-embed-text:latest" WORKSPACE_DIR="./workspace"
`(opendevin) taozhiyu@603e5f4a42f1 OpenDevin-main % make run Running the app... Starting backend server... Waiting for the backend to start... Connection to localhost port 3000 [tcp/hbci] succeeded! Backend started successfully. Starting frontend with npm...
opendevin-frontend@0.1.0 start vite --port 3001
VITE v5.2.8 ready in 170 ms
➜ Local: http://localhost:3001/ ➜ Network: use --host to expose ➜ press h + enter to show help /Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modellist" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelname" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_groupalias" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelinfo" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelid" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
ERROR:root: File "/Users/taozhiyu/miniconda3/envs/opendevin/bin/uvicorn", line 8, in
ERROR:root:<class 'OSError'>: Can't load the configuration of 'BAAI/bge-small-en-v1.5'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'BAAI/bge-small-en-v1.5' is the correct path to a directory containing a config.json file`
I have downloaded BAAI/bge-small-en-v1.5, and where should I save it on Mac?
I have downloaded BAAI/bge-small-en-v1.5, and where should I save it on Mac?
you can change the agenthub/monologue_agent/utils/memory.py
, line 39, set model_name value to your save path.
btw,I am a starter here, can you leave a wechat to talk?
@Nash-x9 we only officially communicate over GitHub issues. But you can also try the discord
@taozhiyuai what does git --version
show?
@rbrn
(opendevin) taozhiyu@192 OpenDevin % git --version git version 2.39.3 (Apple Git-146)
Duplicate
thanks for help. but NOT work for me.
` (opendevin) taozhiyu@603e5f4a42f1 OpenDevin % wget https://hf-mirror.com/BAAI/bge-small-en-v1.5/raw/main/1_Pooling/config.json -P /tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/ --2024-04-15 13:03:54-- https://hf-mirror.com/BAAI/bge-small-en-v1.5/raw/main/1_Pooling/config.json 正在解析主机 hf-mirror.com (hf-mirror.com)... 153.121.57.40, 160.16.199.204, 133.242.169.68 正在连接 hf-mirror.com (hf-mirror.com)|153.121.57.40|:443... 已连接。 已发出 HTTP 请求,正在等待回应... 200 OK 长度:190 [text/plain] 正在保存至: “/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json”
config.json 100%[===========================================================================================================================================>] 190 --.-KB/s 用时 0s
2024-04-15 13:03:55 (5.66 MB/s) - 已保存 “/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json” [190/190])
(opendevin) taozhiyu@603e5f4a42f1 OpenDevin % make run Running the app... Starting backend server... Waiting for the backend to start... Connection to localhost port 3000 [tcp/hbci] succeeded! Backend started successfully. Starting frontend with npm...
opendevin-frontend@0.1.0 start vite --port 3001
VITE v5.2.8 ready in 154 ms
➜ Local: http://localhost:3001/ ➜ Network: use --host to expose ➜ press h + enter to show help /Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modellist" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelname" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "model_groupalias" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelinfo" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
/Users/taozhiyu/miniconda3/envs/opendevin/lib/python3.11/site-packages/pydantic/_internal/_fields.py:151: UserWarning: Field "modelid" has conflict with protected namespace "model".
You may be able to resolve this warning by setting model_config['protected_namespaces'] = ()
.
warnings.warn(
ERROR:root: File "/Users/taozhiyu/miniconda3/envs/opendevin/bin/uvicorn", line 8, in
ERROR:root:<class 'OSError'>: Can't load the configuration of 'BAAI/bge-small-en-v1.5'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'BAAI/bge-small-en-v1.5' is the correct path to a directory containing a config.json file ^[[B^[[A^[[A^Cmake: *** [run] Interrupt: 2 `
573 (comment)
@taozhiyuai Did you try that comment?
Should be fixed with the new docker installation method!
573 (comment)
@taozhiyuai Did you try that comment?
yes, I have tried. as shown above information I pasted.
(opendevin) taozhiyu@603e5f4a42f1 OpenDevin % wget https://hf-mirror.com/BAAI/bge-small-en-v1.5/raw/main/1_Pooling/config.json -P /tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/
@taozhiyuai Please continue on #1131
Describe the bug
Setup and configuration
Current version: (opendevin) taozhiyu@603e5f4a42f1 OpenDevin-main % git log -n 1 fatal: not a git repository (or any of the parent directories): .git
My config.toml and environment vars (be sure to redact API keys): LLM_MODEL="qwen:32b-chat-v1.5-fp16" LLM_API_KEY="ollama" LLM_BASE_URL="http://localhost:11434/v1" LLM_EMBEDDING_MODEL="nomic-embed-text:latest" WORKSPACE_DIR="./workspace"
My model and agent (you can see these settings in the UI): LLM_MODEL="qwen:32b-chat-v1.5-fp16" LLM_API_KEY="ollama" LLM_BASE_URL="http://localhost:11434/v1" LLM_EMBEDDING_MODEL="nomic-embed-text:latest" WORKSPACE_DIR="./workspace"
Commands I ran to install and run OpenDevin:
make build
Steps to Reproduce: 1. 2. 3.
Logs, error messages, and screenshots:
Additional Context