Open yh3088 opened 1 year ago
I encountered an error while running the dbgpt_server.py script on Colab. I followed the instructions from the Quickstart Guide, but the following error occurred:
NotImplementedError:` Cannot copy out of meta tensor; no `data!
I am running the code on Colab with an A100 GPU, using vicuna-7b as the model. Can anyone help to identify what could possibly be the issue?The original error message: 2023-07-12 07:50:23 | ERROR | stderr | Exception in thread Thread-2 (init_db_summary): 2023-07-12 07:50:23 | ERROR | stderr | Traceback (most recent call last): 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner 2023-07-12 07:50:23 | ERROR | stderr | self.run() 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/lib/python3.10/threading.py", line 953, in run 2023-07-12 07:50:23 | ERROR | stderr | self._target(*self._args, **self._kwargs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/summary/db_summary_client.py", line 126, in init_db_summary 2023-07-12 07:50:23 | ERROR | stderr | self.db_summary_embedding(dbname) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/summary/db_summary_client.py", line 50, in db_summary_embedding 2023-07-12 07:50:23 | ERROR | stderr | embedding.source_embedding() 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/embedding_engine/source_embedding.py", line 92, in source_embedding 2023-07-12 07:50:23 | ERROR | stderr | self.index_to_store(text) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/embedding_engine/source_embedding.py", line 62, in index_to_store 2023-07-12 07:50:23 | ERROR | stderr | return self.vector_client.load_document(docs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/vector_store/connector.py", line 19, in load_document 2023-07-12 07:50:23 | ERROR | stderr | return self.client.load_document(docs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/vector_store/chroma_store.py", line 35, in load_document 2023-07-12 07:50:23 | ERROR | stderr | ids = self.vector_store_client.add_texts(texts=texts, metadatas=metadatas) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/langchain/vectorstores/chroma.py", line 115, in add_texts 2023-07-12 07:50:23 | ERROR | stderr | embeddings = self._embedding_function.embed_documents(list(texts)) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/langchain/embeddings/huggingface.py", line 61, in embed_documents 2023-07-12 07:50:23 | ERROR | stderr | embeddings = self.client.encode(texts) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/sentence_transformers/SentenceTransformer.py", line 153, in encode 2023-07-12 07:50:23 | ERROR | stderr | self.to(device) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1145, in to 2023-07-12 07:50:23 | ERROR | stderr | return self._apply(convert) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | [Previous line repeated 1 more time] 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 820, in _apply 2023-07-12 07:50:23 | ERROR | stderr | param_applied = fn(param) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1143, in convert 2023-07-12 07:50:23 | ERROR | stderr | return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) 2023-07-12 07:50:23 | ERROR | stderr | NotImplementedError: Cannot copy out of meta tensor; no data! Loading checkpoint shards: 33% 1/3 [00:06<00:12, 6.04s/it] Loading checkpoint shards: 67% 2/3 [00:39<00:22, 22.12s/it] Loading checkpoint shards: 100% 3/3 [00:43<00:00, 14.08s/it] Loading checkpoint shards: 100% 3/3 [00:43<00:00, 14.64s/it] 2023-07-12 07:51:04 | ERROR | stderr | 2023-07-12 07:51:12 | INFO | stdout | LLM Model Loading Success! 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Started server process [[36m8703[0m] 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Waiting for application startup. 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Application startup complete. 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Uvicorn running on [1mhttp://0.0.0.0:5000[0m (Press CTRL+C to quit)
it's mean like your embedding_model is uncomplete, have you install git-lfs?
@Aries-ckt I did install git-lfs. I run the following code:
And this is the structure of my models
folder:
And the config:
If I understand correctly, the text2vec-large-chinese
model is my embedding_model.
I met the same problem, have you solved it now?
I met the same problem, have you solved it now?
Unfortunately no, I'm still stuck on this problem.
I met the same problem, have you solved it now?
I reran my program, and this error no longer showed up. However, I still couldn't get it to work; this time, there wasn't even an error message.
The new error:
I will open a new issue for this.
I met the same problem, have you solved it now? I use 0.3.1 version.
I used the latest repo and it no longer shows up.
I met the same problem, have you solved it now?
Unfortunately no, I'm still stuck on this problem.
can you replace llm model, like vicuna-13b-v1.5
I use the latest repo, but still have the problem.
$ python pilot/server/dbgpt_server.py
playsound is relying on another python subprocess. Please use `pip install pygobject` if you want playsound to run more efficiently.
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pymilvus/client/__init__.py:3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import get_distribution, DistributionNotFound
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
2023-08-17 17:46:58 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/server/knowledge/space_db.py:12: MovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9)
2023-08-17 17:46:58 | ERROR | stderr | Base = declarative_base()
2023-08-17 17:46:58 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-17 17:46:58 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-17 17:46:58 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
2023-08-17 17:46:58 | ERROR | stderr | warn("The installed version of bitsandbytes was compiled without GPU support. "
2023-08-17 17:46:58 | INFO | stdout | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cadam32bit_grad_fp32
2023-08-17 17:47:00 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-17 17:47:00 | INFO | stdout | Add file db, db_name: sqlite_default_sqlite, db_type: sqlite, db_path: data/default_sqlite.db
2023-08-17 17:47:00 | INFO | stdout | add db connect info error2!Constraint Error: Duplicate key "db_name: sqlite_default_sqlite" violates unique constraint. If this is an unexpected constraint violation please double check with the known index limitations section in our documentation (docs - sql - indexes).
2023-08-17 17:47:00 | INFO | stdout | /home/server/codes/DB-GPT/pilot
2023-08-17 17:47:01 | INFO | stdout | Model Unified Deployment Mode!
2023-08-17 17:47:01 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-17 17:47:01 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-17 17:47:01 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/model/adapter.py:348: DeprecationWarning: invalid escape sequence '\.'
2023-08-17 17:47:01 | ERROR | stderr | if not re.fullmatch(".*ggml.*\.bin", model_path):
2023-08-17 17:47:01 | INFO | stdout | Loading vicuna-13b-v1.5 LLM ModelServer in cpu from model path /home/server/codes/DB-GPT/models/vicuna-13b-v1.5! Please Wait......
2023-08-17 17:47:01 | INFO | stdout | Found llm model adapter with model name: vicuna-13b-v1.5, <pilot.model.adapter.VicunaLLMAdapater object at 0x7f51fc5424d0>
2023-08-17 17:47:01 | INFO | LOGGER | Found llm model adapter with model name: vicuna-13b-v1.5, <pilot.model.adapter.VicunaLLMAdapater object at 0x7f51fc5424d0>
2023-08-17 17:47:01 | INFO | stdout | model_params:
2023-08-17 17:47:01 | INFO | stdout | ModelParameters(device='cpu', model_name='vicuna-13b-v1.5', model_path='/home/server/codes/DB-GPT/models/vicuna-13b-v1.5', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False)
2023-08-17 17:47:01 | INFO | LOGGER | model_params:
ModelParameters(device='cpu', model_name='vicuna-13b-v1.5', model_path='/home/server/codes/DB-GPT/models/vicuna-13b-v1.5', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False)
2023-08-17 17:47:01 | INFO | stdout | 8-bit quantization and 4-bit quantization just supported by cuda
2023-08-17 17:47:01 | WARNING | LOGGER | 8-bit quantization and 4-bit quantization just supported by cuda
Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s]
2023-08-17 17:47:02 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-17 17:47:02 | INFO | chromadb | Running Chroma using direct local API.
2023-08-17 17:47:02 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-17 17:47:02 | INFO | clickhouse_connect.driver.ctypes | Successfully imported ClickHouse Connect C data optimizations
2023-08-17 17:47:02 | INFO | clickhouse_connect.json_impl | Using orjson library for writing JSON byte strings
2023-08-17 17:47:02 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-17 17:47:02 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-17 17:47:03 | INFO | chromadb | Running Chroma using direct local API.
2023-08-17 17:47:03 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-17 17:47:03 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-17 17:47:03 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-17 17:47:03 | INFO | stdout | ChromaStore load document
2023-08-17 17:47:03 | INFO | LOGGER | ChromaStore load document
2023-08-17 17:47:03 | ERROR | stderr | Exception in thread Thread-1 (init_db_summary):
2023-08-17 17:47:03 | ERROR | stderr | Traceback (most recent call last):
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
2023-08-17 17:47:03 | ERROR | stderr | self.run()
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/threading.py", line 953, in run
2023-08-17 17:47:03 | ERROR | stderr | self._target(*self._args, **self._kwargs)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 138, in init_db_summary
2023-08-17 17:47:03 | ERROR | stderr | self.db_summary_embedding(item["db_name"], item["db_type"])
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 45, in db_summary_embedding
2023-08-17 17:47:03 | ERROR | stderr | self.init_db_profile(db_summary_client, dbname, embeddings)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 160, in init_db_profile
2023-08-17 17:47:03 | ERROR | stderr | embedding.index_to_store(docs)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/embedding_engine/source_embedding.py", line 66, in index_to_store
2023-08-17 17:47:03 | ERROR | stderr | return self.vector_client.load_document(docs)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/vector_store/connector.py", line 32, in load_document
2023-08-17 17:47:03 | ERROR | stderr | return self.client.load_document(docs)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/vector_store/chroma_store.py", line 45, in load_document
2023-08-17 17:47:03 | ERROR | stderr | ids = self.vector_store_client.add_texts(texts=texts, metadatas=metadatas)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/langchain/vectorstores/chroma.py", line 115, in add_texts
2023-08-17 17:47:03 | ERROR | stderr | embeddings = self._embedding_function.embed_documents(list(texts))
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/langchain/embeddings/huggingface.py", line 61, in embed_documents
2023-08-17 17:47:03 | ERROR | stderr | embeddings = self.client.encode(texts)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 153, in encode
2023-08-17 17:47:03 | ERROR | stderr | self.to(device)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1145, in to
2023-08-17 17:47:03 | ERROR | stderr | return self._apply(convert)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply
2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply
2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply
2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn)
2023-08-17 17:47:03 | ERROR | stderr | [Previous line repeated 1 more time]
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 820, in _apply
2023-08-17 17:47:03 | ERROR | stderr | param_applied = fn(param)
2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1143, in convert
2023-08-17 17:47:03 | ERROR | stderr | return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
2023-08-17 17:47:03 | ERROR | stderr | NotImplementedError: Cannot copy out of meta tensor; no data!
[1] 528909 killed python pilot/server/dbgpt_server.py
(dbgpt_env)
my env config LLM_MODEL=vicuna-13b-v1.5
and lfs check
# server @ server-ai in ~/codes/DB-GPT/models/vicuna-13b-v1.5 on git:main x [17:05:56]
$ git lfs checkout
Checking out LFS objects: 100% (4/4), 26 GB | 1 B/s, done.
(dbgpt_env)
What version of transformers? Please try with transformers>=4.31.0
.
What version of transformers? Please try with
transformers>=4.31.0
.
yes, I use the latest main branch. the requirements is that. my os is Ubuntu20.4
I use the latest repo, but still have the problem.
$ python pilot/server/dbgpt_server.py playsound is relying on another python subprocess. Please use `pip install pygobject` if you want playsound to run more efficiently. /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pymilvus/client/__init__.py:3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import get_distribution, DistributionNotFound /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) 2023-08-17 17:46:58 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/server/knowledge/space_db.py:12: MovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9) 2023-08-17 17:46:58 | ERROR | stderr | Base = declarative_base() 2023-08-17 17:46:58 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese 2023-08-17 17:46:58 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling. 2023-08-17 17:46:58 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. 2023-08-17 17:46:58 | ERROR | stderr | warn("The installed version of bitsandbytes was compiled without GPU support. " 2023-08-17 17:46:58 | INFO | stdout | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cadam32bit_grad_fp32 2023-08-17 17:47:00 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu 2023-08-17 17:47:00 | INFO | stdout | Add file db, db_name: sqlite_default_sqlite, db_type: sqlite, db_path: data/default_sqlite.db 2023-08-17 17:47:00 | INFO | stdout | add db connect info error2!Constraint Error: Duplicate key "db_name: sqlite_default_sqlite" violates unique constraint. If this is an unexpected constraint violation please double check with the known index limitations section in our documentation (docs - sql - indexes). 2023-08-17 17:47:00 | INFO | stdout | /home/server/codes/DB-GPT/pilot 2023-08-17 17:47:01 | INFO | stdout | Model Unified Deployment Mode! 2023-08-17 17:47:01 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese 2023-08-17 17:47:01 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling. 2023-08-17 17:47:01 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/model/adapter.py:348: DeprecationWarning: invalid escape sequence '\.' 2023-08-17 17:47:01 | ERROR | stderr | if not re.fullmatch(".*ggml.*\.bin", model_path): 2023-08-17 17:47:01 | INFO | stdout | Loading vicuna-13b-v1.5 LLM ModelServer in cpu from model path /home/server/codes/DB-GPT/models/vicuna-13b-v1.5! Please Wait...... 2023-08-17 17:47:01 | INFO | stdout | Found llm model adapter with model name: vicuna-13b-v1.5, <pilot.model.adapter.VicunaLLMAdapater object at 0x7f51fc5424d0> 2023-08-17 17:47:01 | INFO | LOGGER | Found llm model adapter with model name: vicuna-13b-v1.5, <pilot.model.adapter.VicunaLLMAdapater object at 0x7f51fc5424d0> 2023-08-17 17:47:01 | INFO | stdout | model_params: 2023-08-17 17:47:01 | INFO | stdout | ModelParameters(device='cpu', model_name='vicuna-13b-v1.5', model_path='/home/server/codes/DB-GPT/models/vicuna-13b-v1.5', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False) 2023-08-17 17:47:01 | INFO | LOGGER | model_params: ModelParameters(device='cpu', model_name='vicuna-13b-v1.5', model_path='/home/server/codes/DB-GPT/models/vicuna-13b-v1.5', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False) 2023-08-17 17:47:01 | INFO | stdout | 8-bit quantization and 4-bit quantization just supported by cuda 2023-08-17 17:47:01 | WARNING | LOGGER | 8-bit quantization and 4-bit quantization just supported by cuda Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s] 2023-08-17 17:47:02 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu 2023-08-17 17:47:02 | INFO | chromadb | Running Chroma using direct local API. 2023-08-17 17:47:02 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb 2023-08-17 17:47:02 | INFO | clickhouse_connect.driver.ctypes | Successfully imported ClickHouse Connect C data optimizations 2023-08-17 17:47:02 | INFO | clickhouse_connect.json_impl | Using orjson library for writing JSON byte strings 2023-08-17 17:47:02 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load 2023-08-17 17:47:02 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load 2023-08-17 17:47:03 | INFO | chromadb | Running Chroma using direct local API. 2023-08-17 17:47:03 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb 2023-08-17 17:47:03 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load 2023-08-17 17:47:03 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load 2023-08-17 17:47:03 | INFO | stdout | ChromaStore load document 2023-08-17 17:47:03 | INFO | LOGGER | ChromaStore load document 2023-08-17 17:47:03 | ERROR | stderr | Exception in thread Thread-1 (init_db_summary): 2023-08-17 17:47:03 | ERROR | stderr | Traceback (most recent call last): 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/threading.py", line 1016, in _bootstrap_inner 2023-08-17 17:47:03 | ERROR | stderr | self.run() 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/threading.py", line 953, in run 2023-08-17 17:47:03 | ERROR | stderr | self._target(*self._args, **self._kwargs) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 138, in init_db_summary 2023-08-17 17:47:03 | ERROR | stderr | self.db_summary_embedding(item["db_name"], item["db_type"]) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 45, in db_summary_embedding 2023-08-17 17:47:03 | ERROR | stderr | self.init_db_profile(db_summary_client, dbname, embeddings) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/summary/db_summary_client.py", line 160, in init_db_profile 2023-08-17 17:47:03 | ERROR | stderr | embedding.index_to_store(docs) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/embedding_engine/source_embedding.py", line 66, in index_to_store 2023-08-17 17:47:03 | ERROR | stderr | return self.vector_client.load_document(docs) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/vector_store/connector.py", line 32, in load_document 2023-08-17 17:47:03 | ERROR | stderr | return self.client.load_document(docs) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/vector_store/chroma_store.py", line 45, in load_document 2023-08-17 17:47:03 | ERROR | stderr | ids = self.vector_store_client.add_texts(texts=texts, metadatas=metadatas) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/langchain/vectorstores/chroma.py", line 115, in add_texts 2023-08-17 17:47:03 | ERROR | stderr | embeddings = self._embedding_function.embed_documents(list(texts)) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/langchain/embeddings/huggingface.py", line 61, in embed_documents 2023-08-17 17:47:03 | ERROR | stderr | embeddings = self.client.encode(texts) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 153, in encode 2023-08-17 17:47:03 | ERROR | stderr | self.to(device) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1145, in to 2023-08-17 17:47:03 | ERROR | stderr | return self._apply(convert) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply 2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply 2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply 2023-08-17 17:47:03 | ERROR | stderr | module._apply(fn) 2023-08-17 17:47:03 | ERROR | stderr | [Previous line repeated 1 more time] 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 820, in _apply 2023-08-17 17:47:03 | ERROR | stderr | param_applied = fn(param) 2023-08-17 17:47:03 | ERROR | stderr | File "/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1143, in convert 2023-08-17 17:47:03 | ERROR | stderr | return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) 2023-08-17 17:47:03 | ERROR | stderr | NotImplementedError: Cannot copy out of meta tensor; no data! [1] 528909 killed python pilot/server/dbgpt_server.py (dbgpt_env)
my env config
LLM_MODEL=vicuna-13b-v1.5
and lfs check# server @ server-ai in ~/codes/DB-GPT/models/vicuna-13b-v1.5 on git:main x [17:05:56] $ git lfs checkout Checking out LFS objects: 100% (4/4), 26 GB | 1 B/s, done. (dbgpt_env)
@Aries-ckt
It looks like the imported texts is empty in code
$ python pilot/server/dbgpt_server.py
playsound is relying on another python subprocess. Please use `pip install pygobject` if you want playsound to run more efficiently.
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pymilvus/client/__init__.py:3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import get_distribution, DistributionNotFound
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
2023-08-18 09:08:12 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/server/knowledge/space_db.py:12: MovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9)
2023-08-18 09:08:12 | ERROR | stderr | Base = declarative_base()
2023-08-18 09:08:12 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-18 09:08:12 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-18 09:08:12 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
2023-08-18 09:08:12 | ERROR | stderr | warn("The installed version of bitsandbytes was compiled without GPU support. "
2023-08-18 09:08:12 | INFO | stdout | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cadam32bit_grad_fp32
2023-08-18 09:08:14 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-18 09:08:14 | INFO | stdout | Add file db, db_name: sqlite_default_sqlite, db_type: sqlite, db_path: data/default_sqlite.db
2023-08-18 09:08:14 | INFO | stdout | add db connect info error2!Constraint Error: Duplicate key "db_name: sqlite_default_sqlite" violates unique constraint. If this is an unexpected constraint violation please double check with the known index limitations section in our documentation (docs - sql - indexes).
2023-08-18 09:08:14 | INFO | stdout | /home/server/codes/DB-GPT/pilot
2023-08-18 09:08:14 | INFO | stdout | Model Unified Deployment Mode!
2023-08-18 09:08:14 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-18 09:08:14 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-18 09:08:14 | INFO | stdout | Loading text2vec-base LLM ModelServer in cpu from model path /home/server/codes/DB-GPT/models/text2vec-base-chinese! Please Wait......
2023-08-18 09:08:14 | ERROR | stderr | Traceback (most recent call last):
2023-08-18 09:08:14 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/server/dbgpt_server.py", line 111, in <module>
2023-08-18 09:08:14 | ERROR | stderr | from pilot.server.llmserver import worker
2023-08-18 09:08:14 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/server/llmserver.py", line 118, in <module>
2023-08-18 09:08:14 | ERROR | stderr | worker = ModelWorker(model_path=model_path, model_name=CFG.LLM_MODEL, device=DEVICE)
2023-08-18 09:08:14 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/server/llmserver.py", line 44, in __init__
2023-08-18 09:08:14 | ERROR | stderr | self.model, self.tokenizer = self.ml.loader(
2023-08-18 09:08:14 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/model/loader.py", line 92, in loader
2023-08-18 09:08:14 | ERROR | stderr | llm_adapter = get_llm_model_adapter(self.model_name, self.model_path)
2023-08-18 09:08:14 | ERROR | stderr | File "/home/server/codes/DB-GPT/pilot/model/adapter.py", line 92, in get_llm_model_adapter
2023-08-18 09:08:14 | ERROR | stderr | raise ValueError(
2023-08-18 09:08:14 | ERROR | stderr | ValueError: Invalid model adapter for model name text2vec-base and model path /home/server/codes/DB-GPT/models/text2vec-base-chinese
2023-08-18 09:08:16 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-18 09:08:16 | INFO | chromadb | Running Chroma using direct local API.
2023-08-18 09:08:16 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:08:16 | INFO | clickhouse_connect.driver.ctypes | Successfully imported ClickHouse Connect C data optimizations
2023-08-18 09:08:16 | INFO | clickhouse_connect.json_impl | Using orjson library for writing JSON byte strings
2023-08-18 09:08:16 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-18 09:08:16 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-18 09:08:17 | INFO | chromadb | Running Chroma using direct local API.
2023-08-18 09:08:17 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb, skipping load
2023-08-18 09:08:17 | INFO | stdout | ChromaStore load document
2023-08-18 09:08:17 | INFO | LOGGER | ChromaStore load document
Batches: 0%| | 0/1 [00:00<?, ?it/s]
Batches: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 4.93it/s]
Batches: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 4.93it/s]
2023-08-18 09:08:17 | ERROR | stderr |
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | Persisting DB to disk, putting it in the save folder: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:08:17 | INFO | langchain.embeddings | init db profile success...
2023-08-18 09:08:17 | INFO | chromadb | Running Chroma using direct local API.
2023-08-18 09:08:17 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_summary.vectordb
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_summary.vectordb, skipping load
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | No existing DB found in /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_summary.vectordb, skipping load
2023-08-18 09:08:17 | INFO | langchain.embeddings | db summary embedding success
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | Persisting DB to disk, putting it in the save folder: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_summary.vectordb
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | Persisting DB to disk, putting it in the save folder: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:08:17 | INFO | chromadb.db.duckdb | Persisting DB to disk, putting it in the save folder: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:08:17 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpypqb_4ni'>
2023-08-18 09:08:17 | ERROR | stderr | _warnings.warn(warn_message, ResourceWarning)
2023-08-18 09:08:17 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpkzjk7m99wandb'>
2023-08-18 09:08:17 | ERROR | stderr | _warnings.warn(warn_message, ResourceWarning)
2023-08-18 09:08:17 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpl4q3vue3wandb-artifacts'>
2023-08-18 09:08:17 | ERROR | stderr | _warnings.warn(warn_message, ResourceWarning)
2023-08-18 09:08:17 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpby6_h5_0wandb-media'>
2023-08-18 09:08:17 | ERROR | stderr | _warnings.warn(warn_message, ResourceWarning)
2023-08-18 09:08:17 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/tempfile.py:860: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/tmp/tmpput7fs6ywandb-media'>
2023-08-18 09:08:17 | ERROR | stderr | _warnings.warn(warn_message, ResourceWarning)
(dbgpt_env)
# server @ server-ai in ~/codes/DB-GPT on git:main o [9:08:18] C:1
above is run with text2vec-large-chinese
$ python pilot/server/dbgpt_server.py
playsound is relying on another python subprocess. Please use `pip install pygobject` if you want playsound to run more efficiently.
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pymilvus/client/__init__.py:3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import get_distribution, DistributionNotFound
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
/home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
2023-08-18 09:12:21 | ERROR | stderr | /home/server/codes/DB-GPT/pilot/server/knowledge/space_db.py:12: MovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9)
2023-08-18 09:12:21 | ERROR | stderr | Base = declarative_base()
2023-08-18 09:12:21 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-18 09:12:21 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-18 09:12:21 | ERROR | stderr | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
2023-08-18 09:12:21 | ERROR | stderr | warn("The installed version of bitsandbytes was compiled without GPU support. "
2023-08-18 09:12:21 | INFO | stdout | /home/server/miniconda3/envs/dbgpt_env/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cadam32bit_grad_fp32
2023-08-18 09:12:25 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-18 09:12:25 | INFO | stdout | Add file db, db_name: sqlite_default_sqlite, db_type: sqlite, db_path: data/default_sqlite.db
2023-08-18 09:12:25 | INFO | stdout | add db connect info error2!Constraint Error: Duplicate key "db_name: sqlite_default_sqlite" violates unique constraint. If this is an unexpected constraint violation please double check with the known index limitations section in our documentation (docs - sql - indexes).
2023-08-18 09:12:25 | INFO | stdout | /home/server/codes/DB-GPT/pilot
2023-08-18 09:12:25 | INFO | stdout | Model Unified Deployment Mode!
2023-08-18 09:12:25 | INFO | sentence_transformers.SentenceTransformer | Load pretrained SentenceTransformer: /home/server/codes/DB-GPT/models/text2vec-large-chinese
2023-08-18 09:12:25 | WARNING | sentence_transformers.SentenceTransformer | No sentence-transformers model found with name /home/server/codes/DB-GPT/models/text2vec-large-chinese. Creating a new one with MEAN pooling.
2023-08-18 09:12:25 | INFO | stdout | Loading vicuna-13b LLM ModelServer in cpu from model path /home/server/codes/DB-GPT/models/vicuna-13b! Please Wait......
2023-08-18 09:12:25 | INFO | stdout | Found llm model adapter with model name: vicuna-13b, <pilot.model.adapter.VicunaLLMAdapater object at 0x7fa43d51abf0>
2023-08-18 09:12:25 | INFO | LOGGER | Found llm model adapter with model name: vicuna-13b, <pilot.model.adapter.VicunaLLMAdapater object at 0x7fa43d51abf0>
2023-08-18 09:12:25 | INFO | stdout | model_params:
2023-08-18 09:12:25 | INFO | stdout | ModelParameters(device='cpu', model_name='vicuna-13b', model_path='/home/server/codes/DB-GPT/models/vicuna-13b', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False)
2023-08-18 09:12:25 | INFO | LOGGER | model_params:
ModelParameters(device='cpu', model_name='vicuna-13b', model_path='/home/server/codes/DB-GPT/models/vicuna-13b', model_type='huggingface', prompt_template=None, max_context_size=4096, num_gpus=None, max_gpu_memory=None, cpu_offloading=False, load_8bit=True, load_4bit=False, quant_type='nf4', use_double_quant=True, compute_dtype=None, trust_remote_code=True, verbose=False)
2023-08-18 09:12:25 | INFO | stdout | 8-bit quantization and 4-bit quantization just supported by cuda
2023-08-18 09:12:25 | WARNING | LOGGER | 8-bit quantization and 4-bit quantization just supported by cuda
2023-08-18 09:12:25 | ERROR | stderr | You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at https://github.com/huggingface/transformers/pull/24565
2023-08-18 09:12:25 | INFO | sentence_transformers.SentenceTransformer | Use pytorch device: cpu
2023-08-18 09:12:25 | INFO | chromadb | Running Chroma using direct local API.
Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s]
2023-08-18 09:12:25 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_profile.vectordb
2023-08-18 09:12:25 | INFO | clickhouse_connect.driver.ctypes | Successfully imported ClickHouse Connect C data optimizations
2023-08-18 09:12:25 | INFO | clickhouse_connect.json_impl | Using orjson library for writing JSON byte strings
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | loaded in 0 embeddings
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | loaded in 1 collections
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | collection with name langchain already exists, returning existing collection
2023-08-18 09:12:25 | INFO | langchain.embeddings | init db profile success...
2023-08-18 09:12:25 | INFO | chromadb | Running Chroma using direct local API.
2023-08-18 09:12:25 | WARNING | chromadb | Using embedded DuckDB with persistence: data will be stored in: /home/server/codes/DB-GPT/pilot/data/sqlite_default_sqlite_summary.vectordb
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | loaded in 0 embeddings
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | loaded in 1 collections
2023-08-18 09:12:25 | INFO | chromadb.db.duckdb | collection with name langchain already exists, returning existing collection
2023-08-18 09:12:25 | INFO | langchain.embeddings | db summary embedding success
[1] 115300 killed python pilot/server/dbgpt_server.py
(dbgpt_env)
with vicuna-13b
@yh3088 @lehug Has your issue been resolved? We've implemented numerous updates recently. I'd recommend trying out our latest release, version v0.4.3. You can find the related documentation for further assistance.
Certainly! We would greatly appreciate that if you could provide a tutorial for deployment on Colab. 😊
I encountered an error while running the dbgpt_server.py script on Colab. I followed the instructions from the Quickstart Guide, but the following error occurred:
NotImplementedError:` Cannot copy out of meta tensor; no `data!
I am running the code on Colab with an A100 GPU, using vicuna-7b as the model. Can anyone help to identify what could possibly be the issue?The original error message: 2023-07-12 07:50:23 | ERROR | stderr | Exception in thread Thread-2 (init_db_summary): 2023-07-12 07:50:23 | ERROR | stderr | Traceback (most recent call last): 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner 2023-07-12 07:50:23 | ERROR | stderr | self.run() 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/lib/python3.10/threading.py", line 953, in run 2023-07-12 07:50:23 | ERROR | stderr | self._target(*self._args, **self._kwargs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/summary/db_summary_client.py", line 126, in init_db_summary 2023-07-12 07:50:23 | ERROR | stderr | self.db_summary_embedding(dbname) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/summary/db_summary_client.py", line 50, in db_summary_embedding 2023-07-12 07:50:23 | ERROR | stderr | embedding.source_embedding() 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/embedding_engine/source_embedding.py", line 92, in source_embedding 2023-07-12 07:50:23 | ERROR | stderr | self.index_to_store(text) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/embedding_engine/source_embedding.py", line 62, in index_to_store 2023-07-12 07:50:23 | ERROR | stderr | return self.vector_client.load_document(docs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/vector_store/connector.py", line 19, in load_document 2023-07-12 07:50:23 | ERROR | stderr | return self.client.load_document(docs) 2023-07-12 07:50:23 | ERROR | stderr | File "/content/DB-GPT/pilot/vector_store/chroma_store.py", line 35, in load_document 2023-07-12 07:50:23 | ERROR | stderr | ids = self.vector_store_client.add_texts(texts=texts, metadatas=metadatas) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/langchain/vectorstores/chroma.py", line 115, in add_texts 2023-07-12 07:50:23 | ERROR | stderr | embeddings = self._embedding_function.embed_documents(list(texts)) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/langchain/embeddings/huggingface.py", line 61, in embed_documents 2023-07-12 07:50:23 | ERROR | stderr | embeddings = self.client.encode(texts) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/sentence_transformers/SentenceTransformer.py", line 153, in encode 2023-07-12 07:50:23 | ERROR | stderr | self.to(device) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1145, in to 2023-07-12 07:50:23 | ERROR | stderr | return self._apply(convert) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 797, in _apply 2023-07-12 07:50:23 | ERROR | stderr | module._apply(fn) 2023-07-12 07:50:23 | ERROR | stderr | [Previous line repeated 1 more time] 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 820, in _apply 2023-07-12 07:50:23 | ERROR | stderr | param_applied = fn(param) 2023-07-12 07:50:23 | ERROR | stderr | File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1143, in convert 2023-07-12 07:50:23 | ERROR | stderr | return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) 2023-07-12 07:50:23 | ERROR | stderr | NotImplementedError: Cannot copy out of meta tensor; no data! Loading checkpoint shards: 33% 1/3 [00:06<00:12, 6.04s/it] Loading checkpoint shards: 67% 2/3 [00:39<00:22, 22.12s/it] Loading checkpoint shards: 100% 3/3 [00:43<00:00, 14.08s/it] Loading checkpoint shards: 100% 3/3 [00:43<00:00, 14.64s/it] 2023-07-12 07:51:04 | ERROR | stderr | 2023-07-12 07:51:12 | INFO | stdout | LLM Model Loading Success! 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Started server process [[36m8703[0m] 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Waiting for application startup. 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Application startup complete. 2023-07-12 07:51:12 | ERROR | stderr | [32mINFO[0m: Uvicorn running on [1mhttp://0.0.0.0:5000[0m (Press CTRL+C to quit)