TsinghuaDatabaseGroup / DB-GPT

An LLM Based Diagnosis System (https://arxiv.org/pdf/2312.01454.pdf)
http://dbgpt.dbmind.cn/
Apache License 2.0
565 stars 80 forks source link

文件打开错误 #65

Closed yangyongguang closed 11 months ago

yangyongguang commented 11 months ago

import sys; print('Python %s on %s' % (sys.version, sys.platform)) /root/miniconda3/envs/D-Bot/bin/python /root/.pycharm_helpers/pydev/pydevd.py --multiprocess --qt-support=auto --client localhost --port 37745 --file /home/workspace/YYG/FromS/DB-GPT/main.py Connected to pydev debugger (build 232.8660.197) 系统启动java的jvm虚拟环境成功 12/14/2023 19:25:09 - ERROR - root - obtain_historical_queries_statistics Fails! 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} 12/14/2023 19:25:09 - WARNING - root - Unused arguments: {'model': 'diag-llama'} Report Initialization! 0%| | 0/1====================== Initialization ====================== rank : 0 local_rank : 0 world_size : 1 local_size : 1 master : localhost:10010 device : 0 cpus : [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 1 3, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 2 4, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 3 5, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 4 6, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 5 7, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 6 8, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 7 9, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 9 0, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118 , 119, 120, 121, 122, 123, 124, 125, 126, 12 7, 128, 129, 130, 131, 132, 133, 134, 135, 1 36, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162 , 163, 164, 165, 166, 167, 168, 169, 170, 17 1, 172, 173, 174, 175] /root/miniconda3/envs/D-Bot/lib/python3.10/site-packages/bmtrain/synchronize.py:15: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage() nccl.allReduce(barrier.storage(), barrier.storage(), 'sum', config['comm']) args.load is not None, start to load checkpoints /home/workspace/YYG/YYG/D-Bot/DiagLlama/DiagLlama.pt [INFO][2023-12-14 19:25:39][jeeves-hpc-gpu00][inference.py:33:105510] - load model in 21.73s You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the legacy (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set legacy=False. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 [INFO][2023-12-14 19:25:40][jeeves-hpc-gpu00][inference.py:38:105510] - load tokenizer in 1.27s finish loading 100%|██████████████████████████████████████████████████████████████████████| 1/1 Role Assignment! 100%|██████████████████████████████████████████████████████████████████████| 1/1 12/14/2023 19:26:07 - INFO - sentence_transformers.SentenceTransformer - Load pretrained SentenceTransformer: ./localized_llms/sentence_embedding/sentence-transformer/ 12/14/2023 19:26:10 - INFO - sentence_transformers.SentenceTransformer - Use pytorch device: cuda Batches: 100%|████████████████████████████████████| 1/1 [00:00<00:00, 1.46it/s] CpuExpert Diagnosis!

image

zhouxh19 commented 11 months ago

Thanks for your feedback. We have fixed the problem.

https://github.com/TsinghuaDatabaseGroup/DB-GPT/commit/1547d02a18debc4d3f3ac4d00ca8cf1d963f5cdb