Closed fangconquerord closed 2 months ago
能提供下pdf样本吗
好像修改后的toolbox.py仍存在问题
Traceback (most recent call last):
File "D:\ai\OneKeyInstallerForWindowsAndMacOS\gpt_academic\toolbox.py", line 666, in read_single_conf_with_lru_cache
r = getattr(importlib.import_module('config_private'), arg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.1008.0_x64qbz5n2kfra8p0\Lib\importlib\init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ai\OneKeyInstallerForWindowsAndMacOS\gpt_academic\main.py", line 221, in
更新config文件
已经测试OK,多谢!
再次测试多个pdf总结时报错:
开始最终总结 [Local Message] 实验性函数调用出错:
Traceback (most recent call last): File ".\toolbox.py", line 122, in decorated yield from f(txt, top_p, temperature, chatbot, history, systemPromptTxt, WEB_PORT) File ".\crazy_functions\批量总结PDF文档.py", line 149, in 批量总结PDF文档 yield from 解析PDF(file_manifest, project_folder, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt) File ".\crazy_functions\批量总结PDF文档.py", line 16, in 解析PDF file_content, page_one = read_and_clean_pdf_text(file_name) # (尝试)按照章节切割PDF ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ".\crazy_functions\crazy_utils.py", line 441, in read_and_clean_pdf_text with fitz.open(fp) as doc: ^^^^^^^^^^^^^ File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\fitz\fitz.py", line 3982, in init raise FileNotFoundError(msg) fitz.fitz.FileNotFoundError: no such file: 'private_upload/2023-07-10-12-11-45\116NwULRev.pdf'
当前代理可用性:
代理配置 http://127.0.0.1:10809, 代理所在地:Japan
Installation Method | 安装方法与平台
Pip Install (I ignored requirements.txt)
Version | 版本
Latest | 最新版
OS | 操作系统
Windows
Describe the bug | 简述
[测试功能] 批量总结PDF文档 失效 .pdf 翻译与对话有效,仅该功能失效 报错内容与之前类似https://github.com/binary-husky/gpt_academic/issues/920 版本为3.43 Traceback (most recent call last): File ".\request_llm\bridge_azure_test.py", line 150, in predict_no_ui_long_connection response = openai.ChatCompletion.create(timeout=TIMEOUT_SECONDS, *payload);break ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\api_resources\chat_completion.py", line 25, in create return super().create(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\api_resources\abstract\engine_apiresource.py", line 153, in create response, , api_key = requestor.request( ^^^^^^^^^^^^^^^^^^ File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\api_requestor.py", line 226, in request resp, got_stream = self._interpret_response(result, stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\api_requestor.py", line 619, in _interpret_response self._interpret_response_line( File "C:\Users\fangc\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\api_requestor.py", line 682, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, your messages resulted in 25066 tokens. Please reduce the length of the messages.
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File ".\crazy_functions\crazy_utils.py", line 79, in _req_gpt result = predict_no_ui_long_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ".\request_llm\bridge_all.py", line 312, in predict_no_ui_long_connection return method(inputs, llm_kwargs, history, sys_prompt, observe_window, console_slience) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ".\request_llm\bridge_azure_test.py", line 154, in predict_no_ui_long_connection if retry > MAX_RETRY: raise TimeoutError ^^^^^^^^^^^^^^^^^^ TimeoutError
Screen Shot | 有帮助的截图
Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)
No response