Open lzy83925 opened 7 months ago
我测试的时候也遇到这个问题,部署在本地之后使用netsh做端口转发试试看
这没启动完呢吧, 在QAnything/assets/custom_models/下不存在-t文件夹。请检查您的模型文件。
哦你这个没下模型,要下了模型放进去的呀
你这个显卡适合放一个3b就可以了
【超级会员V3】通过百度网盘分享的文件:MiniChat....7z 链接:https://pan.baidu.com/s/1hlPNADZCSUbmnMI8PzaStA?pwd=W773 提取码:W773 复制这段内容打开「百度网盘APP 即可获取」
请问这个问题解决了吗?我也是一样的问题。我在服务器上搭建的服务,在服务器的浏览器可以访问,但是在pc上就无法访问了,都是同一个网络。
请问这个问题解决了吗?我也是一样的问题。我在服务器上搭建的服务,在服务器的浏览器可以访问,但是在pc上就无法访问了,都是同一个网络。
localhost启动,配置localhost的端口转发就好了
请问这个问题解决了吗?我也是一样的问题。我在服务器上搭建的服务,在服务器的浏览器可以访问,但是在pc上就无法访问了,都是同一个网络。
似乎是ubuntu的防火墙没开放或者就是你输入的不是ubuntu的ip
感谢,我把ip变更就可以访问了
感谢,我把ip变更就可以访问了
你好,请问是如何操作的呢?我也是在服务器上部署,希望外网能够访问
ubuntu ip
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
我们现在是在公司的局域网 GPU 服务器上部署的 QAnything.
服务器本身是能访问外网的,但是只有局域网的 IP 地址.
在启动 run.sh 的时候我选的是 remote,然后输入公网 IP 的时候我输入的是局域网的 192.168..的 IP,启动是成功的没问题,但是在公司局域网其他电脑上访问http://192.168..:5052/qanything/怎么都无法访问。这个是怎么回事儿。看到 docker ps 都是 启动 OK 的,端口也是正常映射的
期望行为 | Expected Behavior
期望能在局域网里的其他机器能正常访问 QAnything的服务
运行环境 | Environment
QAnything日志 | QAnything logs
qanything-container-local | qanything-container-local | ============================= qanything-container-local | == Triton Inference Server == qanything-container-local | ============================= qanything-container-local | qanything-container-local | NVIDIA Release 23.05 (build 61161506) qanything-container-local | Triton Server Version 2.34.0 qanything-container-local | qanything-container-local | Copyright (c) 2018-2023, NVIDIA CORPORATION & AFFILIATES. All rights reserved. qanything-container-local | qanything-container-local | Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved. qanything-container-local | qanything-container-local | This container image and its contents are governed by the NVIDIA Deep Learning Container License. qanything-container-local | By pulling and using the container, you accept the terms and conditions of this license: qanything-container-local | https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license qanything-container-local | qanything-container-local | llm_api is set to [local] qanything-container-local | device_id is set to [0,1] qanything-container-local | runtime_backend is set to [default] qanything-container-local | model_name is set to [-t] qanything-container-local | conv_template is set to [] qanything-container-local | tensor_parallel is set to [1] qanything-container-local | gpu_memory_utilization is set to [0.81] qanything-container-local | checksum 77275c133c7dfcf1553a7b5ef043168d qanything-container-local | default_checksum 77275c133c7dfcf1553a7b5ef043168d qanything-container-local | qanything-container-local | [notice] A new release of pip is available: 23.3.2 -> 24.0 qanything-container-local | [notice] To update, run: python3 -m pip install --upgrade pip qanything-container-local | GPU ID: 0, 1 qanything-container-local | GPU1 Model: Tesla P100-PCIE-16GB qanything-container-local | Compute Capability: null qanything-container-local | OCR_USE_GPU=False because null < 7.5 qanything-container-local | ==================================================== qanything-container-local | **** 重要提示 **** qanything-container-local | ==================================================== qanything-container-local | qanything-container-local | 默认后端为FasterTransformer,仅支持Nvidia RTX 30系列或40系列显卡,您的显卡型号为: Tesla P100-PCIE-16GB, 不在支持列表中,将自动为您切换后端: qanything-container-local | 根据匹配算法,已自动为您切换为huggingface后端 qanything-container-local | 您当前的显存为 16384 MiB 推荐部署小于等于7B的大模型 qanything-container-local | 为了防止显存溢出,tokens上限默认设置为2300 qanything-container-local | The triton server for embedding and reranker will start on 1 GPUs qanything-container-local | The -t folder does not exist under QAnything/assets/custom_models/. Please check your setup. qanything-container-local | 在QAnything/assets/custom_models/下不存在-t文件夹。请检查您的模型文件。
复现方法 | Steps To Reproduce
bash ./run.sh -c local -i 0,1 -b default
模式选:remote 输入公网 IP:192.168..
备注 | Anything else?
No response