Closed kbwzy closed 6 months ago
LLMServer Generate Error, Please CheckErrorInfo.: HTTP code 403 from API (
Please enable cookies. Sorry, you have been blocked You are unable to access api.openai.com
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.
This issue has been marked as stale
, because it has been over 30 days without any activity.
This issue bas been closed, because it has been marked as stale
and there has been no activity for over 7 days.
Search before asking
Operating system information
Linux
Python version information
3.10
DB-GPT version
main
Related scenes
Installation Information
[X] Installation From Source
[ ] Docker Installation
[ ] Docker Compose Installation
[ ] Cluster Installation
[ ] AutoDL Image
[ ] Other
Device information
cpu,centos7.6.
Models information
LLM:chatgpt_proxyllm EMBEDDING:m3e-base
What happened
using chatgpt api ,when start to ask question, a error in web: LLMServer Generate Error, Please CheckErrorInfo.: HTTP code 403 from API (
Please enable cookies. Sorry, you have been blocked
What you expected to happen
get the chatgpt response
How to reproduce
1、python pilot/server/dbgpt_server.py 2、open 5000 port to ask question like"讲个笑话" 3、the web will response the error。
Additional context
log: INFO [pilot.model.proxy.llms.chatgpt] Model: <pilot.model.proxy.llms.proxy_model.ProxyModel object at 0x7f77060be230>, model_params:
=========================== ProxyModelParameters ===========================
model_name: chatgpt_proxyllm model_path: chatgpt_proxyllm proxy_server_url: https://api.openai.com/v1/chat/completions proxy_api_key: s**d proxy_api_base: None proxy_api_app_id: None proxy_api_type: None proxy_api_version: None http_proxy: http://X.X.X.X:50054 proxyllm_backend: None model_type: proxy device: cuda prompt_template: None max_context_size: 4096
======================================================================
INFO [pilot.model.proxy.llms.chatgpt] Send request to real model gpt-3.5-turbo, openai_params: {'api_type': 'open_ai', 'api_base': 'https://api.openai.com/v1', 'api_version': None, 'proxy': 'http://xxxxx:xxxx'} payloads: {'temperature': 0.6, 'max_tokens': 1024, 'stream': True, 'model': 'gpt-3.5-turbo'} headers: {'Content-Type': 'application/json', 'Authorization': 'Bearer sk-XXXXXXXXXXXXXXXXXXXXXXXX'} INFO: 10.10.188.193:57146 - "GET /cdn-cgi/styles/cf.errors.css HTTP/1.1" 404 Not Found
Are you willing to submit PR?