flynnoct / chatgpt-telegram-bot

Telegram bot implemented by OFFICIAL OpenAI ChatGPT API (gpt-3.5-turbo, released on 2023-03-01)
MIT License
181 stars 43 forks source link

Bot does not respond. #48

Closed Ndh474 closed 1 year ago

Ndh474 commented 1 year ago

After running the command 'bash ./bin/start_bot.sh', my terminal displays the message 'Bot has been started successfully', but when I try to chat with the bot on Telegram, it doesn't respond. How can I fix this issue? ảnh

RayCxggg commented 1 year ago

Please refer to the previous issues #13, #26 and #27. And don't forget to update to the latest released version v1.2.2. If the problem can't be solved, please leave more information here for us.

Ndh474 commented 1 year ago

After running the command 'python3 telegram_message_parser.py', my cmd looks like this. What should I do next? ảnh

RayCxggg commented 1 year ago

It is more likely an environment problem. Try to remove python-telegram-bot and telegram package with pip, and reinstall dependencies with pip install -r requirements.txt.

Ndh474 commented 1 year ago

After I moved the config.json file into the src directory and ran the Python command, the bot was functioning properly. However, when I ran the ./bin/start_bot.sh bash command, an error occurred and the bot wasn't working. Why is that? ảnh

RayCxggg commented 1 year ago

Can you set up your Bot in a Linux environment? Our code is tested under Ubuntu and not sure to work on Windows.

Ndh474 commented 1 year ago

Seems quite okay, but yesterday after chatting with the bot for a while, it said, "This model's maximum context length is 4097 tokens. However, your messages resulted in 4287 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again," even though my messages were quite short. However, I chatted again today and it worked fine.

flynnoct commented 1 year ago

Seems quite okay, but yesterday after chatting with the bot for a while, it said, "This model's maximum context length is 4097 tokens. However, your messages resulted in 4287 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again," even though my messages were quite short. However, I chatted again today and it worked fine.

That's because the ChatGPT API won't "remember" the context, and the context will be saved in bot temporarily and sent to API as "context". That's the main reason of reaching the limit.

However, in some cases, I also noticed even if I didn't chat much with the API it will also return with this exception. So this may happen sometimes, I believe it's an API's issue, not the bot's.

Ndh474 commented 1 year ago

Thank you for everyone's help, I will close this issue.