Sometimes the AI can generate messages bigger than 4096, but the telegram API does not support it. To solve this issue, the text generated by the AI can be split into multiple messages, each smaller than 4096 characters.
A solution for this can be:
# In the TelegramChat class:
async def __send_message_directly__(self, context, chat_id, text, connect_timeout=60):
await context.bot.send_message(chat_id=chat_id, text=text, connect_timeout=connect_timeout)
def __split_and_send_message__(long_message, max_length=4096):
parts = [long_message[i:i+max_length] for i in range(0, len(long_message), max_length)]
def send_message(self, context, chat_id, text, connect_timeout=60):
splitted_msgs = self.__split_and_send_message__(text)
for msg in splitted_msgs:
self.__send_message_directly__(context, chat_id, msg,connect_timeout)
Sometimes the AI can generate messages bigger than 4096, but the telegram API does not support it. To solve this issue, the text generated by the AI can be split into multiple messages, each smaller than 4096 characters. A solution for this can be: