utmstack / UTMStack

Customizable SIEM and XDR powered by Real-Time correlation and Threat Intelligence
https://utmstack.com
GNU Affero General Public License v3.0
213 stars 24 forks source link

[BUG] SOCAI error 400 #745

Open kazhuyo opened 3 months ago

kazhuyo commented 3 months ago

When trying process alert with soc-ai, the docker logs return error 400

Here's the part of error in socai docker container :

request to GPT: status code '400' received '{
  "error": {
    "message": "This model's maximum context length is 16385 tokens. However, your messages resulted in 19181 tokens. Please reduce the length of the messages.",
    "type": "invalid_request_error",
    "param": "messages",
    "code": "context_length_exceeded"
  }
}'

I think the default model for socai is using gpt-3.5-turbo-16k, which is need to change gpt-4-turbo to support above 16K tokens.