opensearch-project / logstash-output-opensearch

A Logstash plugin that sends event data to a OpenSearch clusters and stores as an index.
https://opensearch.org/docs/latest/clients/logstash/index/
Apache License 2.0
104 stars 80 forks source link

[BUG] Getting code 400 in logstash with http_compression enabled #232

Closed tgm4883 closed 10 months ago

tgm4883 commented 10 months ago

Describe the bug Getting code 400 in logstash with http_compression enabled.

To Reproduce Steps to reproduce the behavior:

  1. Enable http_compression in logstash opensearch output plugin
  2. Logs work for a minute or 2 and then start failing

Expected behavior Logs should continue to go to opensearch

Plugins Basically the default list that is installed with logstash plus the opensearch output plugin logstash-codec-avro logstash-codec-cef logstash-codec-collectd logstash-codec-dots logstash-codec-edn logstash-codec-edn_lines logstash-codec-es_bulk logstash-codec-fluent logstash-codec-graphite logstash-codec-json logstash-codec-json_lines logstash-codec-line logstash-codec-msgpack logstash-codec-multiline logstash-codec-netflow logstash-codec-plain logstash-codec-rubydebug logstash-filter-aggregate logstash-filter-anonymize logstash-filter-cidr logstash-filter-clone logstash-filter-csv logstash-filter-date logstash-filter-de_dot logstash-filter-dissect logstash-filter-dns logstash-filter-drop logstash-filter-elasticsearch logstash-filter-fingerprint logstash-filter-geoip logstash-filter-grok logstash-filter-http logstash-filter-json logstash-filter-kv logstash-filter-memcached logstash-filter-metrics logstash-filter-mutate logstash-filter-prune logstash-filter-ruby logstash-filter-sleep logstash-filter-split logstash-filter-syslog_pri logstash-filter-throttle logstash-filter-translate logstash-filter-truncate logstash-filter-urldecode logstash-filter-useragent logstash-filter-uuid logstash-filter-xml logstash-input-azure_event_hubs logstash-input-beats └── logstash-input-elastic_agent (alias) logstash-input-couchdb_changes logstash-input-dead_letter_queue logstash-input-elastic_serverless_forwarder logstash-input-elasticsearch logstash-input-exec logstash-input-file logstash-input-ganglia logstash-input-gelf logstash-input-generator logstash-input-graphite logstash-input-heartbeat logstash-input-http logstash-input-http_poller logstash-input-imap logstash-input-jms logstash-input-pipe logstash-input-redis logstash-input-snmp logstash-input-snmptrap logstash-input-stdin logstash-input-syslog logstash-input-tcp logstash-input-twitter logstash-input-udp logstash-input-unix logstash-integration-aws ├── logstash-codec-cloudfront ├── logstash-codec-cloudtrail ├── logstash-input-cloudwatch ├── logstash-input-s3 ├── logstash-input-sqs ├── logstash-output-cloudwatch ├── logstash-output-s3 ├── logstash-output-sns └── logstash-output-sqs logstash-integration-elastic_enterprise_search ├── logstash-output-elastic_app_search └── logstash-output-elastic_workplace_search logstash-integration-jdbc ├── logstash-input-jdbc ├── logstash-filter-jdbc_streaming └── logstash-filter-jdbc_static logstash-integration-kafka ├── logstash-input-kafka └── logstash-output-kafka logstash-integration-rabbitmq ├── logstash-input-rabbitmq └── logstash-output-rabbitmq logstash-output-csv logstash-output-elasticsearch logstash-output-email logstash-output-file logstash-output-graphite logstash-output-http logstash-output-lumberjack logstash-output-nagios logstash-output-null logstash-output-opensearch logstash-output-pipe logstash-output-redis logstash-output-stdout logstash-output-tcp logstash-output-udp logstash-output-webhdfs logstash-patterns-core

Host/Environment (please complete the following information):

Additional context This was working with http_compression enabled on Logstash 7.17 to opensearch 1.13, but failed after upgrading to 2.11.0.

Log entry showing the issue

{"level":"ERROR","loggerName":"logstash.outputs.opensearch","timeMillis":1698270194068,"thread":"[main]>worker9","logEvent":{"message":"Encountered a retryable error (will retry with exponential backoff)","code":400,"url":"https://10.64.111.190:9200/_bulk","content_length":745,"body":"{\"error\":{\"root_cause\":[{\"type\":\"json_parse_exception\",\"reason\":\"Illegal character ((CTRL-CHAR, code 31)): only regular white space (\\r, \\n, \\t) is allowed between tokens\n at [Source: (byte[])\\"\\u001F�\\u0008\\u0000\\u0000\\u0000\\u0000\\u0000\\u0000��TMo�@\\u0010��+,z�-�qjǧ�B�ЖB�*A�5^��U�]��\\u000E���wfݖV�\\u0002\\u0002!!���웯73���ĭ�]�9/�L\\u000Cm;���F�����O��Ur0\\�׈��G��\\u0011n�%̡����p)\\u0008w\\u0002\\u001B�>\\u0015\\u001Bz���\r���}�\\u001A���\\u001A���\\u000E\\u0009t����E�=;��\\u0006AH�\rm��a�\\u0007d����\\u0014��\\u0002�\\u0015\\u0009K�\\u0018\\u0018�\\u0000��+���gEȊ��梷q+�b��r��\\\\u0005�\\\\u0006;TЎ=�\\\\u0005�Q�a�U�\\\"; line: 1, column: 2]\"}],\"type\":\"json_parse_exception\",\"reason\":\"Illegal character ((CTRL-CHAR, code 31)): only regular white space (\\\\r, \\\\n, \\\\t) is allowed between tokens\\n at [Source: (byte[])\\\"\\\\u001F�\\\\u0008\\\\u0000\\\\u0000\\\\u0000\\\\u0000\\\\u0000\\\\u0000��TMo�@\\\\u0010��+,_z�-�qjǧ�B�ЖB�*A�5^��U�]��\\\\u000E���wfݖV�\\\\u0002\\\\u0002!!���웯73���ĭ�]�9/�L\\\\u000Cm;���F�����O��Ur0\\\\�׈��G��\\\\u0011n�%̡����p)\\\\u0008w\\\\u0002\\\\u001B�>\\\\u0015\\\\u001Bz���\\r���}�\\\\u001A���\\\\u001A���\\\\u000E\\\\u0009t����E�=;_��\\\\u0006AH�\\r*m��a�\\\\u0007d����\\\\u0014��\\\\u0002�\\\\u0015\\\\u0009K�\\\\u0018\\\\u0018�\\\\u0000��+��*�gEȊ��梷q+�b��r��\\u0005�\\u0006;TЎ=�\\u0005�Q�a�U�\\"; line: 1, column: 2]\"},\"status\":400}"}}

dblock commented 10 months ago

This is https://github.com/opensearch-project/OpenSearch/issues/10802, a server-side problem with 2.11. Closing it here, it will get fixed int he next minor (and looks like there's some discussion about patching it earlier).