AbsaOSS / spot

Aggregate and analyze Spark history, export to elasticsearch, visualize and monitor with Kibana.
Apache License 2.0
5 stars 0 forks source link

Automatically handle the 'limit of fields' error #52

Closed DzMakatun closed 3 years ago

DzMakatun commented 3 years ago

Common error when storing docs to Elasticsearch:
RequestError(400, 'illegal_argument_exception', 'Limit of total fields [] in index [] has been exceeded')

Solution (currently done manually): The limit can be increased using Elasticsearch query: PUT //_settings {"index.mapping.total_fields.limit": }

This error can be handled automatically within the Elasticseach component.

  1. Check for returned errors
  2. Identify the error and parse the current limit.
  3. Identify the number of fields in the current doc (?)
  4. Calculate the new limit: eg. curent doc fileds + margin OR 2 * current limit
  5. Update the index settings
  6. Retry the request