Open venkachw opened 3 months ago
@venkachw , As maps are generally unordered, how would you propose choosing the order for "first N" values?
@dlvenable, in our case, order is not our priority.
@venkachw , So then it would pick an arbitrary N values?
One option would be to update the select_entries
processor to support selecting specific keys within a object. Perhaps it would have a new select_from
value that determines where to select entries from.
select_entries:
select_from: file
include_keys: [read, write, delete]
select_entries:
select_from: scan
include_keys: [8080, 450]
Would this meet your use-case? Or do you just want an arbitrary limit on certain maps and arrays?
My use case is to limit the arbitrary values for certain maps and arrays, and the scan field has dynamic keys, so even for this case, I need to limit the values
Is your feature request related to a problem? Please describe. I have JSON object in S3 with two fields like below and I want to limit the entries of those fields while uploading it to open search using ingestion pipeline.
I want to limit the entries of the attributes(file.read, file.write, file.delete, scan.key1, scan.key2) to first 10 elements due to latency issues as elastic search has latency issues while querying large arrays.
Describe the solution you'd like Need the data looks like below after processing using data prepper plugin