Closed mattklein closed 7 years ago
For anybody else who might read this, I don't think there's a way to configure the plugin to have the behavior I wanted. One could modify the plugin of course. But it seems like a better solution is to write a Lambda function hooked into the DynamoDB stream for this table, and have that Lambda function put the items into ElasticSearch via its REST API. There's no real need for Logstash as far as I can tell. Perhaps that's one reason why this repository seems to be dead.
Is it possible to have this plug-in transform a value read from DynamoDB into a different value to be stored in ElasticSearch? E.g., if I've got a Number field in DynamoDB that stores a timestamp (# of secs since epoch), can I transform that value so that it's stored as a Date in ElasticSearch?
This would be easily possible using the built-in ElasticSearch Date filter, with something like this: filter { date { match => ["server_timestamp_gmt", "UNIX"] target => "server_timestamp_gmt_parsed" } }
I may be wrong, but it doesn't appear that we can do anything like with this plug-in. It'd also be super useful (for my purposes anyway) if it were possible to transform numbers from DynamoDB into geo_point or geo_shape types in ElasticSearch.
Am I missing something? Is it in fact possible to have this plug-in transform and populate date or geo_point or geo_type types in ElasticSearch?