Open Private-SO opened 7 years ago
@Yaswanth-C have you been able to resolve this one? I'm looking into similar thing, blob storage to logstash or data lake to log stash.
Sorry i didn't resolved this one yet ,rather i am sending data directly from ES snapshot to Azure blob storage. BTW if you got this one resolved please share your info .
Looking into it right now, will do if/when I get to a solution or a workaround.
On Mon, Apr 9, 2018 at 10:51 AM, Yaswanth-C notifications@github.com wrote:
Sorry i didn't resolved this one rather we are sending data directly from ES snapshot to Azure blob storage. BTW if you got this one resolved please share your info .
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/elastic/logstash/issues/7748#issuecomment-379681071, or mute the thread https://github.com/notifications/unsubscribe-auth/AAmsq1Z5rrMuPVyxY2WIbYnVDrh6Np6yks5tmyEVgaJpZM4OdxA4 .
How can i do to solve it?
Version: Logstash 2.4.0
Operating System: WIN 10
Config File (if you have sensitive info, please remove it):
Sample Data: [2017-04-26 20:39:54,633][TRACE][index.search.slowlog.query] [data-0] [content-apr-2017][0] took[82.1micros], took_millis[0], types[publishedarticle], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{"query":{"bool":{"must":[{"terms":{"List.Id":[316271282]}}]}}}], extra_source[], [2017-04-26 20:39:54,633][TRACE][index.search.slowlog.query] [data-0] [content-apr-2017][0] took[82.1micros], took_millis[0], types[publishedarticle], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{"query":{"bool":{"must":[{"terms":{"List.Id":[316271292]}}]}}}], extra_source[], [2017-04-26 20:39:54,633][TRACE][index.search.slowlog.query] [data-0] [content-apr-2017][0] took[82.1micros], took_millis[0], types[publishedarticle], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{"query":{"bool":{"must":[{"terms":{"List.Id":[316271382]}}]}}}], extra_source[],
Steps to Reproduce: 1) Create a storage account in azure and copied raw data .txt file to a container into that account. 2) With logstash azureblob plugin sending it to local ES cluster using above logstash config file.
I am getting error like this in logstash
But when i changed the storage account type from blob to general storage when i executed the first time , i can see the output send to ES . When i did the same again it is not sending any output to ES and also didn't showing any error in logstash too(i.e.Even i kept
stdout { codec => rubydebug }
in output ).How can i resolve this and send data to ES ?