Open liualexiang opened 3 years ago
test VM version: ubuntu 18.04, the issue can be reproduced.
Hi liualexiang, I think there is something wrong with your log file, it may not be the JSON format. Can I have your logfile, I will test on my server?
Hi pinochioze@,
sorry for late response. you may try this log file: https://raw.githubusercontent.com/liualexiang/images/master/Azure_NSG_Logs.json
BTW: I did more tests these days, and I installed the logstash 5.2.0 and 7.8.0 on the same VM (CentOS 7.5 based with JAVA openJDK 1.8.0), using the same logstash configuration file, the logstash 5.2.0 works well with logstash-input-azureblob plugin, however, the jsonparse failed on logstash 7.8.0.
the test configuration I used.
input {
azureblob
{
storage_account_name => STORAGE_ACCOUNT_NAME
storage_access_key => STORAGE_ACCESS_KEY
container => "insights-logs-networksecuritygroupflowevent"
codec => "json"
file_head_bytes => 12
file_tail_bytes => 2
}
}
output {
stdout {
codec => rubydebug
}
}
any insights?
The same issue to me and my logstash version is 7.8.0. The plugin always print comma behind the open brackets and sometimes load multiple blob. So i was replacing "[," to "[" and "}{" to "}^&&^{" then split "}^&&^{" by "^&&^"
The same issue to me and my logstash version is 7.8.0. The plugin always print comma behind the open brackets and sometimes load multiple blob. So i was replacing "[," to "[" and "}{" to "}^&&^{" then split "}^&&^{" by "^&&^"
Can you share how you replace the comma and open brackets? in the logstash input field?
The same issue to me and my logstash version is 7.8.0. The plugin always print comma behind the open brackets and sometimes load multiple blob. So i was replacing "[," to "[" and "}{" to "}^&&^{" then split "}^&&^{" by "^&&^"
Can you share how you replace the comma and open brackets? in the logstash input field?
You should do replace job in the filter filed not input. Anyway, my trick is not working completly. I think the plugin fetching blob incompletly.
The original data is :
{"records":[{"time":"2020-08-12T07:00:07.1267218Z","systemId":"5c9979d8-bb89-486f-adea-060bfe479aa2","macAddress":"0022480EDEF2","category":"NetworkSecurityGroupFlowEvent","resourceId":"/SUBSCRIPTIONS/58DBBD07-EB52-47CC-88B3-BBAEA99036A4/RESOURCEGROUPS/RG-CUSTOMERMONITORING/PROVIDERS/MICROSOFT.NETWORK/NETWORKSECURITYGROUPS/VM-CSMONITORING-NSG","operationName":"NetworkSecurityGroupFlowEvents","properties":{"Version":2,"flows":[{"rule":"DefaultRule_AllowInternetOutBound","flows":[{"mac":"0022480EDEF2","flowTuples":["1597215544,10.1.2.13,20.150.4.4,48330,443,T,O,A,E,8,1535,11,9612","1597215545,10.1.2.13,20.150.4.4,48362,443,T,O,A,B,,,,","1597215550,10.1.2.13,20.150.4.4,48342,443,T,O,A,E,8,1535,11,9612","1597215551,10.1.2.13,20.150.4.4,48370,443,T,O,A,B,,,,","1597215552,10.1.2.13,52.231.32.42,49518,
But the fetched data is :
>"{\"records\":[,\",\"1597854824,168.63.129.16,10.240.0.35,61213,32744,T,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.46,50747,53,U,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.41,42021,53,U,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.46,40744,53,U,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.41,46588,53,U,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.46,44871,53,U,I,A,B,,,,\",\"1597854824,10.240.0.20,10.240.0.45,55026,3001,T,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.46,42116,53,U,I,A,B,,,,\",\"1597854824,10.240.0.7,10.240.0.41,34235,53,U,I,A,B,,,,\",
See the data behind the "records". There is something missing
Hi Azure team,
I tested below conf works well on ES/Logstash 5.2.0, but it doesn't work on ES/Logstash 7.8.0
see error:
Sometimes the error is:
Sometimes is: