Public doc : https://github.com/ElManchacho/FilebeatToCloudInfos
An easy way to add a Filebeat configuration to your system
This app currently runs with the the following Filebeat versions :
:warning: For now, you can only use the first Menu option, as the other ones are not ready for deployment yet (soon TM)
:information_source: This page makes you prepare the filebeat.yml file. There are some base field values that you can modify to your liking.
(more info here : https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-reference-yml.html)
:information_source: You can add multiple logs input folders paths.
:information_source: If the index doesn't exist in your Cloud instance, it will be automatically created.
:information_source: You can edit the file extension and the attribute separator accrding to your logs format.
:information_source: Copy/Paste a sample of your logs to identify it's headers and/or how many attributes you need to distinguish in your logs.
:information_source: There is a local '.env' file at the root folder of the project in which you can adress your personal informations.
:warning: For now, you need to create it youself and write the following properties :
HOSTS=<kibana url>
USRNAME=<username usually 'elastic'>
PASSWORD=<password>
With it, you won't have to rewrite the same infos everytime.
Then, press the validation button.
Now you can see and modify the attribute names that have been generated by your sample (will be empty if no sample provided)
header1;header2;header3
value1.1;value2.1;value3.1
Now let's say that our logs doesn't respect that exact format and that some columns content could be missing, like the following :
header1;header2;header3
value1.1;value2.1;value3.1
value3.2
value1.3;value2.3;value3.3
value1.4;value3.4
value3.5
value3.6
value3.7
value1.8;value3.8
value1.9;value3.9
value1.10;value2.10;value3.10
We can see 3 different logs formats here : not every field is filled for every line.
That would be a problem with a simple Filebeat mapping configuration : the lines without the exact number of mapped fields would be ignored.
value1.3;value2.3;value3.3
--> OK
value1.4;value3.4
--> 1 field missing (2nd column)
value3.5
--> 2 fields missing (2nd and 3rd columns)
So let's set up 3 logs fields configuration :
And then, set up is done !
Go into the 'FilebeatToCloud\filebeat-8.3.1-windows-x86_64\' folder and start your Filebeat instance in a powershell terminal with the .\filebeat.exe -e
command :
```...\FilebeatToCloud\filebeat-8.3.1-windows-x86_64> .\filebeat.exe -e```
Before sending any log, let's check our Elastic Cloud indexes (host:port/app/management/data/index_management/indices
) :
We can see that our index hasn't been created yet.
Now let's send the logs we used as a sample in our 2 input folders :
In folder 1 (as a '.txt' file) :
value1.1;value2.1;value3.1
value3.2
value1.3;value2.3;value3.3
value1.4;value3.4
value3.5
In folder 2 (as a '.txt' file) :
value3.6
value3.7
value1.8;value3.8
value1.9;value3.9
value1.10;value2.10;value3.10
Now check if our index has been created :
Yes !
Let's now create a Data View to explore our log's data (host:port/app/management/kibana/dataViews
) :
Now, we can visualize our data in the Kibana Discover section :
We received our 10 rows splitted in the 2 '.txt' files in the 2 precedently set up folders !
Let's now observe our data fields to check if our formats were properly set up :
(We can now select our headers in the field search input)
:information_source: All the received data are prefixed by 'parsed.'
Now let's properly display our data :
Wonderful ! Our data fields are perfectly classified !
You can now use the FilebeatToCloud application to send your logs with your very own formats on your cloud !
Make a upper level menu to choose to add a new Filebeat configuration or to use an existing/ancient one
Being able to export / import a gonfiguration --> generated/parserscript.js + generated/headersConfigs/config_..js + filebeat.yml
Being able to have multiple Filebeat instances/services + create a management panel
I don't advise to modify any file / folder content or name or the app could stop working