BroadSoft-Xtended / BroadWorks-Dashboards-and-Discovery

This repository contains the BroadWorks Dashboards and Discovery components to extend BroadWorks data mining, reporting, and analysis capabilities.
23 stars 4 forks source link

bwlogsender not sending Tomcat access logs #93

Open davemcbridegma opened 4 years ago

davemcbridegma commented 4 years ago

Hi

I'm not seeing any logs leave servers for the Tomcat Access log channel.

Example: R21sp9 XSP, manually re-configured the XSP_channel.props to include:

channelfile:Tomcat:/var/broadworks/logs/tomcat:access_log:*:dailyallserver

Restarted bwlogsender and the application log talks about starting a new tailer:

2019-12-11_13:16:43.034 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - STATE_IDLING: Tomcat
2019-12-11_13:16:43.045 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - TailerRunner: Found persisted file - starting from here -> access_log.2019.12.11-00.04.14.txt Log number 0
2019-12-11_13:16:43.045 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - Starting file => access_log.2019.12.11-00.04.14.txt
2019-12-11_13:16:43.046 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - Creating new Tailer for access_log.2019.12.11-00.04.14.txt

However I'm not seeng the access_logs leave the server towards Kafka when capturing data using tcpdump on the XSP.

coreyt429 commented 4 years ago

I had success with the following: channelfile:Apache:/var/broadworks/logs/tomcat:access_log:*:dailyallserver

Essentially depending on the tomcat logs being formatted similarly to the apache logs, and using the apache log sender.

gokulbsft commented 4 years ago

Hi

I'm not seeing any logs leave servers for the Tomcat Access log channel.

Example: R21sp9 XSP, manually re-configured the XSP_channel.props to include:

channelfile:Tomcat:/var/broadworks/logs/tomcat:access_log:*:dailyallserver

Restarted bwlogsender and the application log talks about starting a new tailer:

2019-12-11_13:16:43.034 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - STATE_IDLING: Tomcat
2019-12-11_13:16:43.045 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - TailerRunner: Found persisted file - starting from here -> access_log.2019.12.11-00.04.14.txt Log number 0
2019-12-11_13:16:43.045 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - Starting file => access_log.2019.12.11-00.04.14.txt
2019-12-11_13:16:43.046 [TailRunner => access_log] INFO  com.broadsoft.zipsender.TailerRunner - Creating new Tailer for access_log.2019.12.11-00.04.14.txt

However I'm not seeng the access_logs leave the server towards Kafka when capturing data using tcpdump on the XSP.

Hi @davemcbridegma

Try not changing the application name. I.E try with channelfile:Apache:/var/broadworks/logs/tomcat:access_log:*:dailyallserver

Thanks @coreyt429 for pointing it out.

abonela commented 4 years ago

Hi @coreyt429

Quick Question, we currently have XSPs which are separated out based on the applications run on them like DMS,NPS ,XSI etc.

Our plan is upgrade only XSI servers to REL 21.Sp9 and leave others on REL 21.sp1 for now.

By using the same application as you advised, we are having mix up issue with access logs (apache) logs from Sp1 XSP server with access (Tomcat) logs from Sp9.

Any advise can you provide, what we can do to separate them out if we want to use same application name in the channel file.

Regards,

Arun Bonela

coreyt429 commented 4 years ago

@abonela In general mixing the logs shouldn't be a problem as far as storage goes. So I'm assuming the issue is with search. With multiple XSP's supporting different applications, we have similar requirements. I usually work around this with searches that specify the servers to use: For example if for a DMS search, I would just include the servers that provide DMS services (hostnames and mac address obfuscated of course) :

(server:xsp1 OR server:xsp25 OR server:xsp31) AND 0004f283beef

kdotson3263k commented 4 years ago

FYI, It appears that the tomcat logs are formatted slightly different from the apache logs and the apacheuseragent is not getting parsed properly.

abonela commented 4 years ago

@kdotson3263k yes we are having issue because of the same reason . It is down to Apach logs response times are written in milliseconds and Tomcat logs are in microseconds is causing the parsing the logs difficult.