Closed dmuelle closed 11 months ago
@donbourne can you review the draft at https://draft-openlibertyio.mybluemix.net/docs/ref/general/#analyzing-logs-elk.html ?
I tried to explain in the About this task section when a user would follow this procedure and when they would want to use Logstash collector. Not sure if I hit the mark.
There are some discrepancies in the step for creating an Index in Kibanna 5.6 between what was in the KC for this task and what was there for Logstash collector (step 7 here, step 6 in the Logstash topic). Also this procedure recommends ibm_datetime
as the time filter field name, while the Logstash topic reccommends datetime
. Just thought it was worth verifying that they are in fact different from user perspective.
If possible, can you get back to me by Monday 7/20? I'll be out the rest of this week so no rush there. Thanks!
comment from @donbourne in #1684 : #1903 needs a little work -- it's confusing a completely standalone environment, where you would set up your own ELK stack, with a Kubernetes environment, where all of that typically gets installed with Cluster Logging.
@donbourne can you advise on how best to make that distinction clear? Should this task be recommended for users that have cluster logging enabled? Currently, the "About this task" section is more general about when the procedure applies, mainly because I wasn't sure of the range of use cases we want to allow for. And if so, should the prereq be focused of having cluster logging enabled instead of setting up ELK?
Analyzing logs with Elastic Stack -> Analyzing JSON logs with Elastic Stack
You can use the Elastic Stack ... -> You can collect logs and events from your Open Liberty server and display them in a dashboard for analysis and troubleshooting purposes. Use JSON logging to emit logs from Open Liberty and sends them to the Elastic Stack, where they can be managed and visualized.
The following procedure assumes that your Open Liberty server is in an environment, such as a Kubernetes cluster, where logs can be made directly available to the Elastic stack. However, some server configurations cannot make logging data directly available for an external agent to manage. -> The following procedure assumes that your Open Liberty server is in an environment, where Filebeat has access to the Open Liberty server logs. However, some server configurations cannot make log data directly available for an external agent to manage.
Set up the Elastic Stack by following the instructions from Elastic. -> Set up Filebeat, Logstash, Elasticsearch and Kibana by following the instructions from Elastic.
Start the server. -> Start the Open Liberty server.
I'm not certain which of the logstashCollector doc or WL doc was right about the steps, so I'd suggest going with step 6 from the logstashCollector doc, modified for the ibm_datetime field as follows:
For Kibana 7, 6, or 5.6, click Management > Index Patterns.
Enter logstash-* as the Index Pattern.
Click Advanced Options and enter logstash-* as the Index Pattern ID.
Select ibm_datetime as the Time filter field name, and click Create.
Thanks for review @donbourne. I believe I've addressed all your comments. It this doc ready to be tested?
updated draft: https://draft-openlibertyio.mybluemix.net/docs/20.0.0.9/analyzing-logs-elk.html
done Analyzing logs with Elastic Stack -> Analyzing JSON logs with Elastic Stack
done You can use the Elastic Stack ... -> You can collect logs and events from your Open Liberty server and display them in a dashboard for analysis and troubleshooting purposes. Use JSON logging to emit logs from Open Liberty and sends them to the Elastic Stack, where they can be managed and visualized.
*used your suggestion from #1684, slightly modified, here instead** The following procedure assumes that your Open Liberty server is in an environment, such as a Kubernetes cluster, where logs can be made directly available to the Elastic stack. However, some server configurations cannot make logging data directly available for an external agent to manage. -> The following procedure assumes that your Open Liberty server is in an environment, where Filebeat has access to the Open Liberty server logs. However, some server configurations cannot make log data directly available for an external agent to manage.
done Set up the Elastic Stack by following the instructions from Elastic. -> Set up Filebeat, Logstash, Elasticsearch and Kibana by following the instructions from Elastic.
done
Start the server. -> Start the Open Liberty server.
done I'm not certain which of the logstashCollector doc or WL doc was right about the steps, so I'd suggest going with step 6 from the logstashCollector doc, modified for the ibm_datetime field as follows:
For Kibana 7, 6, or 5.6, click Management > Index Patterns.
Enter logstash-* as the Index Pattern.
Click Advanced Options and enter logstash-* as the Index Pattern ID.
Select ibm_datetime as the Time filter field name, and click Create.
Looks like this doc is ready to be tested. I'll open a separate issue for that.
A couple more minor comments:
to The Elastic Stack -> to the Elastic Stack
"For more information, see Analyzing JSON logs with the Elastic stack." - seems odd to say that here since we're in that topic.
Thanks for the catch- I've fixed those and will stand by for doc testing on this one.
The dashboards been seperated for kibana6_7
so step 8 should be changed from:
kibana6_7
to:
kibana6
.kibana7
.Thanks @Yushan-Lin - I've made this change and rebuilt the draft. Did you test the doc? If so, and that was the only change needed, can you add the doc tested
label to this issue? If other changes are needed, just let me know. Here's the updated draft
Thanks!
@dmuelle For 4. the liberty_filebeat6_7.yml has also been separated to liberty_filebeat6.yml and liberty_filebeat7.yml for Filebeat 6.x and Filebeat 7.x. I think that's all!
@donbourne I've made the changes per @Yushan-Lin test comments- is this doc ready for sign-off?
Here's the updated draft
looks good!
@lauracowen this is ready for your review- should be reviewed in tandem with #1684 as they are closely related
Both look good - I think the task topic structure looks nice and clean now.
Thanks- I've updated this draft with all the relevant fixes from #1684
requesting peer review from @Charlotte-Holt
@dmuelle Also looking good! Just a few comments:
Thanks @Charlotte-Holt
New edits for clarity and concreteness in #2995 & #2998
Re-checked against the KC source topic to ensure nothing was unintentionally omitted
Doc is now on vNext branch, will publish with 20.0.0.12 on 11.20
reviewed 10/6, no further edits needed, closing as completed
Create a task topic based on https://www.ibm.com/support/knowledgecenter/en/SSEQTP_liberty/com.ibm.websphere.wlp.doc/ae/twlp_elk_stack.html
We already have #1684 that describes this process, but that is for specialized cases where user "can't easily put an agent next to their liberty server to collect the data, ie. so they need to be able to make a direct connection from Liberty to their remote Logstash server without having some agent process handle transferring that data.
THis topic covers the more normal cases with JSON logging in a managed environment. Need to work with Don to disambiguate these two topics, make sure users find the one they're looking for- or possibly combine them into one.
SME Don Bourne