Closed wilkovandevelde closed 1 year ago
This is issue references repo of Lab 2, not repo Lab 1.
I believe the problem is related to the web-ui timezone setting or something else. Will confirm.
I am not able to reproduce the issue. Even if I toggle the timezone setting in the result card of the web-ui, the answersheet answer matches the quiz 2 # 6 (7.1) correct answer from multiple-choice options. Pls verify your answer should be 16. If thats not what you have in the answersheet provided by us pls lmk.
//C7, T1
let LogType= 'Warning';
let TimeBucket= 1m;
ingestionLogs
| where Level==LogType
| summarize count() by bin(Timestamp,TimeBucket)
| where Timestamp == '2014-03-08 00:00:00.0000'
This issue references Challenge 7, Task 3 but the previous comment is referencing C7 T1. I can confirm I also get an answer for Take 3 is isn't on the answer sheet. I can share the query if needed.
I am not able to reproduce the issue for 7.3 either. The answer sheet and the quiz answers are correct. 7.3. question is: "What is the count of json format?" You should get 492. If you get a different answer pls check your query, or clear & re-ingest logsRaw
.
//C7, T3
logsRaw
| where Component == "INGESTOR_GATEWAY"
| parse-kv Message as (table: string, format:string) with (pair_delimiter=" ", kv_delimiter="=")
| summarize count() by format
@hfleitas , in the beginning of the task it states:
"Use ingestionLogs table for all the challenge 7 tasks."
The query should be
ingestionLogs
| where Component == "INGESTOR_GATEWAY"
| parse-kv Message as (table: string, format:string) with (pair_delimiter=" ", kv_delimiter="=")
| summarize count() by format
Using this query ther results don't match the answer posibilities.
For 7.3 needs to be from logsRaw
not ingestionLogs
to get the answer we're looking for. We will clarify this on 7.3, thx!
The main objective of this question isn't the count in particular, its how you can natively parse the raw data without having to do additional tasks to stage the data. For example, in a real-world scenrio the query would be used in a materialized-view from logsRaw
itself to easily present the aggregate for other applications.
@Lah123 , I reviewed the lab answers txt from the instructor guide and fixed the quiz as you pointed out, this will keep it aligned to the docx file which is not so simple to update. I wish we would of published the instructor guide as markdown.
The options in the answer sheet for Challenge 7, Task 3 are not correct. We got a answer which is not an option in the answer sheet.
FYI: we are a ADX partner and received the answers. The query in this file results in the same answer that we got.