Azure-Samples / modern-data-warehouse-dataops

DataOps for Microsoft Data Platform technologies. https://aka.ms/dataops-repo
MIT License
590 stars 462 forks source link

Add "Application Monitoring" Framework #715

Open promisinganuj opened 5 months ago

promisinganuj commented 5 months ago

Part of the broader "Data Observability" story, this task is specifically around the "application monitoring" capabilities.

Currently, if customers have scheduled notebooks/pipelines/jobs, they need to go to "Monitor" menu on the Fabric UI to see the run details (start time, end time, status etc). Also, they might want to do some high-level application specific logging such as no. of records processed, failed records, custom metrics etc.

What To facilitate that, we want to create a custom logging framework, with the following features:

How Conceptualize and implement a simple generic python based logging package. Identify a processing step. Ex: Process to load RAW data into SILVER layer. Identify the information to log: Ex: Success records, failure records, start time, end time, total time taken, status etc. Upload the logging package to the spark pool, and use it in the notebook for logging information.

maniSbindra commented 4 months ago

Thanks @promisinganuj I have started work on:

  1. Considerations on need to secure Otel Collector Ingestion endpoints when used for Fabric Notebook Telemetry
  2. Notebook sample and collector configuration for basic, bearer and oidc authentication with OTel Collector

Will submit the PR for review soon. cc: @sreedhar-guda