Custom Node-Red node to inject data in Kusto (Azure Data Explorer)
Azure Data Explorer is a high performace timeseries database, query engine and dashboarding tool.
The Node Red Kusto (Azure Data Explorer)
integration allows you to send messages directly into events to Kusto also known as Azure Data Explorer for analytics and dashboarding.
For Home Assistant to authenticate with Azure Data Explorer, it need a Service Principal
There are two ways of creating an Azure Data Explorer Cluster: Pay as you go (PAYG) or Free to create a paid cluster follow instructions from here: Microsoft quickstart However Microsoft has released a free offer and this guide describes how to set up a free Azure Data Explorer Cluster and database:
There are a few different between the PAYG and Free versions: | Feature | PAYG Cluster | Free Cluster |
---|---|---|---|
Ingestion | Streaming and Queueing | Queueing only (for now) | |
Cluster size | Fully scalable | 4 vCPU, 8GB Menory, ~100GB data |
Within a 30 seconds, you will have a Azure Data Explorer cluster ready
After the creation, copy the Data ingestion URI from the top of the page
.execute database script <|
// Add SP ingestor rights
.add database ['<DatabaseName>'] ingestors ('aadapp=b5253d02-c8f4-4a79-a0f0-81xxxxxxx2a1f;72f988bf-xxxxx-41af-xxxxx-2d7cd011db47')
// Add SP viewers rights
.add database ['<DatabaseName>'] viewers ('aadapp=b5253d02-c8f4-4a79-a0f0-81xxxxxxx2a1f;72f988bf-xxxxx-41af-xxxxx-2d7cd011db47')
//Alter Table batchin ploicy
.alter database ['<DatabaseName>'] policy ingestionbatching @'{"MaximumBatchingTimeSpan":"00:00:05", "MaximumNumberOfItems": 500, "MaximumRawDataSizeMB": 1024}'
//Drop table if exists
.drop table ['<TableName>'] ifexists
//Create table
.create table ['<TableName>'] (timestamp: datetime, topic: dynamic, payload: dynamic)
//Create JSON mapping
.create table ['<TableName>'] ingestion json mapping 'nodered_json_mapping' '[{"column":"timestamp","path":"$.timestamp"},{"column":"topic","path":"$.topic"},{"column":"payload","path":"$.payload"}]'
If only some af the payload needs to be stored, this example will extract only the needed part as it reaches Azure Data Explorer:
{"column":"part_of_payload","path":"$.payload.key_in_payload"}
you can extact a part of the json payload and put it in a seperate column.
Example (KNX messages via MQTT and use node-red to store them in ADX):
//Create table
.create table ['knx'] (timestamp: datetime, topic: dynamic, GA: dynamic, val: decimal, knx_textual: dynamic, knx_src_addr: dynamic, knx_dpt: dynamic, payload: dynamic)
//Create JSON mapping
.create table ['knx'] ingestion json mapping 'nodered_json_mapping' '[{"column":"timestamp","path":"$.timestamp"},{"column":"topic","path":"$.topic"},{"column":"GA","path":"$.payload.GA"},{"column":"val","path":"$.payload.val"},{"column":"knx_textual","path":"$.payload.knx_textual"},{"column":"knx_src_addr","path":"$.payload.knx_src_addr"},{"column":"knx_dpt","path":"$.payload.knx_dpt"},{"column":"payload","path":"$.payload"}]'
Insert values for Cluster Name and Table Name from previous step. Insert Valuses from the creation of the Azure Service principal created earlier. Insert a value for buffer time (the time NodeRed buffers messages before sending them to Kusto)
if using a free cluster, un-check the Use Streaming API in the form, as this is not supported.
After completiing the flow, Node Red is sending data to Kusto - Azure Data Explorer.
Node Red is buffering for defualt 5 seconds before sending, and Batching Policy in Azure Data Explorer will futher batch up, so expect a litle delay before data is in Kusto.
As the setup is complete, data is being sent to Azure Data Explorer, and you can start exploring your data. Here are som rescources to learn to use Azure Data Explorer