I am using the same approach, I need some information.
We can direct consume the event hub messages through Azure databricks and can run the stream in foreach batch to parse the body. Do you think my understanding is correct.
I need cost effective solution and to reduce the overhead putting data to storage account it will again charge us to store the data. Could you please guide more with respect new releases of databricks can this be good approach to read and write stream in foreach batch using azure databricks directly.
Hi ,
I am using the same approach, I need some information.
We can direct consume the event hub messages through Azure databricks and can run the stream in foreach batch to parse the body. Do you think my understanding is correct.
I need cost effective solution and to reduce the overhead putting data to storage account it will again charge us to store the data. Could you please guide more with respect new releases of databricks can this be good approach to read and write stream in foreach batch using azure databricks directly.
Thanks Anuj Gupta