Traditionally in the world of reporting, the occurrence of an event is reported much later than its time of occurrence. What is the purpose of knowing whether at the end of the day? We want to read and analyze the data as it flows into the system. To achieve this, we have Azure Stream analytics which can process streaming data and provide real-time reporting. In this blog, we will demonstrate a simple use-case for Azure stream analytics.
Before we get into the implementation, a brief description of each of the components used in this blog:
Azure Stream Analytics
It’s a high performance, pay-per-use event processing engine which can analyze high volume of data incoming from sensors, APIs’, applications etc. It can have Azure IOT hub, Azure events hub and blob storage as input. It can process the data and output to the Azure service bus, Azure functions, real-time Power BI dashboards and other data stores. The transformation query has an SQL like syntax that is used to filter, aggregate and join streaming data over specific time intervals.
Azure events Hub
It’s a service which sits between event producers and event consumers to decouple the production of an event stream from the consumption of these events. It’s capable of ingesting millions of events per second with time retention buffer. Data sent to event hub can be transformed in any real-time stream processor like Azure stream analytics or stored in data stores.
Azure Logic apps
It’s a serverless application integration service which binds many Microsoft applications and other third-party applications together. In this blog we are using Logic app to read from API endpoint at recurrence and send events to Azure event hub.
What are we doing here?

We are using Banes Car Park occupancy API endpoint as event generator. We have setup Logic app to GET from this API every 15 minutes and push the JSON data into Azure events hub using Send event action of logic apps.
Once the JSON is pushed into Azure Event hub, it creates a data stream which is the input for Azure Stream Analytics. Similarly, we configured Power BI sink to be the output. This output is published as a Real-Time dataset in Power BI workspace. The steps to configure these are available at Microsoft blogs. We have written transformation logic to derive the number of available parking slots in run-time.
With this, we can create a simple dashboard in Power BI to check the real-time dashboard features. The dashboard tiles get automatically refreshed when the backend data changes. Below is a GIF to show how the Real-time dashboard works.

Interestingly we can also report for specific time intervals using Stream Analytics. Also, there are a lot of new visualization tiles coming up in Power BI. There is so much to try using these tools. We will write more on these with some interesting use cases. Stay tuned!
Learn more about Visual BI’s Microsoft Azure offerings here.