Finer Points – Azure Stream Analytics & PowerBI: Duplicates

Power BI is evolving rapidly and market competition is really hotting up especially with Tableau. Arguably Power BI’s real-time capability puts it slightly ahead of the game in one aspect. I managed to get some detailed hands on to see what the experience and capability is really like. There are 2 ways to stream data into power BI:

  1. API’s
  2. Azure Stream Analytics

You can get up and running with stream analytics pretty quickly however sometimes it’s good to break stuff just to see what you’re getting yourself into.

I’m not going to go into details of how the architecture is set up since there’s loads of blogs on that already. Basically:

  1. Set-up a stream analytics job pointing at Storage blob to read CSV’s
  2. Added a query to pull apart the date and add in some system columns
  3. Fed the data into my Power BI subscription

arch

You can stream data from IoT, event hub or storage events. I’ve used storage events here just because it’s easy to set up. Using storage for real-time is a way to batch events up into an event pay load which you can stream through and then archive into a historic database or data lake.

The query is as follows:

SELECT
BlobName,
EventProcessedUtcTime,
BlobLastModifiedUtcTime,
PartitionId,
CAST(DATEPART(year,Date) AS nvarchar(max)) AS Year,
CAST(DATEPART(month,Date) AS nvarchar(max)) AS Month,
CAST(DATEPART(day, Date) AS nvarchar(max)) AS Day,
Date,
Region,
CAST(Sales as BIGINT) AS Sales
INTO
 [TestReport]
FROM
 [Test]

In this scenario I’m just going to use the storage account write events to write a few simple rows of data into power BI. I will load 2 csv files:

  1. Sales for UK & Wales for Jan 2017 09:00:01 – 09:30:00
  2. Sales for UK & Wales for Jan 2017 09:30:01 – 10:00:00

Here is an example showing the 2nd file start and end, note that I’ve deliberately included data that will cause an error in stream analytics – “wrong” cannot be cast as a bigint:

Date,Region,Sales
2017-01-01 09:30:01,UK,5
2017-01-01 09:31:00,UK,1
2017-01-01 09:32:00,UK,2
....
2017-01-01 09:58:00,WALES,1
2017-01-01 09:59:00,WALES,4
2017-01-01 10:00:00,WALES,wrong

Step 1 – I copy file 1 up into the storage account, within a few seconds my test data set appears and I can see data from the 1st file. Note that I’ve included bloblastmodifiedutctime.

time-data-1

Step 2 – I copy file 2 up into the storage account.At this point the job stops and is in a failed state as expcted. Looking in the logs we can clearly see the error:

exception-message

However interestingly when we look at Power BI some data (1 row) made it through before the failure ended the stream:

2-with-error

Step 3 – so I’ve picked up my failure and I correct the file changing the value of wrong to 50. Note I can only do this because I have a file to change.

Date,Region,Sales
2017-01-01 09:30:01,UK,5
2017-01-01 09:31:00,UK,1
2017-01-01 09:32:00,UK,2
....
2017-01-01 09:58:00,WALES,1
2017-01-01 09:59:00,WALES,4
2017-01-01 10:00:00,WALES,50

Copy file 2 back up into my storage account overwriting the previous copy that caused the data to fail. Then I restart the job from when it failed.Note I haven’t used TIMESTAMP BY to set the event date using a source column.

when-last-finished

The job starts successfully and picks up the corrected file. However on checking Power BI I now have a situation where duplicates have made it into my Power BI data set. Looking at the bloblastmodifiedutctime I can see that they originate from different blob events; 1 that caused failure and 1 that did not. The numbers are different since there are 2 rows for uk in the data set with the same time which Power BI has aggregated. This shows that only 1 row made it through on the failure:

dupes-time

Duplicates

On perusing the documentation I came across the following:

“….When a Stream Analytics job is running, duplicate records may occasionally be noticed in the output data. These duplicate records expected because Azure Stream Analytics output adapters don’t write the output events transactionally. This ‘duplicate record’ scenario can result if one of the following conditions occur;

  • The Azure instance is upgraded while the job is running
  • The Stream Analytics job is upgraded or an issue occurs with connectivity or reliability to the job output
  • The Azure instance running the job has an outage

The downstream consumer of the output events need to dedupe the events using logical identity of the events. For example, if you are aggregating events by groups in a tumbling window, the logical identity of the event is the groups and the tumbling window’s end time. If you are running a pass through query, you may need to carry a unique id on the event in order to dedupe…

Fixing the Dupes

A simple observation is making sure the queries and source data don’t invoke errors. Although the docs imply that azure instance interruptions can cause the same issue… and outright states that there is no transaction protection of outputs.

In our case we are partitioning data into files before dumping the data into an event window for potential aggregation e.g. every 30 minutes. Because it’s a file on azure storage I can get in to correct the data and restart the stream.

Windowing

So does windowing help us? Well yes and no since it depends on the requirements. It fundamentally comes back to are we using the right thing for the job. With stream analytics we’d want to be doing some sort of windowed aggregate. If we partitioned our file at the same grain as the aggregate window then no files would commit as output events unless the operation succeeded for all the data in the file.

Dedupe

This is pretty much out of the window since we’ve plugged it straight into Power BI. There is no way to insert custom dedupe logic. If you know which files have failed you can filter them out which is an ugly work around.

Reload

All my data is in archived files so I can delete the data set, correct my data set and reload everything.

API

I can use the Power BI api to delete rows of data from my data set using the bloblastmodifiedutctime.

https://msdn.microsoft.com/en-us/library/dn877544.aspx

https://github.com/Microsoft/PowerBI-CSharp/tree/master/samples/webforms/get-started-web-app-asp.net

Do we care?

Considering the use case do we really care? If we’re running an event stream we should only really care about the now. Particularly if we using the stream analytics to stream data through prediction functions or statistical aggregations since they are approximate anyway based on a sample populations of data.

If on the other hand if it’s for a BI function that just want to see their data in a BI model as soon as possible then we are going to care that the numbers are correct especially in Finance. As well as real-time we’d be archiving this data off to load a historical data model. This is where the real-time feature can get a bit misconstrued. Real-time BI is a different beast to real time event processing. Currently using this method there is no way to blend your real-time feed into a richer BI model with related data sets and calculations. Even with event stream processing it would be good to visualize the event stream against the backdrop of historical aggregates.

In order to build real-time BI models we need to be able to partition data sets and point them at different sources – e.g. a real time source and retrospective source. It’s a tried and tested method with SSAS multi-dim. Partitioning is on the way for PaaS SSAS tabular and perhaps the API gives me more options for achieving real-time BI. Or perhaps a richer architecture with stream analytics putting something in between it’s output and Power BI. The only other way to push data real time into data sets currently is using API’s.

Right… onto the API.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s