The Basics – Azure Stream Analytics : Use GetArrayElements to Flatten Json

In this blog I’m detailing out how flatten complex json in Azure Stream Analytics. It’s not massively hard but took a bit of fiddling since the online docs stop a little short. This kind of stuff is mostly a reference for myself for when I need to do it again sometime later and will have probably forgotten.

There is this blog¬†which gives some insight but stops a little short on dealing with arrays. Basically I’m just going to cover how GetArrayElements works in a bit more detail.

Consider the following json event payload we might receive through a stream… No guesses for how I waste most of my time… and… it’s close but that’s not my real gamer tag ūüôā

{  
   "id":"123e4567-e89b-12d3-a456-426655440000",
   "device":"34543FERRE55",
   "player":{  
      "firstname":"Shaun",
      "surname":"Ryan",
      "gamertag":"worlord"
   },
   "gamestats":[  
      {  
         "statid":1,
         "gametime":"2017-02-17T11:59:12.000Z",
         "event":"Kill",
         "item":"BAR M1918",
         "gamertag":"millipead"
      },
      {  
         "statid":2,
         "gametime":"2017-02-17T11:59:15.000Z",
         "event":"Suppression",
         "item":"BAR M1918",
         "gamertag":"duckwarrior"
      },
      {  
         "statid":3,
         "gametime":"2017-02-17T11:59:09.000Z",
         "event":"Kill Assist",
         "item":"Rifle Grenade",
         "gamertag":"captaincardboard"
      },
      {  
         "statid":4,
         "gametime":"2017-02-17T11:59:34.000Z",
         "event":"Heal",
         "item":"medipack",
         "gamertag":"rhubarb"
      }
   ],
   "EventProcessedUtcTime":"2017-02-17T12:00:00.384Z",
   "PartitionId":0,
   "EventEnqueuedUtcTime":"2017-02-17T12:00:00.146Z"
}

Let us imagine our game sends a payload every minute that contains all the game stats that occurred in that minute. This is completely fabricated. Am sure that’s not how such a thing would be done but its a fun example and provides an array.

So our json above is relatively semi-structured, we have:

  • Attributes in the root
  • Child entities with attributes
  • A child entity with an array

Our end game is that we want to flatten this into a denormalized data sets to insert into a SQL table for example or aggregate our stats which is a more likely use case of stream analytics.

We want a data set that looks like this (click image to see larger pic):

capture

If we were aggregating we might thin out a few columns not needed in the grouping but just for an example it’s fine.

Solution

So here’s the solution straight off the bat. There’s some discussion that follows if you’re interested.

SELECT
 [Stat].[id] AS [ID],
 [Stat].[device] AS [Device],
 
 [Stat].[player].[firstname] AS [Firstname],
 [Stat].[player].[surname] AS [Surname],
 [Stat].[player].[gamertag] AS [Playergamertag],
 
 [Stat].[GameStat].[statid] AS [Statid],
 [Stat].[GameStat].[gametime] AS [Gametime],
 [Stat].[GameStat].[event] AS [Event],
 [Stat].[GameStat].[item] AS [Item],
 [Stat].[GameStat].[gamertag] AS [Gamertag],
 
 [Stat].[EventProcessedUtcTime],
 [Stat].[PartitionId],
 [Stat].[EventEnqueuedUtcTime]
FROM
(
 SELECT
 [EventAlias].*,
 [GameStatsAlias].[ArrayValue] AS [GameStat]
 FROM [InEventHub] AS [EventAlias]
 CROSS APPLY GetArrayElements(Event.gamestats) AS [GameStatsAlias]
) AS Stat

 

Finding the Solution

The best way I found to fiddle around with this stuff is to write out to json file on blob storage. Since you get to see the json of how close you’re getting and go from there. There is a Stream Analytics addin for visual studio now that offers local debugging but I’ve had some issues with it, namely the addin breaking my data lake tools addin.

GetArrayElements

The GetArrayElements function pivots out the array items but has to be applied using a cross apply so that it runs across each item in the array. Any where clause should follow the cross apply. I’ve used a sub query since it allows me to:

  1. Have intuitively readable code
  2. Break out the columns from the ArrayValue into their own alias’s above; it might be possible to do that in 1 hit but I like the subquery (see 1)

If you take the sub query alone it creates the following json so it’s not hard to see how to get to the final result by looking at the intermediate result.

Query:

SELECT * FROM
( 
 SELECT
 [EventAlias].*,
 [GameStatsAlias].[ArrayValue] AS [GameStat]
 FROM [InEventHub] AS [EventAlias]
 CROSS APPLY GetArrayElements(Event.gamestats) AS [GameStatsAlias]
) AS Stat

Creates:

[  
   {  
      "stat":{  
         "id":"123e4567-e89b-12d3-a456-426655440000",
         "device":"34543FERRE55",
         "player":{  
            "firstname":"Shaun",
            "surname":"Ryan",
            "gamertag":"worlord"
         },
         "GameStat":{  
            "statid":1,
            "gametime":"2017-02-17T11:59:12.000Z",
            "event":"Kill",
            "item":"BAR M1918",
            "gamertag":"millipead"
         },
         "EventProcessedUtcTime":"2017-02-17T12:00:00.384Z",
         "PartitionId":0,
         "EventEnqueuedUtcTime":"2017-02-17T12:00:00.146Z"
      }
   },
   {  
      "stat":{  
         "id":"123e4567-e89b-12d3-a456-426655440000",
         "device":"34543FERRE55",
         "player":{  
            "firstname":"Shaun",
            "surname":"Ryan",
            "gamertag":"worlord"
         },
         "GameStat":{  
            "statid":2,
            "gametime":"2017-02-17T11:59:15.000Z",
            "event":"Suppression",
            "item":"BAR M1918",
            "gamertag":"duckwarrior"
         },
         "EventProcessedUtcTime":"2017-02-17T12:00:00.384Z",
         "PartitionId":0,
         "EventEnqueuedUtcTime":"2017-02-17T12:00:00.146Z"
      }
   },
   {  
      "stat":{  
         "id":"123e4567-e89b-12d3-a456-426655440000",
         "device":"34543FERRE55",
         "player":{  
            "firstname":"Shaun",
            "surname":"Ryan",
            "gamertag":"worlord"
         },
         "GameStat":{  
            "statid":3,
            "gametime":"2017-02-17T11:59:09.000Z",
            "event":"Kill Assist",
            "item":"Rifle Grenade",
            "gamertag":"captaincardboard"
         },
         "EventProcessedUtcTime":"2017-02-17T12:00:00.384Z",
         "PartitionId":0,
         "EventEnqueuedUtcTime":"2017-02-17T12:00:00.146Z"
      }
   },
   {  
      "stat":{  
         "id":"123e4567-e89b-12d3-a456-426655440000",
         "device":"34543FERRE55",
         "player":{  
            "firstname":"Shaun",
            "surname":"Ryan",
            "gamertag":"worlord"
         },
         "GameStat":{  
            "statid":4,
            "gametime":"2017-02-17T11:59:34.000Z",
            "event":"Heal",
            "item":"medipack",
            "gamertag":"rhubarb"
         },
         "EventProcessedUtcTime":"2017-02-17T12:00:00.384Z",
         "PartitionId":0,
         "EventEnqueuedUtcTime":"2017-02-17T12:00:00.146Z"
      }
   }
]

This has blown out the array across the higher level json data, from here we can just pick out what we need in our rows using the outer query. When you dump this out to a file it won’t create an array as a single json doc it basically writes a json document per file line, so 4 json documents in this case separated by a line feed.

Incidentally if you are using the Stream Analytics Visual Studio addin with local debug and a sample json file you have to encase your json document events into an array or it errors.

Hope this makes sense / helps. No doubt I’ll have forgotten how to do this in a couple of days.

Finer Details – Stream Analytics & PowerBI: Feb 1st Update

So folks might have noticed the Feb 1st update to stream analytics. There’s a fair bit of stuff in this release.. What I am going to focus on though is how stream analytics integrates into Power BI now.

Power BI has had the addition of Stream Data sets sometime after Azure Stream Analytics (ASA) integration. Folks who have worked with it might be aware that when you hook up ASA to Power BI it just creates a dataset and doesn’t create a stream dataset. When I first did this it jumped out as something that probably needed to be consolidated… and now it has.

capture

So what does the implementation experience look like now… Well from the stream analytics nothing much appears to have changed. I have a test query that’s not doing much it just fires through some data. I’m actually firing json in through the event hub but will focus on the ASA PowerBI bit. The query is as follows:

WITH customers AS (
 SELECT
 [customer].customerCode,
 [customer].firstname,
 [customer].lastname,
 [customer].addressline1,
 [customer].addressline2,
 [customer].addressline3,
 [customer].postcode,
 [customer].town,
 [customer].county,
 [customer].country,
 [customer].customerEmail,
 [transaction].transactionCode,
 [transaction].status,
 [transaction].[transactionTime],
 [transaction].[transactionAmount]
 FROM
 [evehub]
)

SELECT * INTO [TestReport] FROM customers

When creating the output everything seems as it was before. My assumption was that something would’ve changed here and that I would have to create my streaming dataset end point definition first in Power BI and choose my data set here. But that’s not the case…

capture2

The subtle impact here is that it’s not like an API where I can just fire json at the secure end point. I still have to authorize with an account! This means you need to have a PowerBI account for your stream analytics service which in a production environment you probably won’t want that to be a named user account. Also, if the workspace needs to be shared then it will need to be a Pro account.

So far we’ve hooked up ASA to Power BI but we haven’t done anything with Power BI. Well it turns out we¬†don’t have to manually define the end point like we do with API integration. When data starts streaming through the data set appears automatically in our Streaming Datasets within Power BI. So when I turn on my data stream using a C# app I get the following in PowerBI.

capture3

When I click the pencil to edit the data set I can see that it’s created all the attributes, typed them and defaulted history to on.

capture4

Now it seems we have some control of how these data sets are shaped in Power BI and we can flip the history on or off. We also have an end point that we can use with an api, cURL or PowerShell…

capture5

This means we might be able to stream in data to the data set using another api source not just stream analytics. Also it might seem I can do the following:

  • add an attribute that’s not in my ASA stream
  • remove an attribute¬†that is in my ASA stream

Lets see…

Add an Attribute

I created a streaming tile just totting up transactions realtime.

capture6

When I add a new column the streaming tile still displays without issue. However when I go to create a new report or edit the existing report I used to pin to the dashboard from my dataset I start hitting issues. To see the new column in the dataset on the report editor I have to hit refresh. After doing this my data set and report is broken.

capture7

capture8

Editing the data set again, removing the column and hitting refresh doesn’t fix the issue. It seems I have to trash and recreate the data set.

How about ASA? Well ASA also had a problem. It didn’t stop the job but failed to send the transactions meaning the seemingly unaffected tile won’t update with any new data. Note the lack of column name in both the Power BI and ASA exception.

capture10

 

Remove an Attribute

What happens when we remove an attribute?

Well… the same thing as we might expect from trying to add a column.

Conclusion

This is a good change and consolidates the features of streaming data sets into Power BI for API’s and Stream Analytics. We gain some further control over datasets but it seems ASA is very tightly bound to the data set definition and in reality we can’t do much without breaking it.

It might mean that we can stream data in from another source not just ASA since we now have an endpoint we can use… I haven’t tried this yet.

All in all I’m still left a bit wanting in the mashing features for realtime data in Power BI. I’m sure it will get there… Microsoft are doing a great job… We are a tough bunch to please these days ūüôā

 

 

The Basics – Azure SSAS: Authentication

Azure SSAS has hit preview and if you’re familiar with SSAS you’ll know it only works with Active Directory Integrated authentication. If you’re not an Active Directory spod this might throw you a little with it being an Azure PaaS service. Effectively it works in a similar way and requires you to create the service declaring ¬†an Azure Active Directory administration account. If you’ve created your own subscription what you use to login is not an Azure AD account and choosing that account won’t be an option.

So… A little on Azure AD. Azure AD wasn’t really intended to be a stand alone Active Directory, well not yet anyway. Essentially for cloud to work there had to be an authentication method that works well over the internet and Microsoft went for OAuth 2 which is basically what Azure AD uses to create secure tokens. Azure AD is effectively a cloud extension to traditional AD that we are familiar with at least on some level and is the organization account we use every day to logon onto our work machines.

Good news is you get basic Azure Active Directory for free. See the screen shot below.

capture

In the screen shot there are 2 accounts:

  • Microsoft Account
  • Azure Active Directory Account

The Microsoft Account will be there already. That’s what was possibly used to create and use the subscription up till now. This is the personal account sign in when you go to sign into office 365 or the Azure portal. The personal account username will be whatever email address you used… e.g.

joe.blog@hotmail.com

When you look at the profile of the account in the next blade on the above image you’ll see the source of this account is declared¬†Microsoft Account.

The other account is an Azure Active Directory account. This can be created in the blade above using “Add User”. It will have the following username convention.

username@joebloghotmail.onmicrosoft.com

If you want to assign your own domain to get rid of the one given to you by Azure you can but that isn’t free. I’ll leave that beyond the scope of this blog. There are brilliant courses on Pluralsight for Azure AD… Thoroughly recommend pluralsight… it’s awesome.

When we look at the profile of this account it’s source is declared as Azure Active Directory. So to create a SSAS service we need an Azure Active Directory account to be an administrator so go ahead and create one. This account is what we know and sign in as an organizational account and you can indeed give it further privileges and use it to sign into the portal to provision and use other services.

capture2

Now we have an Azure AD account we’re ready to go…

When creating the SSAS service you’ll be asked for this type of account for an administrator. It won’t allow to select a Microsoft Account (personal account). Below we see the Microsoft Account is grayed out I can only select an Azure AD Account (organizational account).

capture3

Once the service is provisioned we can deploy to this services using SQL Server Data Tools (SSDT) and connect to it using SQL Server Management Studio (SSMS). These docs go into detail:

When connecting using SSMS normally with SSAS we just go straight in using our on premise AD account with integrated authentication. Because we’re signing in with our Azure AD Account we need to provide the account details that we have created in Azure. In order to connect we need a new option Active Directory Password Authentication¬†that requires a registry entry to enable it (Connect using SSMS).¬†If you’re building up advanced security models in SSAS then we can obviously create a number of users and use different sign-in’s to test the results.

capture4

Unless of course your on premise AD and Azure AD are hooked up via federation then you can go straight in with Active Directory Integrated Authentication without having to sign in.

In a working environment we obviously wouldn’t want to assign individual users. It would be better to create a groups e.g. SSAS Administrators, assign the groups and manage user access by assigning users to groups in Azure Active Directory.

I hope this helps. Not all BI developers have a handle on Active Directory especially the in’s and out’s with Azure. I only know what I know from app programming many moons ago and some hours spent on Pluralsight learning enough to get by and design good security models