Auto DW – Metaprogramming

This is a high level consideration on using metaprogramming to build Automated Data Provisioning frameworks.

Once you’ve mastered writing code and applying solution patterns to solve real world problems a next natural progressive step is to write code that writes code! A lot of implementation technologies might even dip into this without you being aware or you may just start dipping into a natural innovative way so solve a certain problem. The topic is part of an advanced knowledge domain called Metaprogramming; the wiki-post discusses it pro’s and challenges. Kathleen Dollard has a great course on Pluralsight called Understanding Metaprogramming.

My own experience and perhaps the most common is that you’ll start metaprogramming before you’ve given the topic it’s full attention. I don’t remember getting out bed and thinking… “today I will do some metaprogramming”. What happened is that chasing the benefits as a result of experiencing pain provided the motivation. The next thing to say about code writing code is that it can go fantastically well or horrifically bad. Without giving the knowledge domain the respect that it deserves chances are it will be the latter.

Another fundamental software engineering trap I learnt the hard way is don’t program generic solutions to very specific problems you’ll pay for it in complexity and performance. The temptation can be quite strong because we’re taught to abstract and conquer particularly if the problem looks the same – but is it? This is particularly relevant for data provisioning platforms because (not exhaustive):

  1. Performance is high on the agenda. We attempt to routinely and frequently move and change tons of data; performance is crucial for success and it’s directly related to costs
  2. The repetition is obvious and can appear to constitute a large proportion of man hours; to an economist it seems to be the same solution over and over again e.g. stage data, build snapshot fact, build type 1 dimension, build type 2 dimension, etc…
  3. The content (the data) is fluid and dynamic over a large temporal period. Beyond it’s schema definition it has a low level and often an incomplete persistence of semantics that can be illusive that are a product of real world economic and human behaviour
  4. The expectations and requirements are also fluid and dynamic; they seek recover semantic meaning or information from data using reports, interactive data visualisation tools, semantic layers or other system to system interfaces.

So bringing this all into context:

  • Design patterns are common but the solutions are never the same. A type 2 dimension is a design pattern not a solution. This isn’t helped by teams running bad agile delivery. A type 2 dimension is not a backlog story and neither is a fact table.
  • The solution is to provide specific information to meet a business requirement. Not only is it different on every single project, it’s different in the same project over time. The business has to respond to it’s market which in turn motivates expectation and influences human behaviour which  in turn churns the raw solution content; the data. A static solution is not a solution.
  • The solution content is the data which is also different on every single implementation and within an implementation over time. It has features and they can all change either explicitly or implicitly
  • Performance in data platforms rely on issuing the correct amount of physical computing resources at exactly the right time. What this means is that a physical implementation needs to know about the features of the data very explicitly in order to allocate the correct amount of resources. Get it wrong in an on premise architecture and a job hogs limited resources causing other processes to suffer. Get it wrong on cloud MPP architecture and you’ll pay too much money. This is not going away; why? because information has entropy and you can’t cheat the law of physics.

In Conclusion

Building a generic solution to solve the problem of repetition in Data Platform delivery isn’t the answer. The data is specific, the requirements are specific and if you take this approach the solution is abstract leading to overly complicated and poor performing technical architectures. At their very worst the try to shoe horn the specifics into an architecture that hinders the goal and completely misses the requirements. I’d stick my neck out based on my own experience and state that 2 solutions are never the same; even in the same industry using the same operational systems.

Be very wary of magical all singing and all dancing products that claim to be a generic solution to data provisioning. AI is long way off being able derive specific semantics about the real world based on data. It’s just not possible right now… a lot of AI is approximate based on population statistics; the features of data and information are very specific.

Metaprogamming solves the problem of repetition but delivers specific solution artefacts that don’t sacrifice what Data Platforms implementations need in order to succeed which is:

  • Perform within their budget
  • Meet the business requirements

We aim to solve the repetition problem (and a whole host secondary problems) during the development process and recognise that there is the following:

  • Specific metadata about the technical features of the raw data
  • Specific metadata about the technical features of the deliverables
  • Generic implementation patterns

Development frameworks can collect the metadata specifics and combine them with generic implementation patterns to automatically generate the code of our specific solution artefacts. No product or framework however can do the following:

  • Semantically analyse the data to determine the code required to perform the transformations required to meet the information requirement. This requires real intelligence i.e. a human! It can also be extremely hard if the data and requirements are particularly challenging – this is where the real business value sits in your solution
  • Decide what are the best design patterns to use and how to construct them into a into a solution to meet the requirements. This requires knowledge and experience – An experienced solution architect

There are number of technical ways to achieve metaprogramming. I generally work in the Microsoft data platform space. Here are some I’ve used before I knew about metaprogramming:

  • XML/XSLT creating JavaScript!! Not data platform and a long time ago. Wouldn’t recommend it
  • SQL creating SQL (Dynamic SQL)
  • C# creating SSIS and SQL
  • T4 Templates
  • BIML (a Varigence creation)

I’ve built a few automated development frameworks using the above. Some of them were awful. I found myself neck deep in some crazy code maintenance and debugging hell which motivated me to learn more about the in’s and out’s of metaprogramming. I strongly recommend Kathleen’s course Understanding Metaprogramming if you’re heading down this road since it goes into detail about the approaches and the best classes of solutions for code generating code. Now I only use to the following:

The way that BIML works is actually a very similar T4 Templates it’s just that BIML brings a really useful mark-up language and development IDE to the party for scripting the creation of SSIS packages and database objects. They have also just released their automated development framework called BIML Flex if you don’t have the bandwidth/time to build your own.

As it turns out tackling metadata as a technical requirement during the development cycle lends itself to solving other common difficult problems in the data provisioning space which is integrating the following:

  • Data Catalog
  • Data Lineage
  • Operational Logging
  • Operational Auditing

Because the metadata collection is required and the assets are created from it, integrating these data platform features becomes a by-product of the development process itself. It’s a very proactive and effective solution. Retrospective solutions in this space can never keep up to the pace of change or are too pervasive requiring constant maintenance and support over and above the solution itself.

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s