Business Intelligence state of the art

Usually when we need to understand our business’ numbers, we open Microsoft Excel and start applying filters, pivot tables and charts to try to gain insight on what story our data are telling us.

It’s 2017, Isn’t there a better way to do this?

I think it must have been around 2014 when I first heard about Tableau. I immediately liked the tool’s proposal: Easy tool to analyse data in real time without needing to be an expert in SQL, Olap cubes, or any technology. It could take as input for data any number of excel sheets, csv files, or database sources and would immediately allow the user to move the data around, establish relationships between sets of data and start visualising. I’m a visual person, I need to see things in graphical form to understand them, so I really like this concept. In the Gartner Magic Quadrant for Feb 2017, the 3 top players in the BI space by completeness of vision and capacity of execution are Tableau, Microsoft, and Qlik who offer similar solutions for front end analytics.

One of the weak points that I see in Tableau’s proposal is that it only addresses the front end, the data analysis, but leaves the back end, the data model, the origin of data, for you to deal with. In the past I’ve worked on a few data warehousing projects where the immense complexity was in creating a data model which made sense from a business perspective and which could grow with the business’ future needs. Designing the data model wrong was similar to shooting yourself in the foot, as future needs could not be adapted, changes would require refactoring of the whole data model and of the downstream systems which read from this data model and maintenance of this data model would be expensive and complex. In all cases I experienced in the past, creating a data model was similar to creating a monster which would grow in unforeseen and inelegant ways getting more rigid and patchy as time went by.

A good example of a world class level attempt at creating a data model for the capital markets world is  ISDA’s (International Swaps and Derivatives Association) FPML (Financial Products Markup Language). The data model they have birthed is immensely complex, as could not be different when the goal is to be able to map any financial instrument. Any attempt at using FPML as inspiration for your data model promises a long and complex project.

Three weeks ago, I attended Moneyconf. There I saw a company called gooddata which offers an interesting model: They take care of your Business Intelligence needs as a service. You send them your data, they organise it and offer you a Tableau style front end tool to visualise it. This would take care of all data warehousing troubles.. however I have some concerns:

  1. How do they know all the ways you need to structure your data model?
  2. Does this scale well with big volumes of data?
  3. How do they manage confidential data? Imagine I have medical records to manage…
  4. Is the front end tool as complete as Tableau?

 

The BI world is very cool looking. It brings promise of granting better insights and fancy looking charts, but, is the price tag worth it when we can always just grab an excel and start applying filters, pivot tables and charts to understand the story our data are telling us?

bizint