Ideas like entropy, Darwinian evolution, energy are part of our language and they are often invoked in our discussions as images or metaphors. A bit rarer are images involving complexity as an idea, and emerging complexity is a hardly used term.
The term "emerging complexity" refers to complex structures being created upon simple components and simple rules. The awesome development that lead life from simple prokaryote cells to the unique biological architectures of man and other large animals is the first and most famous of the examples we could do.
Since I trained as an aerospace engineer, I also like to mention
the fluid that starts like a perfect laminar flow out of a nozzle in the water, just to develop waves and vortexes downstream without anything apparently provoking it. At the end the flow motion will be extremely complex and, ultimately, it will dissolve in the larger body of fluid.
Even simple dynamic systems, governed by simple equations, may give rise to extremely complex patterns, often known as chaotic.
I really wish that a BI system, a MIS considered in all its components, could be described by a system of equations. This is not the case. However, we often see behaviors that should be "linear" to become "chaotic" and then show a new complexity out of it.
As usual, let's make an example.
Let us consider the familiar sales space, in particular for an Internet generalist retailer. They sell different product sectors to a wide range of customers.
The first reporting being produced will likely be the usual better/worse by product and by product groups, with various time windows (Year vs year, ytd, last week vs current week, this week last year, this Christmas vs las years' Christmas etc.). Measures will be quantities and gross, discount and net values.
This setting, alone, may produce a lot of complexities which may be (roughly and not so rigorously) measured as:
Nbr. of Products Classifications. x Nbr. of Time Windows. =C
Not all the combinations will be relevant to the management, some will be just dummy elements combinations etc. so:
Cr=C x R where R<1
Initially, only some of the combinations will be relevant to the users but, slowly, as the business goes on, each combination will get under scrutiny as the knowledge hunger increases and the overall complexity goes up as a result.
R ->1 when time->infinite (better, simply grows)
When R approaches 1, typically C increases as well because other factors are added:
Nbr. of Products Classifications. x Nbr. of Time Windows. x Nbr. of Customer Classification =C
So C and R tend to increase because, naturally, there is always a growing need of information and analysis to keep improving the business.
The max value for R is 1, and the max value for C is determined by the number of attributes that can be attached to the sales and are used to analyze them.
In proximity of these max levels, then, something new happen. So far we have expressed the complexity as a linear combination of elements but when predictive algorithms or data science enter the arena, we start adding strongly non linear complexities to our system. In this case, it is difficult to attach a number to measure the complexity and I prefer to think to it as if it was a phase transition, where new elements enter the game and rewrite the rules.
And, obviously, we do not stop here since we may use those results to implement a suggestion engine or another of those extremely ingenious artifacts that affect our life on-line, thus increasing the complexity that we are managing to high levels.
If you think that all of this is derived by the simplest raw material, an order or an invoice, we may well see how growing complexity is the inevitable bedfellow of the BI/DW discipline.
Today, two kind of reactions to complexity are emerging.
In some organizations complexity is simply ignored and not considered. Management practices like lean advocate this line of thought. It is intellectually easy and reassuring thinking that "we are focusing on fundamentals", but unfortunately there is probably a "long tail" of advantages that are being overlooked just because the effort to master complexity seems just too much to be tackled thoroughly.
The other reaction is to leave the complexity permeate the organization, let the single information workers deal with it by himself or within her small group, thinking that "people know" what to do if they are close enough to the issues. Unfortunately this is just introducing a new, extremely complex and rather uncontrollable factor to complexity: human judgment. Which is good as long as it is not used to duplicate analysis, do slightly different versions of the same thing, set local own standards, confuse numbers etc. etc. etc.
I think we should not be surprised, these are human reactions and, as such, are likely to take hold. It is no use to say that a more coherent approach to complexity is adopting the right tools and policies to implement and govern it.
This is what I wanted to say and I'd be delighted to have someone elaborate on this, like finding practical ways of calculating C and R. However, just let me know your opinion!