A critical view on our widespread and beloved visual display

Dashboards can be found everywhere where there is a lot of data - it's almost dashboard mania. The quality of most of these dashboards is consistently not even good. Here is why.

Usually there is little impact from displaying dynamic data with dashboards.

The reason: the setup for design & developpment is flawed, and that leads to flat outcomes (dashboards) and  immense inefficiencies in consequence.

As data consumers, we are left in the dark and unable to take any meaningful action based on these dashboards.

There is a simple underlying factor: the BI and nowadays the AI industry is a multi-billion dollar market that tells us that data is easy if you have the right tool. Everyone (with too few exceptions) believes and work like this. I'm sorry, that's too short-sighted.

If you just add data and tools to your work, the results, the dashboards will remain flat and will have no/low impact. Without a reflective and interactive design and analysis approach, failure is inevitable. Experienced data analysts know this by heart, but are ignored by C-level managers.

My advice: we need to rethink dashboards.

Here are the main false believes, which I think are necessary to fight for the sake of getting better data products. 

1. Believing in the wrong process

Often you find descriptions for data visualizations like this: 1. Collect and Prepare Data, 2. Data Exploration and Governance, 3. Data Analysis and Modeling, 4. Data/AI/Machine Learning Enhanced Decision-Making and Planning, 5. Monitoring and Evaluating. (Link)

While this is true from a mere data management perspective these steps leave out the most important stakeholder of any analytic task: the user. This type of process, which is ubiquitous, will never lead to significant insight or value for decision making. The result: superfical charts.

2. Believing a dashboard is just a collection of charts

If I had looked at the many examples available on social media (especially LinkedIn), I might have come to this conclusion. You cannot depict any structured approach or even a common business question from these. Things are put together without any deeper relation. There is no way to guide the user through his analytic journey. The result: flat displays.

3. Believing the data determines the outcome

This means that analysts rely only on what is in the data. This way of thinking is a brutal limitation of the possible scope. Experienced analysts work the other way round: the required outcome, derived from the business problem, determines the required data, and they try to enrich their database with new information. The result instead: limited insights.

4. Believing, the tool (AI) can fix it

Tools can only fix simple things (labels, colour encodings, data formats). Tools will never know (at least for years to come) what questions a user/decision maker has in mind, how experienced she is, or what the specific requirements of the domains are. Building an analytic tree will remain the work of people who understand the domain and the data world.  

5. Leaving the user out of the equation

It's a sympton of point 4. You can't be a successful analyst if you don't put the user at the centre of your work, understand their business question and derive the data questions from that. Every dashboard should be vetted by users in a proper process.Â