Classical business administration teaches us to view the world in causal relationships. Every event triggers a subsequent event. Thus, there are clear triggering events for everything. A is followed by B. Of course, A can also trigger B and C and E in turn can be triggered by C and D, but there is always a clear chain of events that trigger the events of the world. And if the events are linked in this way, then they can also be measured and controlled and – most importantly – predicted. Indeed, in this linear way of looking at the world, the future is just a variation of the past. The linear steam engine systems follow an inner causality that can be observed, described and predicted and that makes them supposedly so safe.

Of course, sometimes a valve breaks, but the durability of the valves can be measured, so it is possible to calculate a statistical risk assessment for the breaking of a valve. It’s complicated, but it’s perfectly linear and therefore quite possible and feasible. It’s about risk assessment. When do I have to replace the valve? It should be in operation for as long as possible and be replaced shortly before it would break. A risk assessment looks at the effects of alternative courses of action, their possible consequences and their probabilities of occurrence. The more data I collect and the more detailed I make the case, the more precisely I can estimate the chances of success of an undertaking. When I compare results of an action with the alternative scenarios, I can continuously check whether I am on the right track. Risk assessment and risk analysis are a profession in their own right in many industries and are often underpinned by complicated statistical prediction models. So nothing can go wrong with sophisticated risk management, can it?

Well, this way of looking at things assumes that the future basically behaves like the past, i.e. it is its extrapolation. In other words, all influencing factors are considered to be known and to have corresponding effects. Risk management only looks at known risks whose probabilities can be measured empirically – and that is where the risk lies.

Risks whose probabilities cannot be measured, for example because they have never occurred and are therefore simply not considered, are not risks at all, but uncertainties. Where do uncertainties occur? In complex systems. What is a complex system? A system that does not allow any clear cause-effect relationships to be recognised because there is constant interference. A plate of spaghetti is a complex system. We cannot make any predictions about what will happen when we pull on a certain noodle by pure observation. And if we try it, we can’t repeat the experiment because it has changed the system in such a way that the next time a completely different result will come out. We simply cannot predict the system. The economy, society, education and politics are complex systems – our world, and this is often misunderstood, is not a world of increased risk that can be dealt with by risk management, risk analysts and risk registers. Our world is a world of increased uncertainty, in which there are no clear connections and causal chains. Guessing an event well – that may still work – but forecasting cause and effect relationships – i.e. predicting the future – fails miserably.

“Deciding under risk means monitoring the deviation from the plan and taking measures to minimise and compensate for the deviations. Deciding under uncertainty means monitoring the plan itself and taking actions that adjust the plan to reality. Plan-driven work ignores the facts that make compliance with the plan impossible because compliance with the plan itself is the goal. (..) Decision-making
under uncertainty means questioning the basis for decision-making and
thus also one’s own attitude. This is not steam engine management.”

Excerpt from: Andreas Rein. “Steampunk Economy.” YessYess Publishing, 2021.