Common fallacies

From Cynefin.io
Jump to navigation Jump to search

Common fallacies are errors people often make in their reasoning while attempting to solve a problem in a system. Even though the arguments building up to a solution might sound logical, the fundamental reasoning error means the solution is unlikely to have the desired effect. Even if it does have the desired effect, it cannot be scientifically attributed to the solution that was applied. Common fallacies include, but are not limited to:

  • Related to the Cynefin framework as a whole
    • Assuming a system is one domain
  • Domain-specific
    • Forcing homogeneity, instead of favoring coherent heterogeneity in a complex system
    • Confounding simulation and prediction in a non-ordered system
    • Failing to realise that chaos is a temporary state
    • Making a target out of a measure in a complex system
    • Making agent-based models of human systems
  • General
    • Confounding correlation and causation
    • Saying it's "all about perception"


Confouding correlation and causation

This can happen when people make observations on their past experience, or cases studies. They observe that (almost) every time action X was taken, the result Y has occurred (correlation). The reasoning error is in concluding from the observation that taking action X will (almost) always imply that result Y will occur (causation). For example, there is a correlation with consumption of dark chocolate per capita and the number of Nobel prizes won by that country. That doesn't prove that making people eat more chocolate in a country, will make the country win more Nobel prizes.

Confounding simulation and prediction

This is when the results of a simulation are considered to be a logical prediction of future behavior of a non-ordered system. The reasoning error is in simulating a non-ordered system deriving predictions from that simulation. In a complex domain patterns can be perceived, but not predicted. In the chaotic domain patterns can't be perceived nor predicted. More specifically, in human organizations there are several elements that cannot be simulated: humans are not limited to one identity, humans are not limited to acting according to predetermined rules and humans are of more than the simulation context (See "The new dynamics of strategy: Sense-making in a complex and complicated world by Kurz and Snowden") For example, performing a simulation of how students and professors will move around on a new campus and use the result to predict the optimal layout of the different pathways between buildings. But students might not always follow the shortest path between classrooms, if the weather is nice and there's and ice cream truck passing by (breaking a predetermined simulation rule). Professors that after a global pandemic, are very worried to get infected by a virus by students, might try find ways to move between buildings that maximize that odds of keeping sufficient physical distance from the students (influence from outside the simulation context).

Forcing homogeneity, instead of favoring coherent heterogeneity

This happens when a complex system is constrained in a way that reduces the variety in the system, which is harmful to the system and might even threaten its existence. For example, in an IT organization mandating that all updates to IT systems should happen in two-week timeboxes. Major outages of IT systems that are resolved, on average after one week, might cause major customer and financial impact on the company as a whole. Upgrades of IT systems that take longer than two weeks to complete can't take place anymore, which can cause major outages.

Making a target out of a measure

This applies to complex systems and was articulated by the anthropologist Marylin Strathern: "When a measure becomes a target, it ceases to be a good measure." The reasoning error is that a complex system can be improved or optimized by acting on a single metric, while in a complex system cause and effect relations cannot be reliably predicted. For example, evaluating the performance of commercial airline pilots based on fuel consumed during each flight. This might cause pilots to fuel up the very minimum amount of fuel required and to abuse the emergency landing procedures at their destination airport, to avoid having to wait for other airplanes to land. This increases the risk of crashes.