It Takes Complexity to Handle Complexity

A software project is a network of people, interacting together for their own purposes. We can see part of that network as the system of team members producing value, and another part of it is the environment of stakeholders consuming that value.

W. Ross Ashby, one of the fathers of General Systems Theory, came up with one of the basic principles for systems, called Ashby’s Law, or the Law of Requisite Variety:

“If a system is to be stable the number of states of its control mechanism must be greater than or equal to the number of states in the system being controlled.”

In other words, in order to survive a system must have an internal model that reflects the variety it encounters in the world outside.

Nobel-prize winner Murray Gell-Mann, founder of the famous complexity institute in Santa Fe, wrote it like this in The Quark and the Jaguar:

“The genotype of each organism, or else the cluster of genotypes that characterizes each species, can be regarded as a schema that includes a description of many of the other species and how they are likely to react to different forms of behavior. An ecological community consists, then, of a great many species all evolving models of other species’ habits and how to cope with them.”

Not being a founder of any scientific field or institute, I have my own simplistic version of the same idea:

“It takes complexity to handle complexity.”

Software teams have three options when dealing with a troublesome environment:

  1. Reduce the complexity in the environment;
  2. Increase the complexity inside the system;
  3. Ignore complexity (survival is not mandatory).

The first is rarely possible. The third is rarely smart.

That leaves software teams with the second option: try to match the ever-changing complexity of the environment with social complexity (people & interactions) and continuous improvement (embracing change).

This is, I believe, the core of Agile and Lean thinking…

Related Posts
free book
“How to Change the World”
  • Michael Bolton

    The Law of Requisite Variety is very important for teams and, I would argue, managers and testers. I’m surprised that it doesn’t get mentioned more often; I’m a little less surprised that YOU mention it. 🙂 Thank you for that.
    There’s another general systems pattern that I think is closely related and worth knowing about: The Fundamental Regulator Paradox.
    Jerry Weinberg summarized me summarizing Jerry:
    “We can’t predict the unpredictable. There is a viable alternative, though: we can expect the unpredictable, anticipate it to some degree, manage it as best we can, and learn from the experience. Embracing the unpredictable reminds me of the The Fundamental Regulator Paradox, from Jerry and Dani Weinberg’s General Principles of System Design which I’ve referred to before:
    The task of a regulator is to eliminate variation, but this variation is the ultimate source of information about the quality of its work. Therefore, the better job a regulator does, the less information it gets about how to improve.
    “This suggests to me that, at least to a certain degree, we shouldn’t make our estimates too precise, our commitments too rigid, our processes too strict, our roles too closed, and our vision of the future too clear. When we do that, we reduce the flow of information coming in from outside the system, and that means that the system doesn’t develop an important quality: adaptability.”
    I think that adaptability is the hallmark of what it means to be agile. The Law of Requisite Variety is one of the things that makes adaptability possible. The Fundamental Regulator Paradox makes adaptability neccessary.
    Again, great stuff, Jurgen.
    —Michael B.

  • shrini kulkarni

    I dont understand how can you beat (or deal) with complexity by increasing complexity else where. May be you need to elaborate. When you say match system complexity with social complexity – do you mean make the social structure of the system complex? how do you propose to do that? Having people with diff cultures (that happens anyway in todays global environment)? Complicate the social system with contradicting rules? giving people confusing goals?
    How do you plan to increase social complexity? Probably you wanted to say “deal with social complexity” by recognizing that it is a network of people?

  • Jurgen Appelo

    What I mean (in upcoming blog posts) is not to rely too much on models, but have people solve problems.
    This is the first post of a few about the same topic…

  • Jurgen Appelo

    Thanks for the references to Jerry Weinberg. Great stuff indeed!


    MORE complexity is not the solution it IS the problem! Every “system” has a sustainable level of complexity beyond which it will lose functionality i.e. become unfit for the purpose for which it was intended.
    You may find this of interest: “Complexity of IT Systems Will Be Our Undoing”
    “I wouldn’t give a nickel for the simplicity on this side of complexity, but I would give my life for the simplicity on the other side of complexity” – Einstein (

  • Jurgen Appelo

    You are talking about complicatedness, not complexity. They are different things.
    I agree complicatedness is bad. But that’s another discussion.


    I beg to differ. I know nothing of your underpants(!) but can say that a watch is, indeed, complicated: linear; deterministic. It works or it doesn’t.
    What is Complexity?
    Although a widely accepted definition of complexity doesn’t exist many of the definitions refer to complexity as ”a twilight zone between chaos and order”. It is often sustained that in this zone Nature is most prolific and that only this region can sustain life. Others claim that the phenomena of self-emergence are manifestations of complexity. But such definitions are not practical since they don’t define anything measureable. At Ontonix we sustain that the fitness of a system is equal to its complexity. The evolution of living organisms, societies or economies constantly tends to states of higher complexity precisely because an increase in functionality and fitness allows these systems to better face (or adapt to) the uncertainties of their respective environments, to be more robust, in other words, to survive better.
    Our extensive work in complexity – from healthcare to aerospace design and other critical processes is 100% quantitative and, as the nature of our work indicates rigorously tested – has enabled us to compile a list of verifiable “Complexity Facts”
    Complexity, in our view, is not a phenomenon on the edge of chaos; it is an attribute of any system, just like energy, or momentum. Therefore, it can be managed.
    According to our philosophy, a comprehensive complexity metric should be a function of the following fundamental ingredients: structure, entropy and coarse-graining. Structure describes the way information flows within a given system. This may be represented via maps (graphs) such as those determined by OntoSpace™. Entropy represents uncertainty and the level of organization. Coarse-graining is essentially equivalent to granularity or resolution with which we interrogate data relative to the system. Very often, we can only express fuzzy statements about a system’s state (e.g. hot, very hot, extremely hot, etc.) or about given risk levels (very low, low, medium, high or very high). Complexity measures based exclusively on graph structure or entropy tell only part of the story.
    Perhaps this piece will shed further light –
    Ontonix: Model-free methods – a new frontier of science
    This short video from, ecologist, Eric Berlow is also very useful:
    I welcome the opportunity to exchange views on this topic and love the illustrations too.

  • David G Wilson

    Judgements you make a very good point and that is why the ability to measure is such a significant development (breakthrough).
    If you really think about it it makes so much sense that system data at micro or nano level (autonomous information agents) would carry the characteristics of the system itself!? Like a cent, € or £ iterated.
    Fractal scaling enabling examination of the system at the most appropriate scale
    Although I am bot scientifically trained and can to the subject from financial risk, the Ontonix proposition is a rigorously tested solution applicable across domains. Model-free, 100% quantitative applied and tested, like Heineken, reaching the parts that others cannot!
    I love your presentation on the subject (apart from the obvious!) and would be happy to supply some further insight as I believe yours to be the most effective means of communicating an, unsurprisingly complex message!
    If this has appeal let’s connect on Linkedin and develop the conversation.
    With respect,
    David G Wilson

  • Jurgen Appelo

    Thanks for your message. It is interesting. But it doesn’t seem to match the general message in the books I read on this topic.
    For example, in Complexity: A Guided Tour, Melanie Mitchell clearly summarizes the many attempts at measuring complexity, pointing out that all of them failed.
    Therefore, I still don’t know if we’re really talking about the same thing.


    Apologies for the spelling mistakes! I would love to blame mobile technology but to arrive at “judgements” when it should have read Jurgen suggests I was more caught up by Champions League football than I thought.

  • Glen B. Alleman

    Care is needed here. Jurgen’s definitions don’t always match those found in the systems domain. They are closer to the organizational development domain. This is not always clear when a conversation is started. As well, reference materials usually come from unrefereed sources and popular texts and magazine articles.
    It takes a bit of time to discover there are gaps in simple definitions used by those of use working int he “complex systems” domain. I’m still struggling with that in his book and posts. There are good tings there, it just takes time to determine which ones are fact based and which ones are opinion.

  • Jurgen Appelo

    Hi David,
    Thanks for the feedback. I saved your links to sources for later reading. I will definitely look into it.

  • Jurgen Appelo

    Some of those “unrefereed sources and popular texts” I use and refer to are from Nobel prize winners.


    The quality of sources are not up for debate and, as I said earlier, a common definition is missing. It looks to me like much of that is down to “Academic vanity” each intent upon pursuing their own particular brand rather than greater collaboration bringing “simplification”. Oh the irony! Perhaps, as has happened so often in history, meetings of minds will be the basis for future Nobel prizewinners…moreso than entrenched qualitative views: facts v opinion.
    Although I haven’t read her book I understand that it is a retrospective look at the subject(s). Science does get serious when you can measure. That is what is so significant about what we have established and MOST IMPORTANTLY ARE APPLYING in an increasing number and variety of critical environments.
    I am happy to debate the finer points in a more appropriate forum but, rather, hope to sew some new thoughts that, I certainly believe make more sense than much of what I have read on this and related subjects.
    Anyone curious to read more I would recommend “A New Theory of Risk and Rating” Dr J Marczyk available on Amazon
    @Glen: “Chicken or egg: system or organization?” – If it is possible to identify, map, measure, monitor the interactions amongst ALL system data [autonomous information agents] at nano/micro level, to determine and manage the level of complexity at system (or macro) level – as well as to track causality through links/hubs – then, the data already contains the (organizational) characteristics! Think about biology and the complexity that exists within organs, cells, etc.
    Zoom in or out to look at and address system issues at the most appropriate scale.
    I hope this is helpful and provides the further food for thought that I intend!


    Jurgen: I wanted to share this with you and other readers and hope it helps understanding.
    I would also recommend this excellent article from Mark Buchanan as it deals with fractals and Power Laws in a “more business” context very well:
    In order to develop, to operate, and to survive, there is interdependence and feedback between connected patterns. This interdependence is described by E.O.Wilson and Bert Holldubler in “The Superorganism”:
    “Life is a self-replicating hierarchy of levels. Biology is the study of the levels that compose the hierarchy. No phenomenon at any level can be wholly characterized without incorporating other phenomena that arise at all levels. Genes prescribe proteins, proteins self-assemble into cells, cells multiply and aggregate to form organs, organs arise as parts of organisms, and organisms gather sequentially into societies, populations and ecosystems. Natural selection that targets a trait at any of these levels ripples in effect across all the others.”

  • Morgan Ahlström

    I’m not sure if I remember correctly but I think Don Reinertsen wrote something along the lines of “only variability can swallow variability”. I think that was a good way to express the implications of this concept. My personal interpretation of Ashby’s Law was that it’s the argument needed for selling self organization. Wrote a post about that a while ago:

  • Glen B. Alleman

    Care is needed with terms here. The semi-philosophical “law of requisite variety” in rough terms, asserts that variety in a system’s output can be modified only by sufficient variety in it input seems to be a statement about system complexity.
    But this does not say to what degree the number of internal states needed to deal with the number external states. The relative strength of the interactions – the coupling and cohesion paradigm of good object oriented development – is one “externality” that must be defined.
    A simple formula from “Connectivity, Complexity, and Catastrophe in Large Scale Systems,” John L. Casti, International Institute of Physics, International Series on Applied Systems Analysis.
    Total Variety in Behavior <= (disturbance variety) / (control variety) What this says is "a system is never universally complex." It may be complex in some aspects but not others. It may be complex only if used in certain ways. But the Law of Requisite Variety is applicable to static systems. One final thought, Murry Gell-Mann is associated with the Sante Fe Institute, and the interesting book with the Quark title. But like many physicists in their contribution to topics outside their "Nobel" work has much less rigor. I say this from being in the presence of Gell-Mann and Feynman as a Grad student at the University of California, when the quark theories were being tested on the accelerator I was working on under Fred Raines, another Laureate for the Neutrino. So care is needed as well in using past performance as a basis of other domains. Pauling is another CalTech example. From contributions to quantum mechanics to vitamin C. But your on to something in the agile world when you move from static systems to dynamic systems. Those are much better models of social systems.

  • Glen B. Alleman

    System or organization? yes that is the question. My systems engineering background starts with the model of the system. For static models it’s straight forward. For dynamic models with systems in place and operating, the model maps to the system.
    In both cases though – system <> organization there must be reciprocity between the two. It’s a closed topology at that point.

  • Pingback: SmartDev

  • Pingback: Training | Casual Loop Diagram

How to Change the World - free Workout - free