How to Deal with Unknown Unknowns

Getting people together to share information and make joint decisions is the best way to deal with the unknown.

Software projects always suffer from limited predictability. And that’s not only because of the many feedback loops inside the system. Another reason is that every team has to face the unknown.

The Incompressibility Principle
”There is no accurate (or rather, perfect) representation of the system which is simpler than the system itself. In building representations of open systems, we are forced to leave things out, and since the effects of these omissions are nonlinear, we cannot predict their magnitude.” – Cilliers, Paul. "Knowing Complex Systems" Richardson K.A. Managing Organizational Complexity: Philosophy, Theory and Application.

From the Incompressibility Principle we can infer there will always be unknowns. There are things unknown about the system itself, and things unknown about the environment.

Known Unknowns

We can distinguish two kinds of unknowns. The known unknowns are the things we know that we don’t know. For example, we know that the customer will sometimes change her mind. But we don’t know when and in which cases. And we know there’s a chance that team members will get sick. We just don’t know who and when. I sometimes refer to these kinds of risks as jokers. We know the jokers are in the deck of cards. We just don’t know which hands will get them.

Unknown Unknowns

Black-swan The unknown unknowns are the things we don’t know that we don’t know. For example, I was once confronted with a team member who had, quite unexpectedly, developed a psychological disorder. And I once unexpectedly became chairman of an institution, when the other two board members quit one month after I joined the board. I had never dealt with such situations before, and I had not taken these eventualities into account in my decisions. Such events are sometimes called black swans.

Not long ago people could imagine only white swans, because white swans were all they had ever seen. And so people predicted that every next swan they would see would be white. The discovery of black swans shattered this prediction. The black swan is a metaphor for the uselessness of predictions that are based on earlier experiences, in the presence of unknown unknowns. – Also see: Taleb, Nassim. The Black Swan.

Limited Knowledge

It is important to understand that the unknown always depends on the observer. Some people already knew about black swans, but that didn’t make them any less surprising to those who had never seen them. My fellow board members already knew they wanted to leave their positions a good time before they shocked me with their announcements. And if one of your team members turns out to be a criminal on the FBI’s Top 10 Most Wanted list, he probably knew that long before you found out.


The known unknowns and unknown unknowns in your organization differ from person to person. Getting people together to share information and make joint decisions is the best way to deal with the unknown.

This text is an excerpt from the Agile Management course, available from March 2011 in various countries.

  • Looking for Creative Business Partner
  • Happy Holidays! (and Free Book Chapter)
Related Posts
free book
“How to Change the World”
  • Vasco Duarte

    That was a “cheap” get out of jail card ;). You did describe the Unknown Unknowns, but not the circumstances where they matter or what you can do with that knowledge (e.g. do not bet your project on anything that can explode – like the stock exchange or swans being _always_ white).
    The stock exchange is an interesting realm for black swans, because we all know it goes down at some point. We just don’t have any idea when. But what we cannot know is if the next time it goes down if it will wipe out most of the wealth in the world (like in the 20’s 30’s). If you take into account that around 90-99% of the world’s wealth is in some market at this very moment then you know that we are betting our whole world’s economy on the expectation that the stock market will never crash so bad that we wipe out the “important” part of the economy.
    I am still struggling to find a proper analogy on the software world. I will have to blog about it soon…

  • Alexandru Bolboaca

    We have managed to find a model for software development that revolves around information. I think it’s an interesting step forward from your post:
    What we found is that if we look at software development as small operations applied to information with the purpose of refining it until it becomes executable, we might be able to explain why certain practices work and why others don’t. This idea is still in its infancy, but our hunch is that it may grow into something very useful.

  • Jurgen Appelo

    If you know that the stock market is going to crash someday, then it’s not a black swan. It’s a grey swan (according to Taleb).
    Looking forward to your contribution to this fascination topic! 🙂

  • Glen B. Alleman

    Yes UnkUnk are observer dependent. Knowability is also observer dependent. Seeking the unknowns is also project dependent. Seeking risks on manned space flight is different than seeking risks on an internal IT project.
    Just a comment, when you use the term “system” you implicitly assume an organizational system. This is not clear for those of us working “product” systems – machines, production facilities. In this “machines” world, much of the variability and uncertainty around “people” is boxed by the staff selection process. Performance of individuals is “ruthlessly” controlled through the selection and management process on high risk programs in our domain.
    The focus moves from people to mechanical systems.

  • Scott F

    My understanding of Taleb’s Black Swans is that they are not unknowable. In fact, every one of them is predictable but judged so remote that they are excluded from our decision making. Hence, flying planes into buildings has always been well within the realm of the imaginable and even possible. The trouble is that our organizations are not equipped to deal with the swarm of ultra-low probability events whose sheer volume raises the likelihood of ONE of them occurring to near unity.
    My take away for Taleb’s book is that trying to identify all the Black Swans by communicating more will not prepare you for the one that strikes – there are just too many of them. His description of his investing strategy indicates that he assumes the Sh*t Happens and makes sure that he can weather “freak” events no matter which direction they come from. Even better from his perspective is to be in a position to benefit from the occasional beneficial Black Swan (hence his small exposure in bio-tech).

  • Glen B. Alleman

    The “Black Swan” paradigm was there were no black swans in England, so in typical fashion “if we can’t see them they don’t exist.” Of course they did exist in Australia.
    The “what I can’t see can’t hurt me” process is common in many fields.
    The notion in investing in the presence of Black Swans is mixed with bad investment processes driven by greed. Read Lars Poker and The Big Short, to see how all the rational approaches in the world were ignored. Taleb then makes a post facto argument that “no one saw it coming,” which of course if a big fat lie.
    Lot’s of people saw it coming and made billions (literally) shorting the market, knowing it was going to crater. The University of Chicago School of Economics has a great selection of papers forecasting the demise of the sub-prime market. Only the academics and the short-players read those papers. Everyone else ignored the warnings and assumed it was nothing but Up, Up, and Away.
    It was NOT a freak event, it was inevitable and those who knew it bet against the “crowd intelligence” and won.
    Now if that is called a Black Swan, then Taleb is using a different dictionary. He’s a well known self promoter, and post hoc analyst, charging lots of money to tell people “I saw this coming.” If he had really seen it coming, he would have shorted the market too, and netted billions.
    We tend to accept arguments from people like Taleb without actually understanding the underlying process – CDO’s and sub-prime markets for example. Our the inherent risks in manned space flight, nuclear power, or even deploying ERP systems.
    The next book after those is “All The Devils in One Place.” Everyone who new anything new, they intentionally transferred the risk to those without knowledge or skill and took profits on both the long and short side of the deal.
    Ain’t capitalism great?

  • Roberto Bera

    There is a problem in “Getting people together to share information” : meritocracy!
    Because information is “power” and if a company has a pyramid strucure, nobody wants to lose his power.
    I don’t agree with this “competitive” approach so I’m a low-level employee.

  • Vasco Duarte

    very good point Roberto. hierarchy has an impact in all information sharing. hence why it is so improbable that info sharing works as a strategy. “infi sharing” strategy needs to be complemented with other strategies, and in my view they must be obvious to anyone involved.

  • Scott F

    Yes. Simply “sharing information” is intangled with so many other factors that it falls flat if implemented by itself. I am currently working on an agiler team in which the lines and incompatible goals are becoming more and more apparent. The forces driving this are coming from all levels, right up to the VP level. Some of the blame can go to hierarchy but can be spread around to the categories of personality, organizational history and conditioned outlook. Which thread do you pull first? Do you have to torch it and start over (seen this one and don’t recommend it!) No easy answers.

  • Roberto Bera

    I’ve linked this your post in the last post of my blog.

  • Blaine Mazurek

    Blaine Mazurek

    Very informative blog post.Really thank you! Fantastic.

How to Change the World - free Workout - free