Human Complexity and the Strategic Games of Uncertainty
By Rick Bookstaber
“Sun Pin was in battle against the Wei general, P`ang Chuan. Sun Pin said: ‘We have a reputation for cowardice, and our adversary despises us. Let us turn this to our advantage.’ Accordingly, when the army had crossed the border into Wei territory, Sun Pin gave orders to show 100,000 fires on the first night, 50,000 on the next, and the night after only 20,000.
P`ang Chuan pursued them, saying to himself: ‘I knew these men were cowards; their numbers have already fallen by more than half.’ In his retreat, Sun Pin came to a narrow pass, which he calculated that his pursuers would reach after dark. Here he had a tree stripped of its bark, and inscribed the words: ‘Under this tree shall P`ang Chuan die.’
When night fell, he placed archers in ambush, with orders to shoot where they saw a light. Later P`ang Chuan arrived and noticing the tree, struck a light to read what was written on it. His body was immediately riddled by a volley of arrows, and his army thrown into confusion.”
In a recent post I discuss the limitations of neoclassical economics and the strain of behavioral economics that remains tethered to it. I argue that they fail because the market is inhabited by people, heterogeneous and context-sensitive, who do not live up to the lofty assumptions of mathematical optimization and Aristotelian logic that underlie these approaches – and do not do so for good reasons.
The nature of complexity also is different in the economic realm from that in physical systems because it can stem from people gaming, from changing the rules and assumptions of the system. Ironically, "game theory" is not suited to addressing this source of complexity. But military theory is.
What is Complexity?
Complexity can be either an annoyance or a boon, depending on one’s enthusiasm for tricky problems. We all know intuitively that complexity makes accidents both more likely and more severe. After all, any machine with many parts has more risk of having something go wrong, and with more interconnected mechanisms there is more risk that a single failure will propagate to cause the entire machine to fail. For markets, the accidents are market crises. I pointed to complexity and tight coupling as key components in the origin of market crises in my book.
But there remains the task of defining the type of complexity that matters for financial markets. There are a number of different concepts that harbor under the umbrella of complexity, which is not surprising, given that complexity is an issue in many different fields, ranging from physics and engineering to biology to sociology and economics. I want to lay out my view of what sort of complexity matters in economics and finance, and contrast it with some of the notions of complexity used in these other fields.
The measurement of complexity in physics, engineering, and computer science falls into one of three camps: The amount of information content, the effect of non-linearity, and the connectedness of components.
Information theory takes the concept of “entropy” as a starting point: essentially, the minimal amount of information required to describe a system. Related to this is a measure called thermodynamic depth, which looks at the energy or informational resources required to construct the systemic. The idea is that a more complex system will be harder to describe or to reconstruct, though this is problematic because it will look at random processes as complex; for example, by these sorts of measures a shattered glass is complex.
The use of information content has been extended to define the complexity of a system as the amount of energy needed to maintain it. For example, it has been employed to explain the collapse of societies, arguing that as societies become more complex, they require more of their energy (loosely defined) to maintain the status quo, and can no longer defend themselves against the many inroads toward decline. In biology, one measure of complexity related to information content is the gene count, though that alone might understate complexity because what matters is not just the number of genes, but how they are used in an organism — the epigenetics of how they are turned on and off, and even how all of these interact in different environments and during periods of stress.
Non-linear systems are complex because a change in one component can propagate through the system to lead to surprising and apparently disproportionate effect elsewhere, e.g. the famous “butterfly effect”. Indeed, as we first learned from Poincare’s analysis of the three body problem, which later developed into the field of chaos theory, even simple non-linear systems can lead to intractably complex results.
Connectedness measures how one action can affect other elements of a system. A simple example of connectedness is the effect of a failure of one critical node on the hub-and-spoke network of airlines. Dynamic systems also emerge from the actions and feedback of interacting components. Herbert Simon posed a fractal-like measure of system complexity by looking at the layering of hierarchy. That is, the depth to which a system is composed of subsystems, which in turn are composed of yet deeper subsystems.
A common approach to complexity in biology is to use some variant of size, such as the number of part types in an organism or the number of nodes in a food web. The use of size might seem intuitive at first, because things that are bigger — that have more parts — generally are more complex. But building a road that is twice as long, piling leaves twice as high, pushing twice as many letters through a postal system all increase some dimension of size without making things more complex.
The definition you use depends on the purpose to which you want to apply complexity. For finance, several of these measures of complexity come into play. There are non-linearities due to derivatives. Connectedness comes from at least two sources: the web of counterparties and common exposures. Exacerbating all of these is the speed with which decisions must be made.
Also, because economics and finance deal with human-based rather than machine-based systems, our tendency to operate based on context will invariably lead the conventional tools used to solve complex physical systems to miss the mark. But another important point for finance which makes complexity differ from its physical counterparts is that in finance complexity is often created for its own sake rather than as a side-effect of engineering or societal progress. It is created because it can give a competitive advantage. I will discuss this more below.
An Epistemological Definition of Complexity
A complex system is one that is difficult to understand and model; as complexity increases, so do the odds of something unanticipated going wrong. This is the driving characteristic of complexity that is most important for finance and economics: complexity generates surprises, unanticipated risk. “Unanticipated” is the key word: it is not simply that more complexity means more risk — we can create risk by walking on a high wire or playing roulette. Rather, it is that complexity increases risk of the “unknown unknowns” variety. And the risks that really hurt us are these risks, the ones that catch us unaware, the ones we cannot anticipate, monitor or arm ourselves against. Simply put, a system is complex if you cannot delineate all of its states. You may think you have the system figured it out, and you might have it figured out most of the time, but every now and then something happens that leaves you scratching your head. This is an epistemological interpretation of complexity. It defines complexity as creating limits to our knowledge. Neoclassical economics does not admit such complexity.
If the states in a system can be determined in a sufficiently short time frame, it is not complex, even though doing this may require more analytics and computer power. So complexity is measured by the increased risk of surprising modes of failure and propagation. This means a complex system can be defined as one that cannot be solved, whose effects under stress cannot be anticipated.
The limits of knowledge arising from this definition of complexity are not randomness or prediction error. Uncertainty where the states are known but it is uncertain which state will be realized is not a complex system. This distinction has been made by Keynes and Knight between risk and uncertainty, embodied in the concept of Knightian uncertainty.
If we take this route of defining complexity as creating limits to our knowledge, then these limits are constrained further because although we can know the characteristics of a process or system that tends to be complex, we often will not know definitively if something is complex ex ante, because to do so we must know that there are states that we do not know.
Complexity Depends on Timeframe
We cannot think about complexity without reference to time frame. A problem might be complex if we only have a few seconds to respond, but not complex if our time frame is one or two months. If we have enough time to solve a problem and understand and anticipate all of its possible outcomes, then it is no longer complex even though, to restate the point, it might be costly to solve and monitor, it might have random results (random but where we can know all of the possible states and assign probabilities to each one), but it no longer can lead to surprises.
This importance of time frame is the reason we have to look at complexity and tight coupling jointly. Tight coupling means that a process moves forward more quickly than we can analyze and react. For example, the college matriculation process is complex in one sense – there are many requirements, and those have prerequisites layered on them; there are courses that are dropped because a key professor is taking leave – but there also are appeals that can allow for adjustments within the time frame to deal with these complications so that students wend their way through the maze.
A second characteristic for complexity in economics, and finance in particular, is that it is not exogenous, simply sitting out there as part of the world. We create it ourselves, indeed often create it deliberately, and create it expressly to harvest the attendant unanticipated risks.
What does it mean to create risk, and why would we want to make our life more complicated by doing so? Let me address this by first taking a detour into the well-trod ground of game theory, and then into an area that is more applicable: military strategy.
This is Not a Game
When arguments related to strategy, to cognition and social interaction enter into the discussion, the first arrow pulled out of the quiver is game theory. And, at least in the view of some game theorists, it will carry the day. Aumann and Hart state that, “Game Theory may be viewed as a sort of umbrella of ‘unified field’ theory for the rational side of social science”. It is certainly true that game theory takes a stab at the human component, the “I know that you know that I know that…” sort of interaction, and in doing so, adds a level of model complexity that reflects social interaction. It turns out that game theory is not as hard as it may initially sound;. For example, the interaction among low species of birds and insects employ the same sort of recursive games and do so with ultimately predictable and stable results.
Game theory began with the insights of John von Neumann and Oskar Morgenstern in their book, The Theory of Games and Economic Behavior. They defined a game as an interaction between agents governed by a set of rules that specified possible moves for each participant and the set of outcomes for each possible set of moves. The theory of games rests on defined rules and outcomes. It also assumes rationality. “Rational” means “logical”, and I have already argued in the posts linked above why that might not be the best assumption to use. But the fact that games are predefined with bounds and states that correspond to the actions of the players also limits their realism and skirts a key source of the complexity in human interaction, be it in markets, the economy, or society generally.
Finance is often regarded as a game, but by the conventional von Neumann definition it is not. Granted it is ostensibly circumscribed by the rules of law, but so is war circumscribed by the rules of the Geneva Convention (at least conventional war). Adversaries at war do not have to play by the same rules or even play the same game. Indeed, what more is the strategy of war than playing the game that works to your advantage, and doing so while keeping your adversary unaware of that game? What more is the charge of “asymmetric warfare” than having an adversary that is not playing by your rules or your game – and is winning the war in the process?
“The Strategic Game of ? and ?”
The military theorist John Boyd used this phrase to convey the essential point that warfare is not a game. Or if you want to think of it as a game, it is a game that is ill-defined, with rules that are, put gently, subject to interpretation. Thus, he said, for any strategy, “if it works, it is obsolete. Yesterday’s rules won’t work today”. This point was also made by the great German Field Marshall Helmuth von Moltke: “In war as in art there exist no general rules; in neither can talent be replaced by precept.” That is, plans – and models – don’t work because the enemy does not cooperate with the assumptions on which they are based. In fact the enemy tries to discover and actively undermine any assumptions of his opponent.
Boyd viewed the key to tactical dominance as creating confusion for your adversary; “The warrior’s object is to create pandemonium, chaos, disorder – and you sweep out the debris”. This philosophy found its first success in air-to-air combat, where Boyd’s theory allowed even those with technically inferior aircraft to dominate the skies. Rather than simply operate efficiently in the given environment, he taught pilots to “generate a rapidly changing environment” to suppress or distort his opponent’s observations so that he could not adjust to these environmental changes, reducing him to “confusion and disorder,” so that he would act with accumulating errors “because of activity that appears uncertain, ambiguous or chaotic.”
If the time frame is long enough to allow reaction within the “rules” of the existing system, we have gaming. If it allows for the rules to be changed, we have something more akin to warfare. In war itself, changes of this sort can occur very quickly, because the key to victory is creating both change and the tight coupling that prevents adjustment to that change. In finance, changes also can occur quickly, because finance has no physical plant to alter, is designed for innovations, and largely works in the close to instantaneous realm of information flows and trading. When we move to the macro sphere, the changes of this type take longer, because they require time for institutional and political shifts. But the objective remains the same: move to unanticipated new environments, thereby creating endogenous uncertainty.
The Informational Battlefield
After its acquisition by the hedge fund manager Steve Cohen, Hurst’s “shark in the tank” sculpture, The Physical Impossibility of Death in the Mind of Someone Living, became a metaphor for the sophisticated traders preying in the waters of the financial markets. Like most metaphors, it only goes so far, mainly because sharks take their environment as given, whereas those operating in the market – particularly those at the top of the food chain – can alter the market to their advantage.
If we are going to use the analogy with war in economics and finance, the battlefield where Boyd’s dictum will be applied will be in the realm of information. One tactic in this battlefield is to create informational asymmetries. If the market is becoming efficient, if information is accessible to everyone at the same time, then either create new private information or else speed up your access to the public information. Derivatives play a role in the first approach, with banks creating information asymmetries by constructing financial instruments that they understand better than the buyers. For the second approach, consider the news feeds that are fed to high frequency traders with millisecond response times.
Another tactic is to destroy information. One way this is done is through what Steve Wunch has called algorithmic shredding; algorithmic trading breaks down trades into confetti-like pieces that obscure the information that might otherwise be broadcast to the market. Yet at the same time those doing the shredding employ more sophisticated methods which track the pattern of trading as it moves from one venue to the next in order to reconstruct vital information about the pre-shredded trade.
Conclusion
The interaction between the market participants, and for that matter between the market participants and the regulators, is not a game, but a war. Complexity in the information battlefield, the willful creation of complexity – complexity that is peculiarly human in origin – and the resulting endogenous uncertainty, is particularly confounding for the neoclassicists. The battle spells trouble for the foolhardy armed only with the neoclassical methods.
(This post by Rick Bookstaber originally appeared at rick.bookstaber.com.)
Comments are closed.