Guest Post: "Is the Securitization Crisis Driven by Nonlinear Systemic Processes?"

Posted on by

Reader Richard Kline, who provides regular, sophisticated comments, was keen to continue the discussion provoked by a post last week, “Hoisted From Comments: Greater Liquidity Produces Instability.” An anonymous reader offered a complex systems theory view of our modern financial system. The opening paragraphs:

Perhaps a lesson to be learned here is that liquidity acts as an efficient conductor of risk. It doesn’t make risk go away, but moves it more quickly from one investment sector to another.

From a complex systems theory standpoint, this is exactly what you would do if you wanted to take a stable system and destabilize it.

One of the things that helps to enable non-linear behavior in a complex system is promiscuity of information (i.e., feedback loops but in a more generalized sense) across a wide scope of the system.

One way you can attempt to stabilize a complex system through suppressing its non-linear behavior is to divide it up into little boxes and use them to compartmentalize information so signals cannot easily propagate quickly across the entire system.

I hope I am not oversimplifying what either the anonymous reader or Richard intend to convey, but the non-linear issue is not trivial. Processes that are described by non-linear equations are unpredictable. That is why, per above, inducing or enabling non-linear behavior is Not A Good Idea.

Worse, non-linear math is really hard, so while lots of mere mortals can model linear processes, it takes high powered skills to deal in non-linear modeling. And you therefore get a second problem: due to computational convenience, most practitioners will try to describe a system using linear models, and if it works well enough in most cases, it gets a go. To illustrate: pretty much every mainstream financial model (Black Scholes, for instance) assumes continuous markets, which simplifies the math. This, for instance, is the origin of the classic fat tails problem. Pretty much everyone knows that models that use a normal distribution underestimate tail risk (the odds of outliers, which in this case is dramatic price rises or falls). Yet the flawed models are still consulted out of convenience (note I am not saying other models aren’t used, but the reliance on models known to have fundamental shortcomings is considerable).

Richard has provided a through, thoughtful exploration out of some of the issues. After a general discussion, he sets forth five questions and works through the first one. on innovation (note the discussion ranges far beyond the financial markets). Recall that one of the defenses of our current financial mess is that the products were innovative and hasty regulation will curtain other useful advances (this argument is that the products weren’t the problem, it was the practitioners, or in popular terms, “guns don’t kill people, people kill people”). But as Richard illustrates, that level of discussion is simplistic; there are ways to parse the problem that can lead to better thinking about possible remedies.

His ideas on issues 2-5 will come in later posts in this series.

Your comments very much appreciated. I’ve edited his piece slightly to make it a bit less formal.

Now to Richard Kline:

To what extent have nonlinear processes promoted the Securitization Bubble, precipitated its collapse, or prolonged the resulting instabilities in the financial system? I’ll keep the discussion non-technical, i.e. non-mathematical. While I have an informed opinion, I don’t pretend to expertise, and hope to elicit further comment and debate.

While there is evidence for most of my contentions, it isn’t conclusive;. I raise ideas more than offer conclusions. Some general, but valuable, further reading is suggested for those interested. Comment by those with technical background in nonlinear complex systems, especially economic systems, is welcome—but I’m not holding my breath. Though nonlinear dynamics in financial markets received no little initial research ten years ago and more, many of the specialists involved have since been hired into the hedge fund industry where their work has presumably become proprietary. Not only do we not know what they are doing, we don’t even know what they know now; there has been little recent publication of consequence.

To delve into this issue, then, let us first briefly consider financial markets as systemic phenomena. Given their inherent complexity and diversity of inputs, modern financial markets are inherently complex systems with numerous nonlinear phenomena embedded within their actions, that is phenomena whose transformations are not smooth, not continuous, or both. Such overlapping dynamical phase spaces appear less complex than they are because salient stable equilibria within them are defined by firm, cohesive, and above all observable parameters such as priced units of exchange, transaction terms, regulatory limits, and the like. Such firm parameters do typically though not invariably have the virtue of precluding overtly chaotic behaviors in their respective financial event-spaces, and to a degree in the larger interaction systems which contain them. Indeed, while complex systems will often self organize with emergent properties developing within them in consequence, the intervention of human participants in these markets tends to limit or swiftly capture observable systemic properties—or at least that is the idea.

Since these defined and manipulated parameters are of lower dimension than the market processes to which they map, they give the illusion to the observer that markets themselves are more solid and of lower dimension than is really the case, like skin on hot chocolate. This illusion is compounded by the fact that the very large volume of quantitative data regarding finance and markets, including trend analyses beloved by academically trained economists, are presented in linear analytic terms; ants crawling on that skin, if you will. Such linear models tell something regarding ‘what dwells below,’ but less then we often lead ourselves to believe.

Bear in mind, though, that such linear models only map to the nonlinear trajectories and higher dimensions of the underlying event-spaces, if with fair reliability, rather than fully describe them. These are fuzzy, noisy spaces in that they largely describe human behavior which is intrinsically inexact, information which can be imperfect and/or corrupt(ed), and rule-parameters which are not always followed and which do not capture all relevant processes. Phase spaces and their properties are best described as geometric structures with a time dimension which describe relationships whereas our analyses in a modern educated context are overdefined by linear mathematical methods which abstract fixed values. The present conceptual mismatch of methods to phenomena further leads to an insufficient cognitive engagement with systemic and nonlinear processes on their own terms, in economic behavior and elsewhere:

Our tools are yet poorly matched to the natural phenomena we wish to understand. I will pose it as a truism that processes which appear disjointed or broadly nonlinear do so when they are viewed from perspectives which are or lower dimensionality than are the structures observed; Flatland views of Squareland trajectories. Tensor analysis may prove sufficient to effectively analyze some complex processes; perhaps. Since most of us cannot execute it competently, nor are the guidelines clear by which to operationalize available data into tensor matrices, we will have to sharpen our ‘complex reasoning’ to make heuristic judgments better suited to the data-events instead. This exercise is valuable in and of itself. It is even more true in considering complex systems than otherwise that as you define your questions you describe the parameter space of your possible answers. So, let’s build some better questions.

From that position, here are five questions recently and variously posed which I find personally interesting:

Does innovation require untrammeled information flow across social/ economic event spaces?

Is the crisis in securitized debt the result of a ‘black swan’ event?

Was the creation of the Securitization Bubble the result of nonlinear processes in the financial markets?

Is a financial event-space optimized for propagation desirable?

If not, what structure or process parameters might improve process outcomes?

Innovation: Does innovation require untrammeled information flow notwithstanding any potential costs to an economy or society of undampened interactive trajectories? Not . . . quite.

The stated assumption that innovation requires untrammeled flows of any kind embeds two misconceptions. First, there is an implied confusion between discovery and innovation. Discovery is just that, finding something not previously understood to exist. Exposure to large bodies of information may raise the probabilities of discovery, but so may improved observation of putatively well-known information. Either way, discoveries are comparatively rare; significant discoveries rarer still.

Second and more fundamentally, the stated assumption conflates innovative design and innovation diffusion. The popular belief is widespread that innovative design results from ‘throwing many ideas up against the wall and seeing what sticks;’ that no one really knows what they are doing so innovative ideas and designs are both essentially fortuitous and random. And certainly fortuitous and random innovations do occur. What is required, then, from this perspective is the largest possible supply of things to throw up against the wall. In fact, much the opposite is the case. To cite Edison’s well-known dicta as a benchmark, “Genius is 1% inspiration and 99% perspiration.” This overstates the case, but innovative design tends to happen in small environments which can be effectively modeled to the point where changes from shifts in composite parameters can be approximated hypothetically, additional variables or inputs can be added to the context in a controlled fashion, or both.

Engagement with those environments—i.e. knowledge and skill—tend to improve the frequency and coherence of designs, to which quality of outcome correlates. Fortuitous manipulations do happen, yes; information putatively extraneous to context can provide valuable guidance or comparison, again yes. Innovative design does not necessarily flourish in noisy environments maximally in flux. There, relationships can be hard to grasp, and innovations may soon be suboptimal in ever changing contexts; indeed, conservative but stable designs may better reward success. In brief, innovative design occurs best in the enriched niche, not in the middle of a crossroads.

Innovation diffusion, by contrast, occurs best where information and adaptation are minimally constrained across a context. Consider the adoption of mobile phones in Europe or Korea, where a single technical standard was publicly designated, adoption of mobiles was rapid and deep, and use-driven development burgeoned. In contrast in the US, competing technical
standards and incompatible service provider networks slowed adoption, and have left services fragmented. Diffusion is a process which implies point autonomous use of what is adopted or put to use. In contrast, propagation is more nearly a spread whose nodes remained linked.

Consider Linux an example of diffusion and Windows an example of propagation. Linux point sources can transform or adapt independently, while Windows point sources are under heavy systemic pressures (incompatibility drift) to transform in relation to nodal (i.e. Redmond) based changes. It isn’t commonly understood that many innovative designs are effected well before they diffuse (or are propagated), perhaps because salient fads can diffuse with great rapidity in modern societies. A typical example is the Internet, which was functionally extant well before software refinements turned it into a mass medium, a medium whose greater scales drove product and organizational developments thereafter. The adoption trajectories of innovations most typically are logistic functions in form, but with longer low adoption under-the-radar initial tails then considered, even much longer. Whether relatively rapid diffusion is a social virtue is debatable, but it is certainly an economic gain if only for implementation investment.

There are two points to making this distinction between innovative design and innovation diffusion (or propagation). First, the two processes can be facilitated or inhibited separately. For example, a society with low barriers to diffusion may still be the one to capitalize upon innovations, regardless of source, because they scale the markets and formalize product parameters. Second, large profits come to those investing in innovations which diffuse due to market scale-ups, while huge profits come to those investing in innovations which propagate since they remain substantially intermediated in subsequent capital flows. The arguments one typically hears for lowering barriers to innovation diffusion and damn the consequences are from those hoping that their innovations or the industries tied to them will get the market scale-up opportunities. ‘Pro-adoptionists,’ to give them a name, typically have a stake in the outcome so their perspective is not disinterested (presuming that anyone else’s could be, either). To get innovative designs we need enriched niches whether or not we have low barriers to innovation adoption. We can have rapid adoption without being particularly innovative. Societies can, in fact, deliberately choose whether or not to have rapid adoption.

Moreover and more importantly societies can deliberately choose which innovations to rapidly adopt (within limits); consider China in the latter regard of selective adoption. Choices about which innovations to permit rapid adoption are choices about who will get very rich, however. Much of the shouting about innovation is, at its base, concerned with the last proposition.

Further reading:

Nebojsa Nakicenovic and Arnulf Grübler. 1991. Diffusion of Technologies and Social Behavior.
Jacob Getzels and Mihalyi Csikszentmihalyi. 1976. The Creative Vision.

[Respectively the best texts on technological innovation and the creative process I have ever
found. Of course they are the least known.]

Print Friendly, PDF & Email

28 comments

  1. A curious Aussie

    I don’t see how that comarisons to non linear systems improves our understanding.

    Afterall it was the mathematicization of economics that has cause much of these problems.

    A glass of sand is a complex non linear system but it is stable. Shake it a bit and it will quickly return to stability.

    In my view (and in theory) most aspects of free markets are stability improving and shock dampening. However human psycology is not. Herd mentality is anti-stable. Combine that with incompetance and greed and we get what we have now.

  2. Marcf

    Curious Aussie,

    Nature abounds with bi-stable systems. Switches in electronic design, cell functions, stochastic gene expression (state change to persistence in bacteria, color pigmentation in eye detection, system biology for gene regulation, phase transition in physics).

    Any system that has feedback loops with re-enforcement will exhibit non-linearity. A system that sees a variation and dampens it will be stable, hold a pendulum down, a system that sees a variation and amplifies it will deviate from previous equilibrium, hold a pin on its head.

    Bi-stable systems are good, not evil. (regulation in cell expression for example is a known hedging strategy). NOT ALL non-linear systems are RANDOM, they can be bi-stable, tri-stable, n-stable bla bla bla. One staple is that their dynamic is explosive, they just move QUICKLY between one state to the other. Think about the current banking crisis as such a state change and we are in the transition scratching our heads. I am growing my own conviction that what we call “cycles” in economy may just be the by-product of these positive feedbacks. The dynamics of the movement may be beneficial: fast adaptation as opposed to slow motion. Think Japanese bank adaptation: better to move quickly to another equilibrium than drag on.

    Removing non-linear behavior is naive (they are just the nature of the beast) and could be counter productive.

    Finally, modularity of system doesn’t mean “non-information flow” modularity in financial system is as much an illusion as it is in biology, it is a way to comprehend what is going on, but by and large most systems are tangled. Aspect oriented design in software, detangles the modules, but is a view of the spirit that is found in natural systems as an imperfect construct (persistent state of cells, horizontal gene transfer of RNA etc).

    Bottom line: risk modeling assumes equilibrium when in fact a system is always in “meta-equilibrium” meaning a temporary equilibrium that is just waiting to be thrown off-quilt by context. Rather than trying to legislate feedback loops which is naive, I am personally trying to understand the canals of transmission and identify the loops, they are complex like in biology and mostly intractable. System biology advances by computing power alone and even then its achievements, while admirable are limited. A dynamic understanding of the coupling can lead to stochastic modeling and attaching probabilities to the configuration space. While individual events appear “chaotic” the “generatrice” can be modeled, this will lead to a better risk models. Stop spitting on mathematics, it is mostly an observation of state. Only assuming eternal equilibrium with dampening feedback ignores reality.

    And then of course this is all tampered by the liquidity mass out there that awashes every asset class and leaves it as a victim of movement and momentum, again a re-inforcing feedback loop, but this one may be the dominant one. Too much money chasing too little assets will lead to volatility.

    The statement “liquidity leads to volatility” may just be the expression of M2 moving around with feedback (momentum investment).

    http://www.thedelphicfuture.org

  3. Scott Finch

    In response to the second question posted above regarding ‘black swan’ events…

    As I am sure others would agree, the ‘crises’ in securitized debt was not a black swan event. Merely asking the above question displays a lack of understanding for the more fundamental and subjective mechcanism by which the ‘crises’ came to light.

    I am a student at a PAC 10 school and enrolled in the finance program. All too often I cross paths with fellow undergraduate students touting strictly quantitative approaches to investing in capital markets and managing risk. I partially blame these misaligned efforts on the business school as complex behavior is held in high regard.

    Risk, in my opinion, is not any set of summary or descriptive statistics.

    The value and additional understanding generated using linear applications to describe [perhaps] non-linear systems may be leveraged tremendously when taking the fundamental and subjective mechanism to which the stystem is exposed into view.

  4. James

    Financial markets are obviously “complex systems”. But what sort of complex system? Are they the kind that can be described by the mathematics of chaos, like Lorenz’s famous three-equation “butterfly attractor” system? It would be nice if this were the case, because the nonlinear mathematics of chaos and state (phase) space analysis can reveal relatively simple underlying dynamics in apparently complex systems. Unfortunately, I don’t think this is so. I would expect any dynamics revealed by nonlinear analysis to be just as transitory as those revealed by linear analysis because financial markets are complex _adaptive_ systems in which each new model or analytical approach becomes part of the system as soon as it is applied. This makes the system permanently nonstationary, violating a key assumption of all traditional linear and nonlinear analytical approaches. How do you analyze a nonstationary adaptive system? I wish I had a good answer to that, but as far as I know nobody does.

  5. Carlosjii

    Continuity is a fundamental assumption of conventional finance. pg 237

    Human Nature yearns to see order and hierarchy in the world. It will invent it if it cannot find it. pg 189
    from The (Mis)behavior of Markets – A Fractal View of Risk, Ruin and Reward, Benoit Mandlebrot

  6. russell1200

    I will try and find the reference later (I hope I have it printed up at home somewhere). I have reference to attacks on network systems that the less connected system has much more frequent collapses, but they generally do not affect the greater whole. Highly connected systems are very stable in general terms, but when they do have an “event” they are more likely to have a complete system collapse.

    I work in commercial construction. It has catastrophic financial collapses within its component parts (building projects) all the time. When you have relatively small contractors, the effect of the collapse tends to end with the bankruptcy of the firm and possibly chain reaction bankruptcy through a few other related subcontractors.

    The very large, multinational contractors do not go bankrupt with the collapse of one project, instead they ride it out, pay there bills, and generally the building owner views it as a much safer process.

    However, when you get a really big collapse these firms can (and do) go under. In this case their size and interconnection to so many projects is likely to cause a greater overall problem.

    A few years ago, the construction corporate rollup was a popular Wall Street model. A corporate entity would buy up a number of midsize contracting outfits and put them under (at least in theory) a unified systematic governance structure with deep pockets.
    What actually happened? A disastrous project in California (or somewhere) would suck money out of the corporate entity. The successful (previously separate) parts of the company would start having problems as the company became slow paying. It generally turned into a slow negative feedback system and the rollup would go bk.

    In effect, the fluidity of the rollup, allowed the mega-big hit to be transmitted much further across the industry. Fortunately, the industry is still rather fragmented. Industry wide collapses within commercial construction tend to come from outside the system (the general economy), not within it.

  7. Daniel Newby

    As an engineer who designs control systems, nonlinearity is the last item on my list of Things To Worry About. If something is moderately (but not wildly) nonlinear you just design for the steepest part of the curve.

    No, what burns my bacon is delay. That takes a little explanation. Active stability is provided by negative feedback. When the system is above the desired operating condition (a positive error), a restoring force is applied in the negative direction. When it is below the desired point (a negative error), the restoring force is applied in the positive direction. It’s called negative feedback because the restoring force is proportional to the mathematical negative of the error. Force = (negative number) * (current condition – desired condition).

    The problem is that most systems in the real world have “momentum”. There is a delay between applying the restoring force and the resulting change in the condition. Simple negative feedback will always overshoot the mark. If the force was too large, or the system has little “drag”, it will then overshoot it in the other direction, oscillating back and forth before gradually converging on the desired condition. If the force is way too large, it will never converge, but instead oscillate back and forth forever.

    What is happening is that the controller is trying to manage this cycle’s positive overshoot but is looking so far into the past that it sees the previous (negative) half of the cycle. When that happens, the negative feedback number effectively turns positive and the system runs away with itself.

    Engineers deal with this effect by taking other data into consideration. The two most common are the rate of change and the cumulative error. In other words, if something has momentum, you push less hard in proportion to how fast it is moving towards the goal (you may even pull backwards to reduce overshoot). Using cumulative error ignores momentary disturbances. Most control problems can be solved using cumulative error, instantaneous error, and rate of change.

    It seems to me that the financiers have been ignoring those other factors, and also turning the feedback knob up to 11. Early on, it seems to work. The salutary effect on equity markets is instantaneous, with the systemic effects taking years to manifest. So they turn the feedback knob up to 20. Still no problem. 50? A little smoke but I think we can take it.

  8. James

    Russell1200, I think this is the paper you referred to:
    Albert, R. et al. 2000. Error and attack tolerance of complex networks. Nature 406:378-382.

    I think the principle you discuss applies to all manner of business. In retail, for instance, we’ve seen a huge trend towards big chains taking over markets from small retailers. The deep pockets of the big retailers may provide some stability until, say, Target goes out of business. Then suddenly we have a countryside full of empty boxes, even in markets that might otherwise have sustained them.

  9. Daniel Newby

    Also: An important stability factor is the long-term capital gains tax exemption, which induces people to ignore short-term fluctuations and consider cumulative corporate productivity. I predict that if it is abolished, the business cycle would tend to speed up and the peak-to-trough change grow.

  10. Yves Smith

    These are good comments, and I wonder if any of you can address the topic that concerns me: how do you tell (aside from waiting for it to fall over) when a system, whether due to too many feedback loops increasing non-linear behavior or other factors, that a system has too great a propensity for collapse? How do you think about tolerances and metrics?

    There are some risks that will impact a system because the external shock is too great (nuclear war, say). But there are others that a robust system might survive (albeit perhaps at a reduced level of activity) while others would fail. Are there any proxies or methodologies in engineering or other fields for telling one from another?

    There is likely no way to get these considerations included in regulatory design and other “what do we do about the financial” mess discussions, yet they seem a vital part of the equation (no pun intended).

  11. Richard Kline

    To A Curious Aussie, Russell 1200, and Yves:

    The issues you are raising involve connectivity and propagation questions. Whether and how a system is stable depends upon its specific parameter set: there is no sui generis case, and this point is crucial to translating other studies to financial market conditions. There are some general principles regarding connectivity which tie together things I see in your remarks, however. I will specifically raise some issues on this when we get to that discussion per question four. It is to ‘clear the underbrush’ for that, as it were, that we need to consider whether or not the present financial crisis is principally driven by a bifurcation catastrophe—Black Swan variant—by specific nonlinear instabilities, or by embedded destabilizing factors such as propagation amplification. To me, we need to do some ‘context description’ to see what prior experience fits, and so what remedies if any may best apply.

    And BTW CAussie, I do not see markets as inherently self-stabilizing and dampening, no. They have been improved to make them moreso, but small (and not so small) changes at the margins, as with heightened monetization of formerly illiquid assets, can make for massive changes in overall system stability. Consider closely the probable, and empirically determinable, case that the financial markets now only maintain stability (or its appearance) due to EXTERNAL INTERVENTION, i.e massive liquidity inputs and regulation forgiveness, in both cases by public authorities exogenous—external—to normal ‘market-stable’ processes. The markets were not in fact stable or self-dampening in the long run if they are often so in the short run: in the long run, we would all be dead if we left it to the markets. That is the history on the matter, current ‘theory’ notwithstanding. I will talk a bit about what I mean here in the third question I will raise by and by. Laying the instability off on ‘irrational humans’ is a kind of logical out-factoring which makes the discussion meaningless. I’m not saying that as a criticism per se, only that one has to conceptualize the problem somewhat differently in my view.

    Can’t say more today, regrettably. I will take a look at the papers mentioned, though, and thanks to all.

  12. James

    Yves, that’s an interesting question, and I don’t have the answer. When thinking about this stuff, I think it’s useful to differentiate between extrinsic shocks (things coming from outside the system, like nuclear war) and intrinsic shocks generated by the system itself. I’d say the credit crisis qualifies as intrinsic: it’s the result of a positive feedback loop between housing price changes, interest rates, and credit standards (and other things, I’m sure). Once something like that really gets rolling it has to unwind eventually, and it creates its own crisis whether or not there’s an extrinsic shock.

    Russell1200, thanks for that link. I hadn’t seen that paper.

  13. A curious Aussie

    marcf: “Stop spitting on mathematics, it is mostly an observation of state. Only assuming eternal equilibrium with dampening feedback ignores reality.”

    I dislike the common abuse of mathematic models in economics and finance. In economics so much ‘research’ is published that is simply some new useless way of using math to tell us what we already know. In finance we abuse statistics in horrible ways to come up with conclusions that we want, all backed up by fancy mathematics. (This is despite me completed my degree in mathematics)

    I really liked the “Blame the Models” blog post.

    Richard Kline:
    I find it surprising that you state that markets are not stable. By stable I mean equilibrium returning not equilibrium achieving! Markets aren’t a human invention and they aren’t restricted to the financial world.

    Eg the market for sexual partners has been functioning for millenium. Sure it is certainly not a homogeneous product and there is various market segmentation but it certainly does a decent job of market clearing.

  14. russell1200

    Financial markets seem to be so prone to the feedback loop that it probably is accurate to say that they are not inherently stable.

    Some of the boom and bust cycles (particularly the 18 year real property cycle in the 19th century) are so metronomic as to almost appear like clockwork.

    Arguably these various cycles if understood properly could be used to give a sense of order to the financial system, but the external environment changes so frequently that your cannot even be sure that the underlying principals are the same.

    The whole argument that the service economy causes shallower business cycles is a case in point. Does it change the game? Or have we simply not been through enough permutations to see what it is capable of delivering.

  15. russell1200

    Richard:

    To me your use of sui generis is a little confusing. A sui generis case would be a unique standalone case: no? In which case you couldn’t use any models for anything. It would all be platypuses out there.

    However, the arguments of network topology studies (or whatever you want to call them) is that the form of a network itself can have a major impact on how events effect a system. The liquidity as a transmitting device is arguably edging toward a subset of this argument. It doesn’t seem that liquidity as the conveyor of problems is particularly helpful unless you know what type of interactive system you are dealing with.

    Per Taleb, I stand to be corrected, but it appears that you are using his Extremistan argument of reality, not his black swan one. Extremistan is the land of expected fat tales. Black swans are the land of singular off model events. If you don’t know that you are living in Extremestan, you could take a fat tale event (like a Katrina) as a Black Swan, but I take the Black Swan to be more singularly unique and unexpected. The discovery of penicillin is an example he used that comes to mind.

    My point is that a bad event caused by excessive leverage is not really a black swan: even if it lead to the collapse of the US financial market. It is territory we have tread pretty close to before.

  16. Ingolf

    “Consider closely the probable, and empirically determinable, case that the financial markets now only maintain stability (or its appearance) due to EXTERNAL INTERVENTION, i.e massive liquidity inputs and regulation forgiveness, in both cases by public authorities exogenous—external—to normal ‘market-stable’ processes.” (Richard Kline in comment above)

    This is true (to some extent at least) but fails to consider whether the systemic instability we’re trying to better understand is itself in large part the consequence of repeated earlier external interventions.

    Markets, left to their own devices, have a very long record of muddling through. The intermittent dramas they put on, impressive though they may be at times, shouldn’t be allowed to distract us from that simple fact.

    In attempting to alter reality through artificially stimulating growth and lessening the consequences of risk taking, external interventions pollute the feedback loops that are critical to the system’s wellbeing. Smaller crises are indeed averted, but only at the cost of saving them up (with compound interest) for some far larger catastrophe down the track. That day, it would seem, may finally have arrived. Anyone soberly observing the extent and nature of credit growth of late will hardly be surprised. The surprise, to many, was instead that the whole contraption could carry on for as long as it did.

    Which brings me to a second point.

    “Laying the instability off on ‘irrational humans’ is a kind of logical out-factoring which makes the discussion meaningless.” (Richard Kline again in comment above)

    Like “a curious aussie”, and James if I’ve read him correctly, I think this matter of human involvement (whether judged “irrational” or not), far from being an unwanted and pointless guest at the feast, in fact goes to the very heart of the matter. Markets, as I see them, cycle around constantly changing underlying fundamentals according to the psychological ebbs and flows of their participants, a group whose makeup is also of course constantly changing. The fundamentals, in turn, are often affected by the animal spirits (or lack thereof) of the various players. And so on and so on in a freeform dance that has no end or plan beyond the often inchoate desires and ambitions of all of us.

    Can all of this really be sensibly modelled? As James said: “. . . each new model or analytical approach becomes part of the system as soon as it is applied.” And so on ad infinitum.

    I suspect all one can say with any real confidence is that leverage makes the system far more fragile, quite likely on an exponential basis, and that distorting the feedback mechanisms will have unintended consequences, few if any of them (in the long run) happy ones.

  17. Anonymous

    Sy Krass said

    Ingolf hits the nail on the head – too much debt has completely distorted the system. Just look at the charts – the Dow is forming a giant head and shoulders top, M3 has exploded and is steepening to the point of deflation or hyperinflation depending on how this plays out.

  18. Anonymous

    A single bug in a single line of the source code of a computer program (such as Microsoft Word) can easily cause the entire memory space of the program to become corrupted, resulting in corrupted data, non-responsiveness of the program, or a crash. The bug could be a logic error on the part of the programmer or even just a typo, but no matter the origin of the error, the results can be similarly lethal. This may not be the type of “complex system” being discussed here, but it is a complex system with millions of lines of code for a typical large computer program.

    Similarly at the file system level, corruption of a single file belonging to the operating system (like a rogue trader at Societe Generale) can bring the whole operating system down in an instant. These human creations are fragile complex systems.

  19. Charles Butler

    Discussions of this sort, interesting as they might be, merely provide fodder for those that claim that the financial system does not need to be regulated or cannot be, due to its complexity. But the fact is that the current situation is a direct result of decision makers within the system not calculating their personal risks inherent in their own actions in any way remotely similar to that in which the institutions they manage would calculate them for themselves had they any intention of remaining solvent over any long run. For anybody operating under a yearly bonus system and and whose economic stake in the long term survival of the enterprise pales in comparison, the risk at any moment is reduced to lost opportunity. But when that opportunity calculates out to huge amounts of money, true risk averse behaviour requires taking on risk for the institution. The global financial system run by commission agents.

    Apologies the economists on staff if this is not directly reducible to a number.

  20. Daniel Newby

    In the engineering of simple control systems, stability is measured by applying a sudden disturbance and observing the response. In general, the more rapid the response, the closer the system is to instability. The graphs on this page (scroll down to the charts) show systems that respond so fast they overshoot and are closer to instability (underdamped), as fast as possible without having overshoot (critically damped), and slow responding (overdamped). You can also let the system simply run and observe how it responds to spontaneous stimuli. If you have the luxury of letting it run for a long time, you can make a spectrogram of the data and look for peaks that would indicate risky resonances at particular frequencies.

    I’m not sure how to apply this to the economy. There are plenty of good, easily-measured markers in the economy: money supply; sector volume; sector price per transaction; proportion of debt, savings, and income used to pay expenditures; debt service to income; and many others. Plot charts of Amazon.com stock price and volume, and San Diego house prices and sales volume: you see the same rapid build up that far exceeds the growth of enterprises known to have long-term profitability, followed by the same zig zag decline. The euphoria/panic cycle is unmistakable and easily measured.

    The difficulty is that planners at all levels believed that an economy-wide euphoria/panic cycle was impossible. They confused the absence of one in their lifetime with its impossibility. Never having been burned by fire, they believed it a tame servant.

    How can you fight that? If Bill Clinton had loaded the Fed with a battalion of Volkers, they would have eventually been vilified and destroyed, and the disaster would have been pushed forward around one generation at the most.

    The problem seems immune even to national law. If the decentralized American republic were restored, or even partioned into several new nations, international trade means we would still have had this meltdown. If England, Spain, and China can all get caught in the same euphoria trap, there is probably no stopping it.

  21. Richard Kline

    russell 1200 of 11:47, you are right that I misued sui generis; remind me to post only when I’m awake. Try this: ‘There is no generic case,’ which was the comment I wished to make. All systems are not created equivalent; specific parameters must be identified for each and every one.

    Ingolf: It is easy to conflate near-term stability in financial markets with their long-term prospects. In the near-term, markets _can be_ fairly stable. In the long-term—say on the order of a generation, call it 25-30 years—these have crashed with such frequency that it is difficult to avoid the conclusion that markets are inherently fragile, if not intrinsically unstable. Oh, when they crash folks put them back together again so it looks like they ‘endure,’ but really the situation is more discontinuous than appearances would have it. I don’t mean that markets crash _totally_, but enough to wipe out many players with major macroeconomic impacts. There are multiple reasons why this appears to be the case; to posit Just a few: a) cumulative concentration, b) over-adaptation to current but impermanent contexts, and c) cumulative connectivity or call it correlation if you prefer. Something I will try to discuss toward the end of this series is that markets which optimize for short-term outcomes may be inherently exposed to mid-term divergences and so optimize themselves _away_ form long-term stability.

  22. Ingolf

    Richard, it seems we have very different expectations of markets. Periodic swoons of the sort you describe as crashes are, as far as I’m concerned, simply part of the natural ebb and flow. Painful, certainly, but necessary when the markets have stretched their rubbery connection with reality a little too far. Given our innate tendency to get caught up in enthusiasms (and despairs) from time to time, I don’t really see how this will ever change.

    A reasonable argument can certainly be made, in theory at least, that official leaning against these periods of excess would provide a useful dampening function. The problem, as we’ve seen again and again in the past century, is that those who we would have carry out this mandarin function are themselves only too human.

    Not an easy problem to get around.

  23. Peter

    are there any nonlinear models for
    markets ? If there are I would love to see
    some references…there’s the issue
    with self-induced stochastic resonances
    where noise can drive the nonlinear system into behaviors not allowed by the noise-free parameters that are supposed to be governing its behavior. So the amplitude of the volatility of liquidity could be a bifurcation parameter (!).

  24. Juan

    One or more of these written by a pretty sharp mathematical economist, Barkley Rosser Jr., may apply.
    See: http://cob.jmu.edu/rosserjb/

    I would add my own thoughts but they would have more to do with dialectics such as quantity v quality and internal relations, so not sufficiently tight for this discussion.

  25. Richard Kline

    So Ingolf: Consider just the crashes we have seen in Turkey, Argentina, SE Asia, or the Sunbelt property crash in the US in 91. All of these were local financial system killers. Several of them led to ‘rebuilds’ by external authorities—Sunbelt, Turkey—others were _exacerbated_ by bungled ‘rebuilds’ by external authorities—Argentina, SE Asia. I wouldn’t call these normal cycles of mania and depression, and considering the scale of the consequences we need a ‘new normal’ it seems to me. Part of the issue is that if we expand our view of ‘financial markets’ to include ALL comparables over a phase space of 300 years or more, market phenomena look much less stable long-term than if we restrict our view to selected sets of optimal durations. I’m not accusing you of selective thinking BTW; it’s just that the term markets is ambiguous, and discussions like this often do not sum due to different definitions, so I’m seeking to be clear, here. As to whether the problem of market instability or the rather different issues of systemic oscillations can be better managed not to say solved, it will make some difference if we actually describe _what happens_ with closer approximations. This is my interest in pursuing the stated questions at this time.

    So Juan: I have read some of Rosser’s papers in the past, and yes he seems to have some traction on whereof he speaks. I’ll look in on what he has been doing lately. Thanks for the reference.

  26. Richard Kline

    To Daniel Newby: Your remarks on rate of change issues in feedback loops parallel some of my own thoughts come the questions on potential remedies I’ll get to last in this series. We do _not_ have effective ‘governor’ modulation of capital flows, for instance, which tracks appropriately to changing conditions. For example, risk premiums will go up as counterparties perceive increased risk, but their view tends to be local-context and short-term, so the matchups may fit poorly to mid- and long-term conditions, to specific or to general market conditions, or all of the above. There are other considerations here also such as power-scaling issues for different levels of capital flows which are not well-accounted for in present thinking it seems to me.

    If we’re going to have that discussion, guess I’d better do less blabbing and more writing . . . .

  27. ingolf

    Richard, I think our interest in this topic is probably sufficiently different to make attempts at an ongoing conversation unrewarding for us both. It’s your post, and clearly the more technical aspects are of great interest to many here so, for now at least, I’m going to retire to the sidelines.

Comments are closed.