By Philip Pilkington, a writer and research assistant at Kingston University in London. You can follow him on Twitter @pilkingtonphil
Three stages in the development of a Menger sponge
If I were to put forward to you that we can learn as much if not more about human cognition, epistemology and methodology from the clinically insane as from our philosophers would you think me to be engaging in hyperbole, pseudo-intellectual claptrap and pretentious avant-garde nonsense? Perhaps you would, but I maintain that this is perfectly true and in what follows I hope to show it. I hope to highlight certain issues of aggregation which plague the social sciences and which, in the sphere of economics, have resulted in a strong regression to backward and primitive metaphysical mechanisms of thought since the post-war era.
When the world falls apart some things stay in place
Levi Stubbs’ tears run down his face
– Billy Bragg, ‘Levi Stubbs’ Tears’
The word “schizophrenia” has been completely bastardised outside of the spheres of psychology and psychiatry today. Otherwise educated people generally use the term to refer to any manifestation of thought that involves holding two contradictory notions on a single issue at any one point in time. Thus schizophrenia is often portrayed as something of a Dr. Jekyll and Mr. Hyde phenomenon – at one time the person engaged in such thinking switches from Jekyll to Hyde without even realising. The reality of the disorder, however, could not be further from this. (A closer psychopathological metaphor might be “bipolar”, but even this is off the mark).
In reality schizophrenia is characterised by the complete breakdown of a person’s reality. Almost all cognitive functions are affected by this horrific disorder: perceptions of space and time fragment; linguistic expression becomes stuttered and aimless (this is usually referred to as ‘word salad’); and emotions become misdirected and emerge seemingly randomly. At one extreme of the disorder, in paranoid schizophrenia, the affected person may manage to reconstitute their world in some shape or form by making sense of the breakdown by postulating some grandiose series of delusions. While at the other extreme, in disorganized schizophrenia or hebephrenia, the breakdown becomes progressively worse and worse and the affected person typically loses the battle against the fragmentation of their reality. This usually results in institutionalisation until death.
The following short video might give some insight into this disorder.
Note that the patient suffering from schizophrenia tends not to answer the questions directed at him but rather responds with complete non-sequiturs. This is characteristic of the breakdown of language that we mentioned early – although it should be said that the breakdown can be far more extreme than in this case and language can degrade to the point where sentences themselves stop being coherent. The psychiatrist who comes on to discuss the case afterwards characterises the disorder nicely when he says that it is “a total involvement of a person in an illness, a very global impairment of what we consider the higher psychological function”. Clearly then, schizophrenia is a very different phenomenon than that what people appear to think of it when they use the term in colloquial linguistic expression.
Schizophrenia then, as the philosophers and psychoanalysts Gilles Deleuze and Felix Guattari recognised, can perhaps be given a topological expression in the Menger sponge, a picture of which can be found at the top of this piece. The symptoms of disorganisation and fragmentation that affect spatial perception, language and hearing resemble the breaking down of the sponge. While the reorganisation or re-aggregation that takes place in the recovery process of, for example, paranoid schizophrenia resembles the reconstitution of the cube that we might see if we move backwards through the series of images. The further the sponge breaks down, the more dysfunctional becomes the sufferer; while the more successful the reconstitution of the cube through delusional explanations of the bizarre events taking place the more functional the sufferer becomes. The Menger sponge is, of course, a metaphor, but it seems a particularly powerful one to help non-sufferers understand what schizophrenia is really about so that we can perhaps begin to try and move beyond the Dr. Jekyll and Mr. Hyde nonsense that currently prevails in popular discourse.
Schizophrenia is quite obviously a pathological phenomenon. While schizophrenia’s organic roots have yet to be established beyond doubt – if, indeed, they are ever truly established beyond doubt – it is clear that we can still call it a “disease” in some sense because it leads the human organism to become severely dysfunctional. However, we also know that some of its symptoms can be simulated in “healthy” individuals by dosing them with Ketamine which suggests a more nuanced approach to the status of the illness. While most of the medical reports on this take the tired line that they might actually be discovering something about the nature of schizophrenia, the more reflective among us can see that what such experiments actually show is that there is a continuum of human cognitive abilities which can be stimulated by certain chemicals or events and that these states simply represent the extreme ends of this continuum that most of us never experience. Indeed, it is most fruitful to draw from this experiment the conclusion that Deleuze and Guatarri drew from their observations of schizophrenia: namely, that in its extremity it tells us something interesting about the nature of human cognition itself.
Deleuze and Guattari derived two basic principles from their analysis of schizophrenia and its importance for philosophy: they called these the “molar” and the “molecular”. The molar principle is an aggregative principle in which smaller entities are subordinated to larger entities. So, for example, a group of individuals are referred to as a named group – say, a football team – rather than by the names of each individual. The football team becomes, in a sense, a new cognitive entity from the individuals that make it up. The molecular principle, on the other hand, is the process of breaking down molar structures in order to create smaller cognitive entities. Thus, we can break down our football team into individual entities and from there we can break down each individual into a collection of emotions and thoughts and from there we might break down these emotions and thoughts as being generated by a collection of memories and fantasies and so on and so on; the depth of molecular breakdown that we wish to engage in being left up to our own judgment.
Deleuze and Guattari, being anarchists, believed that the molecular process of breakdown – which was, by its nature, potentially and logically interminable – was a better model which to live one’s life than the molar processes of aggregation. They applied this judgment to everything from the structures of social organisation to the way in which an individual views their own self. We are here not so much interested in value judgments but instead with the processes themselves and what they can tell us about the issues at hand.
Needless to say, these two principles are not really existing entities in any so-called material sense but are instead a manner in which we organise our cognition – our cognition being a feature through which we “filter” so-called reality. In this they broadly, if not completely, correspond to what economists and other social scientists usually refer to as “macro” and “micro” phenomena. Macro phenomena are those that take place at a highly aggregated level, while micro phenomena take place at a disaggregated level. What these social scientists generally miss however, but which Deleuze and Guattari certainly did not, is that the attempts to disaggregate are logically interminable and have no non-arbitrary end-point. It is the subject of cognition themselves – the social scientist, in our case – that brings this infinite regress to a halt by imposing some new macro or molar level of aggregation.
And with that discussion we now move on to how these considerations have affected the discipline of economics since roughly the 1970s.
The Microfoundations Delusion
Recently the Post-Keynesian economist and historian of economic ideas John King has published a book entitled The Microfoundations Delusion (my review will appear in the next edition of the Review of Keynesian Economics). I would encourage any reader who is interested in these issues to read this work as this work appears to me an important contribution to the macro-micro debate not simply in economics, but also in biology (the title is an ironic smirk to Richard Dawkins’ The God Delusion) and across the social sciences.
In his book, King lays out how economists have tried to establish supposedly disaggregated “microfoundations” with which to rest their macroeconomics upon. The idea here is that Keynesian macroeconomics generally deals with large aggregates of individuals – usually entire national economies – and draws conclusions from these while largely ignoring the actions of individual agents. As King shows in the book, however, the idea that a macro-level analysis requires such microfoundations is itself entirely without foundation. Unfortunately though, since mainstream economists are committed to methodological individualism – that is, they try to explain the world with reference to what they think to be the rules of individual behaviour – they tend to pursue this quest across the board and those who proclaim scepticism about the need for microfoundations can rarely articulate this scepticism as they too are generally wedded to the notion that aggregative behaviour can only be explained with reference to supposedly disaggregated behaviour.
The modern question of microfoundations is generally traced back the Lucas critique in economics, a crude criticism that macroeconomics lacked a so-called microfoundation and named after the economist Robert Lucas, who first raised the criticism in a serious way in the 1970s. The argument was a thinly veiled attack on Keynesianism which relied and continues to rely almost wholly on macro-level analysis. The Lucas critique itself is similar in many ways to a small child asking the question “why?” over and over again until told to shut up by the adult. If properly and coherently formulated the critique would be bottomless. First the critic would ask why the analyst has not taken into account the “behaviour” of “individuals”. Then he would ask why the analyst has not taken into account the “psychology” of these “individuals”. Then perhaps he would ask why the analyst has not taken account of the “biological” and “genetic” features behind this “psychology”, and so on and so on ad infinitum. (The sharp reader will note that the words placed in quotation marks in the last three sentences all imply some sort of aggregation, whether of something called an “individual” or of some sort of “average” of behaviours of this “individual” and so on).
This process of looking deeper and deeper into the micro or the molecular is, as we have seen in our discussion of schizophrenia and human cognition, by its very nature interminable. In order to stop at some point and stop the nattering questioning the analyst must come to a halt at some given level of abstraction. And this is precisely what Lucas and his followers – who pretty much constitute the entire of the mainstream economics profession – did.
The level of abstraction at which the microfoundations advocates came to a halt was the so-called Representative Agent with Rational Expectations (RARE). The idea here is that since all actors in the economy have rational expectations and access to perfect information – which means, for all intents and purposes, that they can tell the future – the mainstream economist simply aggregates all these individuals together and gets a single “representative” agent. This, in their infinite ignorance of methodology and philosophy, they consider some sort of solution.
“But wait…” the astute reader will say, “this is just another level of macro theorising. Isn’t this representative agent really a sort of average or mean of all individuals in the economy? And doesn’t this mean that this is, in fact, an aggregated or macro theory?” Yes, of course. But because your average economist has no shame in talking about things they have no idea about, they just ignore this. The problem is that the microfoundations crowd simply do not understand what they mean by “microfoundations”. They seem to equate it with some sort of methodological individualism, not recognising that this simply results in another aggregation – that is, our RARE agent who is not just an aggregation but a remarkably unconvincing aggregation.
In fact, we can go further still. If we examine carefully what this RARE agent actually is we will see that it is actually the economist himself, or at least the economist’s own ego (i.e. his representation of himself to himself).
In order to see this clearly we must run through what is going on here. Neoclassical economists believe that individuals act in a certain way and for certain reasons. We know that they have no evidence for this outside their own mind, since they cannot literally get inside the heads of others to find out if this is true. So, we can only assume that they postulate this RARE agent based on their own immediate lived experience. Whether the economists themselves are actually in any way similar to this omniscient RARE agent is a different question which we shall not broach here (note that I consider this to be a simple power fantasy), what we should be clear about is that as the economists made their way down the dark tunnel of theoretical disaggregation and disintegration what they ultimately met at the end was but a mirror image of their own ego. This is truly surreal and, it should be mentioned, not unlike the instances of heuatoscopy that take place in schizophrenia causing sufferers to encounter doppelgangers, albeit in our case we can clearly see that the economists encounter their doppelgangers in their intellectual constructions rather than in their lived reality thus ensuring that they remain on the right side of the abyss**.
Even on Their Own Terms…
Microfoundations, however, do not even make sense on their own terms – i.e. even if the RARE framework was realistic and was not just another aggregate. This is because large groups of people have more influence on the individual than individuals have on large groups of people. Consider, for example, a football stadium filled with one hundred spectators lined up ten-by-ten. They are all sitting down. We know that every individual can stand up to get a better view except those in the front row (whose view is not obstructed). Let us also assume that if the individual in front of another individual stands up, the individual having their view blocked will also stand up to compensate.
Now, what does this mean? Well, it means that any given individual can stand up to get a better view but by doing so they will cause other individuals to stand up. If the most influential individual in the front row stands he will cause nine other people to stand, while if the least influential individual in the back row stands he will have no influence on anyone else at all. So, the maximum power an individual has to block the ability of other individuals to stand up to get a better view is the power to block nine other individuals. However, if the group all stands up at the same time then no one is able to get a better view.
This is simply a numbers game with no metaphysical overtones and it applies to many problems in economics and other sciences. For example, the so-called paradox of thrift states that if the whole community try to save at one time, no one will increase their savings (because they rely on the spending of others for income out of which they can save). The group thus has an enormous constraining influence on the individual. However, if a single individual tries to save he will lower the group’s income to some very small extent (to put it in economic terms: the multiplier will fall by a small amount) but it will not have any very significant effect on the group.
This highlights quite clearly the importance of aggregation and macro phenomena. Not only are these absolutely necessary for us to think at all, but also there is an inherent tendency for larger phenomena to exert greater influence over smaller phenomena than for the smaller phenomena can exert of the larger phenomena. Again, no metaphysics here, no value judgements, it is just a simple numbers game.
Trapped in the Mirror
As already mentioned, some economists within the mainstream have come to question the need for microfoundations. Two notable examples on the internet are the New Keynesian economists Paul Krugman and Noah Smith. However, they do not appear to have a firm grasp of the methodological issues involved (not surprising given their training) and so they can only mount vague and watery defences of their position, the most convincing of which is that fully aggregated models have better predictive capacity than models with so-called microfoundations.
This is somewhat vexing because at least one of these economists, namely Noah Smith, really should know better. I say this because I recently came across a piece that he wrote not on economics, but on psychopathology. Smith published a piece on his blog entitled ‘A Few Thoughts on Depression’ in January of this year. Although I will freely admit that I do not regularly follow his blog, I think that I am nevertheless safe in saying that this piece was something of a departure from his other writings, which are entirely within the mainstream New Keynesian paradigm.
While Smith’s piece is peppered with cognitive-behaviourist language which I find rather sterile and lacking the expressive power needed to describe human psychology, the piece is nevertheless quite good. It is appreciative of the fact that mental constructions and behaviours of so-called individuals are, at the end of the day, rather arbitrary and not open to rational explanation in the sense in which that term is normally used. At one point Smith writes something that oversteps not only his economics but even, to a large degree, the psychological framework he seems to be familiar with:
Human beings are not consistent, we are not simple, and we don’t make sense. The narratives that we construct for ourselves are mostly bullshit. We construct them out of a need to make sense of the world, not as rational scientific theories that best fit the available data.
This is a very powerful statement and one that I would consider completely true. But it is certainly not something that a neoclassical would generally admit. Nor is it something that a cognitive-behaviourist would be inclined to say. This more so reads like something out of David Hume, Friedrich Nietzsche, Sigmund Freud, Jacques Lacan or Deleuze and Guatarri. It is, when boiled right down, a powerful expression of what we tried to deal with in the first part of this piece: namely, the fact that when you really start to try to focus in on the micro-level of something you quickly come to realise, if you are theoretically honest, that this is an interminable process. Eventually one must come up with a new aggregation – a “narrative”, as Smith calls it – and it is these aggregates that we place upon disparate phenomena in order to make sense of and structure our worlds.
To what extent Smith realises the implications of this for economics, I simply do not know. I have seen him make somewhat nihilistic statements about the impossibility of knowing anything which I think, frankly, go too far. (Although, it should be said, in his confusion Smith almost articulates an important truth). It would be far more realistic if he recognised that certain phenomena can be explained quite cogently at a macro-level and that some narratives are superior to others – usually ones that incorporate a high degree of realism, dismiss arguments that derive their authority from their logical form, are flexible enough to bend with the data and do not get entangled in the interminable microfoundations delusion.
Mainstream economics has a long way to go before it can recognise and integrate these ideas and I for one, thinking as I do that it is a theology or ethical system rather than a science, am rather pessimistic that it can make this leap. But King’s book is an excellent sketch of where it needs to go and Smith shows that some neoclassicals, despite a degree of what I would consider indoctrination and brainwashing, are indeed able to think outside the box. So, perhaps there is some hope. Perhaps when the world falls apart something does indeed stay in place.
** Note that a more recent approach to this problem of finding “solid” microfoundations has led some behavioural economists to study peoples’ actual behaviour in a lab in the hope that they can come up with some sort of multi-agent model. The problems with this approach are completely insurmountable and it will only produce dross. Anthropologists have known for years, for example, that people act differently under conditions of observation than they do in their day-to-day lives. This leads to the conclusion, if we push our micro or molecular theorising to the limit, that there are probably as many potential actions as there are different individuals and different situations. Again, the search for the micrological leads only to interminable regress.