Arjun Jayadev at Triple Crisis provides a quote from Thomas Phillipon that somehow never sees the light of day in the financial press:
…the unit cost of intermediation is higher today than it was a century ago, and it has increased over the past 30 years. One interpretation is that improvements in information technology may have been cancelled out by increases in other financial activities whose social value is difficult to assess.
This of course is a very understated way of suggesting that the bankers have found new ways to sell or bundle other products or services along with the ones made cheaper by information technology, or create new ones of dubious additional value, so as to allow them to fatten their total pricing.
This is a big and important topic, so let me take just an initial slice at it, and I’ll hopefully come back to it in future posts. We can certainly see the net effect, which is the financialization of the economy, which suggest that IT (and other developments) have allowed the banks to move into an oligopoly position and are extracting economic rents. Simon Johnson, in his important 2009 article, The Quiet Coup, described how the financial sector had accomplished the surprising feat of increasing average worker pay packages and increasing their share of GDP. Wages rose from roughly comparable to average private sector worker wages from just after World War II through 1982. They increased to 181% of private sector worker wages right before the crisis. From 1973 to 1982, the financial sector never garnered more than 16% of corporate profits. By the 2000s, it hit 41%.
On the client side, lower transactions costs (which are not attributable solely to IT but to deregulation of commissions on the equity side, and to end of the requirement to make physical delivery of securities) have led to higher transaction volumes, raising the question of social utility. It’s conventional to see lower cost trading as a benefit, but is it really? Traders benefit from trading. Bona fide investors (are there any left?) might actually benefit from having to have a sense of commitment before buying, that the costs of trading were high enough that you actually need to think before you jump into a particular instrument. The reason that women are found to be better investors is that they are less inclined to overtrade. Higher transaction costs similarly discourage overtrading.
Again on the client side, a host of new products have come into being, and again, I’m skeptical that the net result is value added for customers. Of course, we have to define who we mean as “customer” since we have a huge agency problem. For instance, the rise of complex, customized derivatives is utterly dependent on the rise of more robust IT platforms. The early leaders in the OTC derivatives business, Chicago Research & Trading, O’Connor, and Bankers Trust, all had to have state of the art capabilities and highly competent IT professionals because you were at bleeding edge to model large derivatives books and their related hedges on a real time basis. But as anyone who has read Frank Partnoy’s Fiasco or Satyajit Das’ Traders, Guns, and Money knows, a very high proportion of complex derivatives trades are for tax or regulatory arbitrage, playing accounting games, or just ripping off customers (as in talking them into something more complicated with hidden margin or hidden risks loaded in). So complicated derivatives are perfect for predation.
So why do customers buy them? The customers are seldom the real customers. They are usually agents, most often, fund managers or employees. Fund managers are victims of benchmark-driven herd behavior: if there colleagues are using derivatives to boost returns, they have to as well, even if the result is to eke out a few extra basis points now for more downside later. Other agents have similarly bad incentives. If you are a county commissioner, you can’t say: “I know Wall Street is a con, we have a ten year project, we can finance it with ten year bonds, the math works, let’s go ahead.” No, you will be accused of being lazy and having left money on the table for not having investigated your options. So you hire a consultant. The consultant’s incentives are to find as many complex structures as possible to review, that makes his job more difficult and justifies a big fee. And he can’t recommend a simple ten year bond after all that work. That would call his existence into question. So even if he is not affirmatively corrupt (as in steering business to buddies for kickbacks), he’ll recommend something complicated he may not really understand, and it is certain his client won’t understand. And his client will be hung on the bottom line. If something offers apparent cost savings, he as a public official can’t buck that. Cost savings are easy to understand. Complex, hard to characterize risks are not. And that is how municipality after municipality (heck, Harvard’s own supposedly sophisticated management company) gets fleeced. But the more complicated all these people’s jobs appear to be, that of the consultant, the fund manager, the county commissioner, the more they can claim they deserve higher pay. So they win from the complexity game, even as the people they represent wind up losers.
By Arjun Jayadev, an Assistant Professor of Economics at the University of Massachusetts, Boston. Cross posted from Triple Crisis
One of my favorite lines from recent economics papers is the following one from this paper by Thomas Phillipon, who in talking about the performance of the financial sector suggests that “the unit cost of intermediation is higher today than it was a century ago, and it has increased over the past 30 years. One interpretation is that improvements in information technology may have been cancelled out by increases in other financial activities whose social value is difficult to assess.”
The claim that the financial sector has been ‘functionally inefficient’ was made 30 years ago by James Tobin, and it’s great to have a quantitative basis to make this sort of judgment. Another way to have some sort of handle on the degree to which intermediation has become more expensive is to look at the spread between funding costs and lending rates.
To revisit a theme expressed before on this blog, if one takes a long-term perspective, the idea that inflation adjusted interest rates were especially low in the 2000s is simply mistaken, and there were long periods in the post war to 1980 period that saw very low interest rate facing end borrowers (consumers and non financial corporations). Given the fact that short policy rates—the federal funds rate—were indeed at historical lows in the early 2000s suggests that there have been rising spreads. The figure below—from the latest draft of a paper by JW Mason and I– is the 10-Year Treasury and BAA Corporate Bond rates relative to the 10-Year Ahead Average Federal Funds Rate. It’s asking how much interest a financial intermediary could make borrowing at the Fed funds rate over 10 years and lending to Treasury and corporates. I suppose we could have done some sort of structural break analysis, but really, it’s all there in the graph. After 1980, in the Brave New World of Neoliberal Finance, the 30 years that Phillipon writes about saw a sharp increase in spreads compared to the period before.
This in itself is of course not enough to suggest that intermediation has become less efficient. Certainly, it is possible that these rising spreads might be because of increased perceived or actual default risk. Well maybe, but the following figure gives some reason to doubt that story.
The graph shows the difference in spreads between corporate bonds with AAA ratings from Moody’s at the time of issuance and the 10 year treasury bond rate. The spread between the AAA corporate yield and the 10 year bond rate also began to rise around the same time (i.e around 1980). Since 1980, however, the annual default rate on bonds of corporations with AAA ratings at issuance, has been approximately 0.05 percent. Given an average recovery rate of around 50 percent, the default losses have been about half that. But the premium of AAA bonds over 10-year treasuries has been 1.2 percent (i.e. more than 40 times the expected annual default loss).
This is an example of the “credit spread” puzzle. We chose AAA bonds as a comparator, but the same pattern exists across different classes of Bonds. The spread on triple B bonds (as the link suggests) was 8 times the default on those loans.
So what is behind these spreads? Josh and I are agnostic. Several candidates spring to mind, but one in particular seems most interesting, especially from a Minskyian/Post-Keynesian point of view. From such a starting point, asset holdings are not driven by households maximizing their expected utility from consumption through the solution of an Euler equation. Rather, the impetus for any agent to hold an asset is to achieve positive returns while keeping the probability of being able to meet all current obligations above some threshold. In this sense, the importance of holding liquid assets is to protect against bankruptcy if contracted cash payments cannot be made. The implication of this position is that the demand for liquidity will depend strongly on how likely lenders believe they are to face the risk of insolvency. The broader implication is that the observed credit spread may depend critically on the the probability assigned by banks and other financial institutions of falling short on cash to meet obligations, and thus the greater premium they will pay for liquid assets such as government bonds, thereby increasing spreads between those and other long rates.
This is not an explanation, of course of why treasuries have higher rates after 1980 than before and for that we’ll need additional explanations.
Do readers have other hypotheses/thoughts?