The FT’s Gillian Tett writes up a blow-by-blow of the credit crisis; and the spread chart is a good reminder of how different things still are:
On August 9 2007, the European Central Bank sent shock waves around world financial capitals when it injected €95bn ($150bn, £75bn) worth of funds into the money markets to prevent borrowing costs from spiralling sharply. The US Federal Reserve soon followed suit. But while the central banks had billed these moves as “pre-emptive” actions to quell incipient market tensions, they did not bring the panic to an end… A year later, there is still no sign of an end to these problems. Instead, the sense of pressure on western banks has risen so high that by some measures this is now the worst financial crisis seen in the west for 70 years.
There were a few people on the record as anticipating problems, and no easy way out – Hiroshi Nakaso, a senior official at the Bank of Japan; Jean-Claude Trichet, governor of the ECB; Timothy Geithner, president of the New York Federal Reserve – but it’s easy to data mine in retrospect. These are the drivers of the train, or at least in the engine cabin, and they just watched it crash.
Yet most investors, bankers and even regulators did not change their behaviour to any significant degree, owing to a widespread adherence to three big assumptions – or articles of faith – that have steathily underpinned 21st century finance in recent years.
The first of these was a belief that modern capital markets had become so much more advanced than their predecessors that banks would always be able to trade debt securities. This encouraged banks to keep lowering lending standards, since they assumed they could sell the risk on…
Second, many investors assumed that the credit rating agencies offered an easy and cost-effective compass with which to navigate this ever more complex world. Thus many continued to purchase complex securities throughout the first half of 2007 – even though most investors barely understood these products.
But third, and perhaps most crucially, there was a widespread assumption that the process of “slicing and dicing” debt had made the financial system more stable. Policymakers thought that because the pain of any potential credit defaults was spread among millions of investors, rather than concentrated in particular banks, it would be much easier for the system to absorb shocks than in the past…
Because the risk was systemic, there was no risk? A big mistake.
As a result, when high rates of subprime default emerged in late 2006, there was initially a widespread assumption that the system would absorb the pain relatively smoothly. After all, the system had easily weathered shocks earlier in the decade, such as the attacks of September 11 2001 or the collapse of the Amaranth hedge fund in 2006. Moreover, the US government initially estimated that subprime losses would be just $50bn-$100bn – a tiny fraction of the total capital of western banks or assets held by global investment funds… And as the surprise spread, the three pillars of faith that had supported the credit boom started to crumble… First, it became clear to investors that it was dangerous to use the ratings agencies as a guide for complex debt securities… [The end of that franchise.] Then, as bewildered investors lost faith in ratings, many stopped buying complex instruments altogether… As a result, western banks found themselves running out of capital in a way that no regulator or banker had ever foreseen… [Whocouldaknown?] Banks started hoarding cash and stopped lending to each other as financiers lost faith in their ability to judge the health of other institutions – or even their own… Then a vicious deleveraging spiral got under way… The IIF calculates that in the year to June, banks made $476bn in credit writedowns, as debt prices plunged in the panic (although tangible credit losses are hitherto just $50bn). However, they have also raised $354bn in capital…
It all seems so familiar somehow… but I cannot remember how the story ends. (Paul at Technology Investment Dot Info)