Recent Items

Wallace Turbeville: Report from the Frontlines – Mission Not Accomplished on Derivatives Reform

Posted on by

By Wallace Turbeville, the former CEO of VMAC LLC and a former Vice President of Goldman Sachs who writes for New Deal 2.0

It is now obvious that when President Bush made his victory speech on the aircraft carrier in front of the now-famous “Mission Accomplished” banner, he was a bit premature in his assessments.

We should not make the same miscalculation with financial reform. Dozens of fights over these reforms, large and small, continue in Washington, New York and elsewhere. The struggle has moved from the halls of congress to the bureaucracies. At issue is the implementation of 850 pages of legislation concerning financial systems and practices that are difficult for even the most experienced financiers to understand, subjects which are far less appealing to the media. The SEC and the CFTC (Commodity Futures Trading Commission), which are jointly responsible for regulation, understand that their task is enormous and that their resources are stretched.

My interest is primarily in the area of derivatives, and this piece is intended as a “report from the front” in that theatre of conflict. The early stages of the process involve: roundtables hosted by the regulatory agencies; private meetings with industry and public interest groups (in which the attendees and subjects addressed are disclosed, but not the content of the discussions); and comment letters filed with the regulatory agencies. The vast majority of input has come from the financial industry. Proponents of regulation are at an enormous disadvantage. Their resources are meager and their access to information and expertise is minimal compared to the institutions and the businesses that serve them. However, the proponents remain passionate, and they are bolstered by the obvious intent of the legislation.

Several of the issues under discussion get at the core components of the Dodd-Frank Act. One is the transfer of risk-laden derivatives portfolios from the poorly-managed and murky world of bi-lateral contracts into clearinghouses, where the management of risks is monitored and transparent. Another is public availability of trading data, which allows regulators and participants in the marketplace access on equal footing with the dominant trading houses.

Clearing

The banks and clearinghouses have asserted from day one that not every type of derivative contract can be cleared. Before the passage of Dodd-Frank, much of the discussion of clearing limitations concerned “bespoke” transactions, suggesting one-off, complicated arrangements with multiple terms, unsuitable to the standardized world of clearing. The discussion has shifted to a separate and more troubling issue. The new focus is on standardized contracts that involve risks that are difficult or impossible for clearinghouses to measure. The principle role of clearing is the statistical measurement of predicted price moves and the management of the credit risk associated with those moves. Margin to collateralize the risk is required to offset the risk. If the risk cannot be measured, the adequacy of margin is uncertain, calling into question the integrity of the clearinghouse.

There is tension between Dodd-Frank’s intention to move positions into the clearing environment and the need for a secure clearing system. The debate is over how to determine the scope of clearable contracts and use available techniques to maximize the categories of contracts that will be cleared.

Proponents of reform are generally skeptical of the clearinghouses on this issue. Until relatively recently, clearinghouses were owned by the trading firms and operated for their benefit. Clearinghouse profitability is based on volumetric fees, and the banks represent the vast majority of volume. The clearinghouses and banks have freely admitted that financial institutions are both heavily involved and influential in the process of determining the types of contracts that are cleared. Bank involvement is essential, they say, because of their expertise and their ultimate exposure if things go wrong.

This assertion is deeply ironic and raises some concerns:

• If a transaction’s risk cannot be measured adequately to permit prudent collateralization, why should the system allow a financial institution to trade it?

• In the go-go era of derivatives trading, clearinghouses competed to clear products (and increase revenues and share prices) by pushing the envelope of statistical risk metrics. The new emphasis on prudence is startling.

• Clearinghouses are supposed to be experts in measuring derivatives risk. While banks might be a good source of information on a new contract, the clearinghouses must make the decisions. They exist to manage risk, not to further the interests of the clearing members. In reality, this distinction is still blurred for many senior managers of clearinghouses.

• Financial institutions are at risk if a clearinghouse fails; but, recalling the autumn of 2008, so is the public.

Much of the discussion has revolved around share ownership limitation and corporate governance. These are valid concerns. Perhaps a more important focus is the risk committees at clearinghouses. Clearinghouse managers and clearing members (that is to say, the banks) run the core business through these committees. They control the central issues, including the types of contracts that can be cleared. Regulatory or public interest participation in these committees would be an effective way to legitimize the process.

If it is accepted that the risk of certain categories of derivatives cannot be measured adequately to permit conventional clearing, the discussion moves on to techniques that depart from conventional clearing practices to enable more transactions. Three methods, which are not mutually exclusive, have been suggested by proponents:

• Statistical projections used in clearing are based on historic price movements. The idea is to cover some large percentage, often 99%, of historic price moves with margin. There is no reason that margin should be limited to historic movements. If margin collateral exceeded the largest historic movements, more transactions could be cleared.

• Reference prices in problem contracts can often be disaggregated into components: one that is easily cleared and one that is not. This is because of the real-world characteristics of the commodity or financial instrument that is the subject of the contract. Consider a difficult natural gas delivery point that is physically sourced from a readily clearable delivery point. The disaggregated price risk of the clearable point could be cleared, leaving only the stub price differential as a problem contract. Thereby, more risk is cleared.

• The obstacle to clearing problem contracts is unacceptable risk for the clearinghouse. Requiring the clearinghouses to run separate clearing pools in which this risk is limited or eliminated allows the broadest scope of clearing. This is far better than the alternative — leaving these transactions in the bi-lateral world.

Market Information

A central tenet of Dodd-Frank is that the derivatives trading market should be transparent. All price data should be available to market participants and regulatory authorities. The Act establishes the superstructure, mandating data repositories and (for the most part) “real time” public disclosure.

The quantity of data poses challenges, but these are largely mechanical. The significant issues revolve around the form of data disclosure. The data must be useful to individual market participants in their daily activities if the system is to provide meaningful transparency. The presentation must be uniform and accessible in a way that can integrated into the screen environments used by traders.

The Roundtable discussions concluded unanimously that a data aggregator was essential. This function is not contemplated by Dodd-Frank, but it is required to make the policy work. The discussion also recognized that the aggregator must be independent of influence and that industry ownership and volumetric fee revenue should be avoided. The concepts of a “public utility” and a “common carrier” were used to describe it.

The data assembled by the aggregator will offer a tremendous opportunity for regulatory authorities to meaningfully monitor the derivatives markets. As an example, consider compliance with the Volcker Rule. It is naïve to believe that there is a bright red line between proprietary trading and banks’ other market activities. The aggregator’s data base can be analyzed using systems designed to detect activities that might not be in compliance. In addition, the data will represent the comprehensive portfolio of the reporting firms, allowing the regulators to monitor and analyze credit risk and appropriate position limits.

The optimal organizational structure for the aggregator would allow it to service the needs of the regulators, while providing the mandated market transparency in a useful way. This suggests an independent non-profit or limited profit organization, with substantial regulatory involvement, capable of developing analytical systems to benefit the public’s interest.

Print Friendly
Twitter13DiggReddit0StumbleUpon0Facebook0LinkedIn0Google+0bufferEmail

9 comments

  1. Opinionated Bloviator

    Either Congress gets serious about fixing the preverse incentives and “rule of whim” that plagues derivatives and financial reform in general or suffer TOTAL ECONOMIC collapse. The United States of Mexico or Argentina, that is our future on this current course.

  2. Razzz

    How do you fund a bet when you don’t know its value?

    Easy, you trade over the counter and only collect enough money to pay the commission fees based on a bogus number then worry about settlement later.

    Even if they brought all derivatives under control from this point forward, the previous unknown, underfunded and undocumented derivatives (Even BIS has no clue of outstanding bets because reporting of derivatives was voluntary) would bring down the banking system if they are valued at market to market and not the current market-to-fantasy-kick-the-can-down-the-road-let-the-next-generations-of-taxpayers-worry-about-it accounting method.

    Remember, accounting rules have been changed to accommodate all the bad paper outstanding. In reality we’re worse than broke, we have insurmountable debt.

    1. Skippy

      Bravo! I have been at a loss, seeing this derivative refuse spoken about so eruditely, as if it would remove the vomitus smell issuing forth from its decaying carcase.

      Loosely speaking, a derivative can be thought of as how much one quantity is changing in response to changes in some other quantity; for example, the derivative of the position of a moving object with respect to time is the object’s instantaneous velocity. Now factor in this observation with the orders of magnitude supplied by computational power, size and mass, actors, quantum variables etc etc…

      And it blew up…go figure…yet its talked about like the voyager mission, helping us understand our solar system, not like the ELE asteroid that it is.

      Skippy…top level psychopaths aided and abetted by higher mathematics, infecting the global markets with its make believe valuations, too in fact become more *value* than all other values combined! A fruad from its very conception!

  3. jake chase

    This post may be the first time anyone on the inside has spoken truth about clearing derivatives. If more than one hundred people understood the implications, there would be a revolution tomorrow morning.

  4. CreativeGenerations

    Thanks for the informative report.

    Another thing on derivatives clearing to consider… [Ear to ground] What I hear is that those who will be using the clearing house[s] have a primary concern for their own protection: “Whose name will be on the account while in clearing?”

    If the owner of the ‘assets’ to be cleared loses title before the settlement of the clearing, the chances of the process being accepted by the end-users is like, maybe more than nil – not sure by how much.

    Let me spell it out. If the clearing creates a kind of escrow in which funds(/__________) return to the sender[s] named on the contract if the deal fails, that would be workable.

    If on the other hand, the funds get put into a numbered-only account that if it fails goes into a file ‘to be dealt with at some later time’. That will not fly.

    Also, because of the previous debacles’ shenanigan’s, some people rightly fear that if their name gets taken off the documents before completion, if it fails – someone else’s name will get put on. Ooops, a billion [or so] dollar ‘clerical’ error. It’s happened.

    Just something to keep in mind. Food for thought. Mmm, pie.

  5. Cochise On Rye

    In her book, Yves states that many of the more exotic derivatives would never had existed if their was proper risk assessment of said derivatives. The cost of risk insurance and information discovery would exceed their fee/commissions value.

    It seems more logical to focus on risk discovery procedures than to try and design a clearing house to trade more crap derivatives. I do not know to what extent if any the new legislation addresses honest risk assessment of contracts.

    Let the clearing houses get back to trading the more mundane commodities and interest rate swaps.

    Easy to value and easy to trade

  6. readerOfTeaLeaves

    The Roundtable discussions concluded unanimously that a data aggregator was essential. This function is not contemplated by Dodd-Frank, but it is required to make the policy work. The discussion also recognized that the aggregator must be independent of influence and that industry ownership and volumetric fee revenue should be avoided. The concepts of a “public utility” and a “common carrier” were used to describe it.

    The data assembled by the aggregator will offer a tremendous opportunity for regulatory authorities to meaningfully monitor the derivatives markets… The aggregator’s data base can be analyzed using systems designed to detect activities that might not be in compliance. In addition, the data will represent the comprehensive portfolio of the reporting firms, allowing the regulators to monitor and analyze credit risk and appropriate position limits.

    The optimal organizational structure for the aggregator would allow it to service the needs of the regulators, while providing the mandated market transparency in a useful way. This suggests an independent non-profit or limited profit organization, with substantial regulatory involvement, capable of developing analytical systems to benefit the public’s interest.

    Scratching my head only slightly because in my experience, an aggregator is not a single database in and of itself; rather, it aggregates data from multiple databases.

    Does anyone around here know, specifically, what types of data and/or databases are to be aggregated? Anyone have a link…?

    Would the non-profit, or ‘aggregator oversight group’ be tasked with monitoring the quality of the data?

    This post raises some really, really interesting issues.
    **Really** interesting.

    FWIW, should ‘jake chase’ return, please feel free to elaborate on your comment.

    1. Wallace Turbeville

      Sorry for the delayed response, readerOF TeaLeaves.

      The aggregator would receive data from the Swap Data Repositories (DTCC and others) and potentially from Deal Capture systems at individual trading firms. The data will at a minimum include the basic terms of a swap: product, reference price, swap price, term, quantity, counterparty. One intriguing opportunity is to add data fields: collateral terms for uncleared swaps, FCM for cleared swaps, etc. Once assembled, in theory, the data can be used for a comprehensive analysis and continuous monitoring of derivatives market risk. Cool!

  7. readerOfTeaLeaves

    Hasty, dashed response — first, thanks for the info.
    This becomes only more interesting; definitely follow the logic behind the added data fields ;-)

    Cool, indeed!

Comments are closed.