Occupy the SEC to DoJ: Act on Congressional Mandate, Quit Rubber Stamping Bank Mergers

Occupy the SEC is back! The small group of lawyers, bank compliance experts, and financial services industry product specialists generated an outside reputation with its highly praised, over 300 page comment letter on Dodd Frank. As Law.com wrote:

In detailing how they believe the Volcker Rule can be strengthened, Occupy the SEC has written what may be the longest and most talked-about letter of this comment period, which concluded February 13. From Slate and Mother Jones to Bloomberg, Reuters, and the Wall Street Journal, Occupy the SEC’s doorstopper dispatch to the Securities and Exchange Commission has garnered mega attention this week, helping to define the ongoing debate over the so-called Volcker Rule, a regulation to limit the bets that banks can make through proprietary trading….

But it has been Occupy the SEC’s letter that has stood out from the rest in the media sphere, being widely praised this week as smart, well-written, and detailed. The letter identifies risky proprietary trading with government-backed funds as the downfall that occasioned the 2008 financial meltdown, and argues that a stronger Volcker Rule must keep that from happening again—sentiments echoed by John Reed, the former chairman and CEO of Citigroup in a five-page comment letter he submitted.

Not to worry, the matter that Occupy the SEC has roused itself to address this time is more straightforward than the Volcker Rule,1 as you can see from the letter embedded below. Occupy the SEC has responded to a request for additional comments from the Department of Justice’s Antitrust Division over an initiative to update its 1995 rules for analyzing bank mergers. From the agency’s announcement:

The Department of Justice’s Antitrust Division announced today that it is seeking additional public comments until Feb. 15, 2022, on whether and how the division should revise the 1995 Bank Merger Competitive Review Guidelines (Banking Guidelines). The division will use additional comments to ensure that the Banking Guidelines reflect current economic realities and empirical learning, ensure Americans have choices among financial institutions, and guard against the accumulation of market power. The division’s continued focus on the Banking Guidelines is part of an ongoing effort by the federal agencies responsible for banking regulation and supervision.

As you will see in the crisp, concise letter below, Occupy the SEC contends that the Department of Justice’s bank guidelines are defective because they ignore Congressional directives in the Bank Merger Act and use only the sort of Clayton Act size/concentration merger analysis used in other industries. The letter also mentions that the use of the Herfindahl-Hirschman Index screen is problematic, since it depends on how markets are defined, and that can both be gamed as well as a subject of legitimate debate. In banking and finance, where providers actively seek to bundle services and have considerable latitude to do so, it’s even harder to define markets properly.

Occupy the SEC further points out that the near-total failure to reject any bank mergers and the speed with which approval is given….even under Covid…is proof that the reviews are pro forma.

The problem with this type of process is that even when they seek to be broad-ranging, they seldom allow for basic issues to be debated. Consider this sentence from the Department of Justice request:

Building on the responses, the updated call for comment focuses on whether bank merger review is currently sufficient to prevent harmful mergers and whether it accounts for the full range of competitive factors appropriate under the laws.

The hidden assumption is that most bank mergers are helpful or at least not detrimental.

In fact, every study of bank mergers ever done (until they seemed to drop radically in popularity as a subject of inquiry, around the mid-2000s) is that they all found that once a certain size threshold was achieved, banking had a negative cost curve. In other words, bigger was not better. Bigger was more costly to operate. And back then, the size barrier varied by study, but the largest back then was $25 billion in assets, which is not very large.

There is no reason to think anything fundamental has changed, save the size limit may be a bit higher. Large banks still run massive legacy systems on mainframes. Those systems are fragile. They are even more fragile as a bank gets larger. And banks have no path for migrating off them. Conservative analyses say that well over half of large IT projects fail, and no project to migrate from a legacy system has ever succeeded. Aside from being massively risky, the NC IT bank IT wags estimate it would be pencilled out as taking at least 100% of bank profits for three years. Since big project never get done on time or on budget, that means even if it were successful, it would be more likely to take at least 6 years and eat all of the bank’s profits during that period.

Some readers might object and point to all of the branch consolidations and firings that happen in bank mergers. But they miss two points. The first is the rising cost curve (slight rise in expenses per dollar of assets as banks get bigger) means some combination of:

The cost cuts could have been achieved by each bank separately, but mergers give an excuse for a lot of bloodletting

Any cost savings within lines of business were more than offset by diseconomies of scope, as in higher costs of running a more complicated, bigger operation

Cost savings were more than eaten up by pay increases at the top level. Executive pay in banking is a function of the size and complexity. So bank acquisitions are very attractive for the top ranks of the buying entity, since they can justify higher comp thanks to having supposedly bigger jobs

Needless to say, all other things being equal, larger and more complex financial firms represent more too big to fail risk. Dodd Frank required the Financial Stability Oversight Council to identify as systemically important financial institutions and subject them to more stringent requirements, such as higher capital requirements.

As of 2016, five of the biggest banks didn’t have “credible” living wills. There were also doubts about how Citigroup had gotten a green light. Donald Trump then came into office and weakened many of the Dodd Frank “too big to fail” provisions. Sadly, the Department of Justice no doubt considers its to be outside its authority, but I would start with the position that anything more than a trivial acquisition by one of the largest banks was harmful and put a much stronger onus on the buyer to prove otherwise.

But we should be glad to see progress on the antitrust front, even if getting real momentum will take more concerted effort.

______

1 A big reason the debate became so complicated, and thus amenable to lobbyist watering-down, was that Volcker wanted to customer trading treated differently than proprietary trading, when the difference isn’t necessarily clear cut. Put it another way, it’s easier to say “We want banks to stop gambling with public-backed money” than figure out how to make that happen. We did discuss concrete ideas at the time.

00 Bank Merger - Comment Letter
Print Friendly, PDF & Email

6 comments

  1. James E Keenan

    Yves writes:

    Conservative analyses say that well over half of large IT projects fail, and no project to migrate from a legacy system has ever succeeded.

    I certainly have heard this as “lore” in the IT world *in general*, but has this claim ever been authoritatively documented?

    Aside from being massively risky, the NC IT bank IT wags estimate it would be pencilled out as taking at least 100% of bank profits for three years. Since big project never get done on time or on budget, that means even if it were successful, it would be more likely to take at least 6 years and eat all of the bank’s profits during that period.

    Again, knowing people who work in tech for big banks, I don’t find this claim surprising — but what specific studies support it?

    1. Yves Smith Post author

      No one admits to failure. Projects are quietly buried. So you can’t prove the negative. Some notably bad ones have been written up, but those are real outliers.

      I know of no bank IT project of over $500 million since the 1990s that has succeeded. Anyone who had would tout it far and wide.

      And I’ve had two top bank CIOs as clients (and the big boys watch each other, plus hire from the same consultant and expert pool), plus considerable intel from NC readers in the biz.

      The fact that absolutely no large financial institution has migrated off legacy architecture, even banks seen as very competent at IT (like Goldman) despite it being recognized since at least the mid 1990s as necessary, should tell you how many failed attempts there have been.

    2. Clive

      If anything, Yves is being too generous. IT project failure was pretty bad in her McKinsey days, but it just keeps getting worse and worse. CIO Magazine revisited the problem in March last year, they wrote up a comprehensive report, which you can read here (I advise anyone interested in the topic to go through it, they ask for a registration for a small allowance of free papers) https://www.cio.com/article/230427/why-it-projects-still-fail.html.

      To give a summary, the failure rate of projects is estimated as being as high as 70+%:

      Research from Boston Consulting Group, for example, has found that 70% of digital transformations fall short of their objectives. Similarly, the 2020 Global Application Modernization Business Barometer Report found that 74% of organizations that had started a legacy system modernization project failed to complete it, consistent with the 70% rate of failure McKinsey & Co. reported several years ago.

      There are simply too many things waiting to go wrong. As CIO Magazine noted, problem number one is that strategic alignment might exist at the top, but not throughout the organization. Just because banks want to merger at the c-suite level, it doesn’t mean their legacy technology bases are in any way compatible. And even if a project is sponsored right at the very top, it still has to compete with many other priorities for funding and management time. Everyone wants a good news story, so there’s little, if any, incentive, for the rank and file to ‘fess up the chain of command just how bad things are and how much worse they could get, if the project is progressed.

      As a “here’s one we made earlier” example of how bad big bank IT integration can be, the Daddy of them all, certainly in recent years, is TSB’s disastrous migration to parent Sabadell. The whole wretched, gruesome disaster was the subject of a UK parliamentary enquiry (TSB’s own evidence was painful in its exposure of how it engineered its own train-wreck http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/treasury-committee/it-failures-in-the-financial-services-sector/written/95274.html). TSB thought they could integrate the two bank’s systems in 3 years and threw everything at trying to do this. When they did a migration in a state of completed unreadiness, customers were locked out of their accounts for weeks https://www.bbc.co.uk/news/business-43877667 and problems persisted for almost another year after that. The final set of fixes rumbled on for another two years. Even then, a lot of more complex accounts and processes had to be manually kludged and won’t, finally, end until the legacy products they service are in run-off.

      Integration consultants and bank tech rah-rah’ers, who you’d have thought would like to have a chance to sell their services, are amongst the most wary of promising success in bank IT integration. Many have analysed the market e.g. https://www.linkedin.com/pulse/mega-project-banking-10-lessons-success-22-factors-8-frank-schwab and are generally very negative on the whole subject.

      Ultimately, if you throw enough money and time at a problem, you can probably get something vaguely declarable as a success. A 25% success rate was cited in the link there, although you have to be careful that “success” wasn’t achieved through ignoring the more complex integration tasks and leaving some stuff behind on the legacy systems after putting it in the Too Difficult box. But if that analysis is credible, and it certainly seems in the right ballpark and they do give some well-known examples to support it, the remaining 75% are forced — due to time or financial constraints — to abandon their attempts entirely.

      The SEC should consider the risk of bank IT failures as a result of rushed or botched systems integrations projects in their analysis of whether to approve mergers. But as the piece points out, the SEC is happy to relegate its role to that of a nodding dog.

        1. Yves Smith Post author

          I do not understand your desire to have “data’ when there is and will never be any good data. At the risk of sounding harsh, this is a critical thinking skills fail. For starters, WTF is a “digital transformation”? And more important, why would anyone be honest? None of these consultants have audit rights. They only see what their clients chooses to share. They won’t see or will at best only selectively see IT expert presentations. And I guarantee they will not have seen the contracts and the deliverables v. what was actually done.

          There is no required reporting. Any IT hired guns will not admit to failure and CIOs who oversaw the costly building of bridges to nowhere will understate how bad things were. Surveys of this sort are not reliable. Yet you’d rather have clearly dodgy “data”?

          When I was at McKinsey, it published a major study on Wall Street IT (yes, as a booklet even, with the McKinsey name on its cover) that was obviously fabricated. Key “data” it presented did not exist and could never some bee developed.

          Let me stress: the failure rate of migrating off legacy systems at banks is 100%. There is no example of success. None. The mere 70-74% failure rate is by redefining success as something more modest.

      1. Code Name D

        Part of the problem is just a gross misunderstanding of how networks and software systems work. My Uncle worked on ObamaCare Network. He described the project as the equivalent to the Moon Launch, with huge technical hurdles and challenges to overcome. These challenges brough out a wave of top talent wanting to be a part of the project, go get a piece of that Ciber/Nasa as they liked to call it.

        But problems emerged quickly. You may have heard of “feature-creep” where new features keep getting added at the last minute. My uncle described it as “feature-flooding” where whole pages of new features would descend almost on a daily basis. Often with “top mandates”. But there was never any time or money dedicated to building these features. Project manages kept over promising, claiming that new features could be added in weeks that would in reality take years to perfect. Nothing was ever tested. You had a “patch-and-go” mentality which collapsed into endless waves of bug-fixes.

        At the end, the notion of building a functional system was a joke, and all anyone could do was fake the front end to keep the managers and politicians happy. The hole thing turned into a disaster.

        And a tragedy. Some of the promise technological developments from Obamacare would have had direct application in merging banking networks. One tragic loss was king of learning AI translator that could learn how to translate between incompatible databases. Eliminating the need of merging incompatible systems all together. It’s still something of a holly grail of network managers as there is an intense need for such technology.

        But what was developed was so buggy as to be non-functional. And its too much money and time to go bank and revamp the system, so this non-functional AI software is the foundation of all current attempts at this important technology. But rather than investing on getting the system to actually work, the IP owners would rather monetize what they have and sell it as is. So, it doesn’t work, never has worked, never will work, is still the cornerstone of all current large scale networking systems. But you better believe they make a tun of money selling it over and over and over again.

Comments are closed.