By Bob Goodwin, an investor and medical device entrepreneur who lives in Mercer Island, Washington
Several bellwether software initiatives have gone off the rails over the last five years. I am going to focus on one, because I learned about it on Naked Capitalism, and is where I first saw the expression “Code is Law”. I hope when history is written, this example will stand out on how the anarchist nerds that we call software engineers inadvertently started to hijack public institutions. More cynically we could believe that the software engineers where just useful idiots to corruption. But conspiracy theories are not necessary for this story, and given the pace of events, I doubt there were smart enough conspirators anyways.
Before we dive into this story, we should keep perspective, as the history of early industrialization is also filled with anecdotes about the overreach of engineers and their subsequent failures. I doubt these stories were widely reported at the time, and I like how industrialization has turned out.
In this story, software engineers summarily rewrote policies that had been in place for centuries, and arbitrarily replaced historical procedures run by clerks, land offices and judges with a buggy central data-base. This is the story of the software used for mortgage securitization leading up to the credit crisis. Property ownership is fundamental to money and credit, which in turn is fundamental to capitalism, and therefore should be treated with great care. There are many examples of societies tinkering with property ownership, only to see their economy damaged in predictable sequences of setbacks involving money, credit, trade and investments.
One of the legs in the credit crisis was how title to real estate, which had served as collateral for individual real estate loans, underwent significant and perplexing changes over time through the process of using large numbers of these loans to create mortgage backed securities. It only became clear after the acute phase was over and more and more homeowners were facing foreclosure that the well established processes for recording and transferring title had become corrupted. This collapse was not due to computer science, nor was the collapse central to the larger crisis. In a normal era, this story might not have gone largely unnoticed. But even with the diligent efforts of lawyers and academics to bring attention to this issue and prevent it from becoming acceptable, official policy was to save the banks from possibly fatal liabilities, which meant papering over problems with mortgage title. Forgiveness was seen as a small price to pay to divert the attention away from the carnage. As a result, the the role of MERS and other “innovations” were never front page news.
But the point of view underlying MERS and the other ways that the traditional, slow, but extremely reliable process of recording title and transferring borrower notes is consistent with widely-held believes in the software industry. Software engineers had long ago decided that paper and humans were inefficient, despite our long history of governments tracking ownership via paper, authority and convention, such that it can be verified by a human judge conclusively later.
I should rephrase this. Software engineers did not know how to control paper, city halls or judges. So they wrote them out of the software.
The consistency of the paper process tended to be a disincentive to bad actors, as they were at risk of being exposed later. Reduction in fraud benefited the regular guy, which is nice, but not the point. Consistency was essential to the capital markets. Moving money is not very profitable per dollar, but is workable purely on the basis of scale. Moving money is also about playing the odds. Some transactions go bad. But the entire waterworks collapses if every bad transaction is contested and inconsistent outcomes occur. Only a nearly frictionless pipeline of money stays open. Twitchy financial types are quick to close pipelines when risks are not understood.
Today if you get a mortgage there is a lien on your house to secure the debt. Every lien is recorded in some government office using paper. Mortgages often change hands because the companies that are best at writing loans are not always the best at managing payments. Other mortgage companies may specialize on troubled loans, etc. Recently there has been an industry of packaging groups of similar loans together to make them marketable as securities.
History may look back on securitization and decide its primary value was to lower transparency and allow the corrupt to sell junk as gold. But at the time they were invented, I am certain there were people who saw the mathematical potential to improve efficiency by creating a secondary market for risk. And just as the mortgage markets needed inexpensive and consistent methods of managing a low margin portfolio, securitization was even lower margin, and relied even more on very high volumes.
The cost of a 19th century paper trail was inconsistent with a high speed late 20th century risk pooling market.
I try to imagine myself in those software meetings that overrode hallowed laws. Triage is a big word in software, because Pareto was taught to us on our second day of work. The 80/20 rule says you get 80% of the benefit from 20% of the work. We only wanted to do 20% of the work. Software types are most gifted and careful in our analysis into costs and benefits, lest we have to start all of our work over again later. The engineer quickly discovers that an individual mortgage has a failure rate of X. And of that X, which is low to begin with, becomes even smaller in practice because only a very few foreclosures are contested, so the requirement for a paper trail is virtually, zero. “Virtually zero” means problems need resolution, people can get on airplanes and negotiate fixes individually for each case. That cost is guaranteed to be far lower than the cost of creating a system that is a complete replacement for the 19th century paper trail. Looking back, I can see an error in the engineer’s analysis. We now know the cost was not anywhere close to zero.
Now the supreme irony is that I’ve depicted the origins of MERS as software engineers gone wild. It’s certainly plausible as history, but what actually transpired is worse. Some people at Fannie Mae, Freddie Mac, Ginnie Mae and the Mortgage Bankers Association looked at how the Depositary Trust Corporation, a system which eliminated the physical delivery of securities on Wall Street by creating a central depository, and thought it would be great to create a version for the mortgage industry. But even though real estate is governed by state law, and has important differences state by state, this group decided they could punt reviewing the state law issues. They’d just go ahead with their new system and the courts would comply. And the worst is that the courts have largely complied, even though the MERS database was built with an absence of protocols considered basic to database integrity, like audit trails and measures to insure accuracy (like dual keying and management oversight). So we got the worst of all possible worlds: management thinking like software engineers (conforming requirements to what looked most useful, rather than replicating an existing process) and whoever did the database part acting like MBAs.
And once this was in place, there was no going back. We are so dependent on software, that noncompliance with software is not an option. As we read the rest of the story, we can see how it came to be that “Code is Law”.
This story is most instructive not on how it happened, but how it was resolved. In the absence of a useful paper trail from mortgage companies, the mortgage companies were found to be retroactively creating a paper trail. Retroactive paper trail is a kind way of saying they forged documents for a court. Sometimes these forgeries were proven in a courtroom to be convenient fictions.
We would like to believe that the courts would have followed the letter of the law. But of course that is not exactly what either the courts or the laws are for, at least in America. The highest goal of property law is to keep money and products flowing in the economy, through efficient and consistent resolution of the occasional contract failure. A literal reading of the law, and a literal interpretation of the facts would have gummed up the economy by doing the exact opposite of its primary goal. Many judges saw this immediately. Rather than follow the written law, most courts effectively accepted that software was some form of common law. Even now, if the offending software were discarded, fixed, or laws are changed retroactively, we now know conclusively that software has the power to veto policy.
This is not the only story like this. We know that no customer can negotiate with anyone who accesses their company through a computer, unless they’ve been granted authority to fix certain problems, and then only in specified ways . Computers don’t negotiate. Code is law. We know that companies can create excruciating telephone experiences for customers seeking redress because code is law. But the mortgage detailed above is a really big deal uniquely because the law that was overruled by code was important and well considered, and courts unambiguously codified the primacy of software over law. In order to return the universe to its proper configuration, laws might be used to regulate software.
Over thousands of years, policies have correctly (or not) guided our use of land and water, built cities, and earned security for civilizations. Two policies in the last century have doubled our lifespans and increased our wealth astronomically. The two policies are energy (drilling, processing, roads, industry and utilities) and public health (clean water and food, work environment, health education). As great as the software industry might believe in its power, it is small in this larger context. Law and policy must always win. Law and policy of course can be inconsistent if one is moving faster than the other. But policy is delegated to men and women who are given this authority by traditions of government forged from blood.
Loss of confidence in our institutions comes at great cost, and I see the failure of the failure of Healthcare.gov, and what it means to both software and law going forward, as having the potential to reduce low levels of trust in our government even further.
Keep in mind loss in confidence in our institutions is not that rare. I am barely old enough to remember the campus riots during the Vietnam War, but plenty old enough to remember the malaise that Jimmy Carter spoke of. Three consecutive events occurred in the same decade that each spoke to the incompetence or impotence of the federal government: the war itself that ended in 1973, the oil shock, with America reduced to rationing, and inflation, which brought 19% mortgages and virtually zero improvements or repairs to any asset with a long life.
It is a stretch to impute a line from OmamaCare having problems to a great malaise. However every consequence of the 1970s flowed directly from the inability of our federal government to implement its stated policy. Failure begets failure. The 1970’s object lesson speaks to why we should all be adamant about competency of government to achieve stated policy.
The first casualty of cascading failure would be liberalism. But only temporarily. Even though I am not a liberal, the shining example of liberalism is its tradition of proactively building social institutions, from activism against slavery and before that institutes of education. In this century much of the growth in the federal government was a direct extension from a long history. It is certain that hunger to expand institutions will be diminished when a straight forward policy is achieved at great cost politically, and then cannot be implemented. But while I hope the lesson will shift liberalism toward more competent institutions and away from pure resource diversion, liberalism will likely prosper so long as it adapts. There is no activist alternative to liberalism today (although I see candidates), so the only way liberalism falls is through repeated examples of social institutions that aren’t successful, and are mostly used as organs of corporatism.
The second casualty will be confidence in the public sector, and that has been under long term attack. I fear this outcome the most. We fairly get angered at incompetence in the public sector, mostly because it is funded from taxes without recourse. We also get angered at all other incompetence, but somehow separate the anger into different compartments. And I don’t think viewing the public sector as incompetent is fair.
Public employees are passionate, hardworking and committed. I definitely trust public institutions to collect taxes and fight wars, and prefer to buy commodities from private institutions. Public and private models have different sweet spots. Public employees are my neighbors and friends. I talk to them. I hear about their challenges, struggles, and goals. They seem like passionate, hardworking and committed Americans. Education, crime, and war are managed imperfectly, but better than ever before, and amazingly well considering the obstacles and a lack of money based incentives. But public employees have also seen great improvements in both efficiency and accountability without this mechanism. When there is confidence in an institution, good intentions are transmitted.
Most institutions, public and private have exasperating and obvious failures, due to forces that work in opposition to good management. But the net of all these forces we usually end up with functional public institutions that mostly differ from their private counterparts in their slower rate of adaption, higher expectations of conformance, and greater interest in individual outcomes. Government employees did not mismanage a software project because they had poor skills. Software engineering has always been an obstacle to the top down propagation of policy. It has been hard for every industry that has used it, but government has found it hardest to adapt.
ObamaCare may fail, and it is possible that it will cause a chain of other bad outcomes. It seems more likely that any means necessary will be used to avoid that scenario. By placing this principal as first, many possible outcomes can be dismissed as unlikely. Political fix that enables the Obamacare law to achieve the minimal acceptable outcome (large reduction in uninsured, pre-existing conditions cannot prevent insurance) would ordinarily be the obvious remedy, but since this law was voted along party lines, that path may be closed.
Doctors, hospitals, and insurers will all resist disruption. Fixes to the law will impose new costs on all of them, since they’d already planned for Obamacare as first written. The Administration has gone from treating the insurers and Big Pharma as its real clients to being more willing to show some muscle as the possibility of failure looms.
The tough part is patients, or as the Administration likes to call them, consumers. Government cannot coerce the population as easily as it can institutions. ObamaCare is definitely losing ground in public perception. To oversimplify perception, all the opposition needs is for something fundamental to be going wrong that people can easily relate to. Once they have public attention, it becomes a lot easier to drive wedges such as “He lied that I could keep my coverage.” If we ever get to the point where virtually everyone knows someone who has a bad story to tell about the law, then public attention could last indefinitely, a health care version of the Iran hostage crisis, which sapped Carter’s waning credibility. There are plenty more wedges that opposition can drive, such as when a debt ceiling debate includes “He lied to us about taxpayer liabilities from Obamacare.”
My belief is that absent the public attention, Obama could have finessed all the challenges. The perception of incompetence has to be eliminated, preferably by eliminating the incompetence. The perception and the fact of incompetence is attributable to the software. It may be necessary to abandon the website completely, but this really solves nothing. Computers are still needed to determine eligibility, and insurance companies will now have to create the same services from scratch that the government has been working on for years. It seems possible that software is what is preventing the success of the law.
And this is where I get really uncomfortable, because even if we pull this one through somehow, there will be another. As public policy is increasingly dependent on information, large policy changes are becoming harder to implement. Headwinds are growing, it is only a matter of time before big policy will be unable to outrun it.
Perversely, policy is similar to software. Both have never ending dependencies, and tame chaos mostly by creating predictability, and less by creating optimal outcomes. Both are prickly professions precisely because tinkering almost always makes things worse. Both are opaque, and both harden like concrete into odd shapes. Both are managed mostly through small actions at the periphery, and are structurally changed only with vast resources and these changes made are motivated by strange alliances of interests.
What is a deeper observation about the intuition is that policy has always be information bound, even from the beginning of time. And the pressures on policy in an age of exploding information are very large. This is the really good news in this essay. Even if the transformation of our society leads us through a valley of despair, we can still expect the long term competence of government policy to accelerate like other parts of the economy. The promise of the power of government policy can be achieved once new principles of information management become part of the institutions of government policy.
Perhaps software engineering was right on this point all along. With enough complexity planning becomes impossible. Only if one distributes accountability and constructs from the bottom can there be any hope of rationalizing all the possible outcomes. Stabalizing pockets of complexity is a requirement before planning what to build on top. The problem of course is that the government contractors implemented bottom up management (they had to use the resources they could get), and then implemented a top down cargo cult for the benefit of the government.
To fix this we are going to start to need to implement policy bottom up within the government, and then demand software industries supply building codes and transparency so that the bottom up policy makers are masters of their software rather than victims. If policy really is like software, additional advantages will accrue to the builders of policy. Bottom up policy making allows experimentation, and allows connected branches of policies to adapt to experimentally discovered optimizations. The organism becomes self-correcting in a way top down systems cannot be. Some top-down features are unavoidable, but the interface between top-down and bottom-up needs to be reconsidered. We need funding, for example, and we need an understanding of how to measure success. This is a difficult management problem, but not unsolvable.
But it would represent radical change in the configuration of public and social institutions. Such changes require confidence, and so the loss of confidence at this juncture would be a particularly costly setback.
There are a few of us who have become the scientists and practitioners of the complexity that comes from massive management of information. It is messy, it is imperfect, and it is far more of an art than a science. I sit in on the side as a software engineer, and gently raise my hand and say “Policy always was software”. “Code always was Law”. We just never realized that before there was software there still was code, executed by clerks rather than machines. Let’s adapt to our brave new world, and create an even higher platform for the human experience, by building more powerful institutions and more passionate professions. The consequences if we fail to do so are costly indeed.