The Mother of All Demos, presented by Douglas Engelbart (1968)

And now for something completely different. In the Mother of All Demos (MoAD), Douglas Engelbart demonstrated the NLS (“oN Line System”) to 1,000 computer professionals in San Francisco’s Convention Center. Arguably, Engelbart’s demo was more important than anything else going on in San Francisco in 1968. Here’s the video:

Now, the demo is one hour and forty minutes long, so unless you’re a geek or you love the history of technology, this probably isn’t the video for your morning coffee. But if you take a few minutes to skip through it, you will see the first demonstrations of:

  • The mouse
  • The cathode ray tube (CRT) monitor (what we all used before flat screens, if you can imagine such a thing)
  • The graphical user interface (GUI) with windows (as opposed to the previous state of the art, the teletype)
  • The word processor (with cut, copy, and paste)
  • The outliner 
  • Computer generated slides
  • Hypertext with clickable links
  • Video-conferencing
  • More abstractly, “the file,” with file name, creation date, and creator, with navigation through a hierarchy of files
  • And Herman Miller office furniture!

And much, much more. To me, the MoAD is an astonishing moment in time. These days, we revere Steve Jobs for his presentation skills and trade show demos, his aesthetic sense, and his executive skills, but compared to Engelbart, and his team with their Mission Control ties and haircuts, isn’t Jobs a rather trivial figure? After all, when the history is written, who will be found to have had the more powerful reality distortion field? Jobs or Englebart? 

However, this isn’t a history, even a potted one, of the work Engelbart and his team did at the Stanford Research Institute‘s Augmentation Center (you could start here). Or a history of computers. Rather, I’d like to share a few reflections that seeing MoAD conjured up for me. 

* * *

The MoAD took place in 1968. Simplifying the family tree, the NLS begat Xerox PARC’s Alto, in 1975. Alto (more or less) begat the Apple Lisa (1983) and the Apple Macintosh (1984), where the GUI with a full-fledged WIMP (Windows, Icons, Menus, Pointing device) had its first mass-market commercial success. Microsoft Windows followed at a later date. Excel followed much later, and much later than that, we had finance types crashing the world economy using Excel spreadsheets neither they, nor anyone else, could understand. But I digress. 

My point is that 1984 – 1968 = 16 years. That’s the time it took for the technology that Engelbart demonstrated to be adapted successfully for commercialization in the mass market. And 2013 – 1968 = 45 years. That’s the time that Engelbart’s vision — or, at least, a version of it — has, more or less unchanged, dominated the way that humans interact with computing machinery. We think of computers and the computer industry as evolving very rapidly, and being very dynamic, but are they not, in reality, remarkably static? To put this another way, I’ve been living in Doug Engelbart’s world for almost all my adult life. (I bought my first Macintosh, a 512KE, in 1986.) And so have many of you….

* * *

My Mac 512KE totally empowered me (“augmented” me, as Engelbart would say) as a human being, at least after I managed to install more memory (it wasn’t easy) and got a Bernoulli drive (300 megabytes of storage, as I recall). The Mac was absurdly easy to set up, far easier than, say, model trains: I lifted it out of the packaging, set it down on my desk, plugged it in, and there was the soon-to-be familiar Happy Mac face. Then the screen saver kicked in — some sort of dark, oscillating, sine-wave pattern — and it then took me many hours to figure that that to make the screen saver go away and get to “the desktop” I had to move the mouse…. 

Anyhow, being “augmented.” I come from a literary background, and through most of my life up to that point I had been attempting to write… Something. Something worth reading. On several typewriters, through an IBM Selectric Composer that actually justified type (with fonts, dammit)… But I was never able to complete anything. Not anything. I’d start strong, and get part away along, and then start polishing, and never find my way through the borking paths to a conclusion. (The thought of using index cards did not occur to me). 

Enter the outliner! One of the features of Engelbart’s NLS. For the first time, I could write down my thoughts as they came to me, and then drag-and-drop them into the order I wanted. There was no cost, no friction to writing: No paper, no platen, no crumpling and throwing away…. And because there was no cost, I didn’t have to self-censor; I could throw out an idea, develop it, or just throw it away. Learning to use that simple outliner — Acta; I’m using its descendant, Opal, right now — was one of the most freeing and joyous experiences of life. (Okay, okay….) I wrote an article (for my computer user group) in a few hours. Then I wrote another one! (Of course, later Microsoft came in and killed the entire product category with the hideously clunky and ludicrously inferior outline mode in Word, but that’s another story.) To quote Ursula LeGuin’s Genry Ai in The Left Hand of Darkness, the goal of Englebart’s vision was “the augmentation of the complexity and intensity of the field of intelligent life,” and that goal was achieved for me.

* * *

Finally, and however, this. (Amazingly, a transcript for MoAD was put up on Github just this month). Toward the end, Engelbart says:

[ENGELBART: Anyway, one of the interesting things that NLS does, just an advantage of being online is that it keeps track of who you are and what you’re doing all the time.  So on these statements, uh … on everything, every statement that you write it keeps track of who you are and when you did it.

Um. To be fair, “online” to Engelbart doesn’t mean “on the Internet,” as it would to you or me today; toward the end of MoAD, Engelbart mentions an “experimental network” that he hopes will be able to “transmit across the country” in the following year; that’s ARPANET, the predecessor of the Internet.

Nevertheless, it’s a bit saddening that the technology that has brought me such a sense of power and joy has also enabled a digital version of the Stasi, and that this possibility, too, was implicit in “augmentation” from the very beginning. I wonder what Engelbart thinks about that?

Readers? What do you think?

Print Friendly, PDF & Email
This entry was posted in Guest Post on by .

About Lambert Strether

Readers, I have had a correspondent characterize my views as realistic cynical. Let me briefly explain them. I believe in universal programs that provide concrete material benefits, especially to the working class. Medicare for All is the prime example, but tuition-free college and a Post Office Bank also fall under this heading. So do a Jobs Guarantee and a Debt Jubilee. Clearly, neither liberal Democrats nor conservative Republicans can deliver on such programs, because the two are different flavors of neoliberalism (“Because markets”). I don’t much care about the “ism” that delivers the benefits, although whichever one does have to put common humanity first, as opposed to markets. Could be a second FDR saving capitalism, democratic socialism leashing and collaring it, or communism razing it. I don’t much care, as long as the benefits are delivered. To me, the key issue — and this is why Medicare for All is always first with me — is the tens of thousands of excess “deaths from despair,” as described by the Case-Deaton study, and other recent studies. That enormous body count makes Medicare for All, at the very least, a moral and strategic imperative. And that level of suffering and organic damage makes the concerns of identity politics — even the worthy fight to help the refugees Bush, Obama, and Clinton’s wars created — bright shiny objects by comparison. Hence my frustration with the news flow — currently in my view the swirling intersection of two, separate Shock Doctrine campaigns, one by the Administration, and the other by out-of-power liberals and their allies in the State and in the press — a news flow that constantly forces me to focus on matters that I regard as of secondary importance to the excess deaths. What kind of political economy is it that halts or even reverses the increases in life expectancy that civilized societies have achieved? I am also very hopeful that the continuing destruction of both party establishments will open the space for voices supporting programs similar to those I have listed; let’s call such voices “the left.” Volatility creates opportunity, especially if the Democrat establishment, which puts markets first and opposes all such programs, isn’t allowed to get back into the saddle. Eyes on the prize! I love the tactical level, and secretly love even the horse race, since I’ve been blogging about it daily for fourteen years, but everything I write has this perspective at the back of it.

74 comments

  1. Black Smith

    You shouldn’t be talking about the “drive of clicking death” like it was a positive thing!

    1. from Mexico

      I sort of agree.

      When we look at these great leaps in the technology of human communications — writing, the printing press, the internet — what have they actually achieved?

      They have greatly increased the amount of information at our fingertips and have revolutionized social, economic and political organization, ushering in an era of huge territorial states with hundred of millions of inhabitants, then empires that spanned the world, and ultimately in our own era, globalization.

      But despite all the earth-shattering advances in information and technical capabilities, advances which are undeniable, what have the advances in values, ethics, politics, and life philosophies been? Modernism is predicated on the doctrine that as the former advanced, the latter would march along side-by-side.

      As we look about the world today, however, I believe reality places a very large question mark over the assumptions of Modernism, Positivism and the Enlightenment.

      1. sufferinsuccotash, moocher

        The takeaway from the 20th century is that even as Enlightenment evolves, so does Barbarism, which frequently makes use of means devised by the Enlightenment. Fascism, along with its seductive appeal to part of the shell-shocked post-WW1 intelligentsia, was a case in point.

    2. Ned Ludd

      I don’t think Bernoulli drives suffered from the click-of-death. The Bernoulli discs I had were 20MB, with the drive advertised as “infinite storage”. The discs would degrade and suffer data loss, but the disc-destroying click-of-death plagued Iomega’s successor to the Bernoulli drive, the Zip drive:

      Iomega Zip drives were prone to developing misaligned heads. Dust inside the Zip disks or dirty heads caused by oxide build-up could misalign the heads, but in newer devices it was due to poor quality control and manufacturing defects in the drive itself. Magnetic fields could also cause the drive heads to become misaligned, as the drives were not internally shielded from external magnetic fields. The heads caused the data on the cartridge to become misaligned, rendering it unreadable.

      How many college students lost how many papers to the click of death?

  2. LucyLulu

    That’s amazing. I believe he even pre-dates the introduction of the C programming language by Dennis Ritchie at Bell Labs that paved the way for smaller OS’s that could use the limited memory of a desktop computer. My understanding is that in the late 60’s, everything was done on mainframes large enough to take up entire rooms.

    You know, Apple didn’t introduce the first word processing/text editors, though they did introduce the mouse and a cool GUI. There was already a text editor like vi, still popular among Unix and Linux users (I use it when one one of these), though not as user friendly as word processors we know today, no mouse or menus, only keyboard commands to memorize. WordStar had come out for some archaic OS I don’t recall, and a couple years later was ported to DOS when it came out in the early 80’s. WordPerfect came along a few years later for DOS. Thinking back on DOS, and the development of the internet from a very rudimentary protocol with no WWW feature, available really only at universities and military installations, to the widely used commercial enterprise with its flashy media features today, it DOES seem like technology has grown by leaps and bounds.

    I agree about Steve Jobs. He was a commercial genius, knowing how to identify what people wanted and to bring it to them. But he wasn’t a genius/pioneer in the sense of inventions that would change the course of computing. Those people remain relatively unknown.

    1. alex

      “introduction of the C programming language by Dennis Ritchie at Bell Labs that paved the way for smaller OS’s that could use the limited memory of a desktop computer”

      C (developed starting in 1969), while a great language for many purposes, is not what made it possible to put an OS on a desktop. If anything the older practice of writing OS’s in assembly language has a bit of an advantage there (more so back then, when compilers weren’t as good as today). Also, there were many other “systems implementation languages” that could have done the job.

      Sorry to be pedantic, but a compulsion is a terrible thing to waste.

    2. ex-PFC Chuck

      WordStar had come out for some archaic OS I don’t recall

      I believe that OS was CP/M, for the Intel 8080 microprocessor chip. That’s the OS and CPU chip that was on the Osborne 1 that I bought in 1982, my first computer. Cost $1799, IIRC, and that included 64K bytes of main memory and two floppy disks, each with a 92K byte capacity! It came with WordStar, SuperCalc spreadsheet (a knockoff of VisiCalc as were Lotus 123 and Excel), and the dBase 2 data base management system. Each was on one of the floppies that came with the package.

    3. Stephen Nightingale

      > I believe he even pre-dates the introduction of the C programming language by Dennis Ritchie

      Well ‘C’ is just an arbitrary point in the evolution of programming languages. It developed from Martin Richards’ BCPL (1966) – which was type free – which was a cut-down, workable version of Chris Strachey’s CPL (1964) – which arose as a design exercise out of the work done on Algol (1960).

      Doug Englebart was preceded also by Donald Davies’ ‘Data Communications Network’, the first working packet switch, that effectively powered the world’s first local area network, in the Computer Building at the National Physical Laboratory. We used to take afternoon tea on the First Floor Landing and Donald Davies would show up sometimes.

      Those were days when Giants walked the earth. On both sides of the Atlantic.

    4. diddywa

      Right, no one should think of Jobs as a great inventor.

      But come on (not you but the thread in general).

      Jobs was a visionary of world-historical stature, and changed technology forever, for good or ill. No one can say it would have taken the same path without him.

      Yes, about the outsourcing, yes, ok, if you’re right about the ruthlessness. But the way he came back to lead Apple back from the grave? And the determination to create, to produce.

      Real artists ship.

      1. DANNYBOY

        A Genius Who Discovered His Intuition in India

        When different cultures meet, a creative spark emerges, ready to be captured by those flexible enough to bring together the best of two worlds, geniuses like Steve Jobs, Apple’s founder who passed away in 2011. Though Jobs was
        loosely affiliated with Zen Buddhism later in life, Hinduism and India were fundamental in forging his views of the world.

        A nonconformist influenced by the romantic 1960s, wearing long hair, Jobs’ dream was to visit India, inspired by his friend Robert Friedland who had just studied with Neem Karoli Baba. Arriving in Delhi in April 1974, he fell
        sick for days. After recovering Jobs headed to Haridwar, where a Kumbha Mela took him by surprise: “There were holy men all around, people riding elephants, you name it.” But when Jobs arrived at Neem Karoli Baba’s ashram, the guru had just passed away. Jobs stayed in a room, sleeping on the floor, where he found a forgotten copy of Paramhansa Yogananda’s Autobiography of a Yogi–a book he would reread every year, the only one Jobs ever downloaded
        to his iPad 2.

        Having missed the chance to see Neem Karoli Baba, Steve wandered. A peculiar incident, perhaps an initiation, marked his trip: “I was walking around in the Himalayas and I stumbled onto this religious festival. There was a baba
        who was the holy man of this particular festival, with his large group of followers. I could smell good food. I hadn’t been fortunate enough to smell good food for a long time, so I wandered up to pay my respects and eat some lunch. For some reason, this baba, upon seeing me sitting there eating,
        immediately walked over to me, sat down and burst out laughing. He didn’t speak much English and I spoke a little Hindi, but he tried to carry on a conversation. Then he grabbed my arm and took me up this mountain trail. Here were hundreds of Indians who had traveled for thousands of miles to hang out with this guy for ten seconds and I stumble in for something to eat and he’s dragging me up this mountain path. He laughed and laughed. We get
        to the top of this mountain and there’s this little well and pond at the top of this mountain up in the Himalayas, and he dunks my head in the water and pulls out a razor from his pocket and starts to shave my head. I’m
        completely stunned. I’m still not sure why he did it.”

        Steve discovered intuition in India. “The most important thing that had struck me was that Western rational thought is not an innate human characteristic. The people in the Indian countryside don’t use their intellect like we do, they use their intuition instead, and their intuition
        is far more developed than in the rest of the world. Intuition is a very powerful thing, more powerful than intellect, in my opinion. That’s had a big impact on my work,” Jobs later recalled to his biographer. “If you just
        sit and observe, you will see how restless your mind is. But over time it does calm, and when it does, there’s room to hear more subtle things–that’s
        when your intuition starts to blossom,” he said.

        After returning from India, Jobs and his friend Steve Wozniak founded Apple computer in his parents’ garage. From there on, he changed the world.

        Steve had two pictures of Neem Karoli Baba in his room when he died, 35 years after his India trip. His sister Mona Simpson wrote of his last conscious moments: “Before embarking, Steve looked at his sister Patty, then
        for a long time at his children, then at his life’s partner, Laurene, and then over their shoulders past them. Then he spoke his final words:

        “OH WOW. OH WOW. OH WOW.”

        (source: Hinduism Today Vol Apr/May/June 2012.

  3. David Lentini

    Englebart’s work is still amazing. Seeing him work the “cave mouse” is a lot of fun. (I haven’t used a three-button mouse in years.) I wonder why the four-key device at his left hand never saw the light of day.

    And it’s depressing that an a’hole like Steve Jobs gets the credit for inventing this technology. In fact, thinking about Apple’s role in bring this technology to the market after the giant Xerox corporation sat on if for so many years (ever see Xerox’s video on using e-mail in the office from the ’70s?), makes looking at the Apple today sad and ironic.

    And yes, it now looks like that great technology will be used to control us in ways that I suspect Englebart would never have dreamed. I imagine that he was like many inventors, filled with the greatness of the applications of his new technology and fortified by the very idealistic ethos of that time, that he couldn’t imagine Americans succombing to the growing police state we have today.

    And isn’t it ironic that a young Mitt Romney was likely impersonating a police officer right outside Englebart’s office?

  4. David Lewis

    Amazing — thank you!

    But what’s the gadget Englebart is using with his left hand? Looks like it has four or five keys that he operates with his fingers to effect larger commands with single keystrokes rather than with words on the keyboard.

    Whatever it is, seems like it did ~not~ become part of the future that the other aspects of the demo so brilliantly presaged.

  5. Gil Gamesh

    The State relentlessly subverts all tools (i.e. technologies) to its ends. That realization caused Oppenheimer to kill himself. All scientists and engineers should understand this inevitability.

    Digital Stasi…that’s a good one. Should be the new name for the NSA.

    1. from Mexico

      That is an extremely dark, pessimistic, deterministic, and defeatist view. It is shared, nevertheless, by both extremes of the political spectrum: the neoliberals and right-wing libertarians on one side of the political divide, and the Marxists and New Left on the other. The end result is anti-politics. Here’s how Robert Hughes put it:

      The intellectual, under these circumstances, is thought to be as helpless against power and control as a salmon in a polluted stream, the only difference being that we, unlike the fish, know the water is poisoned.

      Thus, by the theory, we are not in control of our own history and never can be… It would be difficult to find a worse — or more authoritarian — dead end than this. John Diggins, in The Rise and Fall of the American Left, puts it in a nutshell: “Today the intellectual’s challenge is not the Enlightenment one of furthering knowledge to advance freedom: the challenge now is to spread suspicion. The influence French post-structualism enjoys in American academic life…answers a deep need, if only the need to rationalize failure.”

      — ROBERT HUGHES, Culture of Complaint: A Passionate Look into the Ailing Heart of America

      1. Goat_farmers_of_the_CIA

        If all Marxists have become reactionaries, Señor de Mejico, then you would need a particularly paranoid, conspiracy theorist mindset to explain away the good work of the Troskysts at wsws.org.

        Once again you follow the diktats of your beloved Arendt. It has been kindly pointed out and explained to you, dozens upon dozens of times, why Arendt’s interpretaion of Marxism was deeply flawed and simply wrong, but you go on quoting her on the subject as if it were Gospel truth. A month or so ago, in reference to another of your fits of Arendt-block-quoting, I replied with some quotes from an article by Corey Robin explaining why your sacred dogma, Totalitarianism, is in fact the weakest part of Arendt’s best known book, due in no small part to its psychoanalitic, ahistorical tendencies.

        If it became famous, as Robin shows, it was because it fit the reductionist mindset of the Cold War, still very much alive today, allowing warmongering pseudo-intellectuals such as Samantha Power (who wrote the introduction to the latest editon of your beloved book, btw) to paint the world in very profitable (to her career and the interests of the elite she works for) black and whites, goodies and baddies. Every time I come across your uncritical quoting of Arendt, I am reminded of this paragraph from Robin’s aforementioned essay:

        “Once a week, it seems, some pundit will trot out her theory of totalitarianism, dutifully extending it, as her followers did during the Cold War, to America’s enemies: al-Qaida, Saddam, Iran [and Marxism!]. Arendt’s academic chorus continues to swell, sounding the most elusive notes of her least political texts while ignoring her prescient remarks about Zionism and imperialism. Academic careers are built on interpretations of her work, and careerism, as Arendt noted in her book on Eichmann, is seldom conducive to thinking.”
        http://www.lrb.co.uk/v29/n01/corey-robin/dragon-slayers

        And by the way, talking about reactionaries, how come you are still quoting the late Fuentes, notorious now for writing an embarrasingly fawning introduction to a book on Venezuela’s Cisneros clan, the owners of a media empire comparable to that of your own Azcarragas ?

        1. from Mexico

          Are personal attacks and character assasinations the only type of argumentation that you are capable of?

          What is it that you find so objectionable to responding to what is being said, instead of who said it?

          1. from Mexico

            And let me be very clear what I mean by Marxist anti-politics. It begins with the Marxist notion that political power merely or largely reflects economic power. It begins with Marx’s estimate of the state as invariably being an instrument of oppression in the hands of the ruling class.

            After the revolution and the transformation and apotheosis of man, Marxists imagine a politics-free world where the state has withered away. Engels described this state of idyllic harmony as being one with “no soldiers, no gendarmes, no policmen, prefects or judges, no prisons, laws or lawsuits.”

            Even though John Gray in Al Qaeda and What It Means to Be Modern explains how the Marxist vision came about, the best statement I’ve found of it is from the 1955 New International:

            Against that day when the state has withered away and the class struggle is a bad dream, we offer the hypothesis that poets will quarrel on a mass scale with nothing more damaging than verbal violence over the use of meters, and painters will rend the air in a dispute as to the use of solid colors and abstract art. There will be all kinds of struggles. All except the class struggle.

            If men are not chained to the machine and the tractor, what productive activity will engage the free energies and minds of men? Being a child of the great German philosophic tradition, Arendt must know that Schiller defined art as the realm of freedom. When men are free from the compulsion of labor, the creative transformation of society, nature and man himself stands on the order of the day. In Literature and Revolution, Leon Trotsky foresaw the day when men would level mountains in one area and raise them in others, turn arid deserts into singing gardens and reshape even their own bodies to their own desire. And in this day of atomic energy, Trotsky’s imaginative visions become real possibilities – if capitalism is replaced by a socialist order.

            http://www.marxists.org/history/etol/newspape/ni/vol21/no02/falk-stein.htm

          2. Goat_farmers_of_the_CIA

            What? Where have I attacked the messenger for what she is instead of for what she says? That is what Robin does in his essay, and that is what I do when addressing your second hand (because you do it through (or merely because of) Arendt) attack on Marxism. That is why you can’t address these criticism of a a flawed intellectual you have turned into a saint, and merely block-quote her. If otherwise you are pointing to my attack on Fuentes, that is for a reason: you go on writing off other people for being reactionaries, but refuse to look at the reactionary thought of those you raise on a pedestal and hold to be beyond questioning. The childish quality (a logical fallacy: turn my message into something it isn’t, an ad hominem) and tone of your reply shows how thin-skinned you really are.

          3. Goat_farmers_of_the_CIA

            Your second reply, with the quote from the 1955 International addresses nothing. To quote from your original post:

            “That is an extremely dark, pessimistic, deterministic, and defeatist view. It is shared, nevertheless, by both extremes of the political spectrum: the neoliberals and right-wing libertarians on one side of the political divide, and the Marxists and New Left on the other. The end result is anti-politics.”

            Your own quote from the International is clear that the utopia (yes, I don’t agree it will come to that) it describes follows a struggle:

            “Against that day when the state has withered away and the class struggle is a bad dream…”

            I myself think the New Left have given up such a struggle, so lending themselves implicitly to reaction, but to throw other Marxist movements together with them and write them off as reactionary is plainly disingenious. The New Left can no more claim to represent all Marxists than Obama all American Liberals.

        2. skippy

          Samantha Power’s… ewwwwwww~

          Skippy… Utopian poster girl for the new american century is tooling up Arendt, um… the dead have no choice thingy… tacky…

          PS. DS would such opines be a tool of the deep state or not… witting or unwittingly

          1. JTfaraday

            “ewwwwwww~”

            That’s exactly what I was going to say! How did that happen? Exactly when did useful tool Samantha Power become some kind of Arendt scholar?

            Looking around today, I think St. Hannah’s analysis applies best to the US, (and allies). Evidently, “someone” thought it would be best to prejudice reading of the text through preemptive interpretive framing.

            Will nothing escape these people?

          2. from Mexico

            When we recently saw the communists join forces with Italy’s liberals and right-wingers in a vicious, baseless and ungrounded attack on M5S and Beppe Grillo, it became obvious that there is something fundamentally wrong with communism.

            But this was not the first time we’ve seen the triumvirate of Marxists, Fascists and Liberals operate in concert. The same triumvirate coalesced following WWII to stomp out the popular democratic movements that had sprung up in Greece, Italy and Spain, as Noam Chomsky explains here:

            http://www.youtube.com/watch?feature=player_detailpage&v=2q0Wdk7ek7Q#t=4920s

            And as Chomsky points out, the communists were actually in the lead in stamping out these popular revolutionary movements.

            Marxism, like fascism and liberalism, is:

            1) Elitist
            2) Anti-democratic
            3) Hierarchical and authoritarian

          3. JTfaraday

            They’re an ideological sect that is convinced it’s Right, and Right about everything, like all the other ideological sects that are convinced they’re Right, and right about everything. Just like MMT and its gathering clique of irrepressibly nasty fanboyz.

            That’s why we saw the Marxist-Leninists at the Platypus Society set off to edjumacate le petite stooopid occupiers who were busy putting their butts on the line, getting arrested in the streets of Bloomberg’s NY.

            It’s like you’re telling us the earth is round.

            It’s not like there are any irrepressibly nasty Marxists on this blog. I think this is what is leading some people to think that the person with the problem is you.

    2. alex

      “That realization caused Oppenheimer to kill himself.”

      No, J. Robert Oppenheimer was a heavy smoker who died of throat cancer.

      He probably suspected that the work he is best known for would be subverted by the government from the get go, since he was working for the US government on an enormous weapons project under war time secrecy.

        1. different clue

          Nicotine is a powerfully addictive drug. Not every nicotine addict is/was suicidal from the start. Oppenheimer might just have been a terminal addict . . . as so many millions of others were/are.

          1. DANNYBOY

            you are correct about smoking. My dad was in the tobacco business for his whole adult life. I addicted early and still remember the pain of withdrawal.

      1. Mark P.

        Boy, some historical ignorance about Oppenheimer here.

        [1] Einstein had the right of it: “Oppenheimer is in love with a woman who isn’t in love with him – the United States government.”

        [2] Most critically, Oppenheimer made the political mistake of getting in the middle of a turf war between the US Air Force and the US Army over the development of “the super,” the Ulam-Teller design for a fusion bomb.

        Seriously. It was about that simple. Go read the transcripts of his hearing —
        http://www.pbs.org/wgbh/americanexperience/features/transcript/oppenheimer-transcript/

        — and you’ll note that basically all the US Army witnesses testify that they’re absolutely convinced of Oppy’s absolute reliability and all the Air Force witnesses — along with Edward Teller and Lewis Strauss, by then at the AEC and a piece of work — come down against him.

        [3] Oppenheimer died of throat cancer, quite likely from chain-smoking. But the cancer could also have been from exposure to radiation while working on the bomb. Other heavy-hitters also went this way probably: definitely John von Neumann from a pretty awful case of brain cancer in 1957; quite possibly, too, Richard Feynman from his days working at Los Alamos.

  6. alex

    “We think of computers and the computer industry as evolving very rapidly, and being very dynamic, but are they not, in reality, remarkably static?”

    The reason it took 16 years is because the price of hardware had to drop to the point where things like GUI’s were cost effective to implement. A GUI is a nice user interface, but could you justify an extra $100,000 (or whatever it was in the 60’s) to have a GUI instead of a command line interface? It’s quite common in the history of technology for an idea to be thought of, and perhaps implemented, years before it becomes practical.

    So yes, the “pace of change” in software technology is overhyped. I’ve been in the electronics/software biz over 30 years (scares me to write that) and I’m not impressed by the rate of change of software. The hardware side has had vastly faster change. Most of the important principles of not just GUI’s, but even more fundamental things like algorithms, operating systems (including virtualization), compilers, networking, etc. were largely developed in the 50’s through the 70’s. Most of what you’ve seen since then is just software taking advantage of the dramatic improvements in hardware.

    There have, nevertheless, been serious improvements in some types of software, like voice recognition and natural language processing, machine vision, etc. But a smart phone with its apps is almost strictly a testament to the improvements in hardware.

    The hype can be used to justify age discrimination. The other day I read someone trying to justify why they didn’t want people with over 10 years of experience. They were in the smart phone app biz and said that the old fogeys didn’t understand how to work with a smart phone and its battery power restrictions and computing power, memory and mass storage limitations (as compared to a PC). I laughed. Sounds like what I and a lot of other people were doing in the 80’s. I have to wonder whether the people spouting this BS actually believe it. I suspect they do, which I find even scarier than the idea that they’re just cynically spewing it.

    A few weeks ago a coworker showed me a very clever smart phone app involving signal processing (can’t go into details due to its still proprietary nature). Originally he was designing dedicated hardware for it, but then had the idea of sticking it on a phone. Worked great. He is in his 50’s, primarily a hardware engineer though with some software on the side, and had never written a phone app in his life. His report was that it was pretty easy.

  7. McMike

    This notion that the true inventors of technology don’t get the credit they deserve is a very important reality to understand.

    That is becuase the great inventors are rarely commercially savvy, or at least sociopathically so. They are not usually great self-promoters or driven to make as much money as possible. They are interested in the science And also, often, they are funded by the government.

    It is only after the more idealistic, or science-driven inventors establish breakthroughs and show viability do the commercial titans ride in on thier stallions and claim everything for thier corporate kingdoms.

    I use the analogy of the relay race. Steve jobs is the sprinter; other runners set the pace and established the lead, but the sprinter gets all the glory.

    To my mind, the most notable example is Tesla, who ran circles around Edison, but couldn’t be bothered with the cut-throat busines side of the equation.

    1. from Mexico

      From Dan Agin’s Junk Science:

      If you want a stark illustration of how things have changed in science, consider this: In the early years of the twentieth century, despite their poverty, and the chance of obvious riches, Pierre and Marie Currie refused to patent their radium isolation process. Concerning the patenting of the process, Marie Curie stated:

      It would be impossible, it would be against scientific spirit… Physicists should always publish their researches completely. If our discovery has a commercial future, that is a circumstance from which we should not profit. If radium is to be used in the treatment of disease, it is impossible for us to take advantage of that.

      1. McMike

        RE Currie. Indeed. One of the “conspiracy theories” about Tesla is his undoing came when he proposed a system for giving away electricty for free. That ranks up here with the oil/auto industry destroying any potential for better efficiency by buying up patents and stiffling development.

        Whatever grains of truth there are to these notions, the point remains that true innovation is often incompatible with commercial exploitation. Commercial interests will in fact tend to slow true innovation and replace it with its simulacra (i.e. planned obsolescence, brand canibalization, and phoney phased product “upgrades”), and, when threated by disruptive innovation, commercial intersts may in fact devote their investments towards eliminating the threat rather than nurturing it.

        If we had turned the internet over to AT&T and Comcast too soon, it would never have happened. And having turned it over eventually anyway, the evidence is overwhelming that these firms are doing their best to stifle and narrow it.

        1. Karl

          The government tried to get AT&T involved with building a packet switching network early on, but thankfully they were too stupid to recognize the value of it.

          http://www.wired.com/wired/archive/9.03/baran.html?pg=5

          The Air Force said to AT&T, “Look, we’ll give you the money. Just do it.” AT&T replied, “It’s not going to work. And furthermore, we’re not going into competition with ourselves.”

    2. McMike

      I should add that this is of corse a sweeping generalization, but holds pretty well if you follow many of the major inventions and transitions in technology and infrastructure – they occur under government funding and in academic settings, and the true pioneers rarely get credit or great riches.

      Not to say that that some of the leaders don’t do well, particularly the second-leg of the relay, but the infrastructure of parobolic wealth is not yet in place, and I argue cannot be in place because they are incompatible: mutually exclusive in fact.

      Even when commerical leaders appear to be at the root of the innovation, it is inevitably being backed by massive government subsidies, incentives, guarantees, laibilty protection, forebearance, and promotion. In other words: outsourced innovation (at twice the price).

      What I am getting at here (thinking out loud as I try and develop this), is that the current environment of economic piracy, privatization, corruption of academia, unbridled greed, and anti-government rhetoric is threatening to kill the golden goose of invention.

      By failing to understand how our innovation works, and by trying to monetize everything everywhere immediately, and by eliminating the systems of innovation through greed and ideological deconstruction, we risk killing the flow of true invention.

      There is no one left who is willing, able, or interested in doing blue sky basic research, in taking risks, in learning by trial and error, or thinking big. The are no systems left that are not overtaken by the forces of monetization.

      Just as many people mis-read Ayn Rand’s Atlas Shrugged seeing only it’s anti-union theme, and miss the role of the crony corporate pirates, many people have not availed themselves of the subtle wisdom in the tale of The Little Red Hen, which tells us that if no one is willing to take a chance on planting seeds, there will be no grain to eat later.

      In our system, the government and academia, operating without relentless draconion profit motives, are the ones who plant those seeds; while the commercial interests watch closely from the margins, waiting to see what will sprout.

        1. McMike

          I tend to agree with most of what Chomsky has said.

          I am not sure what your point is though with respect to the thread.

          See my reply to Glen below. In my view, much of defense spending is largely a dressed up version of outsourced government research. There is little in the way of competition, cost/benefit rigor, or risk for the contractors. It’s just government funded R&D with an extra fat layer of guaranteed profit for the contractor.

          The risk is, as I have laid out elsewhere here, eventually, when spending is fully catpured by the contractors, the innovation dries up and we get an imbalance of unneeded and non-working tech, and not much new tech. Or, worse yet, to juice up the revenues, the contractors induce the military to commit suicide (as in, say, Iraq).

          Come to think of it, the innovation has comparatively dried up. The internet and GIS were big breathroughs. What since? Evesdropping and drones I suppose – big whoop. On reflection, seems like the military has regressed in fact, its big innovations are rediscovering torture and scorched earth.

          Maybe on the energy efficiency front. The military does not bother with partisn BS on fossil fuels, it is already preparing for a post-carbon future.

          1. Doug Terpstra

            Sounds like the former USSR, a totalitarian command-and-control economy sustained by unquestionable ideology that implodes of its own inefficiency and excess. The more things change …

          2. McMike

            Re Doug. Yes, under one system, man exploits man….

            You raise an interesting point about the role of ideology. Because in both cases, it seems to me the ideology falls by the wayside, intialy at the top and eventually throughout the system, until everyone is mouthing platitudes and going through the motions. (there’s Simulacra again). Hell that was true of the IRA, whch had devolved into gun runners and car thieves.

            Clearly, many of our captains of industry are all about anti-competitive rent extraction in their pursuits. But then again, many of them fail to see themselves as the parasites they are, and often claim to believe in things they violate daily (gee, reminds me of a catholic girlfriend I once had..)

            I have zero faith that polticians in Washington (or Soviets for that matter) are driven even the slightest bit by any ideology whatsoever, except whatever suits power and self-enrichment.

            But in their craven cynicism, they provide the masses with fantasies to beat each other over the head with.

          3. from Mexico

            I’m just ruminating over the difference in trajectory of the Spanish Empire vs. the American Empire.

            The Spanish Empire had an economy based pretty much on conquest and plunder and never really developed much of a domestic productive economy.

            The United States, however, had both, at least until the 1970s, when it began taking a wrecking ball to its domestic industrial, productive economy (that part of the economy which lies outside the finance and military/police/criminal justice sectors). Beginning in the 1970s, the US economy has become more and more about finance and conquest and plunder.

            It seems like the Spanish Empire, just like the US, produced a great deal of ground-breaking technological breakthrougs, in sailing vessels, armaments, and also in the mining and refining of silver and gold.

            And wasn’t the internet an innovation of the defense department?

            I don’t know. I guess what I’m trying to say is that scientific and technological innovation is value neutral, both in its creation and its use.

      1. Goat_farmers_of_the_CIA

        That’s also basically the message of Coppola’s little known “Tucker: The Man and His Dream” – when government and business impede or crush any attempt at revolutionary technological change, for one or another reason. During the final scene in the courtroom Tucker himself gives a passionate speech in support of risk taking in the mfg business, equating its spirit to that which guided the establishment of the US as an independent nation.

    3. DANNYBOY

      hippies and other counterculture folks invented the computer.

      and rediscovered everything else that’s good.

      not enough credit given

      by the Showboats

      1. McMike

        And that is the way it will always be. Same with urban gentrification – the artists and gays and bohemians pioneer ailing neighborhods – becuase of the cheep rents and free spirits – and reinvest in the buildings and re-create the social fabric.

        Then the yuppies and developers come in and monetize it.

        The problem we face now, with respect to innovation, is that the corporatists and ideologues are trying to kill or coopt all the hippies. It’s out of balance, and innovation suffers.

        1. DANNYBOY

          Yes McMike,

          I am living the Gentrification Nightmare. When transvestites roamed free, the hood was GREAT!

          Now that the the local previously-porn-theater is being turned into tourist beer something-shitty place, I am lost.

          And it extends to the Arts BIG TIME. My son now protects his art from being exploited by KEEPING IT SECRET.

          I’m living a future that can’t really be happening.

          1. McMike

            Well exactly.

            The trannies and bohos swept the streets and replaced the gated window pawn shop with a true small grocer. Which is soon to be a Starbucks.

            The brew pub would never had happened if the porn theater had not invested in the abondoned crack house. And a few years from now, the brew pub will be replaced by an Applebees….

            And your son will someday complain to you that he is making a decent living, but doesn’t feel like he is doing art anymore, and that he is drinking to much, and mostly sucking up to a bunch of self-important phonies instead of developing his craft.

          2. DANNYBOY

            Dear McMike,

            When the trannies and others were swept away I feared that tolerance was gone. I was right. I’m a Jew, and have an sense about these ‘changes’. I know that if a Black Man or a Gay Man is mistreated, I can expect the same treatment to strike me.

            Your prediction for my son is off only by one generation. Because of the suffering I endured, I have done everything I can to prepare my son, if he is to live a creative (or any unconventional) life.

            The hatred hurled by The Privileged toward anyone who is not them is ugly. My first advice for my son was to live apart from Them. He has found his way to live in a ghetto. He is surrounded by Hassidic Jews, with whom he has little in common, other than the desire to be left alone. So far, this has defended him from the “bunch of self-important phonies” making him “sucking up to survive.

          3. McMike

            Re DANNYBOY. I think I misread your previous post as sarcasm. Sorry.

            Yes, as gentrification proceeds to commodify and monetize everything, it consumes tolerance, beauty, spontenaity, and everything else that is interesting and authentic until all that is left are montary units cotrolled by monetarily-obsessed minds.

            You might want to lok up the Order of Simulacra, into which the cycles of gentrification and innovation fit quite neatly.

          4. DANNYBOY

            Dear McMike,

            No harm in misreading my intentions…provided me with another opportunity to warn others in this situation.

            Another piece of fatherly advice – always be prepared for violence.

            Have a nice day.

    4. Glen

      I think the other large factor in why so many of the original creators of these ideas are overlooked is due to the “government is bad” and “capitalism is good” meme that gets drilled into everybody since at least Reagan (and maybe before).

      It’s impossible for a large portion of America to admit that anything good comes out of government research or government sponsored research or the even out of universities doing pure research. The truth is almost the exact opposite, almost all of the original research comes from some form of pure research, almost all government sponsored, and almost none as a result of capitalism. A typical example is fracking, the natural gas recovery technique now “solving” America’s energy problem. This was developed by the government and laughed at the the oil and gas industry. (And it will not solve America’s long term energy problems since the life span of a fracked source vs. a traditional source is much shorter, plus the knock on of still generating green house gases.)

      The long term effect of this trend is disturbing. Americans are told to distrust any pure research discoveries which conflict with large business capitalism such as global warming. Plus America is being increasingly left behind in the area of pure research which almost all occurs through direct or indirect government funding (except military R&D, but the long term benefits of better ways to kill people do not move societies forward and waste skilled people and money).

      Why this meme makes any sense to most people is confounding. Capitalism exists to take money from consumers with the least amount of benefits especially with the financialization of whole industries. This has resulted in the concentration of wealth to the very top as the pinnacle achievement of Reaganism’s “greed is good”. Governments exist to provide benefits to all of society which results in New Deals, Great Societies, and Medicare programs (or even at a more basic level – laws, justice, protection, education, heath, etc). To say that one or the other approach is the only right one is incredibility short sighted, but then to call what we have now capitalism is stupid too. The whole system is totally unbalanced with government effect being used to the benefit of the same banks and corporations that dominate the private side (and recently wrecked the global economy.)

      Sorry for the long rant but I’m tired of watching so many in our country using the power of science and technology (the internet, computers, etc) to explain how nothing good comes out of the science and technology, especially government backed science and technology. For those who potificate those views, please just go back to living in caves and let the rest of us get on with the future of the human race.

      1. McMike

        Indeed. It was Reagan who really accelerated poisoning the well. To the point that the right wing zombies are in danger of sending us back to the stone age through economnic and social regression, literally undoing centuries of human development as a cooperative species. But of course modernity has always been the threat to mouth breathers. Perhaps the neandethals have left their mark after all (joking).

        However, I would add that military R&D has in fact spawned a lot of the technologies of civilian life (but yes at great cost in human suffering), including internet, communications, satelites, hi res cameras, aircraft and vessels, highways, GIS. etc etc

        In fact, the military is pretty much the right wing palatable Keynesian program, and the last safe place for government funded R&D that is not required to meet cost/benefit crtieria.

  8. Merijn Knibbe

    Twenty five years ago I used a lap top from Groningen University to manipulate economic time series at home (literally between drying diapers), using Lotus. I do now own my own lap top – and still manipulate economic time series. An improvement: I now have a room with a view and sometimes robins and other birds come withing fifty centimeters to snatch something from the food which is hanging outside the window. Another improvement: larger computermemories. The real improvement: cyber space, which gives me acces to the data bases of the OECD, Eurostat, the BLS, digitalized eighteenth century Cape town probate inventories, whatever – the real improvement was of course that all this stuff came on line. I do not need any kind of university to do time series analysis or to travel to Cape Town to study the use of money in a settler colony, great! I use Excel nowadays – this was not the greatest improvement.

    1. McMike

      The access to all that data is wonderful. The ability to get answers in real time is great, allowing me to keep momentum, and to complete things as I go, rather than leave a big “to do” list of nits and facts to be checked.

      However, this ability begs the question of how much of the stuff I do is worthwhile or adds value? Perhaps if I was more judicious with my time, I’d focus on the more important things. One trick fathers use teaching their sons to hunt is to send them out with a single shot gun – learn to make that first shot count. Semi-autos comes with a surplus that makes some shooters lazy and careless with thier first shot.

      Also, this processing and research ability comes with a price, I spend God knows how much time messing with the “overhead,” backing up data, on hold with tech support, on my hands and knees rebooting a router, dealing with crashes and patches and upgrades and downloading new viewers or whatever.

      1. LifelongLib

        Yet when I suggested (on another forum) that cloud computing services would be a good deal for a lot of people (who could then just use a very robust internet appliance rather than a PC) I was roundly reviled by people who think everyone should own their computers. It’s like saying everyone should have a car rather than taking a bus or a taxi. There’s still a big mystique about personal computers and freedom.

        1. dLambert Strether Post author

          I think anybody who puts their data in the Cloud is demented.* They’ll be able to charge whatever rent they want on it, whenever they choose to. Because it’s always all about the rents!

          NOTE * Except for the case where you want to take a clean computer across an international border, of course.

          1. McMike

            And, having charged you rent to store the data, they will also sell it to data miners without your permission or knowledge, then they will also find ways to charge you to use your data, then, finally after they lose it, they will charge you to get it back, or absent that, they will walk away and not take your call.

          2. Yves Smith

            There are places where access to the cloud is limited and erratic, like coastal Maine. Or on the ocean. And in the US we have laggard, overpriced broadband.

        2. They didn't leave me a choice

          Cloud computing, at least the way it’s being sold to the customers these days is nothing but rent extraction, plain and simple. Feel free to be a digital slave on top of being a debt slave if you want. I’ll keep my own reserve of computing power, thank you very much.

          1. LifelongLib

            I don’t dispute your criticisms about the current state of cloud computing. But I continue to believe that an honest, reliable “cloud” would be a better solution for many people’s information management needs than personal computing is. And when you try to discuss this, it’s evident that opposition to the cloud comes from something other than its existing defects. Like Republicans who don’t believe in government and are therefore not interested in making it better, there are those who have a philosophical commitment to personal computing and will not accept any sort of cloud service. But there is no reason computing should be different from any other human activity. Some will want to handle it themselves, some will want to pay somebody else to do it. Both should have the option.

          2. hunkerdown

            “Honest” and “reliable” are the weasel words.

            Reliable isn’t going to happen because it cramps the style of business and government. How many cyberlockers just up and decided to tell their paid users “None for US” or “We’re done” after the Megaupload seizure? Even the new Mega isn’t all that reliable, as they have proactively removed files indexed by shady-looking indexers. RapidShare recently decided to enact a storage limit for certain account classes and have stated their intention to delete uploaded data of users exceeding a certain threshold.

            Cloud storage is more or less reliable at serving requests. Business is unreliable at serving anyone’s needs but its own.

      2. McMike

        Re lifelonglib. I agree about public transport, if the system is user-friendly and well-run.

        My concern about the cloud though, comes down to a simple question: would you put all your most important business information, your most important personal secrets, your most important financial data, and assign your ability to navigate the daily business and financial world completely in the hands of Microsoft or Google, without backup, without recourse, without oversight, without protection, and under the basis of unilateral “contracts.”

        MS and Google have already answered that question for me. NFW.

        I sometimes hate this stone around my neck that sits on my desk. But, while an application on the cloud here or there is okay (with backup and competition), I would rather go back to pen and paper than put myself completely and 100% in the hands of, and at the mercy of, Google.

        I have this to say to people that are 100% in the cloud (i.e. 100% on google or FB servers): you will live to regret it.

        1. LifelongLib

          Given what I’ve seen of how some people manage their computers, their info isn’t safe at home either. At least the Cloud would presumably be run to better standards. I concede that Cloud computing today is about where banking was in 1930 — maybe better than keeping your money under a mattress but no guarantees. I want a government-regulated cloud with rules about privacy, access, backup,etc. Obviously personal information is not as fungible as money (if yours is lost it can’t be replaced by an equivalent amount) but I think we could arrive at a Cloud system that’s safer than home systems for the non-tech savvy.

          1. McMike

            Re safe clouding.

            Let me know when that happens. I see no incentives for Goolge or FB to do so, and I see no inclination on the part of the government to force them.

            In the meantime, yes, at-home computing is fraught with danger and inefficiency. But at least it is mainly my danger and inefficiency.

        1. DANNYBOY

          Yves, can you ask Uncle Smith to write the check for $600 more, I’m seeing myself on a Tern Link D8 folding bike?

          I’m still on topic here, right. I mean, I’m still all about Naked Capitalism with this.

          Or did I just go over to the dark side.

          I wonder?

  9. Paul Tioxon

    Advanced Research Project Agency (ARPA) which is really DARPA or the Defense Advanced Research Project Agency, is the government funded source of pioneering military technology. If you note, the 2nd screen shot at the very opening, shows the joint sponsorship. The Pentagon wanted a redundant communication system that would survive continental nuclear strikes in order to preserve the command and control of the military and the government. So, Mr. Englebart, like many scientists, got his money from the military budget.

    http://www.darpa.mil/About/History/PARTIAL_BIBLIOGRAPHY_OF_THE_INTERNETARPANET.aspx

    This technology moved onto XEROX PARC where Steve Jobs encountered it.

    “Xerox PARC was the innovation arm of the Xerox Corporation. It was, and remains, on Coyote Hill Road, in Palo Alto, nestled in the foothills on the edge of town, in a long, low concrete building, with enormous terraces looking out over the jewels of Silicon Valley. To the northwest was Stanford University’s Hoover Tower. To the north was Hewlett-Packard’s sprawling campus. All around were scores of the other chip designers, software firms, venture capitalists, and hardware-makers. A visitor to PARC, taking in that view, could easily imagine that it was the computer world’s castle, lording over the valley below—and, at the time, this wasn’t far from the truth. In 1970, Xerox had assembled the world’s greatest computer engineers and programmers, and for the next ten years they had an unparalleled run of innovation and invention. If you were obsessed with the future in the seventies, you were obsessed with Xerox PARC—which was why the young Steve Jobs had driven to Coyote Hill Road.”

    Read more: http://www.newyorker.com/reporting/2011/05/16/110516fa_fact_gladwell#ixzz2OU2rxATS

    And of course, every US Senator and Congressperson who sat on appropriation committees for ARPA can claim support for helping to bring the internet into existence. Thank You Al Gore inter alia, today, Rand Paul would vote against garbage collection.

    1. LifelongLib

      The other week where I live the state government’s internet service went down. It turned out that some homeless people had lit a fire under a freeway overpass and ignited the insulation around the service provider’s cables, which were run along the bottom. It’s difficult to believe that we could actually maintain communication during a nuclear war.

      1. Paul Tioxon

        A while back, a few guys with box cutters took over jet airliners and crashed them into the tallest buildings in NYC and the Pentagon, I find it difficult to believe that the US Military can defend us against anything.

        1. Paul Tioxon

          Really, it is hard to be sarcastic when you can document the level of stupidity that is in charge. To continue with the history of the computer and the internet, the DoD wanted to sell the internet to AT&T, but they declined, saying the technology absolutely would not work. And you wonder why facebook is invented by 20 year olds.

          From the history of the internet in 1971:

          “19 nodes on ARPANet including UCLA, SRI, UCSB, Uni Utah, BBN, MIT, RAND, SDC, Harvard, Lincoln Lab, Stanford, U of Ill Urbana, Case Western Reserve, CMU, NASA-Ames [Hauben]

          Steve Crocker joins IPTO as a program manager.

          Larry Roberts wants to avoid DoD owning and operating the Internet. Therefore Roberts approaches AT&T offering it to them, offering to give them the network and have the USG as an anchor tenant customer. “AT&T could have owned the network as a monopoly service, but in the end declined.” “They finally concluded that the packet technology was incompatible with the AT&T network,” Roberts said.” AT&T would not build its first packet switched network until 1982.[History of Telenet p. 29]

          “Bob Taylor also tried to talk to AT&T about the venture. “When I asked AT&T to participate in the ARPANet, they assured me that packet switching wouldn’t work. So that didn’t go very far.” ” [Nerds2.0 p 74]

          Larry Roberts said, “They wouldn’t buy it when we were done. We had decided that it was best if industry ran it, because the government had done its experiment and didn’t need to run it anymore. I went to AT&T and I made an official offer to them to buy the network from us and take it over. We’d give it to them basically. Let them take it over and they could continue to expand it commercially and sell the service back to the government. So they would have a huge contract to buy service back. And they had a huge meeting and they went through Bell Labs and they made a serious decision and they said it was incompatible with their network. They couldn’t possibly consider it. It was not something they could use. Or sell.” [Nerds p 109] [See also Vanity Fair (quoting Baran ” The one hurdle packet switching faced was AT&T. They fought it tooth and nail at the beginning. They tried all sorts of things to stop it.”)]

          “Roberts discussed the issue with Bernie Strassburg, Chief of the Common Carrier Bureau of the FCC. Strassburg advised that the best approach would be to form a new com pany and apply for an operating license from the FCC.” [History of Telenet p. 29] This would be Telenet.

          See Western Union’s Decision not to buy the Bell telephone patents 1876.

          Stephen Lukasik becomes Chief of ARPA. He was a major proponent of network research of of electronic mail. Would become FCC Chief Scientist in 1979.

          Alex McKenzie took charge of the Network Control Center at BBN. He envisioned the ARPANET as a “computing utility.”

          http://www.cybertelecom.org/notes/internet_history70s.htm#att

  10. H. Alexander Ivey

    “…had to move the mouse”, oh man, that brought back memories!

    Thanks for memory lane…

Comments are closed.