At the risk of losing all cred with tech enthusiast readers, let me use a suggestion (from Dwight) as a point of departure:
I thought I would recommend that you both consider adopting Twitter as part of your blogging presence in 2009.
Twitter was organized in 2006 and hit 6 million registered users by the end of 2008 a 600% increase over the prior year. Facebook recently offerred to purchase Twitter for $500 million in stock but the offer was rejected. Google has also been thinking about Twitter [many examples of who is using it and why, with links]…
At a minimum, even if you choose not to tweet on Twitter, you can at least feed your blog posts to Twitter via Twitterfeed.
Now even though I probably in the end relent and wind up feeding posts on Twitter, I am deeply troubled by a communication medium that limits messages to 140 characters, and I’ll return to that shortly.
But before readers brand me as a hopeless Luddite, let me stress that I am fussy about technology. Compared to other mere mortals (as compared to developers and serious technologists), I tend to be either bleeding edge or late and reluctant. I had a NeXT computer as soon as it was out of beta, was using e-mail and the Internet in 1991. I tried cell phones and the Palm early, concluded I didn’t have much use for either. I was in Verizon’s New York DSL trial (the first broadband available here, and the one and only time I have ever seen Verizon show up for an appointment promptly and perform efficiently). I was one of Vonage’s (VOIP) very early customers and went through the brain damage of making it work from Australia.
So why do I hate Twitter? Twitter is troubling reminiscent of Newspeak, the language being developed by Oceania in George Orwell’s 1984 to control thought.
Orwell, in an appendix, describes the principles of Newspeak, and they are directed towards simplifying language so as to void it of inconvenient (for the power structure) propensities of thought:
The purpose of Newspeak was not only to provide a medium of expression for the world-view and mental habits proper to the devotees of IngSoc, but to make all other modes of thought impossible. It was intended that when Newspeak had been adopted once and for all and Oldspeak forgotten, a heretical thought — that is, a thought diverging from the principles of IngSoc — should be literally unthinkable, at least so far as thought is dependent on words. Its vocabulary was so constructed as to give exact and often very subtle expression to every meaning that a Party member could properly wish to express, while excluding all other meaning and also the possibility of arriving at them by indirect methods.
Now what does that have to do with Twitter, one might ask? Well, while the main means by which Newspeak was implemented was simplifying and subtly changing the inference of words, another element was the extreme condensation of communication:
Regularity of grammar was always sacrificed to it when it seemed necessary. And rightly so, since what was required, above all for political purposes, was short clipped words of unmistakable meaning which could be uttered rapidly and which roused the minimum of echoes in the speaker’s mind…..So did the fact of having very few words to choose from. Relative to our own, the Newspeak vocabulary was tiny, and new ways of reducing it were constantly being devised. Newspeak, indeed, differed from most all other languages in that its vocabulary grew smaller instead of larger every year. Each reduction was a gain, since the smaller the area of choice, the smaller the temptation to take thought. Ultimately it was hoped to make articulate speech issue from the larynx without involving the higher brain centers at all…..
And it was to be foreseen that with the passage of time the distinguishing characteristics of Newspeak would become more and more pronounced — its words growing fewer and fewer, their meanings more and more rigid, and the chance of putting them to improper uses always diminishing.
Now the idea that have people communicate often within 140 characters and thought control seems awfully remote, no? Particularly since this is voluntary, customer driven, right?
I am not at all certain. I notice in reactions to my blog posts, which are often pretty lengthy, that readers sometimes miss important nuance in what I or readers I have cited say, or (just as bad) project onto what I have written something I never said (as I noted earlier today, I have written often about growing unrest in China, and too often, I get comments arguing that I am all wet to be predicting that China will fall apart. Huh? I never said anything about violent overthrow of the government).
Now this could just be normal comprehension issues. But I notice how the Internet has affected how I read. I have become impatient with longer stories (unless I am on an airplane). I spend most of my time on the Internet, and the vast majority of what I read fits within the browser window. I find that has conditioned my expectations. When confronted with a longer piece (say Sunday New York Times magazine feature or New Yorker length) I find after the first page wondering if it really had to be this long, and often not finishing the piece. Five years ago, I never would have responded this way.
You can’t say anything complicated or nuanced in 140 characters. I am sure readers will provide some cute counterexamples, but try explaining Plato’s cave in those confines. Can’t be done. You might allude to it, but you could not present it to someone who didn’t know about it already. And Twitter encourages people to accept a medium that severely constrains communication, and calls a defect a virtue.
Marshall McLuhan was right.
I have a second issue with Twitter, and mobile communications generally, I can’t control how they are used, but I see them as having a corrosive effect on interpersonal relations.
It’s one thing to take calls, check texts tweets, or the news when out and about by yourself. But it has become the norm to take them when meeting with others. That reduces the quality of the interaction and sends a message that the person you are with is merely an option, other options are ever present and must be assessed, maybe exercised.
For those in high urgency professions (doctors, traders) I can see this being acceptable. And everyone has occasions when they need to be on the alert for news, a call, or a text. But this has become routine.
Humans are a social species, with very big limbic brains (the emotional center) and smaller cerebral cortexes (the seat of higher reasoning). I cannot prove the connection, and doubtless many factors are in play, but the US is a society where enormous numbers of people take anti-depressants and brain chemistry altering chemicals, either to elevate their mood or improve performance in some way (and those are the legal drug users. BTW, the most recent data I could find was 2005, that anti-depressants are the most widely prescribed drugs in the US, with 118 million prescriptions written that year). That says something is deeply amiss.
We have a lot of other factors contributing to the erosion of social structures: high divorce rates, short job tenure (and now high unemployment), rising demands for on-the-job productivity (computers and the Internet are a double-edged sword: you can do more, but expectations have risen accordingly). These are clearly the big drivers, but I have to think that the degrading of routine interactions and the expectation (in at least some circles) that people multi-task, when the evidence is that it does not increase productivity, has to play a role.
Twitter feeds that addiction, that false sense of urgency. Most things can wait. Indeed, a lot of things are better off waiting. But we are encouraged to be plugged in, overstimulated all the time, at the expense of higher quality human relations.
I don’t want to contribute to the problem by participating in this sort of thing, but I suspect I will give in to practical realities.