begriffs

Humane Computing and the Eras of Information

November 2, 2013

Yesterday I watched some old horror movies for Halloween. Surprisingly what made them seem most dated wasn’t the special effects, or the fashion trends. It was the way people related to information. Back then information was opaque and constantly human-evaluated. Let me explain.

The two movies I watched were The Hills Have Eyes (1977) and IT (1990). These movies, although thirteen years apart, feel to me like they are in the same information-era.

The Hills was explicitly about isolation, and invisibility. The way people talked was either face to face, over walkie-talkies, or over a CB radio. Notice that none of these methods has any persistence. Radio converts air pressure to electromagnetic pulses. It’s not really different than talking, there just happens to be an electromagnetic isomorphism with sound waves. Like sound, this data is analog and opaque. There is no meta-data here, no machine interpretation. The data makes sense only when actively intercepted and interpreted by a person.

In the movie, the agency of producing, consuming, and acting on the information is completely human. It’s like riding a bike: it’s human power all the way down. To get messages you have to poll. The messages are easily garbled, and fleeting.

A different information-era begins to emerge with telephone answering machines. The information is still analog, but it is persistent. Even more significantly, machines begin to share our agency. A machine records, along with the message, the fact that the message is present, and when it was recorded. Suddenly the machine can interrupt you with meta-data. Machine data. You get home and it’s beeping.

Back to the movies. IT, although made well into the answering machine days, really doesn’t show it. People place synchronous phone calls, wait for the other end to pick up, then speak analog audio. “IT is back! Return to our hometown and face it.” The person who gets the call then tells their friend/family face to face that they have to go. When nobody is talking at someone else then nothing is being communicated. No machines are watching, measuring, alerting.

At one point a character does pull out a basketball-sized cell phone in a cab, but he does it so ceremoniously that it looks like a rare occurrence. Also, his use of the cell phone happens because of a hierarchical power structure – his company. The company provided the phone and authorized him to use it for their own purposes.

The centralized control of information is another striking feature of that era. If information ever gets replicated beyond like one copy, that happens because a powerful organization allowed it. The group of protagonists in IT turn on the TV and a reporter tells them about some gruesome relevant happening. The reporter wears formal clothing to show their inclusion in a media hierarchy. Copying information used to be ceremonial.

However whether information arrives by newspaper clipping, or a classroom paper airplane (which is basically texting), or over the phone, it is always only human-evaluated. Somebody reads it or hears it. When the information is ignored it is dead, or at least sleeping.

This is a pain in the ass, but it also simplifies life. Nowadays the information we store online and share digitally is full of unfinished business. Copying is virtually effortless, and doesn’t require the oversight and discretion of a human hierarchy. It is laden with meta-data which machines will process in who knows what ways. The bits you reveal are now part of an unknown, unpredictable chain of non-human events.

Deep down this makes us uneasy. Actually really disturbed and wary. Horror movies show this even in that earlier information-age. Scary things live in TV static, or in unexplained telephone noises. The action of these systems is mysterious, and in a way we are trying to comfort ourselves by imagining ghosts living in the machine because at least ghosts act like humans.

Watching old movies is like traveling to a foreign land. You come back and see things differently. You see you have a choice. Our habits are not inevitable.

There are things we can do differently to regain control over our information and our knowledge of its use. But there’s one big problem: the human mind weirdly enjoys being interrupted. The fun of alerts and interruptions becomes addicting and causes us to voluntarily expose information to uncontrolled copying and processing.

This addiction causes us to even stifle our own personalities in order to assist machines in processing our data. One big way is by converting our fluid written expression into strings of character codes. What I type with my keyboard are standardized letters. What I read with my eye are shapes curving across the page. The algorithm used to draw the character codes on the screen draws them all the same. AAAA. BBBB. Whereas if I write them by hand I don’t just tell you things, I show you something about myself and how I feel.

Adapting to homogenous character codes is actually robbing us this form of handwritten expression. My handwriting is crude and undeveloped. Where I might draw graceful lines I write uneven jagged scars. And what’s worse, I actually try to imitate the uniform letters on the screen by printing letters. How strange. I have been maimed to serve an algorithm.

Why do we use (unicode) characters rather than sending bitmapped handwritten graphics? Initially we had teletypes (actual typewriters), and then we hijacked that protocol to drive terminals. It saves bandwidth. We are also imitating the way letters look on the printing press, which incidentally changed our lettering to make information easier to copy.

There’s no reason to use character codes for all documents nowadays. We use them universally because it saves machines’ decoding time. If you want stuff indexed by Google, for instance, you use unicode. Google has enough work to do without running optical character recognition on some handwritten web page.

But what if you specifically do not want machines to process certain communication? Seems like a good time to use handwriting. It helps make the data more inert while reclaiming your individuality. It returns that data to the human-evaluation era. It’s more…humane.

There’s a healthy balance. Some data might be better expressed as characters. This blog is public, for instance, and I’d like to share it widely and get organic search traffic. Perfect.

There might be a deeper reason to choose handwriting as the default, though. Throughout history the vast majority of great writers did not input ASCII or unicode into a computer. What they wrote was significant enough that somebody took the time to put it online. If you’re writing something really worth reading then it’s going to spread.

Besides silencing our individuality, the vanity of writing in easily copyable character codes lures us into devaluing ourselves. Think of all the time people have spent accommodating presentation languages like HTML and LaTeX.

Perhaps because the legacy of centralized information copying control, we feel like a meticulous polished layout makes us appear more reputable, and slavishly fight with HTML and CSS to make a good impression. Who’s going to use an online bank that has a handwritten interface?

But honestly, what about this plucky option: be yourself, have something important to say, and let transcription take care of itself. Just handwrite things. Are you an academic? Submit your groundbreaking paper to a journal in your own writing. If it’s the real thing then they’re just going to have to deal with it. And they will deal with it. Conformity lulls us. Time spent learning LaTeX feels confusingly similar to useful work; it’s not. Or word-process the bulk of the paper for ease of editing, but scan in hand drawn figures and equations.

The first technique of humane computing, then, is to be yourself and embrace analog communication when it makes sense. If you want text to be indexed and copied, then enter it that way. When you want to express your personality and warmth to a limited audience then bust out the penmanship.

The second technique is to restrain your craving for alerts. Here we can turn algorithms to our advantage. Imagine an email client that fetches mail only once a day. Maybe you choose to get it all in the evening. The rest of the day you’re free to get into a calm flow. While writing this article, for instance, I have disabled all alerts on my computer and have been writing happily for hours. I feel freer to expand my thoughts through a broader arc.

Somehow having technology that can deliver fast-paced continuous communication has created the expectation that we use it. What if you create an auto-responder that tells people you will won’t read their message until tomorrow? What if you include the same message in your voicemail and turn your phone ringer off? Sometimes information is time-sensitive, but think about it, and assume it can wait unless you know it’s urgent.

The third technique is to use specialized machines with the least computational power necessary. Especially avoid Turing-complete machines. Using them when you don’t need them leads to two kinds of stress.

The first kind is the unknown consequences of consuming and producing data. For instance if you use a smartphone app as an alarm clock then perhaps your sleep schedule is being processed and sold to advertisers.

The second kind of stress is that the machine can break itself. Your smartphone alarm may fail to run after you upgrade the device because of a new security policy or who knows what reason. So use a dedicated clock, in fact a wind-up mechanical clock probably has least computational power.

In conclusion, observe how people used to live and how that’s changing. Take charge of the way you deal with information to achieve what you want without consequences you don’t. We live, now, in a complicated era of information.