Pages

Tuesday, July 28, 2009

Welcoming our posthuman overlords

I have discovered that it's probably impossible for me to ever like scifi, but I am willing to make my peace with it, as long as its devotees promise to maintain a clear distinction between their fantasies of becoming virile space robots and actual politics. I know that some people claim it's a philosophical genre that confronts us with the most important questions about the meaning of humanity, and that seems possible and maybe is true in some cases. But the only signs of its philosophical influence I've ever seen have been found in arguments like this.

There was another, nearly equally bad article about intelligence augmentation recently in the New Yorker hailing our incipient age of Harrison Bergeron-esque equality brought to you care of psychopharmacology, but let's focus on this one, primarily because it accomplishes the amazing feat of being stupid in nearly every single sentence.

Basically, in case you haven't heard, teh internets is making us all geniuses right now while rendering our old "capacity to memorize and recite facts" that used to be called intelligence obsolete. This is a little weird, since the new intelligence is still actually based on facts, only now Wikipedia knows them so you don't have to. Part of the new intelligence is that we can use computers to predict the weather (even correctly, about 25% of the time), see "signs" of things that may possibly mean that "changes" are coming, and "debate models of an 11‑dimension universe." Also, we can Twitter. And here you thought my roommate who put hot dog buns in the ice-maker and expressed great surprise when the machine broke was a moron. No indeed. He is a genius, for he knows how to access information.

Now, at first, children, the preponderance of media may cause you to experience feelings of inattention and being overwhelmed with information, but actually, they are all "providing a form of cognitive calisthenics," that build your "capacity to make connections and to see patterns—precisely the kinds of skills we need for managing an information glut." What this seems to be saying is that watching a lot of TV will at first bombard you with apparently puerile sounds and pictures, but, if you practice hard enough at it, you might reach a stage of sophisticated TV watching, in which you will see how Friends and Will and Grace not as just part of the endless racket coming from a box in your living room, but as part of a pattern, and you will baptize that pattern "sitcom." But don't get too wrapped up in all this, because actually, observing patterns is really something computers do better than you anyway: "Any occupation requiring pattern-matching and the ability to find obscure connections will quickly morph from the domain of experts to that of ordinary people whose intelligence has been augmented by cheap digital tools." Guess you built up the wrong muscles with all those "cognitive calisthenics" after all.

So now that we have this new intelligence, whatever it actually consists in, what should we use it for? First, we use it to replace our own lame minds with
individualized systems that augment our capacity for planning and foresight, letting us play “what-if” with our life choices: where to live, what to study, maybe even where to go for dinner...These systems, perhaps linked to the cameras and microphones in our mobile devices, would eventually be able to pay attention to what we’re doing, and to our habits and language quirks, and learn to interpret our sometimes ambiguous desires. With enough time and complexity, they would be able to make useful suggestions without explicit prompting.
Wouldn't it be awesome if a computer could know what you want even when you don't know? Except how would we ever know if the computer was correct?

Second, even though we haven't figured out what augmented intelligence is good for in the first place, we will use it to compete for more augmented intelligence. Some people will discover Provigil, and then they will bully everyone else into using them so that, "Little by little, people who don’t know about drugs like modafinil or don’t want to use them will face stiffer competition from the people who do." Stiffer competition to do what, exactly? What is worth doing? Making more videos of your cats for YouTube? Again, a question our scifi maniac seems unable to answer.

But no worries, because then there will be the Singularity and all questions will become answers and all answers will become rooms full of hot Anime characters come alive and all socially awkward software designers will be allowed into the orgy! Ok, problems solved! Now we are in the future, which, while "solid predictions about artificial intelligence are notoriously hard," should have us at full posthumanity by 2030.

And what will our posthuman superintelligence look like? Well, mostly, robots will take care of our work for us and we'll spend most of our time having really amazing arguments--
buttressed not just by strongly held opinions but by intricate reasoning. People in 2030 will look back aghast at how ridiculously unsubtle the political and cultural disputes of our present were, just as we might today snicker at simplistic advertising from a generation ago. Conversely, the debates of the 2030s would be remarkable for us to behold. Nuance and multiple layers will characterize even casual disputes; our digital assistants will be there to catch any references we might miss.
It's not totally clear how we, empty of any accumulated knowledge, will be able to make references to anything in the first place, but if we somehow manage this feat, we'll at least have a computer to help our companion cope with our incredible accomplishment. It's also generally unclear why we would form opinions about anything much in the first place when 1) our computers will be better at it than we are, and 2) our opinions have no purpose since politics is administered by models and simulations (and robots?).

This great future may seem daunting at first, but "we shouldn’t let the stresses associated with a transition to a new era blind us to that era’s astonishing potential. We swim in an ocean of data, accessible from nearly anywhere, generated by billions of devices." Which means, you shouldn't complain about what I think is awesome, and I know it's awesome because it has created so many new toys for me already! Besides, even if you wanted to complain, "there’s no going back." Evolution is a fixed one-way road, children, and I happen to know all the stops ahead. Don't even think about turning off onto a side-street! There are no side-streets! There's nothing left for mankind to do but exactly what I say. "The information sea isn’t going to dry up, and relying on cognitive habits evolved and perfected in an era of limited information flow—and limited information access—is futile." What exactly are the cognitive habits appropriate to an era of limited information-flow? Memory? I guess it's time to turn that creaky machine off now that we've got us some Google (though it's not clear why we forgot to do that when we developed writing in the first place). I'm positive we can become more efficient pattern recognizers if we forget everything we ever knew about ourselves or anything else. This is a belief that intelligence requires absolutely no individual or social cultivation beyond acquiring the best gadgets and popping the right pills.

In the end, "there's no going back" is the only argument this vision has to lean on. Cascio can't give a single reason why the new intelligence is better than the old, what its purpose is, or why we should embrace anything he advocates except that, if we don't, some apparently predetermined course of history and evolution will steamroll over us (or he will stay awake longer than us to pump out dozens more articles like this one thanks to Provigil). I am struck by the similarities between these kinds of "the posthuman future is coming and there's nothing you can do about it" claims and what Arendt described as the view of nature and history taken by totalitarianism, which is, not coincidentally, very tied up with the cracked-out Darwinism that comes up in this article. The belief that nature has predestined human society to develop along a fixed trajectory and that politics--through science--must serve as nature's handmaiden in carrying out this plan is straight out of The Origins of Totalitarianism. The rhetorical strategy is, of course, to make resistance appear futile, and to make accommodation seem the only reasonable response. It is also, of course, unimaginably dangerous.

Tuesday, July 14, 2009

Mail call

From a reader letter:
There really is no such word as "mentee." "Mentor" was a figure in Greek mythology; he was the teacher of Telemachus, son of Odysseus. Rather than "mentor-mentee," you might try "Mentor-Telemachus."
I eagerly await the day when "the Mentor-Telemachus relationship" becomes everyday parlance.

Monday, July 13, 2009

Recap

College Summit went well, I think. The kids were better this year than last, even if the campus was abysmal. I learned that 1) modeling is a competitive sport in PG County public schools, 2) the current hip life aspiration of the ambitious is to own a hair salon (last year, it was music production), and 3) being a straight-A student does not preclude totally stupid conclusions, like "Maybe I'll just have a baby instead of going to college because that's what my sisters did." Reinforced from last year was the fact that apparently only rich people want to study liberal arts.

The program has a pretty fool-proof curriculum not unlike the Henry Ford model of production--it breaks down the personal statement into discrete parts so that anyone, no matter how plainly illiterate, can produce something resembling an essay (and, on the other hand, anyone can teach it) in three days. I've discovered that the seeming success of the workshop can create a false sense of optimism. One thinks, "Oh, my students have a finished essay, they filled out a sample application, they met with a college counselor--they are well on their way to college!" But actually, no. Between now and the application deadline this December, they will forget to sign up for the SAT, or to upload the essay they wrote, or to ask for recommendations, or maybe the economy will collapse, or any number of things that will result in only one out of my four students going somewhere other than community college. At least, that's how it turned out with last year's crop. This year's at least had better grades to start from. But I am not getting my hopes up.

Friday, July 10, 2009

What is happening

I am at the most dismal college campus ever for the weekend, teaching the science of the college essay to the ubiquitous "kids from PG County." The beds are crunchy and the bathroom is all rust, exposed pipes, and missing stall doors. Also, it's freezing here. But, on the bright side, it's in the Alleghenies.

In the meantime, UChiBlogo has a post about my Hum professor (and me!). Maybe he merits his own tag in this blog by now?

Tuesday, July 07, 2009

Neurosis returns!

I haven't had a lot of time to worry about being in grad school since I've spent most of it worrying about the logistics of getting to grad school in the first place. First applying, and then leaving my job and moving have clogged my neurosis pathways so that I haven't been able to be particularly anxious about classes and books and things like that. Moving books, yes. Reading them, no. But now, for two straight nights, I've had dreams about 1) going to class and 2) realizing after I've arrived that I have no pants on. This is a classic symptom of Miss Self-Important's academic derangement. It's been so long that I've almost begun to miss it.