I am composing the first draft of this post through dictation. For two years, I’ve been looking at the keyboard of my iPad without noticing that it contained a microphone icon. Finally, I studied the button (we’re supposed to learn from pictures nowadays—that I have noticed!), got a little curious, and began experimenting. When I discovered that the dictation technology which I had wanted to explore for years was right here at my fingertips (yes, literally), I was elated, yet clung to a certain reserve. I had indeed heard that problems lingered. Sure enough, my maiden voyage took water heavily, if it didn’t exactly end on the rocks. Yesterday, in reading back some of my dictation from the previous evening, I found several embarrassing errors (a phrase which I see my digital amanuensis has recorded, “in Barris saying hours”). “Digression” was a challenge for the software: it came out “big Russian”. “Repudiate” emerged “rape you a date”. The single most irritating misfire was the relatively simple word “enhanced”. I observed that a German-looking proper kept showing up in the phrase, “in Hanst”, no matter how often I repeated the word. So where the hell is Hanst? At last I couldn’t contain myself. I muttered to screen, “You dumb s**t”; and, of course, when I came to copy and paste my dictation from the previous evening, there was my obscenity staring me placidly in the face. “You dumb s**t.”
In this one regard, if no other, artificial intelligence is already vastly more mature than the human variety. It doesn’t reciprocate in name-calling. Granted, it may be too stupid to do so; but just possibly, it may also be so extremely clever that it understands the infuriating effectiveness of mirroring an obscenity right back at the sender. Now, if I had said those foul words to a human being, fur would have been flying instantly.
The truth is that you don’t have to say anything insulting to a human being these days in order to register as a beast, a cad (you dumb… you poor digital blockhead: not Computer-Assisted Design), or a cruel, heartless boor (no, you stupid… no, sweetheart: not boar; actually, the word “boor” is an intentional slur aimed at white South Africans—but we’re assured that slurring them is almost a moral obligation). A few of my readers may remember the incident about twenty years ago now when a hapless DC bureaucrat used the word of Scandinavian derivation, “niggardly”. He lost his job, and for a while he must have wondered if he would lose his life. Today we can’t say things like “spic and span” or “chink in the armor”. No, they’re not racist: any idiot could tell that they are expressions with no racial content whatever. Yet our society doesn’t breed just any sort of idiot.
We have a special variety of sensitive plant that sends its roots deep into our academic institutions and proliferates in our broadcast media. These delicate flowers process everything we say as it comes from the mouth, scanning for any resemblance to any word on a list of forbidden terms. Once a similarity is identified (no, not “a Denna five”), the long knives come out. The perpetrator is defamed, shamed publicly, humiliated for life, driven from his job, rendered unemployable—all because he used the phrase, “those people”, or blew some kind of racial dog whistle wherein the words “monkeying around” figured.
Given my newfound familiarity with dictation technology, I’m struck by how much our degenerating human mind resembles the rudimentary kind of artificial intelligence on display in this fallible wizardry. We have in our memory bank some thousands of words and phrases that we’ve encountered before: everything that we hear is judged on the basis of its relationship to the words in that depository. We no longer apply any power of analysis to the lexicon, so if you say, “She should stay home and look after her kids,” you instantly and irredeemably become guilty of at least a dozen vile sexist transgressions. The receptor’s circuits perceive no context for your remark and seek out none. The raw text of what you said is what you meant. What else could you mean? You used words X, Y, and Z; those words are stored in the warehouse; and this is what they mean when unpacked.
Words like “target” and “trigger” are incitements to gun violence. An expression like “tough it out” points to toxic masculinity. Employing the word “mailman” or “chairman” designates you as what used to be called a “male chauvinist pig back” in the Seventies; nowadays the noun “male” suffices to capture the same sense. The word “Christmas” is hate speech: it implies disparagement of Islam. Saying “Peking” instead of “Beijing” or “Bombay” instead of “Mumbai” is rank colonialism. Sometimes you don’t have to utter a syllable; simply wearing a sombrero, whether or not you try to pronounce the word’s trilling r’s, is cultural appropriation.
Digital Dictaphone is a good sport about this sort of thing. It ties your thoughts to its available vocabulary without any sort of invidious inference. But the contemporary human version of this artificial operation is painfully artificial in all the wrong ways. We refuse to supply context, to research words used in an unfamiliar manner, to give somebody the benefit of the doubt based upon the person’s previous clean record. We develop the kind of closed mind that only primitive computers will preserve in the future. The machine will pass us up as we descend to its level of weakest performance.
When I have written in the past about human fusion with robots becoming increasingly easy as people grow blunter and robots grow subtler, this is the sort of thing I had in mind. Imagine my iPad’s stupid little dictation device, and then imagine its marriage to the prickly, politically-correct consciousness of a graduate student in English. We are dumbing down, and we’re not doing it gracefully. The day is already at hand when a minimally functional computer like my iPad could be programmed with polite responses, and the result would be superior to the new wave of “woke” people (what an idiot word for an idiot generation!) emerging from our colleges.
God only knows what my digital mirror is writing down. I’ll find out tomorrow (today: you’ll have noticed a few of my many changes as I edit the final draft), because I’m not actually reading the words that pop up on the screen. I don’t want to: they would distract me—would probably make me angry. Yet I know all the while, even though I’ll probably grumble a few curses tomorrow, that the machine is just a machine. It did as well as it could.
I almost wanted to say, “She did as well as she could.” Why is that? Is it because of my need to belittle females… or is it because I attribute to the female, in a traditionalist’s presumption, the desire to please, to compromise, to mediate, to make peace? I wonder if males who come after me, perhaps those of my son’s generation, will deck their computers out in a feminine face and program it with feminine politeness. I wonder if they will find the result more feminine, more lovable, more companionable than the “manly women” who grow up beside them. I wonder how soon we’re going to be reduced to embracing our screens for human companionship