This is a two-part post reflecting on how our writing technology reshapes our creative minds — sometimes in unanticipated and even unwanted ways. This first installment discusses my personal relationship with writing technology, as it evolved from pen and paper in the 1980s to Internet-connected computers in the 1990s.
I. I Do Not Want What I Haven’t Got: Ireland in the 1980s.
I came of age in Ireland of the 1980s, a time when the country, although on the cusp of a major economic and social transformation, still held fast to its deeply conservative Catholic roots. The Irish constitution entirely prohibited abortion and divorce. The highest court in the land had recently overturned a decades-long blanket ban on contraception, on grounds of marital privacy, but the government had adopted an “Irish solution to an Irish problem” by making contraceptives available by prescription to married couples only. Young women still lived in fear of being sent to the country’s notorious mother-and-baby homes if they became pregnant out of wedlock. The unemployment rate hovered around 17%, with one in four younger people out of work. Nobody had much money — including colleges and universities.
The small liberal arts college I started in the late 1980s was mired in the same recession. Affiliated with Ireland’s oldest Catholic seminary — the largest in the world in the mid–19th century, before a long, steady decline as the numbers of young Irish people pursuing religious vocations slowed to a trickle — its campus featured majestic late-18th century architecture coexisting awkwardly with brutalist concrete buildings and shabby prefabs thrown up to accommodate the influx of “lay” students (i.e., non-seminarians, including women) that the college had begun admitting in the mid-1960s. By the time I arrived, the campus featured a heady mix of goths and clerics, punks and bishops, an amalgam of the Catholic hierarchy who had ruled the country’s past and the so-called “pope’s children” — those of us who had been kids when John Paul II made the first papal visit to Ireland in 1979 — who would define its future.
Despite this, our college education remained highly traditional. Several of our professors were ordained clergy, some of them Jesuits, who possessed unrivaled mastery of Plato, Aristotle, and moral philosophy. We learned Old and Middle English so that we could read texts such as Beowulf and Chaucer’s Canterbury Tales in the original. We read extensively in British and Irish poetry, with a notable stress on theological poets such as Milton, Donne, George Herbert, Gerard Manley Hopkins, and T. S. Eliot. Although we hardly read any American literature — meaning that I discovered the likes of Melville, Steinbeck, and F. Scott Fitzgerald later in life — we did get a formative and life-changing grounding in canonical English literature from the seventh century through the twentieth.
As important as all that reading was, the process of writing about it was just as vital — and that process was exclusively paper-based. None of my classmates owned a computer, and there were precious few computing resources available on campus, meaning that technology simply didn’t feature in our lives or educations. A visiting American speaker mentioned that Dartmouth had become a “wired campus,” but that seemed laughably distant and futuristic. For us, using the library meant thumbing through a tattered card catalog, while “looking something up” meant dragging a heavy, dusty reference volume — such as the Oxford Dictionary of National Biography — off its high shelf. Sitting in a gorgeous old sunlit library, armed with nothing but pen and paper, we let thoughts emerge and arguments form. We submitted handwritten papers to our professors and received handwritten comments in return. Examinations were also conducted as long-form essays. Pen, paper, language, and uninterrupted thought became connected in our minds in a fundamental, seemingly unbreakable way.
In addition to this large volume of hand-produced academic work, I did a lot of social writing. Instead of queuing for the few payphones on campus, I wrote letters to friends, family, and penpals. Handwritten letters were — as I recently explained to my incredulous 15-year-old — the Snapchat of the 1980s, except that they allowed for deeper and more sustained relationships with a smaller circle of correspondents. We even had multimedia — photographs, newspaper and magazine clippings, and mixtapes could all be stuffed into envelopes alongside letters. My best friend, then studying computer science at another Irish university, confidently predicted a future in which we would communicate through connected digital devices, but that felt like something out of Star Trek. I went back to translating passages from Beowulf.
II. Rattle and Hum: New York in the 1990s
By the time I graduated college, I had earned a Masters degree in literature but I still couldn’t touch type. Upon moving to the United States to escape the Irish recession, I soon secured my first real job, working for Random House book publishers in New York City. Based on my job interviews, I knew I’d be doing a lot of editing work with pencil on paper manuscripts, but my job also came with unexpected typing requirements that I was ill-positioned to discharge.
My deficiencies became evident on day two when my new boss spotted me hunting and pecking over my keyboard. “You can’t type?” she said, aghast. “How did you pass the typing test?” I offered that nobody in HR had administered a typing test. “But how did you get through college?” she asked. “Didn’t you have to write papers?” Upon hearing about my tech-free college experience, she sat with her head in her hands. “Everything is going digital now,” she said. “Everything. You can’t work here and not type.” Concerned that my tenure at Random House would last less than 48 hours, I reassured her that I could and would get up to speed with typing.
Soon, I was in a nearby Barnes & Noble, armed with a corporate credit card, checking out typing tutorials. Picking a program didn’t take long. Mavis Beacon Teaches Typing, branded “the finest typing program in the world,” was head and shoulders above the competition. On the box cover, a gleaming African American woman, immaculately attired in a cream-colored business suit, exuded professionalism, competence, and you-can-do-it vibes. Mavis Beacon and I had the same initials, I realized. If Mavis could type, I could type, too. Convinced at the time that Mavis Beacon was a real typing instructor, I was disappointed many years later to discover that “she” was merely a corporate trademark and that the photograph on the box cover was of a model — you live and learn.
At the time, I shared my small office with a vivacious and immaculately dressed Korean American temp, who was covering for the job I had been hired to do full-time. She had just landed her dream job as an assistant buyer at Bloomingdale’s but had agreed to stay on and orient me to the workflow. I watched her fingers dance effortlessly across the keyboard, text flowing magically onto her computer screen. “Oh, are you improving your typing speed?” she asked, as I plunked Mavis Beacon’s gleaming visage down on my desk. “I can do about 105 words a minute, how about you?” I confessed what had happened — HR had screwed up and failed to administer a routine typing test. Now, I urgently needed to learn how to type, in a “do or die” type situation. My officemate’s expression said it all: wide-eyed disbelief that I had been hired for a professional office job without possessing any typing skills whatsoever, like a lifeguard who couldn’t swim. Nevertheless, she helped me install Mavis Beacon on my work Macintosh, and I was soon up and running. ASDF, ASDF, ASDF, ASDF went my left hand. I could already touch type, I realized after half an hour. At least I could touch type “fads sad as dads.” It was a start.
With dedicated devotion to Mavis Beacon and useful tips from my temp officemate, I made rapid progress. Never looking down at the keyboard was important, I learned; instead, I oriented myself by touch to the little nubs on the F and J keys. I discovered that I could “practice” typing anywhere — on the train, at the dinner table, even walking, fingers moving steadily to the rhythm of ASDFG HJKL; and then expanding beyond the home row. This helped my muscle memory adapt rapidly — each time I sat down with Mavis Beacon, I could do her exercises faster and with fewer blunders than before. Soon, I had achieved a respectable if not spectacular 60 words per minute, which was sufficient to handle the typing duties of my job once my officemate had departed. Another shift had happened, too: Without fully realizing it, I was up and running in the digital world.
From there, I could embrace modern 1990s office life, quickly mastering Word, Excel, QuarkXPress, and other programs key to my job, while also becoming initiated into the fledgling Internet via email and Mosaic (soon replaced by Netscape). It amazed me that an email could traverse the world in seconds, or that up-to-date information about practically anything was just a few mouse clicks away. It all felt new and exhilarating, and it was, back then. Editors quibbled about whether we should capitalize terminology such as “Internet” and “World Wide Web” or hyphenate the word e-mail. I attended an in-house seminar on a new digital format called the e-book. All books will be digital within 15 years, a young executive enthused. (This proved wildly over-optimistic — printed books still outsell e-books by around 4 to 1.)
But even then, some were not so enthused about this prospect of everything going digital. A long-time penpal and I, who had been exchanging letters since we were 15, traded experimental emails via our respective new work email addresses. “I hate this,” she confessed, in reply to my third message. “I like the rhythm of letter-writing. I like waiting for your letters and taking time to respond. I like reading your handwriting and I like knowing you’ve touched and folded the pages you send me. An e-mail feels like something you’d get from your boss.” We resumed writing letters. She sent me a mixtape that ended with Christy Moore’s “City of Chicago” — the chorus of which ran “In the city of Chicago / As the evening shadows fall / There are people dreaming / Of the hills of Donegal.” I felt homesick. There was a material world out there as well that I missed.
The world was changing as the digital revolution unfolded — but we still maintained a clear separation of work life and home life. Laptops as a go-between were still relatively rare, being expensive, heavy, and underpowered. During my time in publishing, my office computer sat stolidly on my office desk, got booted up in the morning and turned off in the evening. I had no home computer access, and I rarely thought about email or the Internet when not in the office. Working in publishing meant that I got lots of complimentary books, and there were many more treasures available in New York’s used bookstores for anyone willing to go scavenging. I had more than enough to read — and anyone who needed to get in touch with me over the weekend could use the phone. Having Internet at home felt entirely unnecessary.
III: No Guru, No Method, No Teacher: The Digital Humanities
In the mid-1990s, I left book publishing to pursue a doctorate in English literature at the University of Pennsylvania. After a scrambled last-minute move to Philadelphia, during which I sampled my first cheesesteak and ran up the steps of the Philadelphia Museum of Art à la Rocky, I met my grad cohort for the first time, at a reception at the graduate chair’s home. I watched as my fellow students sized each other up over wine and cheese, the academic competitiveness palpable. They name-dropped their former professors, people I had never heard of, and discussed other programs they had turned down in favor of this one. One young man, decked out in a tweed jacket, opined that no serious PhD student could afford to sleep for more than six hours a night. A young woman wondered whether her recent marriage would survive grad school. During the course of the evening, it emerged that everyone else already had a computer and home modem — they also had university email accounts and had been trading messages on something called a listserv. Someone offered to ask the department’s resident computing guru to subscribe me. I didn’t know what any of this meant. Again, I had catching up to do.
I bought my first computer the following week, a Macintosh Performa, similar to the model I’d used at Random House. My then-girlfriend helped me lug the computer, CRT monitor, printer, and assorted peripherals from the campus computer store to her car, and then from our parking garage to our high-rise apartment. “It has a CD-ROM drive,” I told her as we struggled into the elevator, repeating what the salesman had told me. “And it has eight megabytes of RAM.” But regardless of whatever impressive mid-90s features my new Macintosh Performa boasted, my enthusiasm for opening the boxes just wasn’t there. The computer had breached an invisible barrier between my work and personal life. On Sunday evening, I unpacked it, connected all the cables, and turned it on — after a peppy chime, its smiling Mac face appeared on the screen, friendly and reassuring. I installed Microsoft Word 6.0 from a stack of floppy disks. No more pen and paper. This is how it’s done now, I told myself. Grad school was going to be serious business, although maybe not serious enough yet for a tweed jacket.
I quickly learned that my undergraduate experience of writing everything longhand had been a quirky anomaly of a prolonged economic recession that had left institutions running on a shoestring budget. In the United States, it simply wasn’t acceptable to submit handwritten work in college, or even in high school. I tried to adapt to what I saw my peers doing by writing on the screen and occasionally printing out drafts for review. But over the next few months, something strange started to happen — I noticed that the more time I spent sitting at my Macintosh Performa word processing, the more my writing process became fragmented and nonlinear. It was all too easy to add a few words here, delete some there, cut and paste a sentence or an entire paragraph — but I found that tinkering endlessly with prose in this manner was coming to supplant the process of having and articulating ideas. I no longer ended a writing session with the satisfactory feeling of having got everything out of my head and down on paper. The more I shifted words around on the page, the more muddled they seemed.
My fellow grad students, all Americans who had been word processing as a matter of course since high school, didn’t seem to have any apparent issue with this way of working. But I felt the shift so acutely in comparison to my undergraduate days that I became curious about whether how we write affects what we write — and about whether “word processing” represented not a handy technological advance but an ontological shift away from the linear thought process that had defined much of Western culture. We were moving into uncharted territory, embracing the nonlinear reading and writing processes — defined by skimming, skipping, and discontinuous writing — that would come to dominate the digital era. I wondered, per my penpal’s comments about touching physical paper and reading words intimately formed by hand, about the implications of divorcing our digital writing experience from the tactile materialism of pen, ink, and paper. And I was reminded of Truman Capote’s caustic dismissal of Beat Generation writers: “That’s not writing; it’s just typing.”
As my first semester of grad school drew to a close, I found myself with multiple lengthy papers due simultaneously while my relationships with my computer and Microsoft Word were deteriorating rapidly. My writing sessions had become briefer, more fragmented and less productive; my previous steady flow was no longer reliably there. To make things worse, I often found myself flipping between Microsoft Word and email. Some friends in the grad program had entered a state of blind panic as our deadlines loomed — and the panic was becoming contagious, especially via late-night email exchanges. I was additionally using my computer to look up information online, but often disappearing down rabbit holes and reading webpages far removed from what I was researching.
In the end, I gave up. I went back to my tried and trusted low-tech routine — I took a notepad and paper to the university’s beautiful Fisher Fine Arts Library and sat there in the sunlight, writing uninterruptedly for hours. Then I came home and typed in everything I had written by hand during the day. To my surprise, I had written more with a $0.30 ballpoint pen in one day than in the entire previous week on my $2,000 Macintosh. More importantly, I had recaptured an important sense of flow.
Over the following semesters, I experimented with different writing technologies, some material, some computer-based. I bought, and briefly used, a portable manual typewriter. I tried numerous brands of pens and pencils. I briefly adopted a little-known Macintosh word processor called Nisus Writer. By the time I reached the dissertation stage of my PhD, my cobbled-together writing setup comprised a 1970s-era text editor called vi (pronounced vee eye), written by Sun Microsystems co-founder Bill Joy, the TeX markup language created by renowned Stanford computer scientist Donald Knuth, and an early incarnation of the Linux operating system built by a Finnish computer nerd named Linus Torvalds. The vi editor hid its many powerful functions from view, leaving the user with only a black screen — no bells, whistles, and zillions of contextual menus — that encouraged the same immersion and linear composition as handwriting, without the need to type everything in after writing it. The typesetting capabilities of TeX, while demanding to master, were far beyond those of word processors. The combination of these technologies, while inexplicable to my Microsoft Word–using peers, felt right for me. But I still wanted to know why. I had begun to sense that the various experiences we tended to bundle together under the rubric of “writing” could vary enormously depending on the tools used.
I read about the philosopher Friedrich Nietzsche, whose failing eyesight forced him to stop writing with pen and ink. To compensate, he purchased a Malling-Hansen writing ball typewriter directly from its Danish inventor, taught himself to touch type, and wrote thereafter with his eyes closed and fingers on the keys. Many scholars have since noted that this shift from longhand composition to typing marked the emergence of Nietzsche’s “late style” — the lengthy and complex sentences of his earlier works were replaced by staccato prose that became increasingly aphoristic and enigmatic. Nietzsche himself acknowledged that this shift related to his adoption of the typewriter, observing in a letter that “Our writing tools are also working on our thoughts,” which felt to me like an early articulation of Marshall McLuhan’s famous claim that “the medium is the message.” A seemingly transparent shift from one technology to another was anything but — the shift in medium altered our thought patterns, changed our consciousness. A similar thing happened to Henry James over the last decade of his life. From 1907 onward, he had typist Theodora Bosanquet come regularly to his home; he would pace and dictate as she typed, once commenting that prose was being “pulled out” of him by what he called the “music” of her clicking Remington. This corresponded to a noted change in his late prose, which became more convoluted, complicated, and circuitous, even after he had edited it by hand.
I became fascinated by modern holdouts who had never used the Internet, such as journalist and critic Ron Rosenbaum — who, at the turn of the century, wrote a series of articles for Slate entitled “The Last Luddite Gets Wired,” documenting his efforts to begin using a Macintosh PowerBook laptop. I was curious about the writing practices of modern-day authors such as JK Rowling and Joyce Carol Oates, who continued drafting their books in longhand despite the ready availability of word processing. Jennifer Egan, a Pulitzer Prize–winning novelist, gave an explanation for her longhand writing process that resonated deeply with me, saying that it “seems to do a much better job of unlocking my unconscious, which is where all the good ideas seem to be.” That dovetailed with my own instinctive sense, even if it conflicted with the “future is digital” spirit that prevailed in the 1990s.
I’ve returned to this particular moment in the 1990s when personal computing and Internet connectivity were becoming mainstream, advancing rapidly beyond the office desk into people’s private lives and spaces, because it was then assumed as an axiom of faith, amid the technophilia and financial bubble of the dot-com boom, that all this new technology would make us more creative, more productive, more connected. Writers such as Rosenbaum, already sardonically identifying himself as “the last Luddite,” were seen as relics of a bygone era. Partly fearing that we would be similarly regarded as dinosaurs if we resisted, the rest of us moved with the technology as it advanced — we upgraded to ever more powerful desktops, laptops, and operating systems. We had Napster and CD burners, Web 2.0 and social media, and then iPhones, iPads, smart speakers, and smart homes. When new technology came out, we simply adopted it and changed with the times. But the times were also changing us.
As devices became smaller, more portable, and wireless they came to infiltrate spaces that office technology of old — or even home computers — did not permeate. Nine in ten 18-to-29-year-olds now sleep with smartphones in their beds or within arm’s reach. Around four in ten say they wake up during the night to check electronic devices, and two-thirds say that the first thing they do on waking is check their phones. We have, over the space of three decades, quietly erased the idea of any time and space separate from technology — the quiet time we once devoted to sleeping, reading, reflecting, or bonding with a partner now must compete with an endless barrage of email, text messages, social media, and online video that is rarely further away than the end of our arm.
This shift is profound, and it goes way beyond writing, but it can affect creative professionals in especially telling ways. On Saturday, I’ll post the second part of this piece, detailing how writers must navigate this always-on, perennially distracting personal technology — but more importantly, discussing what we can do about it.
Thanks for the background. I found your description the digital evolution and your response to it very engaging.
FWIW. Your story about Random House and how you couldn't type reminds me of one of my own experiences.
Back in the day when I was in the US Army, someone at my first post asked if "anyone here could type" (with a typewriter). There were perks for being able to type (no KP, duty driver or overnight "on call, on base"). So of course I said I could. And I faked it for a few weeks until I really could type, more or less.
Thanks for an enjoyable read and a trip down my own memory lane.
The question of how technology influences the way people perceive the world is a very interesting one. I posted some thoughts in my recent post Retrotech: Technology in 1925
https://chicagoboyz.net/archives/73290.html