Why Is Everyone Saying "100%"?

Decades ago, when I lived in Montreal to do my master’s, I didn’t bother learning French. I thought I’d just “pick it up,” naturally, effortlessly. “Through osmosis,” I half-joked. It didn’t work out. I never learned any French—except for one thing, the phrase ché pas, Quebec slang for “I don’t know,” which I heard everywhere and used it frequently myself.

Picking up such words and idioms is, of course, central to the way in which language works. William Burroughs once wrote that “language is a virus,” and he was right: words and phrases seem to float in the air; they get picked up and passed around, and sometimes they become fashionable for a time before they fade away again (think of such old-timey phrases as how d’ you do?, groovy, dig that now make one cringe); or, sometimes they become cemented into the structure of language (like the post-1960s way in which the word hopefully is now used, or the acceptance of the split-infinitive—“to boldly go where no man has gone before”—or, more recently, the general acceptance of the singular they). When I returned to Canada in 2006 after living in South Korea for nine years, I was perplexed by the strange popularity of the phrase Wait for it. Where did it come from? And why was everyone saying it? (I’ve since learned the answer.) It was around that time that the word dude also became popular. For years, it was a word I reviled. It was too young for me, too “slang-y” and illiterate-sounding, and so at odds with how I spoke until, one day, maybe ten or so years later, it suddenly wasn’t, and I adopted it as my own (usually as a slightly patronizing but lighthearted jab, as in, “Dude, what are you doing?”).

More recent examples of fashionable words and phrases include that inane portmanteau oftentimes, the endlessly repeated phrase talks about (you need to be an English teacher who marks essays for a living or a listener of podcasts to recognize the incessant repetition of this phrase), and the Tweedle-dee and Tweedledum of language: literally, honestly, and their offspring: To tell you the truth and I’m not gonna lie. Discussion of literally and honestly deserves its own blog post, and I’ll save that for next time. What I wish to focus on here, however, is the brand new and sudden rise of the phrase 100%.

The first time I heard it a few weeks ago I immediately fell in love with it. When I asked my property manager if something in my unit was going to get fixed, he replied “100%.” When I asked a store clerk if it was still possible to get a refund on a used item, he too replied, “100%.” And when I pushed back a rental car reservation and asked the agent on the phone if I would still be guaranteed a car at that hour, she too dispelled my fears by that simple answer: “100%.” The first few times I heard it, I thought it was highly original and therefore—to use Merriam-Webster’s 2023 Word of the Year—authentic. Above all, I loved the absolute and total assurance conveyed by this simple response. It was effectively a promise or, even better, a money-back guarantee! And in an era that’s short on trust in our institutions, 100% signalled that not only could you trust the speaker, but that the speaker was wholly on your side and that your request or favour or proposition wasn’t in the least unreasonable or stupid. It was as if you had unwittingly taken a test—a pop quiz—and not only had you passed it, but in fact you scored perfect! And who doesn’t like 100%? I hesitate to admit (I won’t say honestly) that I even felt a little burst of affection for those first few speakers who said it to me, a little dopamine rush that, I understood, was not unlike a thumbs-up emoji: something that makes you feel good at the moment, especially if a sizable number is next to it.

*

But after a few weeks, the novelty wore off and I already grew weary of 100%. In that short span of time, I’d gone from wholehearted approbation to disillusionment, and I imagine that it won’t be long before it too will go the way of literally and honestly: phatic verbal reflexes that not only say little but, like so much else of what we encounter today, is of dubious sincerity and little more the auto-fill of speech. Far from a sign of originality, this phrase is really a reflection of the broader culture and the degree to which technology plays in our lives and its role in influencing our thought and speech.

By definition, 100% is an answer that rules out all other possibilities; indeed, that the response is 100% and not 99% emphasizes much more emphatically that there is no wiggle-room for nuance, subtlety, complexity, exception, or compromise than a simple yes or no answer might convey. What it offers is an all-or-nothing or zero-sum response that is not unlike the way much of contemporary thought, and political discourse in particular, has been both characterized and criticized. It is a reflection, in other words, of our cultural tendency—spurred on by the way in which social media categorizes, separates, and polarizes people and ideas—to interpret and respond to the world in an increasingly simplistic, increasingly dichotomous, even Manichean way: good/bad, friend/enemy, thumbs-up/thumbs-down, blue/red, oppressor/oppressed. In a technologically dominated world that suffers from shortened attention spans and eschews complexity and difficulty but valorizes brevity, convenience, efficiency, and “plain-and-simple English,” the message that 100% conveys is reassuringly short and simple: there is no ambiguity.

That a reply to a yes/no question should take the form of not just a number but a percentage also seems to underscore the extent to which numbers—data—play in the way we perceive the world around us. Nearly every aspect of our lives—the algorithms that keep track of us; our preoccupation with the number of likes, followers, friends, downloads, views, and shares on social media; the number of stars we assign to almost every transaction or service we encounter; the grades that students value more highly than what they learn; our obsession with tracking the minutiae of every aspect of our health; the endless stats and polls that fill the news; not to mention the financial and economic numbers we all fret over—has become increasingly dominated by a veritable spreadsheet of numerical figures that preoccupy our thinking, tell us what is good and determine how we should think and act.

As Walter Ong has argued, our tools shape our consciousness, so it’s not surprising, therefore, that our speech should be influenced in this way. One hundred percent is a good number. It’s like the very building blocks of computer language with its infinitesimal combination of one’s and zero’s; and 100% is the simplest of codes: one-zero-zero. The irony, however, is that as computers “learn” to speak like humans, we are learning to speak like computers; and in our technologically driven world, 100% means your computer is fully updated and protected against all viruses and malware; 100% means an app has completely downloaded and is ready to use; 100% means that your phone battery is fully charged and you can confidently go through your day without worry. One hundred percent is the score the speaker rated your request, favour, or idea and thereby suggests that a number is a better substitute for and communicates more clearly than words. Most significantly, it’s a response that would have been incomprehensible in an earlier decade or century.

So what are the words that this phrase has supplanted? “Absolutely,” “Oh yeah, for sure,” “Without a doubt,” “No question,” “Certainly,” “Of course,” “Don’t worry”—plus any number of other phrases unique to a particular context. Perhaps some of these expressions have lost their credibility; perhaps we have come to doubt their value or earnestness the way many of us have grown to distrust much of what we hear. Or perhaps it’s a reflection of a culture that is constantly texting and tapping on keys: just as thumbing out full words and sentences on one’s phone or hitting the shift key is annoying and time consuming, speaking actual words and full sentences has similarly become just as troublesome and mentally taxing. (It’s one of the great ironies that with every increase in convenience brought about by some technological advancement, whatever labour remains is regarded as intolerably vexing.) Numbers, though, are both more reliable and instantaneous. As writers like Jacque Ellul and Neil Postman have described, modernity has given way to a world that values efficiency above all else, believes that technical calculation is superior to human judgment, sees subjectivity is an obstacle, and regards what can’t be measured as non-existent or a pseudo-problem.

Although it may appear as though I’m making far too much of this seemingly benign little phrase that has become popular recently, I see the prevalence of words like 100%, literally, and honestly as emblematic of a larger societal shift toward an increasingly simplistic, grammatically impoverished use of English that instead of freeing up expression only makes us more limited in our ability to articulate ourselves; and given the role that AI has begun to play in many of our lives with its ability to produce text that is, at once, far more sophisticated than what I’d ordinarily see from the average college student yet also extraordinarily banal, flat, and hollow-sounding, I expect our language will continue to change in strange and unexpected ways. It’s something I’d like to explore in other posts.

Technology is Not a Thing but a Mindset

For Heidegger, the essence of technology has nothing to do with the technological. For him, technology is a mindset, a way of looking at the world as “standing reserve,” a stock of exploitable resources. He calls this mindset “enframing,” and it’s something that has encompassed everything. A river, for instance, reveals itself as a power supplier; a tract of land “reveals itself as a coal mining district, the soil as a mineral deposit” (p. 14). Enframing has even encapsulated man (we are, after all, in the eyes of Big Tech the sum of our data). And now, as demonstrated by the advent of ChatGPT, enframing has swallowed up language, turning it from something uniquely human into a large-language model and a complex series of statistical outcomes.

Others have put forth similar ideas: There’s Marshall McLuhan’s famous dictum: “The medium is the message” and Walter Ong’s, “Technologies are not mere exterior aids but also interior transformations of consciousness.” Or, as Caitlyn Flanagan (2021) simply put it in a piece for The Atlantic, “Twitter didn’t live in the phone. It lived in me.”

What, then, is the mindset that governs our word-processing technology? For one thing, that spelling, punctuation, and grammar no longer matter. Ask any student and you will discover that in the medium of texting, the idea of proper capitalization, punctuation, and even proper spacing are frowned upon, even regarded as prissy and pretentious (though these are not the words they used). A period, I’m told, is seen as “aggressive” or a display of anger. It comes as no surprise, then, that this mindset is reflected in the lockdown browser quiz environment, or in the hand-written work of many of our students: work in which everything is in lower-case, including the first-person pronoun; apostrophes that are nowhere to be found; strange, idiosyncratic spacing choices before and after punctuation; and lines that begin with a comma or period. And let’s not even get into the issues of grammar and spelling. (Even my own spelling, I’m forced to confront week after week when standing at the whiteboard, has atrophied as a result of auto-correct.)

Is this simply laziness or do the students really not know the rules?

The answer, of course, is both. In a world in which hitting the shift key or space bar is too much effort, and our reliance on auto-correct, auto-fill, Grammarly, and now AI, have come to dominate, the technological mindset that Heidegger identified means that there is no real incentive to even know the rules in the first place, that little in fact needs to be remembered or internalized, carelessness is the norm, attention to detail is no longer valued, and independent thought can now be outsourced to technology whose vast sweep has beguiled all of us to varying degrees. If technology is a mindset, it means that a kind of somnambulance governs the classroom, and English class in particular: one need not pay attention (or even attend) if online classes are recorded, which can later be watched and rewatched at 1.25 speed, skipping all the “boring bits.” Note-taking, too, has become obsolete because PowerPoint slides and videos are posted in the course shell, and, more recently, so are AI-generated summaries of the lesson are now available for online classes. And if notes are required, students will often take photos of the whiteboard or screenshots in an online class. Even the idea of writing by hand has become alien, not just to our students but for many teachers as well. (Although many other issues are at play here, one thing is certain: when we abandoned the teaching of cursive, we did so because we believed it no longer served a practical purpose; but what we didn’t realize was that it specifically taught those things that are currently lacking: attention to detail, the importance of rules (and their internalization through frequent repetition), the appreciation of beauty, and to strive for it. It taught us that even the physical act of writing—the tangible feel of it—can be a joy.)

We call all our so-called technological advancements “convenience” and delude ourselves into believing this is “progress,” yet we fail to realize that the tyranny of convenience has a corrosive effect, for we do not seem to realize that in making things easier and more convenient we also do away with motivation. In fact, the technological mindset only instills the idea that reading and writing are tedious, difficult, and boring. But as well all know, there needs to be a degree of difficulty, of pain—of failure—without which there can be no learning and ultimately no reward. No pain, no gain, as they say. After all, anything worth doing or having must be difficult to achieve, and essay writing is supposed to be difficult. But if the very basics haven’t been learned and internalized, if remembering anything is too onerous and overwhelming, all learning will only become increasingly difficult, not less; and what gets taught in the classroom will necessarily have to become more and more remedial—“dumbed down,” as it were. This is not speculation; this is happening now.

But there is another, more insidious, effect: when even language itself has become subject to enframing and reduced to something that can be mobilized via AI to answer the most esoteric prompt in the form of a well-written essay in a matter of seconds, language itself becomes cheapened and our curiosity is deadened. This is the real danger. When all reading and writing become difficult and boring, who will want to explore the great works of the past? Who will even know of them or be interested enough to read them, be inspired by them, and driven to write or think about them? Who will be excited by books or take pleasure in their ideas? If technology is not a thing but a mindset, that mindset has increasingly been characterized by apathy and indolence. And if the most recent PISA report—in which student scores in literacy, math, and science have, for the first time ever, shown an “ ‘unprecedented drop in performance’ globally”—is any indication, it’s that we stand on the brink of a worrying trend, one in which literacy and the concentration it demands will cease to taken for granted, as it is now, but will become a highly valued skill that, just like in the Middle Ages, might once again be held in the hands of a small group of highly trained individuals.

Chris Hedges' Empire of Illusion

Chris Hedges’ 2008 book, Empire of Illusion: The End of Literacy and the Triumph of Spectacle, was not a book I planned on reading. It wasn’t in my ever-growing pile of “to-read” books but was something I found in a different sort of pile: a stack of discarded books someone had placed beside the recycling dumpster in my building’s garbage room. Naturally, I flipped through its pages and what I found were a number of striking passages its previous owner had highlighted in bright yellow: “America has become a façade. It has become the greatest illusion in a culture of illusions”; “At no period in American history has our democracy been in such peril or the possibility of totalitarianism as real”; and: “This endless, mindless diversion is a necessity in a society that prizes entertainment above substance.” I was intrigued. And given how often one hears of the number of Americans described as “divorced from reality” (not to mention all the Nietzsche I’ve been reading), I knew this was something I had to read.

Gilles Deleuze's Nietzsche and Philosophy

Since the pandemic began, I’ve dedicated much of my reading to slowly going through the works of Nietzsche, plus occasionally taking in an academic text on his philosophy along the way. Of the latter, no other book has had a more eye-opening impact on my understanding of the German philosopher than Gilles Deleuze’s Nietzsche and Philosophy. What Deleuze offers is in no way the usual summation of Nietzsche’s key concepts typically found in books aimed at either lay readers like myself or undergraduate students. Instead, Deleuze offers a unique and exciting interpretation that is equal parts Nietzsche, Spinoza, and Deleuze’s own brand of philosophy. I wish to focus here on one aspect of Deleuze’s book and that is his interpretation of the eternal return.

Novella Acceptance!

I’m thrilled to share some good news that my novella “Massive” has been accepted for publication at The Write Launch. It’s the longest piece I’ve written thus far—19,000 words—and I’m really happy to have found a home for this story. Thanks to Sandra and Justine Fluck for accepting this piece, and to all those who provided me feedback on the earlier drafts, most especially to Isabel Matwawana.

The Will to Nothingness

The other day, when I opened my closet and looked at all the clothes hanging in there, at the dress shirts and dress pants, the blazers and ties, the dusty shoes, it struck me that I haven’t worn ninety percent of what was there in a year. It’s like someone died, I thought, and I remembered my mother’s closet after she had died and how I had to go through her clothes, deciding on what was to be thrown out and what was to be donated.

Gratitude in the Time of the Pandemic

After nine years of living in South Korea, I moved back to Canada for good in 2006. I’d grown tired of always being perceived as a foreigner, and as a gay man I felt increasingly uncomfortable as my life came under greater scrutiny the longer I remained a “bachelor.” It was time to go home, time for a fresh start, and I looked to the future with excitement and optimism. What I didn’t expect was how difficult the subsequent years in Canada would be. I had not expected the extent to which I’d experience “reverse culture shock,” how financially difficult it would be, how deeply unhappy and, most surprising of all, how every bit of a foreigner I would feel in Canada. In short, those were “bad” years. And then I remember one Pride weekend, as I was negotiating my way through the crowded gay village in Toronto, when I heard a woman shout: “Yes! 2011 is the best year ever!” What news had she received that added to what sounded like an already wonderful year? I envied her, I remember thinking. Not that my own life by that point was all bad, but it certainly wasn’t as jubilant as hers. It was a year full of the usual ups and downs, just like any other. And although I can’t remember any specific high- or low-lights off-hand, I do recall resolving to stop dubbing years as either “good” or “bad,” a resolution that has unfettered me of a lot of unnecessary expectation and disappointment.