Why Is Everyone Saying "100%"?

Decades ago, when I lived in Montreal to do my master’s, I didn’t bother learning French. I thought I’d just “pick it up,” naturally, effortlessly. “Through osmosis,” I half-joked. It didn’t work out. I never learned any French—except for one thing, the phrase ché pas, Quebec slang for “I don’t know,” which I heard everywhere and used it frequently myself.

            Picking up such words and idiomatic phrases is, of course, central to the way in which language works. William Burroughs once wrote that “language is a virus,” and he was right: words and phrases seem to float in the air; they get picked up and passed around, and sometimes they become fashionable for a time before they fade away again (think of such old-timey phrases as how d’ you do?, groovy, dig that now make one cringe); or, sometimes they become cemented into the structure of language (like the post-1960s way in which the word hopefully is now used, or the acceptance of the split-infinitive—“to boldly go where no man has gone before”—or, more recently, the general acceptance of the singular they). When I returned to Canada in 2006 after living in South Korea for nine years, I was perplexed by the strange popularity of the phrase Wait for it. Where did it come from? And why was everyone saying it? (I’ve since learned the answer.) It was also around that time that the word dude also became popular. For years, it was a word I reviled. It was too young for me, too “slang-y” and illiterate-sounding, and so at odds with how I spoke and the kinds of words I used until, one day, maybe ten or so years later, it suddenly wasn’t, and I adopted it as my own (usually as a slightly patronizing but lighthearted jab, as in, “Dude, what are you doing?”).

More recent examples of fashionable words and phrases include that inane portmanteau oftentimes, the endlessly repeated phrase talks about (you need to be an English teacher who marks essays for a living or a listener of podcasts to recognize the sheer repetition of this phrase), and the Tweedle-dee and Tweedledum of language: literally and honestly, and the offspring of the latter: to tell you the truth and I’m not gonna lie. Discussion of literally and honestly deserves its own blog post, and I’ll save that for next time. What I wish to focus on here, however, is the brand new and sudden rise of the phrase 100%.

The first time I heard it a few weeks ago I immediately fell in love with it. When I asked my property manager if something in my unit was going to get fixed, he replied “100%.” When I asked a store clerk if it was still possible to get a refund on a used item, he too replied, “100%.” And when I pushed back a rental car reservation and asked the agent on the phone if I would still be guaranteed a car at that hour, she too dispelled my fears by that simple answer: “100%.” The first few times I heard it, I thought it was highly original and therefore—to use Merriam-Webster’s 2023 Word of the Year—authentic. Above all, I loved the absolute and total assurance conveyed by this simple response. It was effectively a promise or, even better, a money-back guarantee! And in an era that’s short on trust in our institutions, 100% signalled that not only could you trust the speaker, but that the speaker was wholly on your side, and that your request or favour or proposition wasn’t in the least unreasonable, stupid, or outlandish. It was as if you had unwittingly taken a test—a pop quiz—and not only had you passed it, but in fact you scored perfect! And who doesn’t like 100%? I hesitate to admit (I won’t say honestly) that I even felt a little burst of affection for those first few speakers who said it to me, a little dopamine rush that, I understood, was not unlike a thumbs-up emoji: something that made you feel good, especially if a sizable number was next to it.

*

But after a few weeks, the novelty wore off and I already grew weary of hearing 100%. In that short span of time, I’d gone from wholehearted approbation to disillusionment, and I imagine that it won’t be long before it too will go the way of literally and honestly: thoughtless verbal reflexes that not only say little but, like so much else of what we encounter today, is of dubious sincerity—like the auto-fill of speech. Far from a sign of originality, this phrase is really a reflection of not just the broader culture in which we live but of the degree to which technology plays in our lives and its role in influencing our thought and speech.

By definition, 100% is an answer that rules out all other possibilities; indeed, that the response is 100% and not 99% emphasizes in a quantifiable way that there is no wiggle-room for nuance, subtlety, complexity, exception, or compromise, much more so than a simple yes or no answer might convey. What it offers is an all-or-nothing or zero-sum response that is not unlike the way much of contemporary thought, and political discourse in particular, has been both characterized and criticized. (One need look no further than the Israel-Gaza conflict, the war in Ukraine, and recent US elections for examples.) It is a reflection, in other words, of our cultural tendency—spurred on by the way in which social media categorizes, separates, and polarizes people and ideas—to interpret and respond to the world in an increasingly simplistic, increasingly dichotomous, even Manichean way: good/bad, friend/enemy, thumbs-up/thumbs-down, blue/red, oppressor/oppressed. In a technologically dominated world that eschews complexity and difficulty but valorizes brevity, simplicity, convenience, efficiency, easy answers, and “plain-and-simple English”—not to mention a populace suffering from shortened attention spans—the message that 100% conveys is reassuringly short and simple: there is no ambiguity.

That a reply to a yes/no question should take the form of a number also seems to underscore the extent to which we rely on numbers—on data—to make sense of the world. We are a culture obsessed with, for example, the number of likes, followers, friends, downloads, views, and shares. There are the numbers attached to nearly every aspect of our lives, including products on Amazon, books on Goodreads, our Uber drivers, restaurants we’re thinking of going to, movies we might watch, even transactions conducted on Kijiji (“Rate your seller!”). There are the endless stats, polls, and surveys we are confronted with daily, and the endless requests to fill out surveys: (“Tell us what you think?” “How satisfied were you?”; and “Rate us on Facebook!”). Even, when we go to the doctor, we are asked to rate the level of pain we experience.

There is the vast scope of numbers related to health and our preoccupation with “tracking” everything that can be empirically tracked: the number of calories, hours of sleep, glasses of water, ounces of alcohol, cups of coffee, steps taken, laps swum in a pool or run around a track; not to mention the number of days of resistance training per week, the number of sets, reps, and minutes of rest. We are a society obsessed with our weight and BMI, and we turn to food labels to determine if the grams of sugar, fat, carbohydrates, protein, and sodium make something worth consuming. And then there are the milligrams or international units of calcium, betacarotene, magnesium, iron, zinc, and the alphabet soup of vitamins we ingest to supplement whatever may be deficient in our diets—deficiencies that have often determined by another vast array of mysterious numbers yielded by a blood or urine analysis.

Similarly, students are fixated with the numbers on their test scores (and less so with actually learning something), their GPA, and with the class average. Shoppers are concerned with prices, sales, and points earned, points redeemed. Investors keep careful track of the stock markets, quarterly and annual earnings’ reports, inflation and interest rates, the exchange rate, the bond markets. There is the temperature, the day’s high and overnight low, the chance of precipitation. And as individuals, we don’t exist without a SIN number, a student- or employee-number, or phone number.

Above all, our obsession with numbers stems from a cultural addiction to speed and efficiency, something that began, as Lewis Mumford argues in Technics and Civilization, in the 12th century with the invention of the mechanical clock; a preoccupation that has only continued to accelerate ever since.

Numbers—and I feel I’ve only scratched the surface on the possible examples—can either provide assurance or cause for alarm; they largely determine what course of action we’ll take. One hundred percent is a good number. It’s like the very building blocks of computer language and its infinitesimal combination of one’s and zero’s. And 100% is the simplest of codes: one-zero-zero. We are, one could say, speaking like computers—an irony given that as computers “learn” to speak like humans, we are learning to speak like computers. And in the world of computers 100% means your computer is updated and fully protected against all viruses and malware; 100% means an app has completely downloaded and is ready to use; 100% means that your phone is fully charged and you can confidently go through your day without worry. One hundred percent is the score the speaker rated your request, favour, or idea, and saying it suggests that a number is a better substitute for and communicates more clearly than words.

So what are the words that are implicit but unspoken? “Absolutely,” “Oh yeah, for sure,” “Without a doubt,” “No question,” “Certainly”—plus any number of other phrases unique to a particular context. Just as thumbing out full words and sentences on a phone, or even hitting the shift key, is annoying, speaking actual words and in full sentences, it would seem, has become just as troublesome. (It’s one of the great ironies that with every increase in convenience brought about with some technological advancement, whatever labour remains is regarded as increasingly vexing.)

If this cute, benign-seeming, popular phrase is, as I have suggested above, a kind of verbal emoji, it’s also an indication of how much we seem to rely on emojis, GIFS, and auto-fill in our communication. The vast and growing choice of “pictograms” embedded in our phones that many of us turn to when texting or emailing may be cute or funny or simply convenient, but the fact remains that those who can’t convey through the careful use of words and punctuation something said in jest without resorting to a winking or smiling emoji are limited in their ability to communicate. It’s a small sign, as Marshal McLuhan foresaw in the 1960s, that electronic age will usher in the post-literate age.

Although it may seem as though I’m making far too much of this harmless little phrase that has become popular recently, it’s important to keep in mind what Günther Anders, philosopher of technology, recognized in the 1950s: that the things that cause the greatest harm often appear as the most ordinary. In fact, I see the popularity of such phrases as 100% and literally and honestly as emblematic of a larger, societal shift toward an increasingly simplistic, grammatically ambiguous use of English that instead of freeing up expression only makes us more limited in our ability to articulate ourselves, that we have fewer words with which to express ourselves—and given the sophistication that ChatGPT can elicit answers far more sophisticated than what I see among the average college student, that’s a danger to keep in mind. It’s something I’d like to explore in other posts.

 

Technology is Not a Thing but a Mindset

For Heidegger, the essence of technology has nothing to do with the technological. For him, technology is a mindset, a way of looking at the world as “standing reserve,” a stock of exploitable resources. He calls this mindset “enframing,” and it’s something that has encompassed everything. A river, for instance, reveals itself as a power supplier; a tract of land “reveals itself as a coal mining district, the soil as a mineral deposit” (p. 14). Enframing has even encapsulated man (we are, after all, in the eyes of Big Tech the sum of our data). And now, as demonstrated by the advent of ChatGPT, enframing has swallowed up language, turning it from something uniquely human into a large-language model and a complex series of statistical outcomes.

Others have put forth similar ideas: There’s Marshall McLuhan’s famous dictum: “The medium is the message” and Walter Ong’s, “Technologies are not mere exterior aids but also interior transformations of consciousness.” Or, as Caitlyn Flanagan (2021) simply put it in a piece for The Atlantic, “Twitter didn’t live in the phone. It lived in me.”

What, then, is the mindset that governs our word-processing technology? For one thing, that spelling, punctuation, and grammar no longer matter. Ask any student and you will discover that in the medium of texting, the idea of proper capitalization, punctuation, and even proper spacing are frowned upon, even regarded as prissy and pretentious (though these are not the words they used). A period, I’m told, is seen as “aggressive” or a display of anger. It comes as no surprise, then, that this mindset is reflected in the lockdown browser quiz environment, or in the hand-written work of many of our students: work in which everything is in lower-case, including the first-person pronoun; apostrophes that are nowhere to be found; strange, idiosyncratic spacing choices before and after punctuation; and lines that begin with a comma or period. And let’s not even get into the issues of grammar and spelling. (Even my own spelling, I’m forced to confront week after week when standing at the whiteboard, has atrophied as a result of auto-correct.)

Is this simply laziness or do the students really not know the rules?

The answer, of course, is both. In a world in which hitting the shift key or space bar is too much effort, and our reliance on auto-correct, auto-fill, Grammarly, and now AI, have come to dominate, the technological mindset that Heidegger identified means that there is no real incentive to even know the rules in the first place, that little in fact needs to be remembered or internalized, carelessness is the norm, attention to detail is no longer valued, and independent thought can now be outsourced to technology whose vast sweep has beguiled all of us to varying degrees. If technology is a mindset, it means that a kind of somnambulance governs the classroom, and English class in particular: one need not pay attention (or even attend) if online classes are recorded, which can later be watched and rewatched at 1.25 speed, skipping all the “boring bits.” Note-taking, too, has become obsolete because PowerPoint slides and videos are posted in the course shell, and, more recently, so are AI-generated summaries of the lesson are now available for online classes. And if notes are required, students will often take photos of the whiteboard or screenshots in an online class. Even the idea of writing by hand has become alien, not just to our students but for many teachers as well. (Although many other issues are at play here, one thing is certain: when we abandoned the teaching of cursive, we did so because we believed it no longer served a practical purpose; but what we didn’t realize was that it specifically taught those things that are currently lacking: attention to detail, the importance of rules (and their internalization through frequent repetition), the appreciation of beauty, and to strive for it. It taught us that even the physical act of writing—the tangible feel of it—can be a joy.)

We call all our so-called technological advancements “convenience” and delude ourselves into believing this is “progress,” yet we fail to realize that the tyranny of convenience has a corrosive effect, for we do not seem to realize that in making things easier and more convenient we also do away with motivation. In fact, the technological mindset only instills the idea that reading and writing are tedious, difficult, and boring. But as well all know, there needs to be a degree of difficulty, of pain—of failure—without which there can be no learning and ultimately no reward. No pain, no gain, as they say. After all, anything worth doing or having must be difficult to achieve, and essay writing is supposed to be difficult. But if the very basics haven’t been learned and internalized, if remembering anything is too onerous and overwhelming, all learning will only become increasingly difficult, not less; and what gets taught in the classroom will necessarily have to become more and more remedial—“dumbed down,” as it were. This is not speculation; this is happening now.

But there is another, more insidious, effect: when even language itself has become subject to enframing and reduced to something that can be mobilized via AI to answer the most esoteric prompt in the form of a well-written essay in a matter of seconds, language itself becomes cheapened and our curiosity is deadened. This is the real danger. When all reading and writing become difficult and boring, who will want to explore the great works of the past? Who will even know of them or be interested enough to read them, be inspired by them, and driven to write or think about them? Who will be excited by books or take pleasure in their ideas? If technology is not a thing but a mindset, that mindset has increasingly been characterized by apathy and indolence. And if the most recent PISA report—in which student scores in literacy, math, and science have, for the first time ever, shown an “ ‘unprecedented drop in performance’ globally”—is any indication, it’s that we stand on the brink of a worrying trend, one in which literacy and the concentration it demands will cease to taken for granted, as it is now, but will become a highly valued skill that, just like in the Middle Ages, might once again be held in the hands of a small group of highly trained individuals.

Chris Hedges' Empire of Illusion

Chris Hedges’ 2008 book, Empire of Illusion: The End of Literacy and the Triumph of Spectacle, was not a book I planned on reading. It wasn’t in my ever-growing pile of “to-read” books but was something I found in a different sort of pile: a stack of discarded books someone had placed beside the recycling dumpster in my building’s garbage room. Naturally, I flipped through its pages and what I found were a number of striking passages its previous owner had highlighted in bright yellow: “America has become a façade. It has become the greatest illusion in a culture of illusions”; “At no period in American history has our democracy been in such peril or the possibility of totalitarianism as real”; and: “This endless, mindless diversion is a necessity in a society that prizes entertainment above substance.” I was intrigued. And given how often one hears of the number of Americans described as “divorced from reality” (not to mention all the Nietzsche I’ve been reading), I knew this was something I had to read.

Gilles Deleuze's Nietzsche and Philosophy

Since the pandemic began, I’ve dedicated much of my reading to slowly going through the works of Nietzsche, plus occasionally taking in an academic text on his philosophy along the way. Of the latter, no other book has had a more eye-opening impact on my understanding of the German philosopher than Gilles Deleuze’s Nietzsche and Philosophy. What Deleuze offers is in no way the usual summation of Nietzsche’s key concepts typically found in books aimed at either lay readers like myself or undergraduate students. Instead, Deleuze offers a unique and exciting interpretation that is equal parts Nietzsche, Spinoza, and Deleuze’s own brand of philosophy. I wish to focus here on one aspect of Deleuze’s book and that is his interpretation of the eternal return.

Novella Acceptance!

I’m thrilled to share some good news that my novella “Massive” has been accepted for publication at The Write Launch. It’s the longest piece I’ve written thus far—19,000 words—and I’m really happy to have found a home for this story. Thanks to Sandra and Justine Fluck for accepting this piece, and to all those who provided me feedback on the earlier drafts, most especially to Isabel Matwawana.

The Will to Nothingness

The other day, when I opened my closet and looked at all the clothes hanging in there, at the dress shirts and dress pants, the blazers and ties, the dusty shoes, it struck me that I haven’t worn ninety percent of what was there in a year. It’s like someone died, I thought, and I remembered my mother’s closet after she had died and how I had to go through her clothes, deciding on what was to be thrown out and what was to be donated.

Gratitude in the Time of the Pandemic

After nine years of living in South Korea, I moved back to Canada for good in 2006. I’d grown tired of always being perceived as a foreigner, and as a gay man I felt increasingly uncomfortable as my life came under greater scrutiny the longer I remained a “bachelor.” It was time to go home, time for a fresh start, and I looked to the future with excitement and optimism. What I didn’t expect was how difficult the subsequent years in Canada would be. I had not expected the extent to which I’d experience “reverse culture shock,” how financially difficult it would be, how deeply unhappy and, most surprising of all, how every bit of a foreigner I would feel in Canada. In short, those were “bad” years. And then I remember one Pride weekend, as I was negotiating my way through the crowded gay village in Toronto, when I heard a woman shout: “Yes! 2011 is the best year ever!” What news had she received that added to what sounded like an already wonderful year? I envied her, I remember thinking. Not that my own life by that point was all bad, but it certainly wasn’t as jubilant as hers. It was a year full of the usual ups and downs, just like any other. And although I can’t remember any specific high- or low-lights off-hand, I do recall resolving to stop dubbing years as either “good” or “bad,” a resolution that has unfettered me of a lot of unnecessary expectation and disappointment.