When Lynne Truss wrote, in her best-selling 2003 grammar screed Eats, Shoots & Leaves,of “a world of plummeting punctuation standards,” she was (perhaps unwittingly) joining an ancient tradition. How long, exactly, have shortsighted curmudgeons been bemoaning the poor grammar of the generations that follow theirs? According to Steven Pinker’s The Sense of Style, the answer is, like, forever: “Some of the clay tablets deciphered from ancient Sumerian include complaints about the deteriorating writing skills of the young.”
The notion of being taught language has always been oxymoronic because language is in a constant state of flux, a restless, malleable, impatient entity that, like the idea of now, can never be fixed in place. Take, for instance, the journey of the semicolon as chronicled in the delightful, enlightening new book by Cecelia Watson, Semicolon: The Past, Present, and Future of a Misunderstood Mark. The twisty history of the hybrid divider perfectly embodies the transience of language, the ways it can be shaped by cultural shifts that have nothing to do with correctness or clarity. Invented by the Italian humanist and font pioneer Aldus Manutius in the late-15th century, the semicolon was originally “meant to signify a pause of a length somewhere between that of the comma and that of the colon” (hence its design).
Other punctuation marks — such as the “punctus percontativus, or the rhetorical question mark, which was a mirror-image version of the question mark” — turned out to be passing fads, but the semicolon lasted, owing partly to its usefulness and partly to the trends of the day. For much of the early 1800s, usage of the parenthesis and the colon declined drastically. Two grammar guides of the time declared the parenthesis “nearly obsolete,” while another noted, “The COLON is now so seldom used by good writers that rules for its use are unnecessary.” As those marks waned, the semicolon waxed, flourishing to the point of overuse.
In the meantime, another trend developed, one with equally lasting consequences. In 1748, Robert Lowth’s influential A Short Introduction to English Grammar treated punctuation as “analogous to the rests in a piece of music”; the “comma thus was a pause shorter than the semicolon, and the semicolon was a pause shorter than the colon.” Though his book was a forerunner of what we call grammatical prescriptivism (he “corrected” writers like Shakespeare, Milton, and Swift), Lowth included no mandates for punctuation except for a vague deferral to “the judgment and taste of the writer.” But by 1926, with H.W. Fowler’s A Dictionary of Modern English Usage, suggestions had hardened into rules. Fowler’s prescription for the semicolon is more like the one we have now — as Watson puts it, “a means to delineate clauses properly.” No more music or prosody; now language was scientific. The meaning of the semicolon, like all linguistic conventions, has been twisted and distorted by evolving paradigms and trends, from a pause indicator to a clause connector.
The distinction between these two ways of looking at a mark may seem academic, but it matters. Watson’s use of the word properly signals the transition from “This is how we do things”to “This is how everyone should do things.” Which brings us to the ubiquitous and notorious The Elements of Style, a 1918 primer by William Strunk, which E.B. White padded out and republished in 1959. In one breath, Strunk & White tell you how to correctly use a parenthesis; in the next they warn against “abominations” like personalize, and in yet another they decree, “Prefer the standard to the offbeat.” Are they teaching the best ways to communicate effectively, or merely passing on the preference of certain editors, writers, and linguists at a fixed point in time? And if language ceaselessly changes, can a grouping of informed suggestions remain useful? If, as I’m inclined to believe, they don’t help much at all, what can? How the hell can people improve their writing?
Let’s back up a bit: Why isn’t Strunk & White’s classic called The Elements of Grammar? For one, it dispenses with grammar in a total of nine pages.
For another, it arrived at the culmination of two centuries in which grammar and style had become synonymous — or, more accurately, had switched places. Grammar, in Lowth’s understanding, was style; since no Ur-grammar existed, even a book of so-called rules was understood to reflect the tastes of its author. But as guide after guide proliferated, and as academic consensus grew (or maybe shrank), the English language was systematized into a “logical” set of rubrics and procedures. By Fowler’s time, grammar had become Grammar, and style was what one did with it. Or should do with it: Where grammar and style were once considered to be sets of suggestions, both are now regarded as sets of commandments.
One explanation for the continued conflation of style and grammar — of advice and decree — is surprisingly simple and a little bit crass. There’s money in it. According to Watson in Semicolon, Lindley Murray’s 1797 work English Grammar“went into 24 editions, reprinted by 16 different American publishers,” and sold so successfully that between 1800 and 1840 Murray was “the best-selling producer of books in the world” (italics are Watson’s). Samuel Kirkham’s 1823 guide (basically an update of Murray’s) went into 110 (!) editions. In the 20th century, the guides by Fowler and Strunk & White became bibles for generations of writers, never going out of print. Patricia O’Connor’s 1996 instructional guide Woe Is I was recently published in its fourth edition. A 2004 Forbes article estimated that Lynne Truss would net $3 million from Eats, Shoots & Leaves before it even went into paperback. Dreyer’s English, by Random House copy chief Benjamin Dreyer, spent six weeks on this year’s New York Times best-seller list. Make no mistake: Grammar is a lucrative game.
The reason we pay happily for these manuals is straightforward, if a little sad. We’ve been convinced that we need them — that without them, we’d be lost. Readers aren’t drawn to in-depth arguments on punctuation and conjugation for the sheer fun of it; they’re sold on the promise of progress, of betterment. These books benefit from the dire misconception that they are for everyday people, when, in fact, they’re for editors and educators.
Take this year’s Dreyer’s English, whose jacket description reads in part, “We all write, all the time: blogs, books, emails. Lot and lots of emails. And we all want to write better.” Even if we accept the idea that we all (or most of us) want to become clearer and more interesting writers, is grammar truly the key to such improvements?
No, it’s not. And to prove it, let’s take a look at some of the more useful tidbits from a small sampling of guides. The most famous injunction from Strunk & White — “Omit needless words” — is, of course, a style suggestion. But it is good advice nonetheless, and a vigilance against superfluity can legitimately improve your writing. Dreyer implores us to cut back on what he calls “Wan Intensifiers and Throat Clearers”: very, rather, really, quite, in fact, etc. This too is practical wisdom. But Strunk & White’s specific instruction to, for instance, “use a dash to set off an abrupt break or interruption and to announce a long appositive or summary” will only help you avoid a minor error, since using a parenthesis instead won’t make your writing less clear. And although the lucidity of Dreyer’s explanation of em and en dashes obviously comes from hard-lived experience, how exactly is it going to help me articulate the murky thoughts in my head?
It turns out the most useful elements of grammar guides are the least measurable. It might be useful to know more about the subjunctive mood; it’s more fruitful to be made aware of bad habits, lazy tendencies, and hackneyed expressions — flaws of character that are much harder to correct than gaps in knowledge. And yet, because learning something new is more appealing than being reprimanded for our known faults, these books are not marketed as behavioral nudges. Instead, we have the subtitle to Dreyer’s English: “An Utterly Correct Guide to Clarity and Style.” (Sure, Dreyer’s being a bit wry, but I imagine the wit misses the target reader.) It’s the difference between saying “This is how you write better” and “This is how you write.”
From the perspective of a nonwriter or non-editor, the cumulative effect of these books is to suggest that writing requires either a top-tier education in literary English or the memorization of reams of rules. How could an ordinary person who’s interested in writing not feel intimidated by the pedigree of the authors? These grammarians are the very definition of erudite; if even they have to fill hundreds of pages deconstructing the nuances of the language in order to reach conclusions about their ownpreferences, where do the rest of us even start?
It’s enough to elicit flashbacks of being quizzed in front of the other students in an elementary-school class with the strangely imprecise designation Language Arts. A dizzying array of terms floods the memory: split infinitives, the accusative case, the pluperfect tense, misplaced modifiers, antecedents, predicates, nominatives. Out of this traumatic early experience, a deceptive chasm forms, whereby a preadolescent is taught to assume that every professional writer actually knows this shit and somehow employs the grammar and enforces the rules.
But we writers know this isn’t true. Even those scribes who regard grammar as a noble tradition learned the intricacies of English not via stuffy edicts but through living, reading, listening, talking, and, most important, writing. Any formalization of such learning came only afterward, once the foundations of grammatical convention had been osmotically absorbed. It’s like falling in love with someone: You can say you love them for x and y traits, but those reasons only articulated themselves after the feelings had formed. It doesn’t mean the explanations are false; it’s just that they didn’t create the love. There’s a reason that those who advocate for the idea of a living language are deemed “descriptivists.” (Though the dichotomy between them and prescriptivists, as popularized by David Foster Wallace’s essay “Authority and American Usage,” is overblown and oversimplified.)
When Watson was young, she was a grammar snob, but as soon as she began teaching college at age 23, she discovered that “rules, even when explained very carefully and consistently, didn’t seem to be a good way to teach students what they wanted to know, which was how to have control and mastery of the language.” Watson also notes that, as grammar guides proliferated, “a strange thing happened”: “instead of making people less confused about grammar, rule books seemed to cause more problems.”
A grammar guide will never make you a better writer. Nope. Not a little bit. It may, perhaps, help you avoid making what other people have deemed a mistake, but that’s about it. The very best of them may even inspire, but they won’t teach you good writing any more than learning how a basketball is made will help NBA players improve their game; the ball has to start rolling first. Style guides will never instill more beauty, more personality, more human verve into your writing than you can, providing you work at expressing yourself as clearly and freely as only you can. If you want to write well, then write constantly, read rapaciously, and use grammar not to make your writing better but to make it sound more like you. Your editor will do the rest, anyway.