A month ago, Robert Hartwell Fiske, editor and publisher of The Vocabula Review, very kindly sent me a link to a new article, “Making Peace in the Language Wars” by Bryan A. Garner. I told Fiske I was definitely going to write about it, and he must be thinking (if he remembers it at all) that I’m completely feckless. Well, I’m not (not completely, anyway); I am a procrastinator, but it’s mainly that the subject kept expanding in my mind and I wasn’t quite sure how to deal with it. Now, prodded by a recent discussion of who can be called a linguist and a NY Times article that won’t stay online free for long (and I’m afraid the Garner piece is now available only to subscribers—mea culpa!), I’m finally getting around to it. Warning: this entry will be long and full of ambivalence.
If Garner’s name sounds familiar, it may be because his book A Dictionary of Modern American Usage was the pretext for David Foster Wallace’s notorious Harper’s screed “Tense Present,” wherein he tried manfully to demolish the citadel of scientific linguistics using his patented arsenal of whimsy, faux-plebeian rhetoric, rambling footnotes, and willful distortion. (For more detail, see my own screed here.) Garner, like Wallace, wants to present himself as the honest broker, bringing both sides together in a ring-dance of reconciliation; in fact, both of them are dyed-in-the-wool prescriptivists whose contempt for science is continually breaking through. Here’s a representative passage from Garner:
In other words, the spirit of the day demands that you not think critically — or at least not think ill — of anyone else’s use of language. If you believe in good grammar and linguistic sensitivity, you’re the problem. And there is a large, powerful contingent in higher education today — larger and more powerful than ever before — trying to eradicate any thoughts about good and bad grammar, correct and incorrect word choices, effective and ineffective style.
Of course this “large, powerful contingent” consists of linguists and those inspired by them, and he names names:
Yet several linguists assert, essentially, that there is no right and wrong in language. Consider what one well-known linguist, Robert A. Hall, Jr., famously said: “There is no such thing as good and bad (or correct and incorrect, grammatical and ungrammatical, right and wrong) in language. … A dictionary or grammar is not as good an authority for your speech as the way you yourself speak.” Some of the better theorists in the mid-twentieth century rejected this extremism. Here, for example, is how Max Black responded:
“This extreme position … involves a confusion between investigating rules (or standards, norms) and prescribing or laying down such rules. Let us grant that a linguist, qua theoretical and dispassionate scientist, is not in the business of telling people how to talk; it by no means follows that the speakers he is studying are free from rules which ought to be recorded in any faithful and accurate report of their practices. A student of law is not a legislator; but it would be a gross fallacy to argue that therefore there can be no right or wrong in legal matters.”
But he’s not totally blinded by prejudice; he’s willing to admit the failings of the home team:
Describers have always tried to amass linguistic evidence — the more the better. Prescribers are often content to issue their opinions ex cathedra. In fact, inadequate consideration of linguistic evidence has traditionally been the prescribers’ greatest vulnerability.
Here’s his proposed division of labor:
Prescribers should be free to advocate a realistic level of linguistic tidiness — without being molested for it — even as the describers are free to describe the mess all around them. If the prescribers have moderate success, then the describers should simply describe those successes. Education entailing normative values has always been a part of literate society. Why should it suddenly stop merely because describers see this kind of education as meddling with natural forces?
Meanwhile, prescribers need to be realistic. They can’t expect perfection or permanence, and they must bow to universal usage. But when an expression is in transition — when only part of the population has adopted a new usage that seems genuinely undesirable — prescribers should be allowed, within reason, to stigmatize it. There’s no reason to tolerate wreckless driving in place of reckless driving. Or wasteband in place of waistband. Or corollary when misused for correlation. Multiply these things by 10,000, and you have an idea of what we’re dealing with. There are legitimate objections to the slippage based not just on widespread confusion but also on imprecision of thought, on the spread of linguistic uncertainty, on the etymological disembodiment of words, and on decaying standards generally.
As a matter of fact, I’m not entirely averse to such a division. It’s quite true that most linguists are not interested in “good usage” or competent to decide it, and I believe there is such a thing and it’s worth cultivating. Ideally, linguists would describe the attested facts of language and style experts would build on their evidence to make recommendations about which usages should be preferred.
The problem is that the mavens, the likes of Bryan A. Garner and David Foster Wallace, don’t really believe that linguists know what they’re talking about—don’t, in fact, understand what scientific linguistics is or how it works. This is the gaping hole in the division-of-labor idea. It is as if the people who drew up recommendations for healthy diet and lifestyle held doctors at arm’s length and accused them of not accepting the idea of right and wrong in diet. Ideally, students would be exposed to introductory linguistics classes at an early age so that they would have a basic grasp of language variety, language change, and the arbitrariness of the linguistic sign and would know how to distinguish valuable inheritances from invented myths. Instead, people swallow whatever some self-designated expert decides says, and we get nonsense like the alleged misuse of such. From Merriam-Webster’s Concise Dictionary of English Usage (valuable precisely because it carefully investigates the historical facts before making recommendations), s.v. “such”:
Back in the 18th and 19th centuries a few commentators managed to puzzle themselves about the word order in constructions like these:
…said that he never remembered such a severe winter as this —Jane Austen, letter, 17 Jan. 1809
…but such a dismal Sight I never saw —Daniel Defoe, Robinson Crusoe,1719
They convinced themselves that such in this construction must be a misuse for so. They were wrong and nobody believes it is a misuse any more, but since the subject had been started, almost nobody was willing to forget it, which they should have. The 20th-century focus was on the use of such as an intensive, as in “He’s such a nice boy” and “She has such beautiful manners.” The assertion is that this use of such is informal and not to be used in formal writing… The tortured reasoning of the 18th- and 19th-century pundits was irrelevant, and the 20th-century concerns are unnecessary. You need not worry about adverbial such at all.
Even worse than deciding that a perfectly good usage is wrong is confusing people about words that, left to themselves, they have no problem using; again from the Concise Dictionary of English Usage, s.v. “between”: “Actually, the enormous amount of ink spilled in the explication of the subtleties of between and among has been largely a waste; it is difficult for a native speaker of English who is not distracted by irrelevant considerations to misuse the two words.”
So for the time being we must depend upon people with both linguistic training and a sense of style; this is a niche I try to fill here at Languagehat, and it is part of what linguist John McWhorter is trying to do with his new book Doing Our Own Thing: The Degradation of Language and Music and Why We Should, Like, Care, discussed in the NY Times article (by Emily Eakin) I mentioned earlier:
Mr. McWhorter, 38, a professor of linguistics at the University of California at Berkeley and a senior fellow at the Manhattan Institute, a policy research group in New York City, is hardly the first to complain about Americans’ brazen disregard for their native tongue. But unlike many others, he says the problem is not an epidemic of bad grammar.
As a linguist, he says, he knows that grammatical rules are arbitrary and that in casual conversation people have never abided by them. Rather, he argues, the fault lies with the collapse of the distinction between the written and the oral. Where formal, well-honed English was once de rigueur in public life, he argues, it has all but disappeared, supplanted by the indifferent cadences of speech and ultimately impairing our ability to think.
Now, I suspect that McWhorter is exaggerating, and I certainly deplore the tired “blame the ’60s” approach, but he has the background and the chops to make the case. If more deplorers were like him, I would be more inclined to pay attention.
As for the “who’s a linguist” question: with understanding of even the basic elements of linguistic science as rare as it is, I award the title to anyone who has a good grasp of them, just as we call anyone who knows how to use a telescope and a star chart and spends time at it an astronomer, regardless of their day job. This is, of course, a thoroughly self-serving definition, because it means that yes, Virginia, I am a linguist.
I will doubtless have more to say about these matters, but it’s late, I’m tired of writing, and you’re doubtless even more tired of reading me if you’ve made it this far. So let’s call it a day, and I’ll deal with any issues that may be brought up in the comments.