Subject-verb agreement is a deceptively easy grammar rule: a single subject gets a single verb, and a plural subject gets a plural verb. Yet deciding when a noun is singular and when it’s a plural is harder than it looks. In “10 Subject-Verb Agreement Rules,” I outlined 10 subcategories of subject-verb agreement, noting that Amy Einsohn lists 25 subject-verb agreement rules in her Copyeditor’s Handbook.
You’d think 25 rules would be enough.
The Media Problem
Amy recently shared with me a list of nine more rules she’s identified and didn’t have room for in the newest edition of The Copyeditor’s Handbook. This one jumped out at me:
Media. In Latin, media is a plural noun (the singular is medium). In English, media is sometimes treated as a singular noncount noun (“The mass media has ignored the issue”), sometimes as a plural noun (“The media have been complicit in ignoring the issues”).
Some of you are already shaking your heads. “Media is the plural of medium. It’s never a singular noun.” Certainly, that’s still the case if we’re talking about “an intervening substance through which something else is transmitted or carried on,” as American Heritage Dictionary, 5th edition (AHD5) defines it, such as photography, painting, and sculpture.
But that’s not the sense that English speakers are torn over. Rather, it’s the sense of “a means of mass communication, such as newspapers, magazines, radio, or television” (AHD5). No one denies this meaning; it’s whether in this sense media should be singular or plural. And as usual, the answer depends on whom you ask.
Polling the Experts
AHD5 advises against treating media as a singular noun, noting that only 38% of its Usage Panel would use do so. However, this is a change from the fourth edition, which did not allow media (meaning the news industry) to be used as a singular noun at all.
Merriam-Webster’s Collegiate Dictionary and Merriam-Webster’s Dictionary of English Usage agree that using media as a singular noun would generally be regarded as an error, except perhaps in the advertising field where the term seems to have originated.
However, several other dictionaries accept media as a singular noun. In addition, Garner’s Modern American Usage states that media, “as a shortened form of communications media—is increasingly used as a mass noun. While that usage still makes some squeamish, it must be accepted as standard.” Garner puts this change at stage 5 of his Language Change Index: “fully accepted.” He does draw the line at medias, however.
Usage in the Wild
I hate when my favorite resources argue: how can I choose between AHD and Garner’s? Sometimes the answer is to see what other writers and editors have decided.
I searched for media has and media have in Google News and Google Books for the period January 1, 2000–Ocotber 18, 2011. Perhaps unsurprisingly, usage of the two constructions was almost evenly split:
|Source||media has||media have|
All of this brings us right back to Amy’s rule: sometimes media is a singular noun and sometimes it is a plural noun. It’s a word whose usage is in flux. Until it settles down—if it does—either usage is correct, depending on audience. MWDEU notes that many people still recognize media as plural in a way that they no longer do with data. Know your audience. Using media as a singular noun in advertising, news, business, and casual writing is acceptable. Using it in academic and other formal writing may not be.
MWDEU reminds us that “media and medium are English words, even if naturalized, and are no longer subject to the rules of Latin.” Given the evidence, I’d say media is finding its form within English rules.
Last week, a reader asked me to write a Word Story for prevaricate. Here you go, Mari.
The verb prevaricate means “to speak or act evasively; to avoid telling the truth; to lie.” Although a prevaricating person might tell an outright lie, more often he’s dancing around the topic, saying or doing something to put off that moment when he must either absolutely lie or tell the truth.
The term entered English in 1582 as a backformation of prevarication, according to Chambers Dictionary of Etymology. English speakers may have been influenced by the Latin praevāricāri, which Online Etymology Dictionary defines as “to make a sham accusation, deviate.” Prevaricate picked up its modern meaning of “to speak evasively” in John Donne’s Selections (which publish prior to 1631).
Perhaps because of this idea of evasiveness, prevaricate is sometimes confused with procrastinate. The latter term, meaning “to put off doing something, especially from habit; delay,” entered English in 1588, says Chambers, and is a likely backformation of procrastination. It comes from the Latin prōcrāstināre, “to put off until tomorrow; defer.”
Garner’s Modern American Usage puts the error of using prevaricate for procrastinate at stage 1, which is “a new form emerges as an innovation (or a dialectal form persists) among a small minority of the language community, perhaps displacing a traditional usage.” Scanning several databases, I found examples that used prevaricate in its accepted sense:
But when asked to tell the military if Taliban pass through, they prevaricate. “They tell us, ‘The Taliban will kill us if we do,’” he said. “They are not fence-sitters, these people,” Captain Reed said. —The New York Times (2010)
Interestingly, The Oxford English Dictionary lists “to procrastinate” as one definition of prevaricate, calling it the “usual sense.” It’s the only dictionary I checked of a dozen or so that gave such a definition. Avoid using prevaricate to mean procrastinate, as most English speakers will not share that meaning.
What word do you want to know more about? Send it to me!
In a LinkedIn Group recently, someone asked:
Which one of the two is the correct usage? “The people that knew about the ceremony” or “The people who knew about the ceremony”?
There was a bit of back and forth over whether that was an acceptable pronoun for people. I wrote about this question in the article “Who’s That,” explaining that that has been an acceptable pronoun for animate objects for over a thousand years.
One of the LinkedIn participants didn’t think that a thousand years of usage was a valid argument for correct grammar:
To claim that either is correct in formal English is absurd … I’m sure that “irregardless” has been in use for over a thousand years, but that doesn’t make it right! Perhaps we could start saying “the chair who stood in the corner”?
The fact is, though, that consistent, accepted usage for an extended period of time is exactly what makes something legitimate.
The point of language is to communicate. Shared meaning leads to successful communication. Speakers make language. We decide collectively on the meaning of words and on a shared grammar: the rules of how words can be put together in a sentence in a way that communicates meaning.
“The vocabulary and grammar that we use to communicate,” says Longman Grammar of Spoken and Written English, “are influenced by a number of factors, such as the reason for the communication, the context, the people with whom we are communicating, and whether we are speaking or writing. Taken together, these choices give rise to systematic patterns of use in English.”
So a decision made repeatedly by millions of people over a long period qualifies as a systematic pattern of use. And that use becomes a rule.
People love language and love to play with language. We make up words all the time. We move them around. We play with them. We change them. We discard them when they’re no longer useful or fun.
People have an innate ability to create and learn languages. We don’t need to wait to go to school to learn how to talk or how to use words. By age two, most of us are stringing together simple sentences. By the time school rolls around a year or three later, we’re masters of communication. So what are we learning in school? We’re (hopefully) learning the rules behind our communication skills to better control those skills and become more sophisticated at using language.
Why, then, do we disagree so much about language?
First, no one can accurately and thoroughly explain everything about language. The system is too complex, too much an innate part of us, for any one person to know it all. We misunderstand rules and pass them on, propagating the error. Over time, we’ve come to understand a lot better how language works and what’s happening when we speak or write. But we’re nowhere close to total understanding.
Then there’s variation in how we speak and write to different audiences. We speak one way to our parents, another way to our friends, another way to our bosses, and so on. We make style choices in our speech and writing to appeal to certain audiences. That doesn’t make any one way of speaking wrong if communication is happening. It does make some styles of communicating unacceptable to certain groups. Some people don’t understand these different registers of language; to them, only one style is acceptable. There are no nuances in their rules.
A third reason we disagree is that language is alive; it changes with its speakers. Each generation molds language to fit its purposes and its experiences. In this digital age, in which we are connected with and influenced by more people than ever before, language changes even faster. Frankly, some people stopped learning about language in elementary school or high school or even college. They missed the lesson that taught about language changes. They aren’t keeping up with the changes and flatly refuse to accept the evidence that their eyes and ears give them. They learned it one way, and that’s the only way it can be. Period.
We language speakers are the ones who make the rules. The experts and resources can only do one of two things: describe what they see or prescribe their preferences. We can choose to follow those preferences or not. Common usage comes from a significant portion of the language community accepting a shared meaning and a shared usage.
By the way, irregardless has been around since at least 1912, yet most English speakers still regard it as nonstandard and it remains an error in most English speakers’ opinion. It isn’t just longevity that counts but also acceptance by a significant number of language users.
I started the Word Stories series because of a new project I’m working on. The project has me writing definitions for over a thousand words. It’s great to dig in to so many words, but I only get to use maybe eight words per term defined. That’s hardly enough for a basic definition, never mind nuances and stories. Here, then, is where I will dive deeper into a project word that has captured my attention.
A cacophony is “a collection of loud, harsh sounds heard all at once.” It’s disharmony. Discord. Dissonance. Noise.
Chambers Dictionary of Etymology notes that cacophony entered English in 1656, and Online Etymology Dictionary and The Oxford English Dictionary largely concur (it’s great when we can all get along, isn’t it?). It comes to us from the Greek kakóphōnos, meaning “ill-sounding,” and it might have been influenced by the French cacophonie.
Two of the first printed uses of cacophony, as recorded by The OED, are:
Cacophony, an ill, harsh, or unpleasing sound, (in words) a vitious utterance or pronunciation. —Glossographia (1656)
Alter rhymes, and grammar, and triplets, and cacophonies of all kinds. —Alexander Pope in a letter to Jonathan Swift (1733)
Glossographia was written by one Thomas Blount. Interestingly Blount’s descendent, Roy Blount Jr., has written a couple of his own glossaries, Alphabet Juice and Alphabetter Juice. In Alphabet Juice, he introduces the idea of sonicky:
The quality of a word whose sound doesn’t imitate sound, like boom or poof, but does somehow sensuously evoke the essence of the word: queasy or rickety or zest or sluggish or vim.
I’d vote cacophony a sonicky word. Those c sounds are hard on the ears and are emphasized by their proximity to each other. Loud, harsh sounds, indeed.
The Corpus of Contemporary American English shows that today, we use cacophony as we always have, in relation to voices, language, music, and general background noise.
You are creating a cacophony in which it is impossible to hear your own voice, whether it’s yourself you’re thinking about or anything else. —American Scholar (Spring 2010)
Through a small speaker emerges the symphonic cacophony I couldn’t hear when I was in the water: a rain-forest chatter of squeaks, clicks and trills. —Popular Mechanics (February 2011)
We may be using cacophony the way we always have, but we’re using it more often.
Maybe its popularity is growing because our mechanized world is so loud, or maybe it’s because of all the noise we’re making online. Either way, despite its age, cacophony strikes me as a very apt term for the digital world.
What word do you want to know more about? Email it to me!
Does the following sentence from Adweek bother you:
It is tempting—as well as, in liberal circles, heretical—to try and separate Roger Ailes from his politics.
Some language pedants will be immediately drawn to try and and will insist that it should be try to. But should it be? Let’s take a look.
Try And’s History
The Oxford English Dictionary (OED) and Online Etymology Dictionary both date try as in “to attempt, to test” to the 14th century. The OED’s first reference for try and dates to 1686:
They try and express their love to God by their thankfulness to him. —The history of monastical conventions and military institutions (1686)
Right away, we can see we’re not dealing with a new concept. Try and has been in common usage for over 300 years. This Google Ngram shows that although try to has been more common than try and, the difference was fairly consistent until the early 19th century.
After 1820, usage of try and remained stable, but that of try to shot way up. According to Merriam-Webster’s Dictionary of English Usage (MWDEU), try and started to be criticized as ungrammatical at that time. Never mind that by then, the usage had been in circulation for over 100 years and similar constructions (e.g., go and) had been around since the 13th century. Clearly, people started believing it to be ungrammatical, yet try and continued to be used to some extent.
What the Usage Experts Say
By the early 20th century, usage experts were advising that try and was a legitimate, if casual or idiomatic, construction. In 1926, H. W. Fowler wrote in his Dictionary of Modern English Usage that try and “is an idiom that should be not discountenanced, but used when it comes natural.” In 1957, Bergen Evans and Cornelia Evans wrote in A Dictionary of Contemporary American Usage that try and is “standard English.”
While R.W. Burchfield, in The New Fowler’s Modern English Usage (1996), is undecided about try and’s legitimacy, he notes many uses of it in literature. Finally, Bryan Garner (2009) puts try and at stage 4 of his Language Change Index: “The form becomes virtually universal but is opposed on cogent grounds by a few linguist stalwarts (die-hard snoots).”
What We Really Say
The experts can tell us to use one construction over another all they want. The question is whether we listen.
|Source||try and||try to|
|Corpus of Contemporary American English (COCA)||2,341||38,650|
|Google Books (1990s)||70,321||1.3 million|
|Google Books (2000s)||125,545||1.9 million|
|436 million||900 million|
All results from the above table are from written modern texts (spoken texts were filtered out of COCA), and all but the Google results are from edited texts. Clearly we are still using try and, but we continue to use try to more often. Surprisingly, the gap between the two is smallest within Google News results. Although newspapers tend to be written less formally to appeal to a mass audience, Google results contain links to many, many web pages that have not been written by professional writers, let alone edited. But the results seem to indicate that we use try to in our most formal writing, and try and is very common in less formal situations, which backs up what Garner and MWDEU tell us.
Try And’s Informality
Try and is neither a new construction nor an ungrammatical one. It’s been consistently used and accepted for centuries. However, it is typically used in casual writing, intimate writing, and speech rather than in formal writing. Before using try and, think about your text’s tone and style and your audience. If you’re writing something formal, such as an academic paper, stick with try to. However, if you’re writing something less formal, such as a magazine article, go ahead and use try and.
What other usage questions do you have? Send them to me!
Update: Thanks to reader KK for pointing out typos in this post. They’ve been corrected. Even editors need editors.
A Copyediting reader recently asked me about the difference between testimony and testament. Although both terms relate to evidence, testimony specifically refers to evidence from a witness, while testament is “tangible proof or evidence,” according to American Heritage Dictionary. The terms share a common, if distant, root that reveals their relationship.
Testament entered English in 1290 from the Latin testāmentum, “a will,” which is from the Latin testārī, “to make a will,” according to the Chambers Dictionary of Etymology. Testārī, Chambers continues, comes from the Latin testis, “witness.” The idea is that the testament, the will, was a witness in a lawsuit to the deceased’s wishes.
Testimony entered English before 1382, referring to the Ten Commandments, as used in the Wycliffe Bible. It wasn’t until about 1425, says Chambers, that testimony picked up its legal meaning from a borrowing of the Old French testimonie and the Latin testimōnium, both meaning “evidence, proof.” Now, watch what happens when we pull apart testimony’s Latin ancestor.
Testimōnium, says Online Etymology, is from testis, “witness” and -monium, a suffix that means an “action, state, condition.” That testi- that is hiding in both in testament and testimony ultimately comes from the Proto-Indo-European (PIE) *tris- for “three,” as in the third party who gives witness.
From three to witness to evidence, it’s all relative.
I’ve reviewed several serious reference books in this blog. But language books don’t all have to be serious to be worth reading. Sometimes they can be fun and entertaining—and still teach you something. Here are three books that fit that description.
Roy Blount (pronounced blunt) Jr. is the author of many books on various topics, a frequent panelist on Wait, Wait … Don’t Tell Me! and a member of the American Heritage Dictionary’s usage panel. Alphabetter Juice is a follow-up to Alphabet Juice, both humorous works of word stories.
Blount espouses the concept of sonicky, a word he created from “sonic (evocative of sound) and kinesthetic (evocative of body movement).” Sonicky words, “the most expressive English words,” he writes, “engage the ear, the vocal apparatus, and by implication other parts of the body: call me suggestible, but I can feel hump, which first appeared in English as part of humpback, in my upper back and shoulders.” It’s an interesting concept, one that leads Blount to look at all manner of words, including ad hominy, gag, and knickknack.
The entries are arranged alphabetically (more or less), so you can jump in and out any way you’d like, making it a great read when you don’t have much time to read. Read a couple entries while your computer boots up. Read a couple more while the cupcakes bake. Read a couple of pages just before bed. (Now you know how I get my reading done. Cupcakes, anyone?)
One note: I read this book on my Kindle. I love the advantages of an e-reader, but Blount’s book deserved better treatment than it got. There were many odd line breaks and some typos, neither of which exist in the print edition. I’ve read other e-editions that don’t have this problem, so I know this isn’t a universal problem. Whoever prepped the book for e-readers needs to hire a proofreader (I’m available). Do yourself a favor with this particular book: read it in print.
On the Dot
On the Dot by Alexander Humez and Nicholas Humez is ostensibly about that mark we use for periods. Really, though, chapters just start with the different uses of the dot, such as in Morse Code, as bullets, with musical notes, and as part of several punctuation marks. The discussion then goes wherever the words lead the authors. You might get a brief history of Morse Code and ciphers, such as the cipher George Washington used, or an explanation of bullet lists followed by an entertaining history of the word bullet to mean the projectile that comes from a firearm.
The brothers Humez know their stuff, too. Writers independent of each other, they’ve coauthored several books on language, including Latin for People and Alpha to Omega. Their writing is rhythmic and witty, their topic light and entertaining. On the Dot is a fun read for a dreary day.
How to Read a Word
Historical lexicographer Elizabeth Knowles knows words and how to build “word files.” She’s worked on the Oxford English Dictionary Supplement and the Shorter Oxford English Dictionary. And she knows how dictionaries work.
Her book How to Read a Word introduces readers to the art of creating word files and, thus, dictionaries. Knowles walks us through the process, sharing plenty of word stories along the way. You can read the stories behind wordhoard, Twitterati, vocabulary, and more.
Knowles encourages readers to build their own word files. In the introduction, she writes:
Encountering an unfamiliar word or phrase … is a provocation to find out more. … There has probably never been a time when someone who wishes to explore words has had richer resources to hand. I hope that How to Read a Word will offer its readers a chance to make full use of what is now available to us all.
The book is smartly written. A little more formal than either On the Dot or Alphabetter Juice, it is nevertheless accessible to general readers and is a wealth of information for word lovers.
Don’t forget to vote for The Writing Resource in Grammar.net’s Best Grammar Blog of 2011 contest!
For quite a while in this blog, I’ve been writing Vocab Builders to help readers learn new words. Most Vocab Builders included four or five words, their definitions, and sample sentences found in the wild. Studying words in this manner is one way to improve your vocabulary.
Lately, though, I’ve been drawn to learning the story behind a word to learn the word. Where has the word been? What was its original meaning and how was it used? This column will give you the story behind the word.
This adjective means “unfavorable or harmful,” as with an adverse reaction. It also means “acting in opposition.” Chambers Dictionary of Etymology says that adverse first appeared in Geoffrey Chaucer’s Troilus and Criseyde (spelling modernized):
Then said he thus, fulfilled of high disdain,
“O cruel love, and thou, Fortune adverse,
This all and some, that falsely have thee slain
Criseyde, and sin thee may do me no worse,
Fie on your might and works so diverse!”
It’s thought that Chaucer borrowed adverse from the Old French avers, meaning “unfriendly” or “contrary,” according to Online Etymology. Originally avers descended from the Latin adversus, “turned against” or “turn toward.”
These days, you’re likely to find adverse paired with effects, reactions, or impact, among other nouns.
Not cryogenics, just a carefully maintained cold period above freezing, but low enough to slow metabolism almost to a standstill, prevent cellular damage, other adverse effects. —The Magazine of Fantasy and Science Fiction, Jan-Feb 2011
That’s especially important if you have kidney or liver problems, since both ailments multiply the risk of adverse drug reactions. —Consumer Reports, July 2008
If the costs to society of an action by the agency exceed the benefits, that situation has no immediate adverse impact on the agency. —USA Today Magazine, November 2008
What word stories are you interested in hearing? Let me know in the comments section below.
Voting has begun over at Grammar.net for the Best Grammar Blog of 2011. Please vote now for The Writing Resource. Thanks!
Tomorrow is National Punctuation Day. Jeff Rubin, creator of the holiday, offers lots of ways to celebrate the day on his site. In addition, you might view punctuation videos, like this one from the original Electric Company:
If the new Electric Company is more your style, LL Cool J has song for you, too:
For the more serious among you, check out my post on Copyediting for punctuation reading, plus a new contest.
Recently on Twitter, fellow copyeditor CopyCurmudgeon offered this advice to writers and editors:
Tip: Put down Strunk and White and slowly back away. Then forget everything they told you.
CopyCurmdgeon then links to an article by Geoffrey Pullum, a well-known and well-respected linguist, on what’s wrong with Strunk and White’s famous little book. Writes Pullum, “Its advice ranges from limp platitudes to inconsistent nonsense. Its enormous influence has not improved American students’ grasp of English grammar; it has significantly degraded it.” He then details what he dislikes about the book. What he likes about the book he labels, quoting Douglas Adams, “mostly harmless.”
The real problem is that many people (including some writers and editors) suffer under the illusion that if you can speak you can write. Not so. Language is complicated, and we’re nowhere near to understanding it completely. Writing is hard work. Trying to wrestle an entire discipline that concerns something we don’t understand perfectly and that constantly changes into short, pithy rules is bound to fail. Sure, you can create some rules, but they’ll never be comprehensive or complex enough. Even in this blog, where I can write as much as I want, I can’t write enough to cover something completely. Pullum’s grammar book is almost 2,000 pages!
I’ve read of teachers defending Strunk & White because it’s easy for students to follow. But if the content is misleading or downright wrong, what are they following? It’s like telling students not to start a sentence with because. There’s more to the story than that. Starting a sentence with because can sometimes create incomplete sentences. But it’s actually OK to start a sentence with because if (1) it’s part of a complete sentence or (2) you are intentionally writing an incomplete sentence and you don’t overdo the incomplete sentences.
But instead of teaching a more comprehensive point, many simply teach “don’t start a sentence with because.” Maybe that works for middle-school students, but even by high school, students should be learning more than that. And in a book for college-aged and adult writers, writing such a simplistic rule is irresponsible.
This has long been a complaint of mine: many grammar and writing books written for adults simplify the rules too much. People follow the rules slavishly, but they miss the nuances. As a result, they write poorly but think they write well.
Writing is not easy. It’s a skill and an art. You need some aptitude and lots of training, as well as editors and readers to help you see the writing separate from the meaning. This will help correct the writing to make the meaning clearer.
Given that I’d steer writers and editors away from Strunk & White, what would I recommend? Here are a few books that are readable and knowledgeable but that don’t skimp on the meaning:
- Woe Is I
- Lapsing Into a Comma
- The Elephants of Style
- Style: 10 Lessons in Clarity and Grace
- The Oxford Essential Guide to Writing
- Write Right (if you’re doing business writing)
- The Glamour of Grammar
What’s your favorite writing or grammar advice book? Share it in the comments section below.