Posted on October 27, 2011 Posted by Bertha Perez


Word Story: Bludgeon

by Erin Brenner on October 27, 2011

This week’s word story is perfect for writers looking for a synonym for club that isn’t overused.


Bludgeon can be a noun or a verb. As a noun it means “a heavy, short club that is thicker at one end or is weighted at one end.” Think of the clichéd caveman’s club, and you’ve got the right idea.

As a verb, bludgeon means “to hit someone or something with or as with a heavy club.” As in:

A man who tried to bludgeon his neighbour to death with a claw hammer has been jailed for 18 years. —Essex Echo (2011)

The noun form appeared first, in 1730, in Dictioarium Britannicum by Nathan Bailey. It seems we don’t know where Bailey picked it up. The Oxford English Dictionary gives us the reference:

Bludgeon, an oaken stick or club.

The verb form came later, in 1868. By 1888, says Chambers Etymology, a figurative sense appeared: “to bully or threaten.”

One of the Rangers’ surprising postseason heroes continued to bludgeon opposing pitchers, delivering a key three-run home run in the sixth inning. —Shreveport Times (2011)


Unfortunately, no one knows where bludgeon came from. A couple of other words have the same -udgeon stem and are also of unknown origin: curmudgeon, “someone who is cranky, stubborn, resentful,” and dudgeon, which Michael Quinion of World Wide Words defines as “a state of anger, resentment, or offence.”

Are they all from the same source? It could be. Perhaps a curmudgeon, in a dudgeon, would use a bludgeon to quell his anger.

Current Usage

Perhaps it’s because we’re a kinder, gentler people (more like it’s because we have more efficient weapons), but bludgeon’s popularity seems to be on the wane. A search in the Corpus of Contemporary American English produces only 124 hits (among 424 million words), a good chunk of them from transcripts.

This Ngram shows that bludgeon is also appearing less often in books:

But I like bludgeon. It’s got that sonicky quality that Roy Blount writes about. It starts with a small effort (bl-), as when one raises a bludgeon. Then it gets forceful in the middle (-dge-), when the impact of such a blow hits a person. It ends on a downbeat, with the -on almost getting swallowed up. Rather like being bludgeoned. After the initial impact, you probably aren’t aware if the club has been lifted for another blow or not.

How would you use bludgeon in your writing?

Previous post:

Next post:


Posted on October 27, 2011 Posted by Bertha Perez


Voice, Accountability, and a Quiz

Over at Copyediting, I write a daily News Roundup. Each roundup is a collection of links that copyeditors would find useful. But I read a lot of great stuff every day that writers would find helpful, too. Forthwith, a collection of links to tips, tricks, and advice just for writers.



About Erin Brenner

With a BA and an MA in English, Erin has been an editing professional for 15 years, working on a variety of media, especially online. Her niche is business/marketing and online. In addition, she has experience teaching editing to non-editors and coaching writers. In 2008, Erin was bitten by the social media bug…hard. Follow her on Twitter, @ebrenner, and get a daily vocabulary word, a link to the article of the day, and much more. You can also find her on Facebook and LinkedIn.

This entry was posted in Writing Tips and tagged , . Bookmark the permalink.


Posted on October 26, 2011 Posted by Bertha Perez


Grammar Bite: Making Media Agree

Subject-verb agreement is a deceptively easy grammar rule: a single subject gets a single verb, and a plural subject gets a plural verb. Yet deciding when a noun is singular and when it’s a plural is harder than it looks. In “10 Subject-Verb Agreement Rules,” I outlined 10 subcategories of subject-verb agreement, noting that Amy Einsohn lists 25 subject-verb agreement rules in her Copyeditor’s Handbook.

You’d think 25 rules would be enough.

The Media Problem

Amy recently shared with me a list of nine more rules she’s identified and didn’t have room for in the newest edition of The Copyeditor’s Handbook. This one jumped out at me:

Media. In Latin, media is a plural noun (the singular is medium). In English, media is sometimes treated as a singular noncount noun (“The mass media has ignored the issue”), sometimes as a plural noun (“The media have been complicit in ignoring the issues”).

Some of you are already shaking your heads. “Media is the plural of medium. It’s never a singular noun.” Certainly, that’s still the case if we’re talking about “an intervening substance through which something else is transmitted or carried on,” as American Heritage Dictionary, 5th edition (AHD5) defines it, such as photography, painting, and sculpture.

But that’s not the sense that English speakers are torn over. Rather, it’s the sense of “a means of mass communication, such as newspapers, magazines, radio, or television” (AHD5). No one denies this meaning; it’s whether in this sense media should be singular or plural. And as usual, the answer depends on whom you ask.

Polling the Experts

AHD5 advises against treating media as a singular noun, noting that only 38% of its Usage Panel would use do so. However, this is a change from the fourth edition, which did not allow media (meaning the news industry) to be used as a singular noun at all.

Merriam-Webster’s Collegiate Dictionary and Merriam-Webster’s Dictionary of English Usage agree that using media as a singular noun would generally be regarded as an error, except perhaps in the advertising field where the term seems to have originated.

However, several other dictionaries accept media as a singular noun. In addition, Garner’s Modern American Usage states that media, “as a shortened form of communications media—is increasingly used as a mass noun. While that usage still makes some squeamish, it must be accepted as standard.” Garner puts this change at stage 5 of his Language Change Index: “fully accepted.” He does draw the line at medias, however.

Usage in the Wild

I hate when my favorite resources argue: how can I choose between AHD and Garner’s? Sometimes the answer is to see what other writers and editors have decided.

I searched for media has and media have in Google News and Google Books for the period January 1, 2000–Ocotber 18, 2011. Perhaps unsurprisingly, usage of the two constructions was almost evenly split:

Source media has media have
Google News 45,500 48,400
Google Books 70,800 102,000
Total 116,300 150,400


All of this brings us right back to Amy’s rule: sometimes media is a singular noun and sometimes it is a plural noun. It’s a word whose usage is in flux. Until it settles down—if it does—either usage is correct, depending on audience. MWDEU notes that many people still recognize media as plural in a way that they no longer do with data. Know your audience. Using media as a singular noun in advertising, news, business, and casual writing is acceptable. Using it in academic and other formal writing may not be.

MWDEU reminds us that “media and medium are English words, even if naturalized, and are no longer subject to the rules of Latin.” Given the evidence, I’d say media is finding its form within English rules.



About Erin Brenner

With a BA and an MA in English, Erin has been an editing professional for 15 years, working on a variety of media, especially online. Her niche is business/marketing and online. In addition, she has experience teaching editing to non-editors and coaching writers. In 2008, Erin was bitten by the social media bug…hard. Follow her on Twitter, @ebrenner, and get a daily vocabulary word, a link to the article of the day, and much more. You can also find her on Facebook and LinkedIn.

This entry was posted in Grammar Bites and tagged , . Bookmark the permalink.


Posted on October 21, 2011 Posted by Bertha Perez


Word Story: Prevaricate

Last week, a reader asked me to write a Word Story for prevaricate. Here you go, Mari.


The verb prevaricate means “to speak or act evasively; to avoid telling the truth; to lie.” Although a prevaricating person might tell an outright lie, more often he’s dancing around the topic, saying or doing something to put off that moment when he must either absolutely lie or tell the truth.

The term entered English in 1582 as a backformation of prevarication, according to Chambers Dictionary of Etymology. English speakers may have been influenced by the Latin praevāricāri, which Online Etymology Dictionary defines as “to make a sham accusation, deviate.” Prevaricate picked up its modern meaning of “to speak evasively” in John Donne’s Selections (which publish prior to 1631).

Perhaps because of this idea of evasiveness, prevaricate is sometimes confused with procrastinate. The latter term, meaning “to put off doing something, especially from habit; delay,” entered English in 1588, says Chambers, and is a likely backformation of procrastination. It comes from the Latin prōcrāstināre, “to put off until tomorrow; defer.”

Garner’s Modern American Usage puts the error of using prevaricate for procrastinate at stage 1, which is “a new form emerges as an innovation (or a dialectal form persists) among a small minority of the language community, perhaps displacing a traditional usage.” Scanning several databases, I found examples that used prevaricate in its accepted sense:

But when asked to tell the military if Taliban pass through, they prevaricate. “They tell us, ‘The Taliban will kill us if we do,’” he said. “They are not fence-sitters, these people,” Captain Reed said. —The New York Times (2010)

Interestingly, The Oxford English Dictionary lists “to procrastinate” as one definition of prevaricate, calling it the “usual sense.” It’s the only dictionary I checked of a dozen or so that gave such a definition. Avoid using prevaricate to mean procrastinate, as most English speakers will not share that meaning.

What word do you want to know more about? Send it to me!



About Erin Brenner

With a BA and an MA in English, Erin has been an editing professional for 15 years, working on a variety of media, especially online. Her niche is business/marketing and online. In addition, she has experience teaching editing to non-editors and coaching writers. In 2008, Erin was bitten by the social media bug…hard. Follow her on Twitter, @ebrenner, and get a daily vocabulary word, a link to the article of the day, and much more. You can also find her on Facebook and LinkedIn.

This entry was posted in Word Stories and tagged , . Bookmark the permalink.


Posted on October 19, 2011 Posted by Bertha Perez


Language by the People

by Erin Brenner on October 14, 2011

In a LinkedIn Group recently, someone asked:

Which one of the two is the correct usage? “The people that knew about the ceremony” or “The people who knew about the ceremony”?

There was a bit of back and forth over whether that was an acceptable pronoun for people. I wrote about this question in the article “Who’s That,” explaining that that has been an acceptable pronoun for animate objects for over a thousand years.

One of the LinkedIn participants didn’t think that a thousand years of usage was a valid argument for correct grammar:

To claim that either is correct in formal English is absurd … I’m sure that “irregardless” has been in use for over a thousand years, but that doesn’t make it right! Perhaps we could start saying “the chair who stood in the corner”? ?

The fact is, though, that consistent, accepted usage for an extended period of time is exactly what makes something legitimate.

The point of language is to communicate. Shared meaning leads to successful communication. Speakers make language. We decide collectively on the meaning of words and on a shared grammar: the rules of how words can be put together in a sentence in a way that communicates meaning.

“The vocabulary and grammar that we use to communicate,” says Longman Grammar of Spoken and Written English, “are influenced by a number of factors, such as the reason for the communication, the context, the people with whom we are communicating, and whether we are speaking or writing. Taken together, these choices give rise to systematic patterns of use in English.”

So a decision made repeatedly by millions of people over a long period qualifies as a systematic pattern of use. And that use becomes a rule.

People love language and love to play with language. We make up words all the time. We move them around. We play with them. We change them. We discard them when they’re no longer useful or fun.

People have an innate ability to create and learn languages. We don’t need to wait to go to school to learn how to talk or how to use words. By age two, most of us are stringing together simple sentences. By the time school rolls around a year or three later, we’re masters of communication. So what are we learning in school? We’re (hopefully) learning the rules behind our communication skills to better control those skills and become more sophisticated at using language.

Why, then, do we disagree so much about language?

First, no one can accurately and thoroughly explain everything about language. The system is too complex, too much an innate part of us, for any one person to know it all. We misunderstand rules and pass them on, propagating the error. Over time, we’ve come to understand a lot better how language works and what’s happening when we speak or write. But we’re nowhere close to total understanding.

Then there’s variation in how we speak and write to different audiences. We speak one way to our parents, another way to our friends, another way to our bosses, and so on. We make style choices in our speech and writing to appeal to certain audiences. That doesn’t make any one way of speaking wrong if communication is happening. It does make some styles of communicating unacceptable to certain groups. Some people don’t understand these different registers of language; to them, only one style is acceptable. There are no nuances in their rules.

A third reason we disagree is that language is alive; it changes with its speakers. Each generation molds language to fit its purposes and its experiences. In this digital age, in which we are connected with and influenced by more people than ever before, language changes even faster. Frankly, some people stopped learning about language in elementary school or high school or even college. They missed the lesson that taught about language changes. They aren’t keeping up with the changes and flatly refuse to accept the evidence that their eyes and ears give them. They learned it one way, and that’s the only way it can be. Period.

We language speakers are the ones who make the rules. The experts and resources can only do one of two things: describe what they see or prescribe their preferences. We can choose to follow those preferences or not. Common usage comes from a significant portion of the language community accepting a shared meaning and a shared usage.

By the way, irregardless has been around since at least 1912, yet most English speakers still regard it as nonstandard and it remains an error in most English speakers’ opinion. It isn’t just longevity that counts but also acceptance by a significant number of language users.

Previous post:

Next post:


Posted on October 14, 2011 Posted by Bertha Perez


A Word for the Digital World

by Erin Brenner on October 13, 2011

I started the Word Stories series because of a new project I’m working on. The project has me writing definitions for over a thousand words. It’s great to dig in to so many words, but I only get to use maybe eight words per term defined. That’s hardly enough for a basic definition, never mind nuances and stories. Here, then, is where I will dive deeper into a project word that has captured my attention.


A cacophony is “a collection of loud, harsh sounds heard all at once.” It’s disharmony. Discord. Dissonance. Noise.

Chambers Dictionary of Etymology notes that cacophony entered English in 1656, and Online Etymology Dictionary and The Oxford English Dictionary largely concur (it’s great when we can all get along, isn’t it?). It comes to us from the Greek kakóphōnos, meaning “ill-sounding,” and it might have been influenced by the French cacophonie.

Two of the first printed uses of cacophony, as recorded by The OED, are:

Cacophony, an ill, harsh, or unpleasing sound, (in words) a vitious utterance or pronunciation. —Glossographia (1656)

Alter rhymes, and grammar, and triplets, and cacophonies of all kinds. —Alexander Pope in a letter to Jonathan Swift (1733)

Glossographia was written by one Thomas Blount. Interestingly Blount’s descendent, Roy Blount Jr., has written a couple of his own glossaries, Alphabet Juice and Alphabetter Juice. In Alphabet Juice, he introduces the idea of sonicky:

The quality of a word whose sound doesn’t imitate sound, like boom or poof, but does somehow sensuously evoke the essence of the word: queasy or rickety or zest or sluggish or vim.

I’d vote cacophony a sonicky word. Those c sounds are hard on the ears and are emphasized by their proximity to each other. Loud, harsh sounds, indeed.

The Corpus of Contemporary American English shows that today, we use cacophony as we always have, in relation to voices, language, music, and general background noise.

You are creating a cacophony in which it is impossible to hear your own voice, whether it’s yourself you’re thinking about or anything else. —American Scholar (Spring 2010)

Through a small speaker emerges the symphonic cacophony I couldn’t hear when I was in the water: a rain-forest chatter of squeaks, clicks and trills. —Popular Mechanics (February 2011)

We may be using cacophony the way we always have, but we’re using it more often.

Maybe its popularity is growing because our mechanized world is so loud, or maybe it’s because of all the noise we’re making online. Either way, despite its age, cacophony strikes me as a very apt term for the digital world.

What word do you want to know more about? Email it to me!

Previous post:

Next post:


Posted on October 13, 2011 Posted by Bertha Perez


Try and Understand

by Erin Brenner on October 6, 2011

Does the following sentence from Adweek bother you:

It is tempting—as well as, in liberal circles, heretical—to try and separate Roger Ailes from his politics.

Some language pedants will be immediately drawn to try and and will insist that it should be try to. But should it be? Let’s take a look.

Try And’s History

The Oxford English Dictionary (OED) and Online Etymology Dictionary both date try as in “to attempt, to test” to the 14th century. The OED’s first reference for try and dates to 1686:

They try and express their love to God by their thankfulness to him. —The history of monastical conventions and military institutions (1686)

Right away, we can see we’re not dealing with a new concept. Try and has been in common usage for over 300 years. This Google Ngram shows that although try to has been more common than try and, the difference was fairly consistent until the early 19th century.

After 1820, usage of try and remained stable, but that of try to shot way up. According to Merriam-Webster’s Dictionary of English Usage (MWDEU), try and started to be criticized as ungrammatical at that time. Never mind that by then, the usage had been in circulation for over 100 years and similar constructions (e.g., go and) had been around since the 13th century. Clearly, people started believing it to be ungrammatical, yet try and continued to be used to some extent.

What the Usage Experts Say

By the early 20th century, usage experts were advising that try and was a legitimate, if casual or idiomatic, construction. In 1926, H. W. Fowler wrote in his Dictionary of Modern English Usage that try and “is an idiom that should be not discountenanced, but used when it comes natural.” In 1957, Bergen Evans and Cornelia Evans wrote in A Dictionary of Contemporary American Usage that try and is “standard English.”

While R.W. Burchfield, in The New Fowler’s Modern English Usage (1996), is undecided about try and’s legitimacy, he notes many uses of it in literature. Finally, Bryan Garner (2009) puts try and at stage 4 of his Language Change Index: “The form becomes virtually universal but is opposed on cogent grounds by a few linguist stalwarts (die-hard snoots).”

What We Really Say

The experts can tell us to use one construction over another all they want. The question is whether we listen.

Source try and try to
Corpus of Contemporary American English (COCA) 2,341 38,650
Time magazine 243 8,549
Google Books (1990s) 70,321 1.3 million
Google Books (2000s) 125,545 1.9 million
Google News 35,100 55,000
Google 436 million 900 million

All results from the above table are from written modern texts (spoken texts were filtered out of COCA), and all but the Google results are from edited texts. Clearly we are still using try and, but we continue to use try to more often. Surprisingly, the gap between the two is smallest within Google News results. Although newspapers tend to be written less formally to appeal to a mass audience, Google results contain links to many, many web pages that have not been written by professional writers, let alone edited. But the results seem to indicate that we use try to in our most formal writing, and try and is very common in less formal situations, which backs up what Garner and MWDEU tell us.

Try And’s Informality

Try and is neither a new construction nor an ungrammatical one. It’s been consistently used and accepted for centuries. However, it is typically used in casual writing, intimate writing, and speech rather than in formal writing. Before using try and, think about your text’s tone and style and your audience. If you’re writing something formal, such as an academic paper, stick with try to. However, if you’re writing something less formal, such as a magazine article, go ahead and use try and.

What other usage questions do you have? Send them to me!

Update: Thanks to reader KK for pointing out typos in this post. They’ve been corrected. Even editors need editors.

Previous post:

Next post:


Posted on October 6, 2011 Posted by Bertha Perez


Testament vs. Testimony: It’s All Relative

A Copyediting reader recently asked me about the difference between testimony and testament. Although both terms relate to evidence, testimony specifically refers to evidence from a witness, while testament is “tangible proof or evidence,” according to American Heritage Dictionary. The terms share a common, if distant, root that reveals their relationship.

Testament entered English in 1290 from the Latin testāmentum, “a will,” which is from the Latin testārī, “to make a will,” according to the Chambers Dictionary of Etymology. Testārī, Chambers continues, comes from the Latin testis, “witness.” The idea is that the testament, the will, was a witness in a lawsuit to the deceased’s wishes.

Testimony entered English before 1382, referring to the Ten Commandments, as used in the Wycliffe Bible. It wasn’t until about 1425, says Chambers, that testimony picked up its legal meaning from a borrowing of the Old French testimonie and the Latin testimōnium, both meaning “evidence, proof.” Now, watch what happens when we pull apart testimony’s Latin ancestor.

Testimōnium, says Online Etymology, is from testis, “witness” and -monium, a suffix that means an “action, state, condition.” That testi- that is hiding in both in testament and testimony ultimately comes from the Proto-Indo-European (PIE) *tris- for “three,” as in the third party who gives witness.

From three to witness to evidence, it’s all relative.



About Erin Brenner

With a BA and an MA in English, Erin has been an editing professional for 15 years, working on a variety of media, especially online. Her niche is business/marketing and online. In addition, she has experience teaching editing to non-editors and coaching writers. In 2008, Erin was bitten by the social media bug…hard. Follow her on Twitter, @ebrenner, and get a daily vocabulary word, a link to the article of the day, and much more. You can also find her on Facebook and LinkedIn.

This entry was posted in Word Stories and tagged , , . Bookmark the permalink.