Forgetting the infrequent things

I’m pleased that so many people found my last post on forgetting and language change interesting. Ariel Cohen-Goldberg in particular noted this about forgetting:

Cohen-Goldberg is absolutely right, and this stems from forgetting. The more frequently we do something, the more likely we are to do it the same way, without forgetting how. I never forget which train to take to get to Times Square, which way to turn the key in my apartment door, or which spices to use when cooking a steak, because I do all these things on a regular basis.

It is the same with language: I say “I had a pen in my pocket,” and never “I haved.” I always say “there were three children,” and never “three childs.” I say “was he there yesterday?” and never “did he be there yesterday?” This is what Joan Bybee and Sandy Thompson (2000) called the “conserving effect” of frequency, and Ron Langacker (1987) called “entrenchment.”

I learned about entrenchment from Joan Bybee in a course on frequency effects. She discusses it in more detail in her 1995 paper on regular morphology. In her 1985 book, she credits Witold Mańczak (1980), but Mark Aronoff suggests that it may go back to Zipf (1949). I went to check Zipf’s book; someone has it out of the library, but I put in a request for it.

This course in frequency effects actually changed my life. My term paper for the course, on the shift from ne alone to ne … pas in French, provided a good starting point for my dissertation. In section 7.3.2 of my dissertation I look at the entrenchment of high-frequency phrases like je ne sais “I don’t know,” je ne peux “I can’t,” and je n’ose “I daren’t.”

The study of entrenchment has also brought us the Google Ngram Viewer, a tool that linguists feel decidedly ambivalent about. Earlier this month, Elizabeth Weingarten profiled the Ngram Viewer in Slate, particularly its founders, mathematician Erez Lieberman Aiden and biologist Jean-Baptiste Michel.

And that was the question that set Aiden and Jean-Baptiste Michel, another Viewer founding father and co-founder of the Culturomics field, on the path to create such a tool in the first place. Back in 2007, Aiden, Michel, and a crew of undergraduate students decided to test the word evolution hypothesis by tracking irregular verbs over the past 1,000 years. They found 177 that were traceable (for instance, go and went, run and ran), plotted them manually, and discovered that the verbs did undergo a kind of evolutionary process. “The less frequent the verb, the more rapidly it becomes irregular,” Aiden explains. “Our work became this demo of how evolution by natural selection might work in a cultural study.”

In their paper, which came out while I was examining entrenchment in my corpus, Lieberman and his colleagues cited Bybee’s work on entrenchment, but somehow Bybee didn’t make it into Weingarten’s article, just as Mańczak didn’t make it into Lieberman et al.’s paper (or my dissertation), and Zipf (if he did write about it) didn’t make it into Bybee’s book. The main thing: it came from linguists.

Entrenchment is a very important effect, but many people forget to take it into account in their studies. At the 2008 conference of the American Association for Corpus Linguistics I was That Annoying Guy who asked everyone “If you take out this handful of high-frequency items, is there any evidence in your study that the change is still happening?” The other presenters were surprisingly tolerant of these questions.

You may be familiar with another effect of frequency, what Bybee and Thompson call the “reduction effect.” I’ll talk about that in a future post. And I’ll definitely get around to analogy as well. In the meantime, don’t forget to forget your low-frequency verbs!

Third grade class working hard on their art history assignment. Photo: Bliss Chan / Flickr.

The power of forgetfulness

Emily Brewster remarked the other day on the emergence and resurgence of irregular verb forms like “snuck,” “dreamt” and “awoke.” Stan Carey calls these forms unusual, and they are less common than innovative regular forms, but they are not surprising if you know the mechanisms underlying morphological change, in particular the role of forgetting and how we use analogy to overcome it.

For years, many linguists assumed that all change happened in the imperfect transmission of language from parents to children, because they heard small children produce over-regularized forms like “he keeped running.” In 1982 Joan Bybee and Dan Slobin published “Rules and schemas in the development and use of the English past tense,” but I prefer the title of an earlier version they presented to the ICHL, “Why small children cannot change language on their own.”

Bybee and Slobin asked English-speaking preschoolers, third graders (ages 8-10) and adults to produce past tense forms under time pressure. They found that the preschoolers almost always made errors like “blowed” instead of “blew,” but the third graders and adults hardly ever did. On the other hand, the third graders and adults did create novel irregular forms like “glew” as the past tense of “glow” and “snoze” as the past tense of “snooze.” They concluded that changes like the rise of “snuck” can only be driven by adults and older children.

What was this condition of language change that Slobin recreated in the laboratory? Forgetting. We forget all kinds of things. We forget where we left our keys, we forget where our second cousin is going to college, we forget how to hammer a nail or how to sing “Cielito Lindo.” It shouldn’t surprise us that once in a while we forget the past tense of “dive,” or the plural of “rhinoceros.” We’ve all been there.

So what do you do when you forget? Do you stand there like a moron with your mouth open? Well, yes, we all do sometimes. But after a while, or if you’re thinking quick, you’ll improvise. You’ll think of all the similar things and do something like that. You’ll look in the places you’ve found your keys in the past. You’ll mention another, similar college. You’ll swing the hammer the way you swing a tennis racket or you’ll substitute a word that fits in the song.

That’s what we do with the past tense. We think of how we’ve made the past tense of all the similar verbs and do something like that. We linguists call that analogy.

In researching the title for this post, I discovered that it comes from Nietzsche, of all people. Funny enough, I agree with Nietzsche that cultural change comes from forgetting, but I disagree with him that it needs to be an “active forgetting.” Passive forgetting seems to work just fine.

If you want to find out how analogy leads to “snoze” and “snuck,” you can read Bybee and Slobin’s papers at the links above, or watch this space and I’ll post more about it soon.

Categorizing people

Earlier this year I talked about Wittgenstein’s family resemblances, which Rosch interpreted as radial categories. I’ve also talked about how categorization is used in arguments, with a layer of “category fight” superimposed on an underlying conflict, and often obscuring that underlying conflict.

I’ve used this in class with my students when we’re studying semantics. I think an understanding of polysemy and an ability to see beyond category fights is a hugely important skill, one that I don’t think they’re likely to get elsewhere. I use real examples, and I can find several new ones every week. But I try to stick to fights over non-human entities, because whenever people try to categorize humans it causes problems.

I don’t know for sure why fights over categorizing humans are so much more fraught than categorizing, say, food, but I have a couple of guesses. First of all, humans are just a lot more complex than almost anything else we categorize. We’re hard to pin down, and thus more likely to belong to radial or complex categories. Second, we’re humans, so the stakes are higher. The human you categorize may well turn out to be yourself.

One of the most contentious categorization project is the bitter wars that have been fought over various categories of people with transgender feelings, thoughts and actions, which I discuss extensively in another blog (but even I only scratch the surface). It has become a commonplace of political correctness to refer to categories of people with adjectives rather than nouns, because the nouns tend to have more negative connotations (for example, “Jewish person” vs. “Jew”). I recently had a discussion on this blog and on Twitter with some editors about the categories of “prescriptivist” and “descriptivist,” and I grew more and more convinced that the root of the problem was that we were trying to categorize people.

A few months ago I was thinking about all the problems that come from categorizing people, and I wondered, “what if we just stopped, and didn’t categorize any more people? Why do we categorize people, anyway?” In thinking about it some more, I realized that we have a deep and ancient need to categorize people. When we see someone we ask ourselves a number of questions in rapid-fire succession:

  • Is this a stranger or a friend?
  • Is this person dangerous?
  • Is this a potential mate?
  • Is this someone who might want to buy what I’m selling?
  • Is this someone who might have something valuable to offer?

We use categories to help us answer these questions: Is the person one of us? One of the bad people? Man or woman? Old or young? Rich or poor? But it’s important to note that these categorizations don’t actually answer the question. They’re only good for an immediate first pass. They’re kludges.

Even in these contact situations, when we have the time and energy we should probably look beyond our initial categorization to see what our kludges might have missed. But whenever there isn’t that immediate face-to-face sizing-up, such as when we’re setting up rules to allocate resources, we should definitely look beyond categorizing people.

In the transgender case I came to the conclusion that it’s better to think of transgender feelings, beliefs and actions than to try to categorize people. Today I decided that that’s true of prescriptivism, too. It’s probably true of Judaism and team captainship as well.

On advising descriptively

Some nice people retweeted my post about being a humble prescriptivist, and I had some interesting reactions in the comments and on Twitter, but Peter Sokolowski had one that I wasn’t prepared for.

Jonathon Owen held up Robert Hall’s Leave Your Language Alone as an example of the kind of pure descriptivist that I was referring to, and Sokolowski tweeted:

After thinking it over, I’ve come to the conclusion that there really are two ideas of “descriptivism.” When writing my post I was thinking of the Robert Hall kind, which is the kind that most linguists talk about and aspire to – although I would agree with Sokolowski that we only wind up as hypocrites, loudly declaiming prescriptivism as we prescribe left and right. I think Sokolowski was thinking of a different kind of prescriptivism, as described by Jesse Sheidlower in an article that Sokolowski tweeted last year:

Descriptivism involves the objective description of the way a language works as observed in actual examples of the language. Descriptive advice — almost an oxymoron — about the acceptability of a word or construction is based solely on usage. If a word or expression is not found in careful or formal speech or writing, good descriptive practice requires the reporting of this information.

This kind of “descriptive advice” (I saw how you ducked “prescription” there, Sheidlower) is a venerable tradition with a long history in second language instruction. Most second language learners aspire to speak and write like native speakers, so it makes sense for their teachers to study the speech and writing of native speakers. As Battye, Hintze and Rowlett tell us, it was applied to instructing native speakers on “good usage” by Claude Favre de Vaugelas in 1647:


These are not just laws I made for our language based on some personal prerogative of mine. That would be reckless, some would say insane, because what authority, what basis do I have for claiming a privilege that is the sole right of Usage – the power that everyone recognizes as the Lord and Master of modern languages?

Vaugelas’ point – the reason people bought his book – was not to base these laws on all usage, but on “good usage,” le bon Vsage, which he explicitly defined as the usage of the members of King Louis XIV’s court. His book contained “descriptive advice” for people who were already literate in French – and thus presumably upwardly mobile – and wanted to write like courtiers so that they would fit in better, and maybe even be admired, at court. Write like these people and you’ll get ahead.

Somewhere along the line Vaugelas’ bon Vsage became Sokolowski’s “standards of good English.” The goal is still to write like these people and get ahead – Sokolowski tweeted, “I bet [Hall’s] kids speak good English.” I bet, but I doubt they needed any descriptive advice to do it. They spoke good English because they were raised as members of the elite. Sokolowski’s job as an editor at Merriam-Webster is to describe the writing of the elites and make prescriptions (aka descriptive advice) that upwardly mobile people can follow when they want to fit in.

The main difference between France in 1647 and the United States in 2013 is that there’s no explicit reference to a court. There are still elites, and people are still striving to fit in with them, but the old court all went to the guillotine, so nobody wants to name the new court. Instead they just handwave in the direction of “standards.”

If we’re using this definition of “descriptivist” – someone who describes the way elites talk and sells that descriptive advice to strivers – then my descriptivist chemist is not accurate. I think that’s a perfectly valid definition of “descriptivist” and I’m not judging (even if I am teasing a little) – I may be looking for a job doing that at some point.

I think it is important for linguists to be clear when we are actually attempting to describe language objectively as scientists, when we are advising descriptively, when we are humbly prescribing language with a political goal in mind, and when we’re being the kind of crotchety traditionalists that Vaugelas thought were insane back in 1647.