Chat vs Comment

2013-07-11 by . 5 comments

Hello my friend! How are you doing? It is good to see you around these parts, the Stack Exchange network is a lovely part of the Internet where we can all help each other to learn.

Of course, though, there are rules by which we should abide if we want to keep this place friendly and free of noise. Hey, now, do not look like that. It is true there are many rules, but they are normally quite useful.

Today I want to tell you about how to communicate with your fellow Stack Exchange users. For example: you have seen something someone has written and you disagree. You are a polite person, so you do not think you should just downvote and leave. I admire your style. Communication is good. At Stack Exchange we are allowed to make comments on people’s posts to ask for clarification and to point out mistakes. Obviously we do this in as friendly a way as possible.

Oh? You cannot comment? I see, I see. Do not be troubled. Comments are a privilege, which is earned by attaining 50 reputation points on this site. It is a mere trifle. You will find that by contributing good quality answers and questions you will be there in a few days, maybe even less.

It is important to note that comments are not permanent parts of the site and they get deleted when they outlive their usefulness. Ideally the useful information in a comment will be integrated into the post the comment is on, thus making the comment redundant.

Also, you should understand that the main site is not for discussion. Comments should not be used to discuss a topic at length.

Yes, you are quite right. Sometimes discussion is useful, or necessary. For that there is another place! It is a wonderful place, really. There is much adventure to be had. We call this place chat. The ability to chat is also a privilege, but it has a lower bar. Only 20 reputation points are required.

Chat is used for discussion, yes, but the discussion can roam from being purely about topics on the main site. Many times people wander in with simple questions, questions that might not be suitable for the main site, and ask them in the chat room. This is fine, encouraged even. There is of course a lot of other discussion going on there too, it can get quite frenetic, but do not be afraid to jump in. Be courteous and not pushy and you will be fine.

One excellent trick you can do with chat is to take a discussion in comments and move it to a chat room. This is useful if you think the discussion will become long. Also, if you would like to chat with someone, but they and you are finding the main room too difficult, you can create your own room. The new room will be public, but people generally do not stray from the main room unless they are invited.

Well, my friend, I hope you stay a while. There is a lot you can learn, and maybe a lot you can teach! We will be glad to hear from you.

One Language, Many Voices

2013-06-13 by . 1 comments

‘One Language, Many Voices’ was the title of an exhibition at the British Library in 2010-11. It sums up what English is and always has been. This simple truth is overlooked by those who take a one-size-fits-all approach to language. An historical perspective may help to set the record straight.

English has its origins in the various north-west European dialects which were spoken by the tribes who invaded England from the middle of the fifth century, and which displaced the native Celtic, which remains only in a few words like brock (badger), cwm (valley) and some place names. The surviving literature from the period allows us to identify an Anglo-Saxon language, now usually known as Old English. However, the texts we have still show dialectical differences, and it seems likely that the spoken dialects of Old English were sometimes mutually incomprehensible.

For a period, the Wessex dialect was the most prestigious, showing that then, as now, any one variety of the language predominates not for linguistic reasons, but for political, economic and social reasons. However, the language was by no means static during this period, because the rule of the Wessex king, Alfred the Great, coincided with the Viking invasions, which brought new words and new grammar into English which remain with us today.

The next most significant influence on the language was the Norman invasion of 1066. So pervasive was Norman French that it eclipsed English for many years in the administration of the country, but although Norman French was the official language, we may suppose that English, in all its varieties, continued to be spoken by the majority. English later resurfaced in public discourse, and in 1362 the Statute of Pleading allowed it to be used in Parliament. Soon after this time, Chaucer was writing the first great works of English literature in a form of the language that is much more recognizably English than its Old English predecessor. Chaucer wrote in his own dialect, which happened to be that of the east Midlands, spoken in the triangle formed by Oxford, Cambridge and London. The language spoken and written at this time is known as Middle English, but great literature survives in dialects other than Chaucer’s, including William Langland’s ‘Vision of Piers Plowman’.

Chaucer had a stroke of luck when William Caxton, the first English printer, came to print Chaucer’s works. Because of the proliferation of dialects, Caxton was unsure which to use in his printed books, so he just chose the one he was most familiar with, his own. This happened to be Chaucer’s as well, so the combination of a great writer and the first printer determined the course of English ever after. This particular dialect, which was to become the basis of what we now know as Standard English, was not chosen because it had some particular linguistic merit that other dialects lacked. Any other dialect would have served just as well.

Middle English turned into Early Modern English, the language of Shakespeare, but there is evidence from his plays that other dialects existed alongside what was becoming the standard one. The conscious process of standardisation didn’t begin until the eighteenth century, when speakers of English, most of whom until then had probably never given the matter a second thought, started to become self-conscious about their use of language and sought guidance. As today, there was no shortage of self-styled experts willing to help them out. They made up rules about English, which reflected their own personal preferences, and were based on Latin, a language which has a quite different grammar from that of English and other Germanic languages. Their idiosyncratic prescriptions remain with us. To some they are as holy writ and are stoutly defended by people who know little of their origins. In truth, they are shibboleths, whose main purpose is to allow those with a little education to show their assumed superiority over those who have been unfortunate enough to have had less.

Since the eighteenth century, English has changed, and become more widely spoken, in ways that earlier speakers could not have imagined. It has absorbed vocabulary from around the world and, thanks first to the British Empire and, since the start of the twentieth century, to the global influence of the United States, has become the first international language since Latin. The history of English is complex and long, but this very brief summary is necessary for countering the prejudices that all too often typify discussion of the language today. It is the failure to appreciate that English exists in many varieties, as it has always done, that is behind much misunderstanding. Within the United Kingdom alone, some regional dialects can be almost incomprehensible to those from other regions, and so can some social dialects to those from different social classes. More widely, there are varieties of English spoken in Singapore, New Zealand, Nigeria, India, Canada and the United States, to name just a few places where English is either a first or second language, and within these English-speaking communities there will be further sub-varieties.

All varieties, standard and non-standard alike, have an internally consistent system of grammar, and speakers of non-standard varieties are not, by that fact alone, inarticulate, unintelligent or ignorant. The difficulty in understanding those who speak differently from ourselves often lies in accent rather than grammar or vocabulary. As Peter Trudgill has shown, the grammar of nonstandard varieties does not differ so very greatly from Standard English. Where it does, it can be more complex. For example, as Trudgill says,

‘Standard English fails to distinguish between the forms of the auxiliary forms of the verb “do” and its main verb forms. This is true both of present tense forms, where many other dialects distinguish between auxiliary I do, he do and main verb I does, he does or similar, and the past tense, where most other dialects distinguish between auxiliary did and main verb done, as in You done it, did you?

Nevertheless, some form of commonly understandable norm is essential, and Standard English fills the role. It is the variety of the language used in published work, and in education, journalism and broadcasting, the law and public administration, and by the small minority of people for whom it is a native spoken variety. Provided it is understood as a neutral term, not implying ‘high standard’, Standard English is preferable to alternatives such as BBC English, Oxford English or The Queen’s English, and is the one used by professional linguists. If not universally spoken and written, it is widely understood, and for that reason schools have a duty to teach it.

None of this should be seen as undervaluing the linguistic merits of nonstandard varieties. They contribute to the richness of the language which we have inherited from those diverse tribes who came to Britain so long ago. We should celebrate rather than condemn them.

Filed under Etymology, Linguistics

The Give That Keeps On Gifting: The Protean Nature of English Words, and Why That’s A Good Thing

2012-12-31 by . 8 comments

English is constantly adding, modifying, and repurposing words.

Look, there’s one right now: repurpose. According to The American Heritage Dictionary, it is officially a word:

re·pur·pose tr.v. re·pur·posedre·pur·pos·ingre·pur·pos·es
To use or convert for use in another format or product: repurposed the book as a compact disk.

Etymonline cites its usage from 1995, so it is relatively new. It is made by adding re- to the word purpose, which can be either a noun or a verb. My money is on noun, because the new construct comes out of business or technical jargon and purpose as a verb is pretty seldom heard. Not that it would be wrong. In fact, it would not be surprising if purpose used as a verb were revived as a back-formation of repurpose.

In this season of giving, let’s look at the words give and gift. Give is a verb and gift is a noun, right? True. But haven’t you ever found some mechanical thing to be loose when it was supposed to be tight and said, “I feel a little give in it”? Of course you have. Verb has been nouned. You nouned it yourself.

Similarly, we’ve all heard someone use the word gift as a verb: “She gifted them all with front-row seats to the concert.” And whether our inner fussbudget winces or not when we hear it said that way, it is still a legitimate usage. Face it: the traditional verb form give doesn’t say as much, and simply isn’t as precise. “She gave them all front-row seats to the concert.” That doesn’t exactly carry the connotation of presenting them with a gift, does it? She could have been paying them back for prior favors, or because she lost a bet — any number of non-giftish reasons come to mind.

Who can forget the “regifting” episode of the immensely popular TV show Seinfeld? (Hmm, that occurred right around the time repurpose came into the lexicon. Coincidence?) And if we admit that the writers and actors on that show were all very gifted, we’ve now adjectived a noun. We might even have adjectived the noun very giftedly, in which case we’ve adverbed the adjective. It goes on.

There really is nothing to be afraid of. Languages change, and words get overloaded, adapted, modified. Some people abhor this condition. Some feel language should be as precise as mathematics: see John Quijada’s artificially constructed language, Ithkuil, if you don’t believe me. Me, I prefer the richness of everyday speech, and the creative way people adapt words to mean new things. Isn’t it more colorful and descriptive to say a basketball player bricked a shot, rather than falling back on the boring and pedestrian missed? A horrible shot in basketball looks like someone throwing a brick, not a ball, and if we verb the noun we get a shot that has been bricked.

Language is a living thing. Let’s never forget that. If words stop changing, a language starts dying, just as our bodies do if our cells stop dying and being reborn.

While we’re on the subject, let’s look at that verb: live. The noun form is, of course, life. Since the 1830s, the noun lifer has referred to a prisoner serving a life sentence. But wait a sec, didn’t it have to become a verb first? Isn’t a verb at least implied there: lifer, one who lifes? No? Let’s move forward in time and notice a shift in meaning: lifer now includes someone who is serving “for life” in the military. I recently read the book Generation Kill by Evan Wright, who was embedded with a platoon of Recon Marines that participated in the assault on Baghdad in 2003. After Saddam’s army was defeated, one of the Marines, Corporal Ray Person, is quoted as he grouses about the battalion first sergeant’s return to his meddlesome “lifer” ways. Person complains:

“The second they stop shooting at us, [that] motherfucker’s lifin’ us in his stupid fucking retardese.”

Ahh, there it is. Crude though his statement may be, his verbing of a noun that arose from the prior verbing of the same noun is pure poetry. And exactly right for the sentiment the soldier wished to express. What is a lifer sergeant doing to Marines when he makes their lives miserable with a lot of petty regulations? He is lifing them. And he is doing so in the imaginary-yet-somehow-very-real language Cpl. Person vulgarly calls retardese, which consists, presumably, of one stupid, ungrammatical statement after another spoken in a near-incomprehensible hillbilly drawl.

I hope you can’t find any give in my arguments. But I wish to gift you with one final thought on the protean nature of English. Wallace Stevens said it about poetry, but it goes for language in general as well:

    It has to be living, to learn the speech of the place.
    It has to face the men of the time and to meet
    The women of the time. It has to think about war
    And it has to find what will suffice. It has
    To construct a new stage.
In other words, it has to be a living, changing entity, continually adapting, being adapted, constructing a new stage. Fortunately for all of us, it is.

Filed under Etymology, Grammar, Linguistics

You Could Look It Up

2012-12-17 by . 5 comments

Your question has been “Closed as General Reference”. That raises more questions: What does that mean? Why was it closed? What should you do about it?

What Does It Mean?

First, what it doesn’t mean. It doesn’t mean “Your question is worthless. Don’t bother us.” It certainly doesn’t mean “You are an illiterate cretin. Go away” — although some people take it that way.

Closed means “Closed for repairs”.  And General Reference means “You could look it up.”

Why Was it Closed?

That’s very easy to answer: we believe that your question (as it stands) can be answered by consulting a standard online reference work.

It makes a lot more sense for you to do that than us. If you look it up you will find not only the answer to the question you asked but also the answers to many other questions you might have intended to ask that we don’t know about.

You will also get your answer faster, since you won’t have to wait for one (or more) of us to perform the lookup and incorporate the results in a Witty and Incisive Response. (Wit and Incisiveness are hard to achieve, and a good Response can take a long time to compose.)

And: you may also learn something about what online resources are available to you, and what they offer which might satisfy future needs.

What Should You Do?

Depends.  Very often people will have posted an answer to your question, or will have posted what amounts to an answer in the comments. If all you’re interested in is the answer, you’ve got it: you’re done.

If you didn’t get a satisfactory answer this way, do what the Closed banner tells you:

Look it up.

Again, people will often post a link directing you to an appropriate online reference. If not, a lot of useful references are listed here and here. These lists are particularly valuable for the comments which accompany them. The works fall mostly into four broad categories:

  • Dictionaries provide far more than just definitions: etymologies, examples, citations, and often brief notes on “standard” usage (debate rages over what exactly that means, but that’s instructive, too). Don’t consult just one: Dictionaries vary greatly not just in overall quality but in the value of individual entries.
  • Thesauruses (or thesauri, or even more piquantly thesauroi) are useful for recalling words you can’t quite remember, but they don’t usually tell you much about which synonym you should use where. But they can be fun.
  • Corpora provide many more examples of actual use of a word or phrase than dictionaries, and can be particularly valuable guides to when and how synonyms differ.
  • Style guides are the best source for prescriptive rules of grammar, syntax, spelling, punctuation, and documentation. They all differ in many details, however; select the one that is recommended by your school or discipline or (if you are so fortunate as to have one) your publisher.

A hint:  OneLook is a very useful tool: input a word or phrase and it returns you links to many dictionaries and other references conveniently listed on one page.

But what if these references don’t provide you what you need? — no reference work can answer all questions. In that case, come back to ELU and

Fix your question.

Click edit immediately beneath your post and rewrite it.

  • Tell us what you’ve found out, and focus our attention on what your research leaves unanswered.
  • If anybody left useful comments, address those.
  • Give us as much context as you can. What is it you’re trying to understand (or say)? Who said it (or to whom do you want to say it), where and when? What register are you concerned with? — formal, colloquial, vulgar?
  • Don’t forget to change your title Question to fit the new content.

The more you can tell us, the better we can answer.

If you’ve got at least 20 rep, you can pop over to Chat (the link’s at the top of the ELU page) for help. There’s usually somebody around to hold your hand. And if you fix it all by yourself, come by Chat when you’re done and report it. It takes a moderator or five high-rep users to get your question reopened, so you want to draw their attention to the work you’ve done.

Trust me. That’s how you get a Witty and Incisive Response—or several. That’s how you get Upvotes and Reputation. That’s how you learn to use resources you never knew about. That’s how you Make Friends and Influence People.

You could look it up.

 

Good English = Effective English

2012-12-03 by . 2 comments

Speech and the written language differ in many ways. Speech developed before writing and we learn to speak before we learn to write. For a long time there was no written language at all, and there are languages that have no written form. That is not to say we can say what we like and hope to be understood. Speech has its rules. In English, we must say, ‘Shut the door’ rather than, ‘Shut door the’ or ‘Shut of door’, and we must say ‘streets’ rather than ‘street, street’ when we mean more than one. Anyone who applies such rules consistently speaks correct English. The only people who don’t are those who have yet to learn them: infants and those who are learning English as a foreign language.

There are many varieties of spoken English and there is no reason to suppose that one variety is linguistically superior to any other. At the same time, we do well to use a spoken language that is tolerably close to that of the people with whom we expect to spend most of our lives. For the middle-class, that means adopting the dialect known as Standard English. It can be spoken in any accent, but is often associated with the accent of educated people living in London and the south-east of England. But it’s no more and no less correct than Midlands, Liverpool, Tyneside, Indian, Australian or Caribbean English.

Written language derives from speech, but we have to make a deliberate attempt to learn it. Some fail to do so, even when they speak their native language fluently. We have to encode our thoughts as arbitrary marks on paper or the screen and interpret similar marks produced by others. Like speech, different kinds of written language suit different circumstances. An email or text message in a variety of language that many of us would not understand is perfectly appropriate between people who do understand such language. The question of whether or not it is correct simply does not arise. However, such language in, say, a job application or a Times leader would be unacceptable, and consequently ineffective, if it was incomprehensible to its readers, or if it simply antagonized them. That seems to me a more important consideration than whether or not it conforms to someone’s idea of correctness.

Those who commit words to print should consider what they are trying to express, who their readers are, and whether the chosen language will succeed in conveying the message clearly without hesitation, repetition or deviation. And it is helpful if, in writing which is destined to be read by a large number of people whose linguistic backgrounds we cannot know, we agree on certain conventions. These conventions include punctuation, spelling, and choice of vocabulary and structures. In speech we generally know personally our audiences. In writing, too, we will sometimes know our readers and we can adapt our language accordingly. Quite often, we will not. In those cases, a certain commonality is required to avoid chaos.

When I read a sentence I ask not so much, ‘Is it correct?’ but, ‘Do I want to read any more of this stuff?’ ‘Getting it right’ means successfully using language to achieve the purpose intended, not necessarily complying with a set of rules. Achieving the purpose intended includes producing the response on the part of our readers thaty we want them to have. Placing the emphasis on effectiveness rather than correctness seems to me more likely to produce the desired result. The alternative seems to suppose that once you have complied with the rules laid down by this or that authority you have done all you need to. That is far from the truth.

Filed under Linguistics

Much Ado About Possessive Apostrophes

2012-11-19 by . 10 comments

Apostrophes are lovely little critters, but they tend to boggle the mind if you think about them too much.  One of the most common questions on EL&U regards proper usage of an apostrophe to indicate possession.

The basics.

How do we use an apostrophe to indicate possession?

If the possessing noun is singular, add an 's (apostrophe-s).

Sara's beast friends were all balrogs.

If the possessing noun is plural and ends in s, add an ' (apostrophe).

The beasty balrogs' game was very fun.

Well, now, that’s pretty straightforward, right? Except that apostrophes have this annoying habit of jumping into your brain and scrambling your thoughts.  There are lots of ways to get confused.

What if the possessing noun is plural and does not end in s?

Then treat it like the singular case, and add an 's (apostrophe-s).

The children's books were tucked away in their cubbies. The geese's honking alerted the dog to the fox’s presence.

What if the possessing noun is not plural, but ends in s?

Well, golly, it turns out this one is complicated.  Generally speaking, these are treated just the same as other singular nouns:

The glass's rim was cracked.

But this has not always been the case.  Historically, names ending in s followed the plural rule:

*Seamus' writings were well-known throughout Galway.

For proper nouns, this is considered a stylistic choice, but following the singular form is more common these days:

Seamus's writings were well-known throughout Galway.

You’d think with just four rules (which are really just two if you think about it) that noöne would have much trouble with possessive apostrophes. But those apostrophes sure are pernicious.

The Advanced.

What if the possessing noun is a conjoined phrase like “my wife and I”?

Kosmonaut gives an excellent answer to this question.

 

Those rules are all well and good, but how do I decide whether the possessing noun should be plural or not in the first place?

There are a lot of questions about this very sticky wicket on EL&U.  Some examples are:

User’s or Users’ Guide

User or Users Account

User’s/Users’/Users Group

Happy Mothers’ Day or Happy Mother’s Day

Members’ or Member’s Benefits

Beginner’s or Beginners’ Guide

Baker’s Dozen

Does the guide belong to one user or many users?  Is the day for one mother or all mothers?  Either way is technically acceptable, but generally speaking, we consider a single instance and an abstract entity.  So one copy of the guide for one abstract generalization of user means we usually say “User’s Guide.”  Mother’s Day is trickier because we could celebrate all mothers on that day, but it is supposed to be a day on which we honor our own mother, so “Happy Mother’s Day” unless you have two mommies.

Finally, we see that possessive apostrophes are disappearing for plural nouns that demonstrate affiliation, so it is acceptable practice to use phrases like “User Group” instead of “Users’ Group.”

That is a little summary of possessive apostrophes, along with some fun links for further reading.

Filed under Orthography

Looking Up a Gun: Common English Words with Nordic Origins

2012-11-05 by . 2 comments

Old Norse words in the English language are much more numerous than many would suspect. Many common words such as  guncraze, and equip are of Nordic origin. Because the two languages were so similar, they have many loanwords. Often, they were mutually intelligible to quite a degree. In this post, I’m going to analyze the origins of these three common English words rooted in the Old Norse language.

There were a two main ways that Old Norse words made their way into the English language. First, between 865 and 954 (the Danelaw), the Vikings colonized eastern and northern England. During this time, many of their Old Norse words entered the Old English and have been in use  since. Other words entered the Norman French and were passed on from there to Middle English during the Norman Conquest of 1066. The parallels between Old Norse and Old English facilitated the trading of words between the two languages.

Gun

In Nordic culture, the name Gunnhildr was fairly common. It had the meaning “war battle maid” and is a cognate to the more modern name, “Gunhild”. In 1330, Windsor Castle had an inventory of it’s munitions made. In the inventory, a specific siege engine was called the Lady Gunilda, a shortening of Gunnhildr. Later, the word gonnilde, yet another variation of Gunnhildr, became more generalized to mean “cannon” in Middle English. By the mid-fourteenth century, these had been shortened to gunne. It did not yet have the modern meaning of “gun”, though. It meant simply “an engine of war that throws rocks, arrows or other missiles”. So, the ballista and the trebuchet both fell into this definition. It wasn’t until the fifteenth century that gunne came to mean “firearm” (because that’s when firearms first came to major use). Around that time, it was finally shortened to “gun”.

Craze

Old Norse had a word krasa, which meant “shatter”.  Around the mid-14th century, it entered the Middle French language as the word ecraser, which meant “to squash”. This evolved into both the modern French écraser, and the Middle English crasen, which meant “to break in pieces; to crack”. It also had a  second meaning, “to be diseased or deformed”. Crasen evolved into the modern English crase (now obselete), however, it only carried the first meaning, “to break in pieces; to crack”. However, crasen evolved into another modern English word, craze. This carried the second meaning, “to be diseased or deformed”. However, it had evolved into the meaning “mental breakdown”. The current meaning of the word is “to become insane; go mad”, not a far cry from “mental breakdown”. The first reference to craze meaning “mania, fad” was in 1813. However, the original meaning, “to make cracks”, is still in use a with a slightly different meaning, “to make small cracks on the surface of”. This is used when referring to ceramic pottery.

Equip

The Nordic word skip meant “ship”. Skipa, another Norse word was derived from it, with the meaning “fit out a ship”. In the twelfth century, it entered the Old French as esquiper. In the 1520s, it was used in the Middle French as équiper. It meant “to supply, fit out”, thus it was no longer specific to ships. In the late sixteenth century, it made it’s way into English as Esquippe. In the seventeenth century,  a p was dropped and the word became esquip. Later in the century, the s was dropped and it was shortened to “equip”, as we know it today. It was spelled acquip during that time, but that spelling never really caught on.

Estimates vary, but range from 15-25% of English words (non-scientific) originate from Old Norse. Given the size of the English language, that is a quite a considerable amount. Only Latin and French contribute more words to English than Old Norse. Our language owes a great deal to those ruthless Scandinavian seafarers. Without their contributions, I would not be able to say, “He often fumbled for words, which amused people greatly.” (Kylfdi mᴊǫk til orðanna, ok hǫfðu margir menn þat mᴊǫk at spotti.)


Sources

Filed under Etymology

Prescriptivism and Descriptivism

2012-10-15 by . 4 comments

Imagine you are reading something on the Internet (I know, it’s a stretch), and you come across the following passage:

I want to be sure that you and me are on the same page. When you ask how I feel about grammar, you are begging the question, Are you a prescriptivist or a descriptivist? The problem is that that question isn’t even something sensible to really ask about. It think it would help you if those definitions were reviewed.

How would you characterize the quality of the writing?

  • It is just fine
  • It has some style issues
  • It has some grammar issues
  • It is horrid writing for a number of reasons, including both style and grammar

Of course, the correct answer is… well, hold on, now. It’s not quite that simple.

A Prescriptivist’s View

If you cringed while reading the example passage above and ached to break out the red pen, then chances are that you fall into the prescriptivist camp. The general take of a prescriptivist is that there are rules that define how language should be used, and that mistakes result from when those rules are broken. You might hear this idea of prescriptive linguistics described as normative, which means that the rules are based on normal usage, and they determine the way things (spelling, grammar, etc.) ought to be. Some examples of prescriptive rules are:

  • Don’t end a sentence with a preposition
  • Don’t split infinitives
  • Don’t use the passive voice
  • Don’t use the pronoun ‘I’ in object position

Of course, not all prescriptivists agree on what the rules (and exceptions) should be. Many derive their rules from authoritative works, like Fowler’s 1926 work A Dictionary of Modern English Usage, or Strunk and White’s The Elements of Style, now in its 53rd year of printing. Others rely on their intuitions, informed by the forces of society and class, or aphorisms passed on by their elders (my grandmother was fond of saying, Cakes are ‘done’, people are ‘finished’!). The English Language and Usage Stack Exchange site has seen many questions on prescriptivist rules, for example:

The keen observer will have noticed that prescriptive rules tend to cover not just what is allowed by language, but also (and often) what is preferred. The rules are not restricted to grammar, but can extend to concerns like spelling and formatting (all of which are, for lack of a better phrase, elements of style). For example, a prescriptivist might tell you that a sentence beginning after a colon must start with a capital letter, or that the word ‘like’ should not be used as a subordinating conjunction.

A Descriptivist’s View

You may have gotten through the passage at the beginning of this post and thought that there was nothing wrong with it. Or, perhaps you thought it was not the best prose you’d ever seen, but that there weren’t any real errors, simply style choices that you wouldn’t have made. Maybe you even saw some things that you really didn’t like, but know that sometimes people choose to write that way, and as long as it’s understandable, you can deal with it. If any of that sounds like you, then you are probably somewhat of a descriptivist.

The idea behind descriptive linguistics is that a language is defined by what people do with it. In other words, you begin by studying and listening to native speakers. Then, when you notice patterns in the ways that they communicate, you can record those patterns as guesses about the principles of a language. If you rarely (or never) observe someone breaking those patterns, then your guess is more likely to be an accurate representation of the language. Those guesses are called hypotheses, and when they are well-supported by evidence, they can be accepted as correctness conditions for a language. For example, a correctness condition about Standard English is the notion of a Subject-Verb-Object (SVO) word order. It is very difficult (if not impossible) to observe a native English speaker saying something like, “*I an apple ate,” so it is a safe bet that if you hear that, you aren’t hearing Standard English. Of course, it also means that if enough people start using a new construction, then your grammatical model should adapt to accommodate it.

The main difference between a correctness condition and a prescriptive rule is that a rule is, by its very nature, regulatory. A correctness condition, on the other hand, is constitutive. I like to think about it in terms of cooking: If I serve chicken cacciatore with raw chicken, that’s an error. The dish is still chicken cacciatore, but I’ve made it incorrectly. I’ve broken a prescriptive rule that governs how to make the dish (specifically, the one that says that the chicken should be braised until it is cooked through rather than served raw). On the other hand, if I make cacciatore with rabbit instead of chicken, that’s not chicken cacciatore with mistakes. It’s simply rabbit cacciatore. A descriptivist would look at the situation and conclude that cooking alla cacciatora is defined by searing meat in oil, then simmering it with tomatoes, onions, peppers, and seasoning, rather than by the choice of meat (perhaps with a caveat that some meats are more common than others).

The Middle Ground

So, you seem to be at an impasse. On the one hand, you have generations of grade school English teachers rightly warning their pupils that people might chuckle at them if they use the word ‘irregardless’. On the other hand, you have the scientific rigor of the modern linguistic community touting descriptivism as the torch-bearer of truth and enlightenment. Are you doomed to choose between a democracy of solecisms and a library of thousand-page tomes of writer’s regulations? Are things really that bleak?

Of course not. You have the luxury of picking the view that suits you at any moment. You can leave it to the descriptivists to confirm what makes up the language, and the prescriptivists to guide you on how to make it flow sweetly and clearly into the minds of others. Members of these groups tend to bicker and say that the others are destroying the language or poisoning the minds of the children. It is rarely true that these claims are valid. As long as you keep your wits about you, it is not so hard to tell when a descriptivist is being overly forgiving of bad writing or a prescriptivist is blindly spouting advice on language that hasn’t been relevant for the last sixty years. Neither is it a bad idea to keep an open mind towards new ways of saying something, or consult a style manual for tips about how to communicate your ideas effectively. As is so often the case, the most important advice in the ‘prescriptivist vs. descriptivist’ debate is to keep your head up and use the right tool for the job.

Going Further (or is it farther?)

Interested in diving deeper into the matter? Here are some resources that I think are interesting:

Filed under Linguistics

That vs Which: A Pragmatic Approach

2012-10-01 by . 6 comments

 “There’s glory for you!”

H. Dumpty, founder of linguistic pragmatics

If you’re looking for a balanced discussion of the  That vs Who/whom/whose/which controversy, go here. I’m not interested.

A hundred years ago the Fowlers put forward a modest proposal. Linguistic bureaucrats elevated this proposal to a Rule, linguistic libertarians resisted; and today the Fowlers’ proposal is an Issue hotly contested by Conservative and Liberal ideologues.

I have no taste for political disputation. While my sympathies lie with the Liberals (who in the Fowlers’ day would have been the Reactionaries), my experience is that I am never profoundly disturbed by the actual usage of the Conservatives (who a hundred years ago were the Radicals). And neither side is going to budge from its position, each is deaf to the other’s arguments and writes or redacts according to its own judgment; so I see little point in rehashing the arguments.

I’d like instead to adopt a non-partisan and non-ideological approach, and come at the T/W question from a different angle. I’m a writer, my concern is to make the most effective use I can of the tools which come to my hand. The Fowlers themselves grounded their proposal in the economic argument that “if we are to be at the expense of maintaining two different relatives, we may as well give each of them definite work to do (The King’s English, 1908).” I may hope, then, that others will find some value in exploring the pragmatic and non-doctrinal considerations which govern my usage in my writing.

“Writing”, I say; but the spoken language is both historically and methodologically prior to the written, and most of us aspire to something of the spontaneity and freshness presumed to reside in oral usage; so it may be useful to see what ordinary speech tells us.

I happen to possess a modest corpus of semiformal speech—videotapes of impromptu interviews with a dozen college-educated U.S. speakers from various regions and callings. Scanning the transcripts for uses of relative pronouns (and consulting the tapes where there was any ambiguity) yielded three interesting findings:

  1. Ordinary U.S. speech does not distinguish lexically between restrictive and non-restrictive clauses. Indeed, the paratactic construction imposed by improvisation makes the distinction itself difficult to maintain. How do you categorize a clause which is clearly, to the ear, an afterthought, but which could make sense as a restrictive clause? —Here’s an example; the speaker is discussing a table of numbers (a dash represents a pause):
 The difference between 154?—(points) that is actually available?—and  and the 149—(points) which the budgeting exercise produced?— is another opportunity for life insurance . . .
  1. All the W forms are very rare: that is absolutely predominant by at least fifty to one. If the W forms disappeared from the spoken language they would never be missed.
  2. Again and again I heard that after a clause followed by a pause (and sometimes repetitions of that)—and then the speaker settled on what he was going to say—which might be a relative clause (restrictive or not), an adverbial clause, or a clause to which that was entirely irrelevant.

So—should the written language follow Liberal logic, and abandon the W forms altogether?

Of course not. No craftsman forsakes the use of a tool simply because amateurs do not use it. Finding 3. above is instructive: speakers prefer that because it’s the all-purpose tool, adequate in all circumstances. But the writer has an entire workbench of specialized tools, and leisure to choose between them.

Should we then unite behind the Conservatives, and use T and W to distinguish restrictive and non-restrictive clauses?

Again, I think not. The usage is not distinctive either for the ordinary reader or for many of the ideologues. And it is redundant: all of us distinguish these clause-types by means of the comma. The T/W distinction is unnecessary here.

Let’s instead use the W forms where they’re most useful: in any relative clause. The W distinctions, between who, whom, whose and which, allow us to signal reference and syntax more clearly and more smoothly. When it can be done gracefully, omit the relative pronoun altogether; but let’s use that as a relative pronoun only under pressure of what the Fowlers call “considerations of euphony”.

This not only exploits the W distinctions more fully, it makes that more effective and efficient, too. that is horribly overworked: it takes 17 columns in the OED to discriminate its uses. No word except to is more likely to appear multiple times in a sentence with different meanings. As Dumpty noted, that comes at a cost: no word is more likely to confuse the reader’s eye and mind.

I have for thirty-five years avoided the use of that as a relative pronoun. I use W forms almost exclusively, in all contexts: marketing copy, stage plays, voiceovers, business proposals, legal drafts, training videos, my doctoral dissertation.

And you know what?—I’ve never been called out for it. Not by clients—not by actors—not by academics.

I commend this approach to your consideration.

“Impenetrability! That’s what I say!”

H.Dumpty

 

Filed under Grammar

Typography: Striking Language (part 1)

Typography is all around us, every minute of every day.  I’m willing to bet that, this blog post notwithstanding, there are at least five different typefaces within reach of you at this moment.  I’m hedging my bet, because it’s probably closer to ten or fifteen.  You may think little or not at all of typography, but it is equally important to language as the spoken word.  By definition, typography is the study, use, and design of identical repeated letterforms.  Throughout history, these forms have taken shape and morphed from the shifting popularity and availability of writing implements and surfaces.

Language itself is rooted in visual communication.  Before we had language, we communicated with imagery.  The story of typography begins with man’s initial use of petroglyphs (rock engravings), pictographs (cave paintings), and pictograms to preserve business transactions, tell stories, give warning, and record history.

fig 1. Sumerian cuneiform tablet

The ancestral awakenings of modern typography are found in cave drawings, tablets, and Egyptian hieroglyphics, ideography that served to support and preserve spoken tales of the joys and threats that comprised the daily existence of early man. From the oldest Sumerian cuneiform tablets (fig. 1) to the digital typefaces of today, written language has persevered as a crucial auxiliary to the spoken word.  The written word is invaluable as a means of preserving the past and language itself.  The beautifully-written word adds a nuance of artistic flair and reveals a concurrent history of its own.

The English alphabet evolved from the Latin one used by the Romans and is thought to have grown from Greek, Semitic, and Etruscan influences.  Our modern letterforms developed around 100 BCE, and by 100 CE, two common forms of Roman scripts were in use.  About one hundred years later, parchment and then paper were developed, and their portability was revolutionary to the spread of written language and of literacy in general.  Over the next few hundred years, the Greeks were writing using reed and quill pens with nibs, the changing widths of which altered the pens’ strokes and resulting letterforms.

The earliest form of typographical mass production, relief printing, was developed in China around 700 CE.  By 1300 CE it had reached the Europeans, who by 1440 CE were using the technology to print entire books comprised of woodblock-cut text and images.  A German blacksmith, Johannes Gutenberg set the printing world on fire with his invention of a moveable type printing press which allowed individual letterforms and punctuation to be set and reused again and again.  Gutenberg’s contribution defined the identical repetition that defines typography and largely satisfied the printing industry for the next 400 years.  Printing technologies from then on have advanced to serve the growing literate population that Gutenberg’s invention had spawned.

As a discipline, typography is as complex as any other serious subject: the magic of readability and presentation is rooted in an often mathematically-calculated aesthetic.  In selecting type, the main concern is the matter of readability.  More than just legibility, readability is a marriage of legibility and good design.  You’re likely reading this post in a typeface called Georgia, a very popular face designed for the ease of screen reading.  Just as Gutenberg’s moveable type served to facilitate the changing times, digital type must capitulate to the limits of the medium to maintain the base principle of legibility.  Bad type stands out and good type goes unnoticed, quietly serving and preserving ideas with beauty and aplomb.

 

 

Filed under Orthography