On editing and self-editing

Edit Ruthlessly

Edit Ruthlessly (Photo credit: Dan Patterson)

My husband and I are redoing our front yard to convert from high to low water use. Being situated in the alluvial zone at the foot of the San Gabriel Mountains, our little property is “blessed” with an abundance of rocks – in all sizes. Hence our brilliant idea: Do a rock garden.

So I’ve been spending a little time out there several days a week placing rocks to hold and frame the dirt for planting succulents. I place those rocks very carefully. The effect I’m looking for is not exactly a natural arrangement, but a visually interesting and attractive one. Sometimes I go out in the morning and rearrange some of the rocks I placed the day before – or the ones I placed several days ago. Sometimes I find myself sitting inside looking out the front window and thinking, no… that one would be better a little bit to the right, or, yes… but I’m going to need a bigger one right about there to balance that other little grouping…

Eventually it dawned on me: I’m editing the rocks.

Then there’s the Christmas tree. We have a string of little white lights and an eclectic assortment of ornaments. Every year it takes me a couple of hours to decorate the tree to something approaching visual perfection  – and then I spend the rest of the holiday season tweaking the lights and rearranging the ornaments to better balance the sizes and the colors, and to fill those annoying little “holes” that somehow weren’t apparent when I finished the original arrangement. Sometimes I’m sure I move ornaments to fill holes that were created by moving ornaments to fill other holes…

Yes, you got it. I edit the Christmas tree.

I think I’m a natural born editor.

Given that I’ve been writing things of one kind or another for most of my life, it’s hardly surprising that I edit my own writing. In fact, I do so constantly. Sometimes it feels like I can’t leave any two previously written words together. It also probably shouldn’t come as a surprise that I have become, here in my middle years, a professional editor – for scientific texts. I do scientific texts because I have a background in science that gives me added credibility for that task, but I could easily edit other things. The only thing remotely remarkable about my being an editor, really, is that it took me so long to come around to it.

After giving the matter some thought, I’ve concluded there are three basic requirements for being a good editor (of the text variety).

  1. An outstanding command of the language.
  2. The kind of patient and meticulous nature that makes giving “attention to detail” a foregone conclusion.
  3. A clear concept of what the final product ought to look like.

Obviously I wasn’t born with the first. I must have acquired it, although I do think a certain amount of talent must have been involved because it certainly required very little effort on my part. The second, I was apparently born with – in spades.  The third has always seemed pretty obvious to me and not particularly difficult to achieve. My method of achieving it can be summed up in three words:  Read good examples. (Of course, if your client has a manual or style sheet he or she wants you to use, it goes without saying that you use the manual or style sheet.)

It is a little bit surprising that it has taken me until fairly recently to really appreciate the need to have someone else edit my writing.  Well, maybe it’s not that surprising. When I started writing scientific papers, after all, I always edited myself – over and over – both before and after my thesis advisor had put in his two cents’ worth.  And when the proofs came back from the journal, I never noticed that any changes had been made to what I’d sent. There might have been some that I didn’t notice of course, but basically I think I wrote (and self-edited) well enough that my work didn’t need a whole lot of editing.

In other words, being that I’m a natural born editor, I think I might be forgiven for getting the impression that it was the writer’s job to get it right in the first place.

The trouble with this concept is, you can’t.

I mean, you can’t reliably get it completely right in the first place.  Not when you’re writing the tens of thousands of words – the dozens or hundreds of pages – that go into something the length of a book. The average essay or scientific paper is only a few pages or a few thousand words.  A writer who happens to be a pretty good editor has a fighting chance of catching all the errors in something that length, but not in a book.

Many people have noticed that we all tend to have trouble seeing our own mistakes on the page (or the screen.) You know what you wrote, after all. The fact that you didn’t actually write what you know you wrote can be really quite shocking when someone finally points it out to you. It can be positively mortifying – especially if you’re an editor, believe me.

Even when it comes to other people’s writing, there are certain kinds of errors that we have trouble seeing. The absence of small common words where a line is broken, for example, or the same word repeated at the end of one line and the beginning of the next.

Why are we error prone in this particular way?

Well, it obviously has something to do with expectations.  But I think it also has something to do with how our brains work. The brain is famous for filling in gaps in our perceptions to create the impression of a seamless and coherent world. Studies have shown that our visual systems only take samples of what’s out there. The fact that our eyes are constantly moving and taking samples, together with the vast amount of experience our brains have with interpreting those samples, combine to give us the impression that our minds simply look out through the windows of our eyes and see the world as it is.

Did you know that the design of the eye is such that the image cast by the lens on the retina is inverted, top to bottom, and left to right? As far as the eye is concerned, objects appear to fall up. Why doesn’t it look as if objects fall up?  Because, at a very early age, your brain learned to turn the images around.  It’s an amazing thing, the brain – quite miraculous. Right up to the point where that miracle prevents us from being able to see our own mistakes.

So, even though I’m an editor, I know that I still need an editor. When it comes time to get my manuscript ready for publication, I know I’m going to have to hire one. (And that’s not just me trying to promote my own profession.)

Advertisements

Wag the dog… on thought and language

 

The Thinking Man sculpture at Musée Rodin in Paris

 

It seems pretty obvious that the way we think influences our language. What is less obvious is that our language also influences the way we think.

 

I remember having an argument once with my mother over whether it is possible to think without using words. She said it wasn’t, and I thought it was. Looking back, I think the real basis of our disagreement may have been a difference in what we each meant by the word “think.” To my mother, it just wasn’t really thinking if it didn’t involve words. It was something else, something more nebulous, like feeling, perhaps, or something more primitive, like reacting. On the other hand, I’m darn sure I can think without words. I’m reminded of the fact every time I get stuck because I can’t think of the right word for whatever I’m trying to say. I know I’m looking for a word that means just exactly… well, that… and it seems that there must be one, or at least there ought to be one…

 

Does that ever happen to you?  (Let’s see a show of hands…)

 

I’m getting off the subject, but the point is that our language and our thought processes are very intimately connected.  So much so that we often make the mistake of thinking that a thing must exist simply because we have a word for it – or that a thing must be possible just because we can say that it is. We fall into the error of believing that words or phrases define the world, rather than merely being imperfect tools used to describe it.

Examples:

Safe.”  I once read an entire book on the subject of “acceptable risk,” the whole point of which was that nothing is absolutely safe – totally without risk of any kind. Yet people who ask, “is it safe?” routinely expect to be given a yes or no answer. When the doctor, scientist, or government official comes back with, “the levels are too low to pose a significant health hazard,” people aren’t satisfied. The think that’s weasel-wording, or government-speak for, “we want you to think it’s safe, even though it really isn’t.” In fact, the poor guy is just doing his best not to lie to you.

Other words like “clean” and “pure” – or any word that implies some absolute condition – have similar limitations. Did you know there is a maximum number of insect parts allowed per standard volume of ketchup? Yuck! Why doesn’t the government insist that there not be any insect parts in there? Because there is no possible way in any real universe for the manufacturer to insure that there won’t be any. The best you can do is to establish a level that is as low as possible while still being reasonably achievable.

Freedom.” Increasingly cavalier use of this word as a thing that is always desirable and good is saddling it with so much emotional baggage that it’s in danger of becoming an empty shibboleth – a catch-word thrown about to make you feel good, hook your emotions, or convince people that someone is on the “right” side. We’re starting to believe that freedom is always good, and so anything that limits anyone’s freedom must automatically be bad.  In fact “freedom” really just means the absence of coercion or constraint in any choice or action. In short, it means being able to do what you want. This is fine as long as it’s you getting to do what you want, but what if it’s someone else and what he wants to do is to hurt you? It’s perfectly legitimate linguistically to talk about freedom to rob, freedom to rape, freedom to kill, etc. We’ve begun to think that “freedom” is a treasured value of our democracy when in fact it is specific freedoms, such as freedom of speech, that are our treasured values.

There are two questions you always should ask when you hear the word “freedom” being bandied about: Whose freedom are we talking about? And, freedom to do what, exactly?

I heard a sound bite in which a member of the U. S. Congress said something like, “government should protect our freedom, not tell us what to do.”  I’m sorry; a government that doesn’t tell us what to do creates a society with no rules. And who is likely to benefit in the absence of rules? The strong, the rich, and the clever will benefit for starters – also the irresponsible, the unprincipled, and the ruthless. Government can’t protect any freedoms for the weak, the poor, and the well-meaning but perhaps a bit naive nice guys, except by curtailing some of the freedoms of those who would otherwise take advantage of people less able to defend their own freedoms.

Making money.” Let’s face it, the only people, apart from counter-fitters, who actually make money are the people who work in a mint. The rest of us don’t make money, we acquire it from other people – hopefully in exchange for having done an appropriate amount of useful work, or having provided the other person with a product of appropriate value. Why make this point? Because the word “make” implies something is being produced or created, and it’s hard to see any possible moral issue with that kind of activity. Once you realize that all the money you’ve accumulated came ultimately from other people – directly or indirectly – it puts things in a different light.

We can all be rich.” While it’s possible to say this, it isn’t actually true, because the word “rich,” in monetary terms, is defined as one end of a scale. “Rich” has no meaning in the absence of “poor.”  Simply put, “rich” implies having significantly more money than a significant number of other people. We could potentially all be prosperous, since “prosperous” implies having enough to meet one’s needs, with some to spare. I think we could all achieve that, especially if we helped each other. Yet I heard that half of the 2012 college graduates in a recent poll expressed a desire to become rich. I don’t blame them; I blame us older folks who are giving them the wrong message. We use “rich” in a non-monetary sense to mean all kinds of good things, from “a rich cream sauce,” to “a rich cultural heritage.” We’ve lost track of the negative moral implications of becoming rich monetarily. (There was something about camels fitting through narrow openings…)

With that, I think I’ve probably gotten myself into quite enough trouble.

 

Clarity First – on understanding one another

What is difficult? [ about A Cognitive Substra...

What is difficult? [ about A Cognitive Substrate for Natural Language Understanding ] (Photo credit: brewbooks)

 

I think I’ve already said that clarity is the first priority in communication (especially written communication, which potentially could transcend the ages). I’ll probably say it again. What I won’t say, though, is that there’s no excuse for not being clear. There are lots of excuses.

Here are some of the things that limit clear communication:

 

  1. Language is an imperfect tool under the best of circumstances. It was invented by a bunch of rank amateurs, using a process of trial and error, and is constantly being reshaped by its users, most of whom are also amateurs. It’s a complex system of sounds/visual code associated with meaning, and while we sometimes make the mistake of thinking that our language necessarily must be able to express anything, this is in fact boloney.
  2. People (the users of language) are imperfect. They may be tired, rushed, or in the throws of some strong emotion. They also can vary widely in their natural language ability or acquired level of skill.
  3. Any language, and especially English, is not a uniform beast. Not only does it change over time, but it also contains variants at any given time (regional dialects, cultural idioms, jargon).  In order to understand each other, we have to agree on what the words mean, as well as on the basic grammatical structure – and we don’t always do either one.
  4. The interpretation of language is terribly context-dependent. Basically, we’re not all coming from the same place, and where we’re coming from varies with who we are, where we are, and what we’re doing on a moment-to-moment basis.

All things considered, it’s amazing we actually manage to understand each other fairly well most of the time.

So I’ll always forgive you for not being clear. It’s a little harder to forgive people for not at least trying to be clear, but even then I know there are times when communication isn’t really a person’s top priority. And whenever I realize that I’ve just been misunderstood, the first thing I do (well usually) is reexamine what I actually said to see if I can identify the problem. Did I just say it badly? Or is there some possible alternate context that I failed to take into account? Of course, if I’ve got the other person face-to-face, I can also explore their insights into the issue – or just try again with a slightly different approach.

One thing I learned from being on the teacher side of the education fence is that, no matter how carefully you word the question, someone will manage to misinterpret it. My reaction when this happened was always to feel bad. Not because I assumed the misunderstanding was my fault – because of course it wasn’t necessarily – but because it meant my attempt to find out what the person had learned about the subject of the question had failed. (That’s failure of the assessment tool, rather than failure of the student.) The student may or may have known the answer – and I’ll never know which it was. (I always hated to mark those questions wrong, and tended to be very generous with any partial credit I felt I could assign.)

Now, I have encountered teachers – and, of course, others – whose reaction to being misunderstood went something like, “well I know what I was trying to say, so if you didn’t understand me it must be your fault.”

That is a position I find pretty unforgivable.

 

 

 

 

 

Had been there, had done that…. In search of the perfect past

Verbs Territory

Verbs Territory (Photo credit: Ecstatic Mark)

This is a post about the past perfect verb tense. Why, you may ask, would I want to write about a thing like that, and what the heck is the past perfect, anyway?

Why is because I’m noticing a disturbing number of would-be writers and self-published writers who evidently don’t know how to use the past perfect.

As for what, well, we all know what past means, so that leaves the perfect part to be explained.

That’s perfect, from the Latin perfectus, past participle of perficere meaning to carry out or perfect.

That’s courtesy of Webster’s Seventh New Collegiate Dictionary. (Yes, that’s a book. Sorry, I didn’t get it off the web, but I’m pretty sure it’s true anyway.) And once you get past the meanings like “expert”, “flawless”, “pure”, and “mature”, you get to meaning number 5, which goes as follows: “of, relating to, or constituting a verb form… …that expresses an action or state completed at the time of speaking or at a time spoken of.”

I translate that as, basically, relating to action completed in the past. (Please don’t be overly impressed: I had to ask a friend who actually has a degree in English to be sure I had the right name for the tense I was talking about.  I know what I do, I just don’t always know what to call it.) The thing to remember about the past perfect tense is that it combines had with the past participle form of a verb, like…

…had been,

…had done,

…had said, thought, walked, talked, hopped, skipped, jumped, or whatever.

And of course, just to make it more confusing, the past participle in English is often – but not always – the same  in spelling and pronunciation as the simple past tense of the verb.  So in some cases you only have to stick “had” in front of the past tense form of the verb, while in other cases you have to actually know what you’re doing.

Examples of the former:

Walked; had walked

Jumped; had jumped

Cried; had cried

Said; had said

Led; had led

Examples of the latter:

Was; had been

Did; had done

Ran; had run

Saw; had seen

Rode; had ridden

Why is this important? Well, it isn’t always. The need for the past perfect rarely comes up in scientific writing, for example. But in story-telling it comes up on a regular basis. Wait a second, you say. If the regular old past tense deals with the past, why isn’t it good enough to just use the past tense in a story for past events? It isn’t, because in stories we normally use the simple past tense for ongoing action.

We do? Yes, we do.

I mean, you can use the present tense for ongoing action in a story – “He goes into the bar. He sits down. He orders a drink.” Some people do that to create a greater sense of immediacy. But it’s much more natural to use the past tense – “He went into the bar. He sat down. He ordered a drink.”

I think this goes back to the origins of human story-telling among those hunter-gatherer ancestors of ours I’m always harping about. When those folks sat around the campfire at the end of the day recounting their experiences, they would naturally be speaking of things that had happened in the past – either the day just past or on some other day in the more distant past. In the present, they were just sitting around the fire telling stories. The action was all in the past.

The problem arises when your story needs to include references to things that happened before the currently ongoing action – whether it’s a moment before, a day or a week before, or perhaps before the story began. If you’re using the simple past for ongoing action, you need some other way to differentiate the past events from the ongoing events in order to avoid a potential crisis in clarity. That’s what the past perfect is for.

Realistically writers don’t only rely on the past perfect for clarity in these cases. The past perfect only tells you the action was completed in the past, after all; it doesn’t tell you how far in the past, or exactly when or over what span of time, and those things are frequently important. So people also use time tags – things like, “yesterday,” “last week,” “the previous time,” or “in all his life up until that moment,” to provide the appropriate precision. Because the time tags do part of the work even without the past perfect tense, I can usually figure out what the writer must have meant and identify the places where he or she should have used the past perfect. In fact, my brain frequently screams “had!” before I even get out of the offending sentence. In other cases I have to back track a sentence or two, and in some cases I don’t even figure out that I misunderstood something until much further on in the story.

Given the way the time tags work, I suppose some people might wonder if the past perfect is really necessary. The problem is that people don’t always realize they should have used a time tag. Also, it’s cumbersome to have to keep repeating the tags in each sentence if the past perfect narrative goes on for two or more sentences, and the reader can’t always tell when to switch back to ongoing action.

Consider the following example (without past perfect):

“He walked into the bar, sat down, and ordered a drink. After a few minutes, his former girlfriend walked in. The same thing happened the week before. Since he didn’t want to make a scene, he gulped his drink and left. Unfortunately, he forgot to pay his bar tab. Resolving not to make the same mistake, he called for the check.”

My brain screams, “The same thing had happened the week before…” because of the time tag. But then I assume that “Since he didn’t want to make a scene…” returns to ongoing action. I don’t discover my mistake until I get to the apparent disconnect of, “Resolving not to make the same mistake…”
I then realize that forgetting to pay the bar tab must be the “mistake” referred to, and therefore must have happened the previous week.

Using the past perfect, the excerpt becomes…

“He walked into the bar, sat down, and ordered a drink. After a few minutes, his former girlfriend walked in. The same thing had happened the week before. Since he hadn’t wanted to make a scene, he had gulped his drink and left. Unfortunately, he had forgotten to pay his bar tab. Resolving not to make the same mistake, he called for the check.”

Adding another time tag, such as, “Since he didn’t want to make a scene on the previous occasion, he gulped his drink and left,” would probably have given me the clue I needed to unscramble the action a bit sooner, but my brain would still have been screaming had, had, had!

To all you would-be writers out there, please don’t make my brain scream. If my brain is screaming, I can’t enjoy your wonderful story. And I have seen some wonderful stories that were ruined for me because my brain was screaming had, had, had! I have yet to read a novel published by a traditional publishing house that made my brain scream this way, and fortunately the verb tense problem is something a good editor can fix.

Why punctuate?

 

Punctuation Cookies For National Punctuation Day

Punctuation Cookies For National Punctuation Day (Photo credit: DavidErickson)

Why should anyone bother to use punctuation? It’s such a hassle. What difference does it make?  Well, compare the following:

 

“Come and eat people.”

 

“Come and eat, people.”

 

One little comma makes the difference between an invitation to cannibalism and a simple call to dinner.

 

Have you noticed that someone out there has decided voice actors should pause at arbitrary intervals when reading parts in scripted ads that are supposed to represent ordinary people talking about the advantages of the product? I suppose they think it will sound more natural that way; it doesn’t. It sounds wrong, and whenever I hear it, I know the part is scripted.

 

Ordinary people speaking naturally do not pause at arbitrary intervals. They do, however, pause. They pause for three reasons that I can think of:

 

  1. To catch their breath
  2. To work out some hitch in their train of thought
  3. To punctuate what they are saying

None of those pauses is random or arbitrary, and a listener can usually tell which one is happening. The first two can potentially interfere with understanding. The third, however, is often essential to it.

 

Yes, punctuation is a natural animal.

 

In addition to the words themselves, we use inflection, stress patterns, and pauses of various lengths to convey meaning. The pauses are perhaps less dramatic than the rising inflection of the question or the emphatic stress of the exclamation, but they are no less important. Pauses group words together and separate the groups from other groups based on relationships of meaning.  Most of the time we do this without thinking about it or even being aware we are doing it.  Sometimes we do it very consciously to make sure we are not misunderstood.

Most of the punctuation marks we use in writing ( . , : ; – ) stand for pauses – for modulations in the spacing between words – and they function to clarify meaning. Writing without punctuation is like talking in a monotone with exactly equal spacing between all the words, not pausing between sentences, clauses, other units of thought. Speech like that would sound like a robot (a poorly programmed robot), not like a human being. It would also be hard to follow, hard to understand. It would not, in fact, be at all natural.

So why would anyone want to leave the punctuation out of their writing?

Okay, I understand about text messages and twitter tweets where you only have 140 characters/spaces to work with. Texting on an old-fashioned cell phone like mine, you have to go to some real effort to find and enter those punctuation marks – it’s a hassle.  I understand being in a hurry. I can also understand that some people don’t like all the rules of punctuation. Maybe they can’t remember when they’re supposed to use a comma or a semicolon. Rules may seem arbitrary, fussy, confusing.

So why not just declare our independence and do without punctuation altogether?

Well, ah, because it’s an abandonment of a major tool in the clarity-in-writing arsenal.

It’s tantamount to saying, “I don’t care whether people understand me or not,” or at least, “I don’t care how hard I make my readers work to figure out what I’m trying to say.” I consider that a bit rude. So, while I understand the impulses that lead people in that direction, I personally have no desire to go there. I value clarity and I respect my readers too much to do that.

The truth is that the rules of punctuation, like the rules of grammar, have their limitations. They can get a bit involved, especially if you try to get fancy. Also, the “experts” don’t all agree, and the rules don’t remain constant over time. I think most people get into trouble with punctuation because they focus too much on the rules (which they understandably have trouble remembering) instead of focusing on what punctuation is for. If you keep your sentences fairly simple, remember that a comma is a pause and a period is a full stop, and then just think about what you’re doing when you write, it’s really not so very hard.

For a really clear and very entertaining explanation of punctuation, I highly recommend Lynne Truss’s little book, Eats, Shoots and Leaves.

 

 

 

 

 

To lay or not to lay… (or, remember the eggs!)

English: Brown chicken eggs

English: Brown chicken eggs (Photo credit: Wikipedia)

I can’t help it; “Lay Lady Lay” will always sound to me like someone talking to a chicken…

…a rather fancy, well brought up chicken, perhaps, but still a chicken.

That’s Bob Dylan being ungrammatical there with the chicken lady, and of course what I’m alluding to here is the whole “lie” versus “lay” debacle. This burning issue more or less divides the English-speaking population into two groups: those who have difficulty with these two words, and those who don’t. I’m one of the latter. I take no credit for that fact. It’s just that when I was acquiring language, the people I learned it from (my parents) used the two verbs correctly and so I learned to use them correctly. I’m sure this was reinforced by all the reading I did as a child. (I was an incorrigible bookworm.) While it has spared me a lot of grammatical grief over the years in my own writing, it also has the unfortunate consequence that whenever I read something written by someone else who has gotten it wrong, I notice. It hangs me up. It makes me pause and mentally insert the correction. I don’t like having to do that; it takes me out of the story. It spoils my enjoyment. I therefore have a very selfish interest in keeping lie and lay in their place at least in the writing that actually makes it to print.

There are lots of places to get explanations of lie and lay, but of course I just have to offer my two cents’ worth for anyone who may find it helpful.

Lie and lay are two completely different verbs with non-overlapping meanings.

Lie means to assume a recumbent orientation (generally on some more or less horizontal surface).

Lay means to place an object on a horizontal surface (one on which it will not slide or roll away).

Lay requires an object (something to be laid), while lie distinctly does not want one.

Since someone or something tends to end up resting on a horizontal surface in either case, it’s understandable that some confusion might arise. Add to this the fact that the past tense of lie is lay, and confusion becomes really quite forgivable. The most problematic tenses break down like this:

Lie, lay, lain

Lay, laid, laid

(In each case, that’s present tense, past tense, and past participle.)

The problem is simply that many people are mistakenly using lay for both meanings.

If the trend accelerates, we could be looking at language change here, in which English loses one verb (lie) while a second verb (lay) broadens its meaning and become less precise. Would this really be so bad? Well, probably not, since actual ambiguity or confusion of meaning rarely if ever occurs in this case.  I, however, would not be a happy camper. I would feel even more like a dinosaur than I do already, because I will probably keep doing it the way I learned to do it until I write my last word. The other way just feels too wrong.

So, everyone repeat after me:

People lie down; chickens lay eggs.

People lie down; chickens lay eggs.

People lie down; chickens lay eggs.

I’ve put “eggs,” the object of lay, in red. If the action is being done to something or someone, then laying is what’s going on. (No sexual innuendo intended.) If the person or animal or thing is doing the action all by itself, it’s lying. So, remember the eggs! If you’re contemplating using some form of the verb “lay,” there had better be an egg-equivalent somewhere in sight.

But be careful; there are nuances. (In these examples, lie is in blue, lay is in green, and the object of lay is in red.)

Inanimate things can lie, or lay. So can things that are not concrete nouns.

The knife had lain so long in the weather that the blade was half rust.

The mist lay like a shroud over the fields. (past tense of lie)

I waited for night to lay its cloak across the land. (present tense of lay)

We never know what lies ahead.

What bounties had providence laid in store for us?

It is entirely possible to lay oneself, or parts of oneself.

Let me lay my head on your shoulder.

Lay your body next to mine. (Compare with: Come and lie down by my side.)

I laid myself down to rest in a little hollow among the leaves.

Now I lay me down to sleep…  (Yes, the object of lay can be an object pronoun: me, us, them, him, her, it, or you).

“Lie” can also mean to be in a place or in a given direction.

The village lies just over yonder.

The road lay straight before them. (past tense)

My heart lies beyond the sea.

Finally, the subject of the sentence can be implied rather than explicitly stated, which can be particularly confusing when “lay” is involved.

Please lie down. (The subject, “you,” is implied here and in the next two examples.)

Lay it down over there.

Lay the timbers straight.

The dead were laid in a common grave. (Someone had to do the laying. We’re not dealing with zombies.)

What about you? Would you be glad to see lie supplanted by lay, or would you become a dinosaur like me if that happened?

English: the Good, the Bad, and the Spelling

For better or for worse, English currently functions as a lingua franca in many parts of the world. That’s nice for us and really annoying, I’m sure, for everybody else. It could be worse, though. English has its good points and its bad points.

On the one hand English tends to be a pretty open and inclusive language. Speakers of English generally aren’t fussy about how you pronounce it. (We even elect presidents who say nook-yoo-ler for “nuclear”.) Most of us tend to regard foreign accents as colorful or charming, and foreign words are always welcome. No snobbery here; we welcome armada, gestalt, karma, milieu, and tsunami.

We also must believe that brevity is the soul of wit because we shorten everything. Frequently down to an acronym: (TV, CD, DVD). And we’re never happier than when we can reduce some polysyllabic mouthful to a monosyllable: (sync, hype, nuke). This is good for people who happen to be in a hurry, who like their efforts at communication to be short and sweet, and, hopefully, quickly understood. It’s also potentially confusing, however, since the process goes on continually and usually at a break-brain rate of speed. Just blink and people are talking gibberish.

English offers some familiarity in the form of cognates (words of common origin) to folks whose native tongues hail from either of two major branches of the Indo-European language super-family – the Germanic languages and the Latin-based Romance languages. This is a consequence of its having been born out of the forcible collision of Anglo-Saxon and Norman French that occurred as a result of the Normans’ conquest of England in 1066 AD. (At this point a picture of the Bayeux Tapestry would seem in order)

Deutsch: Teppich von Bayeux

Anglo-Saxon collides with Norman French (Battle of Hastings, from the Bayeux Tapestry) Image via Wikipedia

On the up-side, English’s assortment of language sounds (phones) contains relatively few really rare ones to trip up the tongues of people whose native languages don’t happen to contain them. We have only one major offender: th. (Actually, that’s two major offenders because English contains two distinct sounds represented as “th”, the unvoiced and the voiced forms, exemplified by the words thin and this. We just aren’t aware that the two sounds are different because they never make a difference between words in English.)

On the down-side, English has an unusually large lexicon – that’s the sum-total of all the words it can claim as its own. The linguistic collision resulting from the Norman Conquest may be at least partly to blame, as well as all that free and easy borrowing. It really is an absurdly large lexicon. Pick up any unabridged dictionary and it’s pretty obvious there are more words in there than anyone could ever have any reasonable use for. (Ha! Yes, I just ended a sentence with a preposition, and I’m not a bit sorry!)

Then there’s the Seventh Circle of Hell known as English Spelling.

Whenever I meet some poor non-native speaker who is trying to master English I feel a compulsion to apologize for what has to be the least phonetic spelling (orthography) of any language that has ever been committed to a nominally phonetic written form. If English has any serious competition for this dubious honor, I would love to hear about it. It’s not just the stunning illogic of words like could, island and knight, it’s the appalling inconsistency, especially in our spelling of vowel sounds. English spelling is both redundant and degenerate, meaning we have both more than one spelling for the same sound and more than one sound for the same spelling. (I may have gotten those turned around.) It’s bad enough that we have only 26 letters to work with while we have about 40 phonemes (sounds that make a difference between words.) Then we make matters worse by completely wasting three of the letters – C, Q, and X – all of which are used to spell sounds that could be covered by other letters (kow, senter, kwilt, ekstra). We double-up some letters to represent consonants not covered by a single letter, like ch, sh, and th – which is okay. But when it comes to the vowels, for which we have only five letters to work with, we get really creative – to the point of total chaos.

Here’s a little game: Start with a simple little English word and try to think of another English word that has either a different spelling for some sound in the first word or a different sound corresponding to some part of the first word’s spelling. Then try to do the same with the second word, and so on. See how long you can keep the string going. Here are four seven-word strings:

be bee been bin bind pined signed
right rite write isle aisle stile style
fluff tough thought through though throw chow
sew so hoe shoe too to Few

I could go on like this all day.

We try to find rules to help us deal with some of this insanity. Rules like the familiar:

“I before E, except after C, or when sounded like A as in neighbor and weigh.”

Notice that the “rule” already embodies two exceptions. Unfortunately, these exceptions don’t cover everything, as demonstrated by this mnemonic intended to help us remember how to spell five exceptions to it:

“The weird foreigner seizes leisure at its height.”

I don’t think this list of exceptional exceptions is exhaustive, and notice that these five examples include three different vowel pronunciations for a single spelling.

We also have:

“When two vowels go walking, the first one does the talking.”

If that were really true, what would we need the second one for? It isn’t consistently true, of course. The neighbor/weigh group is an exception, for starters. So is “height.”  So is “believe,” although “believe” does at least follow the “I before E” rule.

Honestly, it’s just about hopeless.  It seems as if the inconsistency of English spelling is its most consistent feature. I grew up with this mess, which puts me ahead of that pitiable soul who is tackling English as a second language. Even so, spelling is my frequent downfall. (If you are similarly frustrated by it, I will gladly join you in a rant any day of the week.)  I was not gifted with total visual recall for the spelling of words – unlike some annoying people, such as my mother. I love my mother dearly, but one of her more annoying habits when I was a growing up was telling me to look up words when I asked her how to spell them rather than just graciously answering my question. I knew how to use a dictionary, for Pete’s sake. Asking her was just a whole lot faster – or it should have been. It can be amazingly difficult to find some English words in the dictionary if you don’t already know how to spell them, and what’s the use of having someone with that kind of talent around if he/she won’t act as a resource?

At this point, fairness compels me to mention that there is one upside to this spelling morass: Having different spellings that correspond to a single pronunciation allows us to distinguish between homonyms in our written language, (“rite,” “write,” and “right;” or “isle,” “aisle,” and “I’ll”). If we had completely consistent spelling, we wouldn’t even be able to have homonyms, though we would still, of course, have homophones, (words that sound alike). (And without those, we couldn’t have puns – and just think how much duller life would be.) Seriously, being able to distinguish homonyms in writing can aid clarity, although usually the context provides adequate clarification. Even here, spelling lets us down sometimes, as in distinguishing between “right” meaning the opposite of “left”, “right” meaning “correct”, and “right” referring to one of those inalienable thingies guaranteed us by the Constitution. (That’s a highly technical term, “thingy”).

How did English spelling get this way? That’s a subject for another day. I do know some of the answers, thanks to having once taken a Linguistics course (Dr. Elizabeth Barber’s class called “Natural and Artificial Languages” at Occidental College, in 1975), but I would like to know more. I recently spotted a book on the subject that I’d like to buy and read before making a further report. It’s called Spellbound: The surprising origins and astonishing secrets of English spelling by James Essinger. Let me check that out, and get back to you. (I’ll give you a hint, though; language change and dictionaries are partially to blame…)

Double Braino

Typographical Error

Typographical Error (Photo credit: futuraprime)

You will find errors in these posts, I’m sure, despite my best efforts. I’ve found some already and corrected them. One was a real doosey.

I found that where I had intended to write “loses sight”, I had written “looses site.”

Ouch!  Right there, staring at me: a double braino. (And here I am blogging about writing. Talk about major, major embarrassment!)

I’m trying for a neologism here with “braino”.  Maybe even an internet meme. (That would be really cool, but of course it assumes that someone actually reads this…)

A braino, you see, is intended to be somewhat similar to a typo. Both are inadvertent errors and not, I repeat, not, misspellings.

A misspelling happens, for example, when a person believes that he/she knows how to spell something and is simply wrong. Or alternatively it could happen when a person simply does not know how to spell a word, and makes a good-faith conscious effort, but unfortunately doesn’t get it right.

In the case of typos and brainos, the person does in fact know how to spell the word, but a glitch occurs somewhere in the process that begins with retrieval of the word from the memory banks and carries on through to the mechanical movements of the fingers that get the word typed onto the page, (or keyboarded onto the screen, or whatever).

Typos, of course, occur in the typing process. The movement of a finger is made inaccurately and the wrong key is struck, or rapid-fire sequences of finger movements are made in the wrong order with a similar result.

Brainos are errors that occur further upstream. Although the person knows what he or she means, the brain retrieves the wrong word and sends incorrect information to the fingers, which accurately type the incorrect information. How exactly does that happen? Well, I’m not sure, but I think that my typing “loose” instead of “lose” may relate to the fact that “lose” ought to be spelled “looze” – phonetically speaking. The sound of the word suggests the double “o.” As for “site” instead of “sight,” well, I had been dealing with a number of web-sites that day and thinking about how this site differed from that site. I think I may have just had site on the brain.

The spelling of “braino,” of course, reminds one of “typo,” and it’s meant to. This neologism is formed by analogy. It’s not a really good analogy, however, since “typo” is short for typographical error and there is no corresponding “brainological error” – nor, in fact, any such word as “brainological.” If braino were to become popular and the usage of “brainological” or “brainological error” were to subsequently appear, that would be an example of the linguistic phenomenon known as back-formation.

(And now I’m sure you know way more about this subject than you ever wanted to.)

And now for something a little different…

Neologism generator

Neologism generator (Photo credit: Peter Forret)

A few days ago, my son introduced me to the word “philosoraptor.” It’s a neologism (a newly-coined word) that refers to a humorous, pseudo-philosophical bit of wordplay such as:

“If pro is the opposite of con, is progress the opposite of congress?”

(Sorry, I don’t know the origin of this example, though it’s unfortunately very apropos at the moment.)

According to my son, “philosoraptor” is an example of an “internet meme“, which is an idea that is propagated on the internet. There is usually an image that accompanies an internet meme and in the case of philosoraptor it’s a charming little picture of a thoughtful-looking Velociraptor. (I had the image here, but when I hit “Publish” I lost the entire post – the second time that’s happened when I tried to include an image in a post. Got to figure that one out.)

I learned all about the origins of the philosoraptor internet meme at:

http://knowyourmeme.com/memes/philosoraptor

The image is credited to (and copy-righted by) Sam Smith who designed it to put on T-shirts. The word probably has multiple origins.

I also became interested in the word “meme“, because I was not familiar with it. It turns out to also be a neologism coined about twenty years ago by the evolutionary biologist Richard Dawkins. He conceived a meme as being somewhat analogous to a gene. The best current working definition of “meme” seems to be “an idea that is passed from person to person through imitation,” although Dawkins usage included the rather bizarre notion that memes were infectious, like viruses. Regardless, “meme” has definitely gone mainstream. There’s a related field called “memetics.”

Neologisms are great examples of a way in which language change happens, in this case through the creation of new words. It’s fun, it’s easy, and best of all, anyone can do it.

You can’t get away from grammar

Grammar police

Lest anyone conclude that I have a general contempt for grammar or grammarians, let me clarify.

Every language has grammar and every speaker/writer uses it.  All the time.  You can’t get away from it.

Grammar is just the structure of a language, as opposed to the words.  It’s a set of patterns you learned before you knew you were learning them, a set of patterns you unconsciously recognize and use. And without them, you would not be able to encode or decode any but the most rudimentary of utterances.

Grammar is all about pattern recognition.  Knowledge of the grammatical patterns of English leads both you and your listener/reader to have certain expectations about where your words are going, and if you violate those expectations too seriously you won’t be understood.  The “rules” of grammar are just an effort on the part of some well-meaning people to save us all from incoherence.

(Actually, I believe that pattern recognition makes up a large part of what we call intelligence.)

Grammar tells us what role a word is playing.  It tells us how the different bits of a sentence are related to each other.  There are two main ways I’m aware of for a language to “do” grammar.  They are:

1) word order

2) word modification

Word order is pretty obvious.  Word modification is all the various forms that are based on a single word-root (such as, write, writes, writer, writing, written, wrote, and so on.)  Most languages, like English, use both approaches.  Latin, I am told, relies so nearly completely on word modification that it virtually doesn’t matter how you order the words.  (Try to wrap your mind around that concept!)

Anyway, I really don’t have a serious quarrel with grammarians in general.  I just get a bit annoyed when I encounter someone who is so fixated on the rules that he or she loses sight of the purpose.

Grammar should be your servant, not your master.

(Grammar Police, photocredit: the_munificent_sasquatch)