Lessons in the Past Perfect 4: Ditching It?

I have a confession to make. There are situations where the rules seem to call for the past perfect but I actually find that I substitute the simple past and it doesn’t feel wrong to me. I know; shocking, isn’t it? I guess Miss Past Perfect (me) isn’t quite so perfect.

I got to wondering whether this was a deficiency in my usually natural ability to “feel” the need for past perfect, or whether these are places where other grammatically knowledgeable writers would make the same call. Was there a pattern? I set out to analyze some of these cases.
Some of them turned out to be examples of a situation I’ve already claimed was legitimate, though I’m not sure it is.

In these cases, I wasn’t really using the simple past tense, I was using the past participle without the helping verb “had.” The simple past and the past participle are identical for many English verbs, so it isn’t always obvious what’s going on in these cases. Basically, they occur when a sentence has two or more parallel verbs all in past perfect. It only feels necessary to me to use the helping verb once – with the first past participle. The others feel okay to me without it, like this:

She had gone to the window, looked outside, and seen no one.

Note that “looked” and “seen” are both past participles here, although “looked” is identical to the simple past tense of “look.” Would anyone really feel it necessary to write “had gone,” “had looked,” and “had seen” in this case? Is what I do legitimate, or not?

And what about this:

He had been in a terrible mood because the repairman arrived two hours late.

I used that example in an earlier post, but wrote “had arrived” in order to make a point, even though I didn’t feel the “had” was really necessary. This is a different situation. “Arrived,” here, is not a past participle.

Or, another example, also a variant on one I used in a previous post:

He had finished it just two days before he died.

I wrote, “just two days before his death,” in the post to avoid the issue, but I like the ring of the above version better – and also better than “He had finished it just two days before he had died.” And once again, I’m not using “died” as a past participle. I might say “he had finished it two days before he went,” but never “he had finished it two days before he gone.”

What’s going on with these last two cases? All I can say in my defense is that both examples involve the second verb in a sentence where the first verb has clearly placed the action in the past relative to the ongoing action by use of the past perfect. In both cases there is no reasonable ambiguity. The word “because” in the first instance makes it impossible to imagine that the repairman being late could be present action, since it was the cause of a past situation (the character’s terrible mood). The second example may not be quite so clear-cut, but for me it would take a break in the sentence to wrench that second verb into the ongoing action – like this:

He had finished it just two days before. At a little before noon, the old man died.

And finally, here’s another example from a previous post. This is how I originally wrote it:

It hadn’t always been that way. There had been a time when he had noticed the trees and the flower gardens, the picket fences, even the cracks in the sidewalk.

The truth is, though, I haven’t really got a problem with switching the first verb in the second sentence to simple past:

It hadn’t always been that way. There was a time when he had noticed the trees and the flower gardens, the picket fences, even the cracks in the sidewalk.

What can I say? There’s a past perfect verb in that second sentence that anchors the sentence in time. The past perfect verb in the preceding sentence reinforces that and leads the reader to anticipate some further explanation of the past situation. So, again, I don’t think there’s any risk of ambiguity. (I might switch the second verb in that sentence instead, but not both at once.)

Am I just being hypocritical to allow myself these reversions to the simple past while demanding that the past perfect be used to anchor both sentences? You can tell me what you think, but for me, switching that entire last passage to simple past suggests a different meaning:

It wasn’t always that way. There was a time when he noticed the trees and the flower gardens, the picket fences, even the cracks in the sidewalk.

Now I think it’s possible that the “time when” referred to might be every Saturday afternoon when he walks to the park, rather than some earlier period of his life.

So what do you think? Should I be hung, drawn, and quartered? What would you do in these instances? Do you have other situations where you break the rules and feel okay about it? If so, tell me why.

 

Icons have their place – it just isn’t everywhere

When is a picture not worth a thousand words?

When it’s an icon.
I just had (another) negative icon experience in Microsoft Word. At such times, when my ire is at its height, I tend to go off into long rants to the effect that the present proliferation of icons is threatening to return our civilization to the Stone Age, or at least to a time in history before the invention of writing — which I consider to be one of humankind’s greatest achievements. On this particular occasion, however, I was brought around to a slightly more balanced position by a serious conversation with my Millennial-generation son.

My son pointed out that he had grown up with icons and more or less takes them for granted. Some of them are pretty widely recognized and are used across different platforms. He also observed that they are not likely to go away any time soon. As he said, they have their place.
Yes, they do. They save space on the small screens of many electronic devices where it would be totally impractical to spell everything out in writing. They are not tied to a specific language, and so can potentially be more “universal” than written labels. On the other hand, this potential is limited by the fact that images have cultural context too. In fact, it’s pretty hard to come up with a universally understood icon.

Some of the better icons are:
The arrow, used to denote direction. This one probably goes back to the Stone Age. It’s practically a dinosaur. Since bows and arrows are not commonplace anymore, however, its meaning has become culturally determined to a large extent.
The skull and cross-bones, used to indicate a poison or other potentially deadly threat. It’s pretty hard to argue with this one, although I once read about a tribe somewhere that keeps the bones of ancestors lying around their houses. Skulls might have a rather different meaning to them.
The no (whatever) allowed symbol, by which I mean the red circle with the slanting line across it, superimposed on an image of whatever is meant to be disallowed. The meaning here derives from the fairly universal destructive gesture of crossing something out. Of course, the full meaning is dependent on the iconic quality of the picture of whatever is behind it.
The scissors to stand for the word “cut.” This is by far the clearest icon image to have come out of the computer age. The trouble with it is, you still have to know what “cut” means in the digital context, so it is language-dependent.

Maybe you can think of more or better examples.

Here’s the thing about images and icons: Not every image makes a good icon. To be good for this, the image has to be, well, iconic. That is, it needs to be visually simple, memorable, and endowed with relatively unambiguous meaning.
I’ll say it again. A good icon must be:
1. visually simple
2. memorable
3. endowed with relatively unambiguous meaning

That’s a tall order. Not very many images can live up to it, and an awful lot of the icons that are strewn willy-nilly across our computer screens fall woefully short. My son and I concluded that icons work best when they are widely used over long periods of time so that they come to have general instant recognition. We agreed that the practice of concocting novel icons to represent specialized functions in specific software applications is just plain wrong-headed. They have no generally-accepted meaning, and users expend effort to memorize them only to frequently have them disappear in the next incarnation of the program. They’re particularly useless when they aren’t even good icons based on the three criteria stated above – and most of them aren’t.
So this is for whoever it was at Microsoft who decided to substitute a totally un-memorable and not very descriptive icon for the “new style” button in Microsoft Word: You know who you are and you blew it! You failed the useful new icon creation test. (Cue sound of rude, annoying buzzer.)
So what do you think? Are there any icons you have come to know and love? Any you think should be relegated to icon hell?
(Take heed, oh ye Microsoft designers and programmers.)

 

 

Clarity and the ambiguous pronoun

Caterpillar using a hookah. An illustration fr...

Caterpillar using a hookah. An illustration from Alice in Wonderland (Photo credit: Wikipedia)

When I read people’s fiction manuscripts I’m often surprised at how frequently I encounter things that just aren’t clear. (I probably shouldn’t be. As a writer you always know what you meant to say, and it can be hard to tell in the heat of the moment that you haven’t said it.) This is a much rarer flaw in published works of fiction – although I have had the same experience recently with published books or ebooks. It never used to happen, or almost never.  I suspect the proliferation of self-published books and books from small indy publishers is at least partly to blame. The author may or may not have hired an editor, or may have used an inexperienced one. A small publisher may run the ms past one editor, whereas I’m told the major houses used to run them past several. More eyes are better. It’s that simple.

When I say things aren’t clear, I’m not talking about places where the writer was obviously trying to imply things, rather than explicitly state them, or deliberately trying to be ambiguous. I’m talking about ambiguity that’s obviously not intended.

One of the most frequent causes of unintended lack of clarity comes from ambiguous pronoun reference, something like this:

As Jim peddled down the street, his friend Bob was sitting at the bus stop. He smiled and waved.  “Where are you going?” he called.

Who smiled and waved? Was it Jim or Bob? Is Jim asking where Bob is going on the bus, or is Bob asking where Jim is going on his bicycle?

Other things being equal, pronouns tend to attach themselves to the nearest preceding noun. “His” therefore refers to Jim. There’s really no other possibility. Both instances of “he” are most likely to refer to Bob, making Bob the one smiling and waving and also the one calling. If you as the writer meant otherwise, you had better say so, like this:

As Jim peddled down the street, his friend Bob was sitting at the bus stop. Jim smiled and waved. “Where are you going?” he called.

Now the remaining “he” feels like it refers to Jim because Jim is closer, so Jim is doing the calling as well as the smiling and waving. Again, if you didn’t mean that, you had better say so:

As Jim peddled down the street, his friend Bob was sitting at the bus stop. Jim smiled and waved. “Where are you going?” Bob called.

But remember, I said “other things being equal.” Consider this rewriting of the original sentence:

As Jim peddled down the street, he saw his friend Bob sitting at the bus stop. He smiled and waved. “Where are you going?” he called.

Now Jim and Bob no longer have equal weight because “Jim” is being used grammatically as a subject whereas “Bob” is being used as an object. I’m not certain, but I feel as if all three instances of “he” more likely refer to Jim. Jim was the subject of the first sentence, so I tend to assume it’s Jim whose actions are being described as the narrative proceeds. If I intend otherwise, I must say so:

As Jim peddled down the street, he saw his friend Bob sitting at the bus stop. Bob smiled and waved. “Where are you going?” he called.

And again, I’ve now got Bob doing the calling because his is the closest name and it was used as a subject.  If I meant to switch back to Jim, I should have written, “where are you going?” Jim called.

All right, now consider this variant:

As Jim peddled down the street, he saw his friend Bob sitting at the bus stop. His face broke into a smile and he waved. “Where are you going?” he called.

Now, I tend to feel as if the “his” in “his face” could quite possibly refer to Bob. I think this is because “Bob” was used as an object and “his” is an object pronoun. It could still be Jim, but the connection is weakened and the sentence has really become ambiguous. Also, the last “he” now feels like it ought to have the same referent as the one in “he waved.” So again, I have to check to be sure that’s what I intended.

What’s the upshot here?

When you are describing action involving multiple characters of the same gender, the pronoun is not your friend. This doesn’t mean you should avoid all pronouns. You obviously need them sometimes. Repeating names over and over can sound repetitious and clunky. It just means that you have to regard all pronouns as suspect, potentially ambiguous until their possible referents have been checked and cleared. And if there’s any chance of confusion, out they go.

It’s a good idea to have alternative identifiers for your characters to help you avoid repeating the same name over and over. Alternative identifiers are things like: “the boy,” “the old man,” “the dark-haired girl,” “the fat woman,” “the farmer,” “the merchant,” “the Italian” – or even things like “his friend,” “the other man,” or “the speaker.”

I know you’re thoroughly tired of this sentence by now, but just to illustrate:

As Jim peddled down the street, he saw his friend Bob sitting at the bus stop. His friend’s face broke into a smile, and he waved. “Where are you going?” he called.

Then, of course, there are the people who don’t like to use dialog tags, who want to just write, “where are you going?” Well, here’s one alternative fix for that approach:

              “Where are you going, Jim?”

It’s remarkable how easy it is to end up with ambiguous pronouns. I know I find them all the time when reviewing my own writing. How about you? Have you noticed this problem in your own writing or in other people’s? Do you have your own tricks for dealing with it?

On editing and self-editing

Edit Ruthlessly

Edit Ruthlessly (Photo credit: Dan Patterson)

My husband and I are redoing our front yard to convert from high to low water use. Being situated in the alluvial zone at the foot of the San Gabriel Mountains, our little property is “blessed” with an abundance of rocks – in all sizes. Hence our brilliant idea: Do a rock garden.

So I’ve been spending a little time out there several days a week placing rocks to hold and frame the dirt for planting succulents. I place those rocks very carefully. The effect I’m looking for is not exactly a natural arrangement, but a visually interesting and attractive one. Sometimes I go out in the morning and rearrange some of the rocks I placed the day before – or the ones I placed several days ago. Sometimes I find myself sitting inside looking out the front window and thinking, no… that one would be better a little bit to the right, or, yes… but I’m going to need a bigger one right about there to balance that other little grouping…

Eventually it dawned on me: I’m editing the rocks.

Then there’s the Christmas tree. We have a string of little white lights and an eclectic assortment of ornaments. Every year it takes me a couple of hours to decorate the tree to something approaching visual perfection  – and then I spend the rest of the holiday season tweaking the lights and rearranging the ornaments to better balance the sizes and the colors, and to fill those annoying little “holes” that somehow weren’t apparent when I finished the original arrangement. Sometimes I’m sure I move ornaments to fill holes that were created by moving ornaments to fill other holes…

Yes, you got it. I edit the Christmas tree.

I think I’m a natural born editor.

Given that I’ve been writing things of one kind or another for most of my life, it’s hardly surprising that I edit my own writing. In fact, I do so constantly. Sometimes it feels like I can’t leave any two previously written words together. It also probably shouldn’t come as a surprise that I have become, here in my middle years, a professional editor – for scientific texts. I do scientific texts because I have a background in science that gives me added credibility for that task, but I could easily edit other things. The only thing remotely remarkable about my being an editor, really, is that it took me so long to come around to it.

After giving the matter some thought, I’ve concluded there are three basic requirements for being a good editor (of the text variety).

  1. An outstanding command of the language.
  2. The kind of patient and meticulous nature that makes giving “attention to detail” a foregone conclusion.
  3. A clear concept of what the final product ought to look like.

Obviously I wasn’t born with the first. I must have acquired it, although I do think a certain amount of talent must have been involved because it certainly required very little effort on my part. The second, I was apparently born with – in spades.  The third has always seemed pretty obvious to me and not particularly difficult to achieve. My method of achieving it can be summed up in three words:  Read good examples. (Of course, if your client has a manual or style sheet he or she wants you to use, it goes without saying that you use the manual or style sheet.)

It is a little bit surprising that it has taken me until fairly recently to really appreciate the need to have someone else edit my writing.  Well, maybe it’s not that surprising. When I started writing scientific papers, after all, I always edited myself – over and over – both before and after my thesis advisor had put in his two cents’ worth.  And when the proofs came back from the journal, I never noticed that any changes had been made to what I’d sent. There might have been some that I didn’t notice of course, but basically I think I wrote (and self-edited) well enough that my work didn’t need a whole lot of editing.

In other words, being that I’m a natural born editor, I think I might be forgiven for getting the impression that it was the writer’s job to get it right in the first place.

The trouble with this concept is, you can’t.

I mean, you can’t reliably get it completely right in the first place.  Not when you’re writing the tens of thousands of words – the dozens or hundreds of pages – that go into something the length of a book. The average essay or scientific paper is only a few pages or a few thousand words.  A writer who happens to be a pretty good editor has a fighting chance of catching all the errors in something that length, but not in a book.

Many people have noticed that we all tend to have trouble seeing our own mistakes on the page (or the screen.) You know what you wrote, after all. The fact that you didn’t actually write what you know you wrote can be really quite shocking when someone finally points it out to you. It can be positively mortifying – especially if you’re an editor, believe me.

Even when it comes to other people’s writing, there are certain kinds of errors that we have trouble seeing. The absence of small common words where a line is broken, for example, or the same word repeated at the end of one line and the beginning of the next.

Why are we error prone in this particular way?

Well, it obviously has something to do with expectations.  But I think it also has something to do with how our brains work. The brain is famous for filling in gaps in our perceptions to create the impression of a seamless and coherent world. Studies have shown that our visual systems only take samples of what’s out there. The fact that our eyes are constantly moving and taking samples, together with the vast amount of experience our brains have with interpreting those samples, combine to give us the impression that our minds simply look out through the windows of our eyes and see the world as it is.

Did you know that the design of the eye is such that the image cast by the lens on the retina is inverted, top to bottom, and left to right? As far as the eye is concerned, objects appear to fall up. Why doesn’t it look as if objects fall up?  Because, at a very early age, your brain learned to turn the images around.  It’s an amazing thing, the brain – quite miraculous. Right up to the point where that miracle prevents us from being able to see our own mistakes.

So, even though I’m an editor, I know that I still need an editor. When it comes time to get my manuscript ready for publication, I know I’m going to have to hire one. (And that’s not just me trying to promote my own profession.)

And now for something really controversial: The serial comma

Life, Liberty, and the Serial Comma

Life, Liberty, and the Serial Comma (Photo credit: MBIMOTMOG)

Today I’m going to take on a really controversial subject:

The serial comma

I’m for it.

I’m speaking about the comma before the “and” that precedes the last item in a series, such as “red, white, and blue,” or “oats, peas, beans, and barley.”

I was taught in school to put it in and so were my sons. Lately, however, there seems to be a trend to leave it out, and this is shaping up to be one of the major punctuational battles of our times.  Seriously.  If you make the mistake of bringing the subject up in any group of serious writers (or editors), you’re likely to find that people start taking sides and the temperature begins to rise.

So what’s the big deal?

The argument for omitting that last comma goes (I believe) like this:  Since you have the “and” there, what do you need the comma for?  In other words, the claim is that the comma is redundant.

The problem with this argument is that logically commas separate things while “and” joins things together, which means that if you leave that last comma out, logically you’re connecting the last two items of the series while separating them from all the other items. It would make more sense to omit the “and” and keep the comma if you don’t think you need them both, except, of course, that we’re so used to the “and” being there.

Possibly the” and” at the end of the series is the last vestige of a more cumbersome construction: “red, and white, and blue.”  After all, the items in the series are linked in the sense that they all belong in the series, even while they are also each separate and distinct from each other. In practical terms, the “and” gives the reader (or listener) a little heads-up that the end of the series is at hand.  It says, “okay, you can stop making your little mental list after this next item and prepare yourself for a change of direction.”

If logic were the only casualty of serial comma omission it might not be worth getting up in arms about. The trouble is, clarity can also take a hit. The comma represents a pause. When you’re speaking, it’s the pause that separates the items in a series in your listener’s perception. And when speaking, you do pause before the “and,” whether you’re aware of it or not. You do it instinctively – and you do it for clarity. If you didn’t pause, you’d make it sound as if the last two items were parts of a single item, like “peanut butter and jelly” (one item) – which in not the same as “peanut butter, and jelly” (two items), or the same as “peanut, butter, and jelly” (three items).  (Say each of those aloud and listen to the difference.)

(You punctuate your speech. Why would you not punctuate your writing?)

Confusion is unlikely in a short, familiar list like, “red, white and blue.” Everyone knows these are the three colors of the American flag. (And there’s no such thing as a “white and blue.”)  But confusion can occur when the items are not familiar, when one or more of the items contains an internal “and,” or when the items are long phrases so that the nature of the series is not immediately obvious.

To illustrate the ambiguity that can arise when items contain internal “ands,” take this example of a hypothetical list of menu options at an ice cream parlor:

 “The choice of toppings includes strawberries and sliced bananas, chocolate chips, sugar sprinkles and peanuts and chocolate sauce.”

So strawberries and bananas apparently come together, and chocolate chips is a stand-alone. But do the sugar sprinkles come with the peanuts, or do the peanuts come with the chocolate sauce? You can’t tell because the missing serial comma could come either after the word “sprinkles” or after the word “peanuts.” This is, of course, a rather trivial example (unless you happen to be allergic to peanuts). You could always ask for clarification, and in all probability the person behind the counter in any real ice cream parlor would give you whatever combination you wanted. I cooked up this example to make a point, and hopefully you can imagine a less contrived and less trivial example.

The point is that when you write something, your intention is presumably to communicate. If your attempt to communicate requires further clarification, you have failed.

To illustrate clarity issues arising from sentence complexity and lack of familiarity of terms, consider the following example drawn from a recent grant announcement posted by the National Institutes of Health. (In which the writer very sensibly does not omit the serial commas):

National Heart, Lung, and Blood Institute (NHLBI) invites applications to develop multidisciplinary career development programs that will equip new MD and PhD (or equivalent) investigators with the knowledge and skills to apply pan-omics and integrated approaches to elucidating genomic and molecular bases of lung disease, including heterogeneity, key regulatory networks, and relevant disease biomarkers, with the goal of advancing understanding of lung disease pathobiology and lung disease personalized medicine applications.

(Whew!)

The passage contains two three-item lists.  Did you spot the two serial commas?  One is after the first occurrence of the word “lung” and the other is after “networks.” If the first one were missing it probably wouldn’t trip up most readers because we easily understand that “lung” and “blood” are separate things. But removing the second would make “key regulatory networks” and “relevant disease biomarkers” run together as if they were potentially a single item.  And for all most of us know, they might be. It wouldn’t be until we saw that the comma after “biomarkers” is followed by “with” that we would realize we must have come to the end of the series. Then we would probably have to go back to re-read the sentence so we could figure out what the list of items was intended to be.

Whenever your readers have to go back over a sentence to figure out what you meant, you just flunked Clarity 101.

This is why most technical or scientific writers tend to favor the use of the serial comma. It really helps in long complex sentences. Using it may not always resolve all possible ambiguities, but it never makes things worse.

One final point:

Authorities who advocate omitting the serial comma will often qualify their stance by saying, “unless required for clarity,” which amounts to advocating inconsistency. It means the poor reader never knows what to expect – especially since writers notoriously tend to believe they were clear simply because they know what they were trying to say. It means the writer has to pause to consider possible ambiguity every time he or she composes a list. This requires thoughtful attention to detail, accurate perception, and sound judgment. And can we really expect such traits in a person who would under any circumstances consider omitting a useful piece of punctuation?

Honestly, isn’t it easier and more reliable to simply make a habit of putting in the comma?

Come on, guys; it’s one little keystroke. You can do it!

So where do you stand on this burning issue? Did I put you to sleep in the second paragraph? Are you at this moment cudgeling your brain to come up with something about which you care less?

What’s in a name (a post-9/11-anniversary reflection)

 

Words

Words (Photo credit: sirwiseowl)

What’s in a name?

The word “liberal” is derived from the same root as the word “liberty.”

The word “conservative” comes from the same root as the word “conservation.”

“Democrat,” of course, is derived from “democracy,” meaning “rule by the people.” That’s using the Greek root “demos” for “the people.”

“Republican” is derived from “republic,” also meaning a form of government in which the ultimate authority rests with the people. It uses the Latin root for “the people,” as in the word “public.” (Those Romans loved to copy the Greeks.)

I don’t know. It seems to me we all started from the same place. And we’re not that different. So, what are we arguing about?  Why is there so much heat and so little light?  Why such a need to vilify the other side? To make it sound like, if they win, it will be the end of the world? (It won’t, you know, because we are the people, and we won’t let it happen.)

After 9/11, we saw a lot of the slogan, “united, we stand.”

You know how the other half of that goes…  Yeah, that’s right:

Divided, we fall.

I’m not talking about silencing dissent, here. We’re never all going to agree, and the day we stop speaking our minds, or stop being allowed to speak our minds, will be a dark day indeed. But with any freedom (of speech, for example) comes responsibility (to at least try to communicate, in this case).

There will always be differences of opinion in any large, diverse group of people. There will always be conflicting interests.  But we ought to be able to talk about these things – really talk about them, in clear, honest, practical terms.

The essence of government by the people should be that a group of elected representatives -representing all the conflicting interests – gets together to talk things out, honestly, respectfully, and in good faith.  And –yes- they have to be willing to compromise, if they’re ever going to be able to balance those conflicting interests. The idea that one side can “win” at the ballot box with 52% of the vote and thereby get everything its own way, ignoring whatever the other 48% wants, is as destructive as it is absurd. It pretty much guarantees that the other side is going to get mad, rally, and come back to “win” the next election and stick it right back to them. The U.S. government is not a football, guys. This isn’t a game.

Divided, we fall…

I really don’t want to think that I might be watching the “fall” of the United States of America, but I don’t like what I’m seeing (or hearing).  Has there ever been a time in our history when our political leaders were so rigidly and uncompromisingly divided?  As an advocate of clear communication, I am appalled by the dearth of civil discourse, the scarcity of honest efforts at persuasion, the stunning lack of simple, clear communication with respect to anything concerning politics in this country.

I, for one, am sick of it.

And, frankly, I’m a little bit scared.

Wag the dog… on thought and language

 

The Thinking Man sculpture at Musée Rodin in Paris

 

It seems pretty obvious that the way we think influences our language. What is less obvious is that our language also influences the way we think.

 

I remember having an argument once with my mother over whether it is possible to think without using words. She said it wasn’t, and I thought it was. Looking back, I think the real basis of our disagreement may have been a difference in what we each meant by the word “think.” To my mother, it just wasn’t really thinking if it didn’t involve words. It was something else, something more nebulous, like feeling, perhaps, or something more primitive, like reacting. On the other hand, I’m darn sure I can think without words. I’m reminded of the fact every time I get stuck because I can’t think of the right word for whatever I’m trying to say. I know I’m looking for a word that means just exactly… well, that… and it seems that there must be one, or at least there ought to be one…

 

Does that ever happen to you?  (Let’s see a show of hands…)

 

I’m getting off the subject, but the point is that our language and our thought processes are very intimately connected.  So much so that we often make the mistake of thinking that a thing must exist simply because we have a word for it – or that a thing must be possible just because we can say that it is. We fall into the error of believing that words or phrases define the world, rather than merely being imperfect tools used to describe it.

Examples:

Safe.”  I once read an entire book on the subject of “acceptable risk,” the whole point of which was that nothing is absolutely safe – totally without risk of any kind. Yet people who ask, “is it safe?” routinely expect to be given a yes or no answer. When the doctor, scientist, or government official comes back with, “the levels are too low to pose a significant health hazard,” people aren’t satisfied. The think that’s weasel-wording, or government-speak for, “we want you to think it’s safe, even though it really isn’t.” In fact, the poor guy is just doing his best not to lie to you.

Other words like “clean” and “pure” – or any word that implies some absolute condition – have similar limitations. Did you know there is a maximum number of insect parts allowed per standard volume of ketchup? Yuck! Why doesn’t the government insist that there not be any insect parts in there? Because there is no possible way in any real universe for the manufacturer to insure that there won’t be any. The best you can do is to establish a level that is as low as possible while still being reasonably achievable.

Freedom.” Increasingly cavalier use of this word as a thing that is always desirable and good is saddling it with so much emotional baggage that it’s in danger of becoming an empty shibboleth – a catch-word thrown about to make you feel good, hook your emotions, or convince people that someone is on the “right” side. We’re starting to believe that freedom is always good, and so anything that limits anyone’s freedom must automatically be bad.  In fact “freedom” really just means the absence of coercion or constraint in any choice or action. In short, it means being able to do what you want. This is fine as long as it’s you getting to do what you want, but what if it’s someone else and what he wants to do is to hurt you? It’s perfectly legitimate linguistically to talk about freedom to rob, freedom to rape, freedom to kill, etc. We’ve begun to think that “freedom” is a treasured value of our democracy when in fact it is specific freedoms, such as freedom of speech, that are our treasured values.

There are two questions you always should ask when you hear the word “freedom” being bandied about: Whose freedom are we talking about? And, freedom to do what, exactly?

I heard a sound bite in which a member of the U. S. Congress said something like, “government should protect our freedom, not tell us what to do.”  I’m sorry; a government that doesn’t tell us what to do creates a society with no rules. And who is likely to benefit in the absence of rules? The strong, the rich, and the clever will benefit for starters – also the irresponsible, the unprincipled, and the ruthless. Government can’t protect any freedoms for the weak, the poor, and the well-meaning but perhaps a bit naive nice guys, except by curtailing some of the freedoms of those who would otherwise take advantage of people less able to defend their own freedoms.

Making money.” Let’s face it, the only people, apart from counter-fitters, who actually make money are the people who work in a mint. The rest of us don’t make money, we acquire it from other people – hopefully in exchange for having done an appropriate amount of useful work, or having provided the other person with a product of appropriate value. Why make this point? Because the word “make” implies something is being produced or created, and it’s hard to see any possible moral issue with that kind of activity. Once you realize that all the money you’ve accumulated came ultimately from other people – directly or indirectly – it puts things in a different light.

We can all be rich.” While it’s possible to say this, it isn’t actually true, because the word “rich,” in monetary terms, is defined as one end of a scale. “Rich” has no meaning in the absence of “poor.”  Simply put, “rich” implies having significantly more money than a significant number of other people. We could potentially all be prosperous, since “prosperous” implies having enough to meet one’s needs, with some to spare. I think we could all achieve that, especially if we helped each other. Yet I heard that half of the 2012 college graduates in a recent poll expressed a desire to become rich. I don’t blame them; I blame us older folks who are giving them the wrong message. We use “rich” in a non-monetary sense to mean all kinds of good things, from “a rich cream sauce,” to “a rich cultural heritage.” We’ve lost track of the negative moral implications of becoming rich monetarily. (There was something about camels fitting through narrow openings…)

With that, I think I’ve probably gotten myself into quite enough trouble.

 

Clarity First – on understanding one another

What is difficult? [ about A Cognitive Substra...

What is difficult? [ about A Cognitive Substrate for Natural Language Understanding ] (Photo credit: brewbooks)

 

I think I’ve already said that clarity is the first priority in communication (especially written communication, which potentially could transcend the ages). I’ll probably say it again. What I won’t say, though, is that there’s no excuse for not being clear. There are lots of excuses.

Here are some of the things that limit clear communication:

 

  1. Language is an imperfect tool under the best of circumstances. It was invented by a bunch of rank amateurs, using a process of trial and error, and is constantly being reshaped by its users, most of whom are also amateurs. It’s a complex system of sounds/visual code associated with meaning, and while we sometimes make the mistake of thinking that our language necessarily must be able to express anything, this is in fact boloney.
  2. People (the users of language) are imperfect. They may be tired, rushed, or in the throws of some strong emotion. They also can vary widely in their natural language ability or acquired level of skill.
  3. Any language, and especially English, is not a uniform beast. Not only does it change over time, but it also contains variants at any given time (regional dialects, cultural idioms, jargon).  In order to understand each other, we have to agree on what the words mean, as well as on the basic grammatical structure – and we don’t always do either one.
  4. The interpretation of language is terribly context-dependent. Basically, we’re not all coming from the same place, and where we’re coming from varies with who we are, where we are, and what we’re doing on a moment-to-moment basis.

All things considered, it’s amazing we actually manage to understand each other fairly well most of the time.

So I’ll always forgive you for not being clear. It’s a little harder to forgive people for not at least trying to be clear, but even then I know there are times when communication isn’t really a person’s top priority. And whenever I realize that I’ve just been misunderstood, the first thing I do (well usually) is reexamine what I actually said to see if I can identify the problem. Did I just say it badly? Or is there some possible alternate context that I failed to take into account? Of course, if I’ve got the other person face-to-face, I can also explore their insights into the issue – or just try again with a slightly different approach.

One thing I learned from being on the teacher side of the education fence is that, no matter how carefully you word the question, someone will manage to misinterpret it. My reaction when this happened was always to feel bad. Not because I assumed the misunderstanding was my fault – because of course it wasn’t necessarily – but because it meant my attempt to find out what the person had learned about the subject of the question had failed. (That’s failure of the assessment tool, rather than failure of the student.) The student may or may have known the answer – and I’ll never know which it was. (I always hated to mark those questions wrong, and tended to be very generous with any partial credit I felt I could assign.)

Now, I have encountered teachers – and, of course, others – whose reaction to being misunderstood went something like, “well I know what I was trying to say, so if you didn’t understand me it must be your fault.”

That is a position I find pretty unforgivable.