Tune That Name

Our household uses very little butter — mostly just to prepare eggs for breakfast — so it’s not surprising that I’ve only just had my first encounter with Challenge Butter. (The brand, I’ve since found out, has been around for more than a century.) I don’t know how the package ended up in our kitchen — it was probably bought on a day when the plain store brand was unavailable — but it certainly gave me a start when I opened the refrigerator door. The box portrays a pristine lake surrounded by evergreens, similar in style and coloring to the illustration on the Land O’Lakes box, but the placid scene is dominated by an outsized, heavily antlered deer at its center. This is not one of your meek, Bambi-like deer; it’s a deer that would cause you to freeze in its headlights (if it had headlights). This is a deer that says, “I challenge you to eat this butter.”

The thing is, I don’t want my butter to be challenging. Butter is supposed to be compliant — it’s supposed to spread when you want it to spread, and melt when you want it to melt. I’m trying to imagine the focus group at which they tested the name “Challenge Butter”:

“That is the stupidest name for a dairy product that I’ve ever heard.”

“Oh, yeah? Want to take it outside?”

Contemplating this badly branded foodstuff takes me back to my childhood, when I made frequent use of an unappealingly named product called “Testor’s Pla.” Pla was an oil-based paint, an enamel, that was most often used for detailing plastic model kits. It was sold in tiny bottles, holding a fraction of a fluid ounce, for about twelve cents apiece. At the five-and-ten-cent store, a rack displaying Pla bottles in dozens of different colors made purchasing one almost irresistible.

But the name! Our neighbor Jackie, who used to babysit for my sister and me, would always make me laugh when she looked sadly at my row of bottles and drawled “Pla-a-a-ah,” as if she were trying to eject something distasteful from her throat. The only way I could account for the name was to imagine that it was originally something like Placenta or Plantagenet, but that the full word was too long to fit on one of those tiny labels.

The Testor Corporation is still around, but the name Pla appears to have been retired in favor of “Testor’s Enamel.” I’m hoping that the trend will continue, and that we can look forward to eventually seeing “Encouragement Butter.”

I can’t conclude a discussion of unfortunate product names without returning to one of my pet peeves, “cheese.” Obviously, “cheese” is a generic term, not a brand name, but that makes it all the worse, because every derivative of milk curds therefore receives that unappetizing designation. Any word that consists of a harsh consonant followed by a whining, constricted vowel sound can’t refer to anything good. (See “pet peeve,” above.)

Try saying it out loud: “Che-e-e-e-se.” Does that really sound like something you’d want to ingest? The fact that it echoes the universal childhood expression of distaste, “E-e-e-e-w,” can’t be coincidental. With so many English food terms having been borrowed from French, I wish we could have gone with the lovely word “fromage.” I’ve always been resistant to putting cream cheese (“cre-e-e-m che-e-e-se”) on my bagels, but I’m sure my taste would have developed differently if the traditional topping had been “le fromage à la crème.” As it is, I generally eat them cheeseless, topped instead with the most docile butter I can find.

Read Me 2 comments

Food for Thought

“You deserve this spoon cake,” said the headline on the LifeHacker website. I wondered what this implied about the quality of the spoon cake, given that I’d accomplished nothing worthwhile that week. But then I remembered that LifeHacker had no means to assess my degree of merit — it didn’t even know who I was. The headline was meant to suggest that everybody deserves this spoon cake (and, by implication, that the spoon cake is delicious).

Here’s the problem: To “deserve” something generally means that one has done something to earn the thing (or at least has done nothing to forfeit the privilege of having it). The word’s purpose is to distinguish those who are deserving from those who aren’t. But if everyone deserves something, the word becomes meaningless.

I first encountered this problem many years ago when McDonald’s began running commercials saying “You deserve a break today.” I was a teenager when this slogan came into being, and even then, I found it insulting. Clearly, McDonald’s was trying to flatter me, to contrast me with those sluggards who hadn’t been doing their work and therefore were unworthy of getting a break. But McDonald’s had no way to know that I wasn’t a sluggard, and therefore their claim was disingenuous.

“Why would anybody take those commercials seriously?” I asked my father.

“Those ads are intended for people who aren’t going to think about them too much,” he said. “You’re not one of those people.”

I’m reminded of this, oddly enough, because I recently encountered a young woman wearing extremely torn jeans. When I say “extremely,” I mean that pretty much the entire front of each pants leg was missing, from the lower thigh to the upper calf.

Now, I can think of two practical reasons to wear pants: One is to protect your legs from rain, cold, or sun; the other is to cover your legs for the sake of modesty or dignity. Clearly, these jeans served neither purpose, so the only other reason I could imagine for her choice of wardrobe was to make a statement.

But what sort of statement? Did she mean to communicate that she was a rebel, too cool to care what people like me thought? Did she wish to demonstrate that she was too spiritual and idealistic to concern herself with material things? Did she simply want to fit in, because all of her friends were wearing extremely torn jeans?

I suppose you could say that — as with the McDonald’s ads — my failure to understand her message means that I was not part of her intended audience. She was wearing those jeans solely to appeal to people who, unlike me, would understand why she was wearing them. As for me, I’m presumed to just continue along my way: Nothing to see here!

But something about that conclusion feels a little too facile — too close to the logical fallacy known as “no true Scotsman.” For those of you who aren’t acquainted with the catalog of logical fallacies, the traditional illustration is this: One man states a rule or generalization, such as “No Scotsman puts sugar on his porridge.” Another objects, “Well, I’m a Scotsman, and I put sugar on my porridge.” To which the first one responds, “Well, no true Scotsman puts sugar on his porridge.” In other words, the first person contrives to make his rule unfalsifiable by specifically excluding any counterexamples, thereby making the initial statement pointless.[1]

To say that “if you don’t understand the message, then it wasn’t intended for you” has a similar effect: It automatically excludes the possibility that the message is incompletely thought out, or badly expressed. If “you deserve this spoon cake” is meaningful only to people who already believe that they deserve that spoon cake, it’s not a very useful assertion. There are plenty of people who don’t feel worthy of spoon cake, but would likely still enjoy it if it were offered to them.

Imagine how much more effective our political discourse would be if we could find ways to express things that are clear to everyone, regardless of their preconceptions. (Perhaps something along the lines of “Lots of people think this spoon cake is really yummy!” or “If fast food is a treat for you, consider getting it at McDonald’s!”) People would still disagree, but at least they would have a shared understanding of what they’re disagreeing about.


[1] For another example of “no true Scotsman” — this one involving concealed gold — see Atmosphere (3).

Read Me 1 comment

Iconoclasm

When I bought my first PC in 1984, the salesperson cautioned me that I wouldn’t be able to use it right away. It was missing an essential component — something called a “disk operating system” — which was required for the computer to do any computing. The operating system (more succinctly known as DOS) came on a floppy disk, which was backordered and expected to arrive in a few days. In the meantime, all I could do was play the computer’s demo disk, designed to show off its capabilities on the showroom floor, over and over and over.

Despite that annoyance, once I learned to use DOS, I quickly became a fan. I loved the simple elegance of the C:\> prompt with the flashing cursor, patiently waiting for me to type in a command. But my romance with DOS was doomed. The same year I bought my PC, Apple introduced the Macintosh, with a graphical user interface that allowed users to do much of their work simply by clicking or dragging with a mouse. I dismissed the Mac as a toy — something that was appealing to beginners, but not suitable for serious work — and assumed that it would be a passing fad. Instead, it was quickly and widely accepted as the model for what a desktop computer ought to be.

My friend Brad, an early advocate of graphical interfaces, urged me to come aboard. He said that the Macintosh’s intuitive way of doing things represented the future of computing.

“On the contrary,” I said, “it’s like going back to the Stone Age. Back then, if you wanted to refer to something, you had to point to it. That was inefficient. That’s why we invented language.”

“But it’s easy to make mistakes when you type in commands,” he insisted.

“And it’s just as easy to make mistakes if you’re not good at handling a mouse,” I said, having had that experience in my experiments with using one.

Once I accepted that change was inevitable, I dodged Microsoft’s weak replacement for DOS, called Windows, by switching to a Macintosh. I’ve had nothing but Macs for 40 years now, and I confess that I’ve come to like using a mouse. But one consequence of that industrywide switch from words to pictures that drives me crazy is the need for everything to be represented by a visual symbol, whether useful or not.

The original Mac icons were simple and clear. For example, in the very  first version of Photoshop, it was easy to grasp that denoted a brush, denoted a pencil, and denoted an eraser. But as Photoshop became more sophisticated and more features were added, providing an immediately recognizable icon became next to impossible.

For example, how would you visually represent the Content-Aware Move Tool (which allows you to move a selection from one place to another, with Photoshop magically repairing the place it was moved from), or the 3D Material Drop Tool (which allows you to sample the surface texture of a 3D object and apply it to another 3D object)? The answers are and respectively, but I wouldn’t have been able to tell you that without looking them up first. I find most icons in current programs to be completely useless, such that I usually have to ignore them entirely and just roll over them to see their names pop up.

At least the Mac interface offers a quick way to see the words that define an icon. That’s not true in other environments that have been gradually overtaken by symbols. For example, I recently found myself driving a rental car with “idiot lights” that were identified solely by icons. The most puzzling was a picture of a car ( ) that suddenly illuminated on the dashboard. What could it possibly have been trying to tell me? I already knew that I was in a car. And frustratingly, there was no way to get any further information without pulling over to look in the owner’s manual.

Relatively certain that I wasn’t in the midst of an emergency, I waited until I got home to look it up, at which point I found out that the icon meant that there was a car in front of me. (I’m so glad that that the manufacturer thought to provide that warning — otherwise, I might have had to, you know, look through the windshield.) But even granting that the alert was useful, wouldn’t it have been even more useful if it consisted of the words “Car ahead”?

I seem to remember that prior to the age of icons, car dashboards used to display the warnings “Check Engine” and “Check Oil.” I don’t know about you, but when I look at the pictures that have supplanted them, I see a meat grinder ( ) and a genie’s lamp ( ). This, I still maintain, is why we invented language.

Read Me 1 comment

On the Other Hand

Illustration by guest artist DALL·E (because why not?)

I must have been very young when my mother first referred to me as right-handed. I asked her what this meant, and she explained that a right-handed person accomplished most tasks with their right hand. I took this to be an important piece of information.

“Then what’s my left hand for?” I asked. Her answer — and I remember this distinctly — was, “Your left hand helps.”

That seems like an innocuous enough reply, but at the time it sent me into a days-long spiral of worry. I imagined myself encountering someone who had fallen in a hole, and extending my arm to pull them out. Under such urgent circumstances, would I remember that this was an instance of “helping,” and therefore know to do it with my left hand? What would happen if I didn’t? Would the person refuse my outstretched right hand? If they did take it, would I be unable to help them? Would I be able to switch hands at that point, or would it be too late? What if, heaven forbid, the person were heavy and I needed to use both hands?

Looking back, I can see that this was a simple category error: I misapprehended the concept of handedness, which is intended to be descriptive, by interpreting it as prescriptive. But I’ve come to realize that it’s an error that’s often repeated, and not just by me.

When I entered junior high school, the need to travel to a new classroom for each subject (unlike in elementary school, where we would spend the whole day in one room) necessitated carrying books through the hallways. Very early on, I was informed by my peers that I was holding my books wrong. I was cradling them against my chest with one arm, the way the Statue of Liberty holds her stone tablet. It was made clear to me, with abundant snickering, that this was the way girls carry their books. As I boy, I was supposed to carry them against my hip, with my arm down at my side.

“How do you look at your fingernails?” I was asked. I held out my hand, palm down, with my fingers splayed. That, too, was wrong. As a boy, I was supposed to look at them with my palm up and my fingers curled over. There were probably some other tests too — I can’t recall — but if there were, I almost certainly failed them.

Motivated by the threat of relentless teasing, I soon learned to adopt acceptable masculine habits. But this, clearly, was another instance of descriptive rules being applied prescriptively.

I’m reminded of the all-too-common situation in which my cat, Mary Beth, steals food off the kitchen counter. “That’s not cat food!” I’ll tell her. Her obvious reply (which I believe I first saw in a New Yorker cartoon) would be, “I’m a cat. I’m eating it. How is it not cat food?”

In the same way, I logically should have been able to say, “I’m a boy. I’m carrying books. How is this not the way a boy carries books?” But unlike using the left hand for helping, rules like this are — for no good reason — actively enforced through societal pressure.

Children are not the only ones who apply descriptive rules prescriptively, of course. When, as a kid, I expressed a liking for bologna sandwiches in white bread, my mother protested, “Jews don’t eat bologna, and they don’t eat white bread!” When, in my senior year in high school, I elected to take a typing class, she objected, “Smart kids don’t study typing!”

Whether these battles are won or lost really doesn’t matter. (As it turned out, I did get to eat bologna-on-white bread sandwiches for lunch, and typing turned out to be the only useful thing I studied in high school.) I suppose that if I really wanted to resist the pressure to carry my books in a certain way, I could have; in that case, it just wasn’t worth it to me.

But it would be nice if, every time we’re tempted to make a prescription out of something that’s merely descriptive (and of course I’m as guilty as anyone! How many times did I say to my students, “That’s not the way a professional would do it”?), we could stop for a moment and say, “Why not?”

Read Me 2 comments

Giving Voice

Royal Albert Hall (photo by Debra)

Debra and I went a few nights ago to an event at the Royal Albert Hall called “Letters Live,” in which noted actors (and a few non-actors — in this case, a completely unexpected John Kerry) read aloud from letters written by various correspondents over the centuries. The event takes place annually, and somehow is popular enough that the 5,000 seat venue was almost entirely sold out, but we weren’t sure whether we wanted to spend the money to grab up two of the few remaining seats.

“It’s a chance to visit the Royal Albert Hall,” I said.

“But it’s people reading letters,” said Debra.

“But one of the people is Benedict Cumberbatch,” I said. (Although the cast list is kept secret until the night of the event, Cumberbatch was an exception, and was featured prominently in the advertising.) “Have you heard his voice? I’d listen to him reading from the phone book.”

(In hindsight, I guess it’s time to retire that outdated cliché. When is the last time anybody saw a phone book? I should have said, “I’d listen to him reading Google search results.”)

In the end, we decided to go, and it was a mixed bag — some of the letters were less interesting than others, and some of the performers were less enthralling — but Cumberbatch was one of the standouts, assuming the personalities of three different people (one of whom was an American) from different times and places. His characterizations were so captivating that I didn’t even pay attention to the quality of his voice.

In a few of my earlier posts, I’ve alluded to my difficulty in processing spoken language. When listening to someone speak, I can focus intently on the meaning of the words, making sure I’m comprehending everything they say; or I can relax and just enjoy the voice, the manner, and the personality of the person doing the speaking. My tendency is to do the latter, which means that I often miss a lot of the content. A great performer can make those two aspects of speech so compelling and inseparable that I feel like I’m receiving it all in a single gulp. But unfortunately for my processing of everyday interactions, not everybody is Benedict Cumberbatch.

I remember driving from Princeton to my parents’ house on Long Island with my girlfriend at the time, Alex. She was telling me a long story and then stopped to apologize, saying “I guess I’m really going on, aren’t I?”

“That’s OK,” I said. “It doesn’t matter what you’re saying; I’m just comforted by the sound of your voice.” I meant that purely as an expression of affection, but she didn’t hear it that way.

“You mean that what I say isn’t important? That it’s all just babble?”

I quickly assured her that everything she said was indeed important, but I realized later that her anger was appropriate. I really didn’t remember anything about the story she told; I had just been delighting in the experience of being in the presence of Alex — the way she looked, the way she smelled, the way she sounded.

In recent years, I’ve come to realize that many of my relationships with people are similarly unbalanced. There have been many people that I’ve thought of as friends, but while each of them might think that our friendship centers around the things we say to each other, my perception is that our conversations are simply excuses for me to enjoy that person’s physical presence. And as much as I value honesty in a friendship, I can’t say so out loud, because that person is likely to (justifiably) react the way Alex did.

As a result, I’ve found myself largely withdrawing from the world of friendships. I think one of the reasons I’m so comfortable here in London is that everybody is a stranger, and I don’t have to pretend otherwise. When I strike up a conversation with a random person in a pub, it’s clear to both of us that what we’re saying is of no importance; we’re just appreciating the special moment of making a connection. And when there’s no connection to be had — as when I’m one of 5,000 people sitting and listening to the voice of Benedict Cumberbatch — I can guiltlessly sit back and enjoy the sensation.

Read Me 2 comments