Service with a Smile

A natural smile is a beautiful thing, but the number of forced smiles in the world seems to greatly exceed the number of genuine ones. If you’re a performer of any sort, smiling insincerely is part of the job. As a member of the ensemble in high school musicals, I was always admonished by the choreographer to “Smile, dammit!” When I sang in a barbershop chorus (under the aegis of the Society for the Preservation and Encouragement of Barber Shop Quartet Singing in America, which prizes showmanship almost as much as vocal blending), our music director nearly always displayed a huge, frozen grin as he conducted us — a reminder that we should keep similar grins on our faces as we sang.

But even people who are not entertainers — such as customer-service representatives, flight attendants, and servers in restaurants — are compelled to smile, regardless of their (typically less sunny) emotional state. Women, I’m told, are frequently commanded to smile by male supervisors, coworkers, and even complete strangers.

In classic films from the 1930s and 40s, I see Black people sporting wide, toothy grins as a sign of deference to the white people around them. Although such exaggerated smiling is no longer expected, I still encounter people of color who feel the need to put on a smile to ensure that they’re perceived as nonthreatening. Even I tend to present a self-protective smile when I’m faced with potential anger or aggression.

The past 20 years have brought compulsory smiling to a whole new domain, in the form of emoji. For most of my life, a written communication was assumed to be friendly — or at least neutral — unless the language clearly indicated otherwise. Now an email or text is generally interpreted as stern or hostile unless it includes a yellow smiley face or some other pictorial signifier of light-heartedness. I resisted this trend as long as I could, but eventually I found myself sprinkling my messages with smiles and winks out of fear that my tone might otherwise be misinterpreted.

Of course, the most common occasion for artificial smiles is the taking of photos. In response to the photographer’s command to “Say ‘cheese,’” we all draw on our inner Method actor to bring ourselves into a fleeting state of happiness that will hopefully express itself in a natural-looking smile. Even if this exercise in emotional memory works — which it often doesn’t — the freshly displayed smile almost immediately settles into a frozen rictus accompanied by pleading eyes that say, “Click the shutter already!”

There ought to be a law against holding a smile past its sell-by date. An artificially preserved smile doesn’t communicate pleasure; it communicates creepiness. Watching someone hold a smile makes me squirm. I have this reaction particularly when I watch old movie musicals, when a great dancer such as Gene Kelly or Eleanor Powell finishes a number and then adopts a beaming facial expression that remains pasted on as the camera zooms in on it. I don’t know who decided that this smile-and-zoom ending was a good idea, but it’s disconcertingly common. Somehow only Fred Astaire manages to avoid it.

When I was in college, I frequently signed up to be a subject in psychology experiments conducted by grad students. In one such experiment, I was asked to take a brief written test to assess my mood. The experimenter then brought out a machine with a number of electrodes attached, pasted them on my face, and asked me to contort my mouth in a certain way to keep them from falling off. With the electrodes in place and the machine turned on, I was asked to take the test again.

It was revealed to me afterward that the machine and its electrodes were no more than a ruse, and that the facial contortion I was asked to adopt was meant to approximate a smile. The point of the experiment was to find out whether the simple muscular act of smiling could improve someone’s mood. I never found out what the ultimate results of the experiment were, but I’ve since read about similar studies that have demonstrated such a causal link. Smiling — even unemotional, unmotivated smiling — can apparently cause the brain to release serotonin and dopamine, thereby reducing stress and increasing feelings of well-being.

So perhaps the fake smiles displayed by Hollywood stars, obsequious salespeople, and everyday subjects of photographic portraits are doing their bearers some good. Nevertheless, they continue to make me uneasy. Can’t we dispense with the idea that insincere smiles are desirable? People in portraits rarely smiled until a hundred-or-so years ago, and yet we can assume that they led contented lives outside of the photographer’s studio. I’m fine with seeing Eleanor Powell flash a brief grin when she finishes a dance number, but then I’d like to see her relax, take a breath, mop her forehead, and drink some water. Then I’d really believe she’s happy.

Read Me 5 comments

Underperforming

Headshot circa 1980

My first paid acting job came a year or two after I graduated from college, when I got the leading role in an educational video. (This was years before I became a producer of educational videos myself.) Given the momentousness of the occasion, it’s amazing how little I remember about the experience. I have no memory of who produced the video, who my fellow actors were, or even how I got the gig.

This last question is especially puzzling, because there’s no way I ever should have been cast in the role. I played an exchange student from a Spanish-speaking country who has trouble fitting in with his peers despite being a star member of the swim team. As a native English-speaking, non-Hispanic, non-athlete who hates being in the water, I was probably the least suitable person they could have chosen. However, I was slim, young, and had brown hair and a mustache, and I was able to summon up a passable generic Spanish accent. (Fortunately, there were no actual swimming scenes in the script.) Such politically incorrect casting would never fly today, but it apparently didn’t bother anyone in the 1970s.

I (not surprisingly) didn’t feel like I played the part very well, but the director was satisfied, and I got my paycheck. A couple of weeks after shooting ended, I was surprised to get a call from the producer, inviting me to see a rough cut of the video. Watching it was a big boost to my self-esteem: My performance wasn’t nearly as bad as I’d thought it was. In fact, it was pretty damn good. Believable, even.

A rough cut, in those days, was done on an inexpensive video-editing machine that offered no color adjustments, fades, dissolves, effects, or image stabilization. It was an economical way to experiment with different ways of editing the footage and to decide what the final version would look like. The final “online” edit, based on the rough cut, would be done with broadcast-quality equipment in a professional editing suite with a high hourly price tag.

When I finally got to see the product of the online edit, I was appalled. My performance, with which I had formerly been so impressed, was embarrassingly terrible. It was immediately clear why: The editing of the final version was entirely different from the rough cut. It was simple and straightforward, with longer takes and fewer cuts. Instead of combining the best parts of several takes of a scene, the editor had just used one take and let it play out, regardless of inconsistencies in the acting and lapses in the rhythm of the scene.

The producer confessed to me that the project had gone over budget, and that he couldn’t afford to do an online edit that was as elaborate as what I’d seen in the rough cut. He nevertheless seemed satisfied with the final result. I was not. I hoped the master tape would meet with some horrible accident, and that I would be the last person ever to have viewed this video. I honestly don’t know what became of the video after that. If there was a horrible accident, I never heard about it, but I’m happy to say that no audience member ever tracked me down and pelted me with tomatoes.

What this experience left me with is a keen appreciation for what the editor contributes to a film or video. When we see a film, we tend to notice and comment on the acting, the story, the production design, and perhaps the special effects, but we’re generally not conscious of the editing. Even I, having spent time on both sides of the camera, will compliment an actor’s performance without thinking about the fact that the actor contributed only the raw material, and that the performance was largely constructed by someone else.

I was reminded of this sometime later when I saw a live performance by a local band that I was a fan of. It had been a couple of years since I’d last seen them perform, and suddenly I found myself wondering what I had liked about them. They were playing the same material as before, but it sounded — well, not very good. The music was flat and uninspired.

Midway through the performance, I found out why. The band’s frontman mentioned that they’d recently brought in a new bass player, but that their former bass player was in the audience — and would he like to come up and sit in for a few tunes? The retired bass player accepted the invitation, and suddenly the band sounded like its old self again.

This was a revelation. When listening to a band, we tend to notice the melody and harmony, the lyrics, the rhythm — but who notices the bass part? It turns out that the bass line is like the foundation of a building. It holds the building up, but we never think about it unless it cracks.

It must be frustrating to be an artist in a role that’s invisible to most people, where the only way to know that you’re doing it well is when nobody notices what you do.

Read Me 2 comments

Spelling, Be

My boss in my first job out of college was a man named Bill West. He’d occasionally get annoyed when he’d give his name over the phone and the person on the other end would say, “Could you spell that, please?” Like, it’s Bill freakin’ West. What is there to spell?

I, on the other hand, have a last name that nobody can be expected to spell correctly on the first try. It’s Schaeffer, but in a world filled with Shaeffers, Shafers, Schaefers, and innumerable other variations, there’s no way for anybody to know what to do once they get past “S.”

For many years, when asked how to spell my name, I would patiently spell it out: “S–c–h–a–e–f–f–e–r.” Then it occurred to me that I could save a lot of time by just telling the person, “Put in all the letters you can.” That turned the problem into a game, which many people seemed to appreciate. (Among them was my wife Debra, who went a bit too far by coming up with the spelling Pschaephpherre.) Still, it didn’t seem fair to saddle a harried reservations clerk or receptionist with the task of puzzling out the correct spelling on their own.

After many years, I finally arrived at the most practical instruction: “Spell it however you want!” After all, unless I was engaging in some sort of legal transaction — in which case I would probably do it in writing — it really didn’t matter how somebody spelled my name. “Schaeffer, party of two, your table’s ready” sounds the same no matter how it was written down.

My wife and I have different last names, which occasionally leads to one of us being identified by the other’s name. Debra, for feminist reasons, is irked when someone assumes that her last name is Schaeffer, but she answers to it when necessary. I, on the other hand, have no objection to being called Mr. Goldentyer when the clerk at Safeway reads it off our loyalty card. (What I’m actually called in that situation is “Mr. Guh… Mr. Go… uh, Gol…,” but having a difficult-to-decipher last name is a problem that poor Debra has had to cope with much longer than I have.)

The point is that it makes no difference what people call me, as long as they and I both understand who is being referred to. My students at Chabot College, on the first day of class, would often ask how they should address me: Mr. Schaeffer? Mark? Professor? My answer was always, “Whatever you’re most comfortable with.” As long as a student treats me with respect — the same respect that I am careful to offer in return — the particular phonemes that come out of the student’s mouth hardly matter.

I would think that this indifference toward arbitrary labels would be universal, but it quite evidently isn’t. Most people, so far as I’ve seen, are offended when someone innocently misspells or mispronounces their name, or calls them something other than what they prefer to be called. I find this attitude mysterious. If someone addresses me as “Mr. Guh… Mr. Go…,” my natural response is to tell them, “It’s Goldentyer.” I see no need to snap, “It’s actually Schaeffer.” The name that this person associates with me has no effect on who I am.

The thing in my wallet that we typically call an “ID card” shows my name, my picture, my date of birth, and perhaps my gender. But those things don’t constitute my identity; they’re just handy labels that people can use to identify me. If someone gets one of those labels wrong, they’re merely making an error; they’re not changing anything about who I am. Most of the time, the error — such as a misspelling of my name — has no consequences. In cases where the error does have consequences — where I might be denied a right due to someone’s interpretation of my age, gender, or ethnicity — the fault lies in the way society treats people with different labels unequally. The labels themselves are insignificant.

Read Me 3 comments

Sentencing Guidelines

I know that my writing style is stuffy and formal. Although I’m liberal in my beliefs, I’m ultra-conservative in the way I express them. For example, I’m fastidious about sandwiching every non-restrictive clause between commas, and beginning every dependent clause with “that.” I still make a distinction between “if” and “whether.” I still use the subjunctive (“if Puerto Rico were a state” rather than “if Puerto Rico was a state”) when talking about hypothetical situations.

It’s not that I want to appear stodgy. When I was young, having such a formal writing style helped me be taken more seriously, but now that I’m a senior citizen, it’s more likely to make me seem out of touch. (I’m reminded of a time when I was in my 40s and a new acquaintance asked me why I have a beard. “I grew it to look older,” I said. She looked me over and said, “I don’t think you need to do that anymore.”)

I think that the real reason I write this way is because of how my brain processes language. Most people, if they were composing the first sentence of this post, would write “I know my writing style is stuffy and formal,” leaving out the word “that.” Doing so makes it feel friendlier and more casual. But the omitted word “that” is actually a meaningful conjunction whose purpose is to introduce a dependent clause. In order for the sentence to make sense, my mind has to consciously insert the missing word.

This re-parsing happens almost instantly — so quickly that it might legitimately be called insignificant. Having no way to get inside other people’s heads, I can only guess that it really is insignificant for most people. But for me, it’s annoying, like a speck of dirt on my eyeglasses. What reason is there to put unnecessary obstacles in the way of clarity?

In my post “Hat Check,” I opined that the function of a hat (or a finger) is more important than its style. I guess I’m saying the same thing here about language. Writing style is important, but it’s less so than the primary function of writing, which is to deliver meaning. Anything I can do to make the meaning of a sentence more immediately accessible, without requiring the reader to expend effort on reinterpretation, contributes to the sentence’s utility.

I once edited (and largely ghost-wrote) a college textbook called “Sentence Combining: Shaping Ideas For Better Style.” Its thesis, put forth by a professor of English named John Clifford, was that anyone can become a better writer by starting with a list of independent statements and then arranging and combining them in different ways. For example, the statements “London’s first department stores seemed pleasant” and “They were disagreeable places to work” can be transformed into:

  • London’s first department stores seemed pleasant, but they were disagreeable places to work.
  • Although they were disagreeable places to work, London’s first department stores seemed pleasant.
  • London’s first department stores seemed pleasant even though they were disagreeable places to work.
  • It’s surprising that London’s first department stores were disagreeable places to work, since they seemed pleasant.

At the time, this concept seemed so obvious to me that I couldn’t imagine why anyone would need to publish a book about it. Writing, for me, has always been a laborious process of trying and rejecting different formulations until I find the one that most clearly represents the meaning and tone that I want to communicate. (In 1983, when I edited that book, I was still conducting all of those trials mentally and then writing down the result on paper. I didn’t get my first computer — which allowed me to do the work on a screen instead of totally in my head — until the following year.) To this day, writing an email or even a simple text takes me forever.

I didn’t think there was anything unusual about that until I met my wife Debra, who can write quickly and almost effortlessly. Considering that she made it through law school and then had several books published, there’s clearly nothing wrong with her writing; she communicates as well as anyone, and her style is considerably less labored than mine. She certainly doesn’t waste time recombining sentences à la John Clifford.

I suppose that this is just another example of how my brain is wired differently. (I struggled with whether to remove “that” in the preceding sentence, but left it in for consistency.) I’d love to write in a more casual style for this blog, but achieving precision and casualness is more work than I can manage for weekly posts. For those who haven’t met me, be assured that I’m not as stuffy in person as I might come across on the page.

Read Me 1 comment

Atmosphere (4)

(part four of four)

The standard scientific way to test for telepathy or clairvoyance is to use a randomly ordered deck of cards with bold, abstract symbols on them. A laboratory assistant looks at one card at a time and concentrates on it, and the self-identified psychic — who can see neither the assistant nor the card — guesses which symbol the assistant is looking at.  By matching a record of the assistant’s cards against a record of the psychic’s guesses, an investigator can determine whether the psychic’s correct answers exceed what could be predicted by chance alone. (The test for precognition — the ability to predict future events — is similar, except that the psychic guesses the symbol on each card before the assistant looks at it.)

To my knowledge, no reputable investigator has yet found a person who can reliably and repeatedly pass this test. This doesn’t mean that such a person won’t one day be found, but as years go by, it appears less and less likely. Scientifically speaking, it seems safe to say that there’s no such thing as telepathy or precognition.

And yet, I can’t help but imagine myself in the psychic’s shoes. If I believed myself to be telepathic, but then failed the scientific test, how would I respond? I suppose it’s possible that I would accept the investigators’ conclusions and never again call myself a telepath. It’s possible, but it doesn’t seem likely. More likely, I’d probably say that the test didn’t have much to do with what I experience when I read minds.

This is pure speculation, of course. But if the telepath’s experience of other minds is anything like my experience of atmosphere, I can’t imagine a scientific test that could measure it. The best I can come up with is this: A bunch of volunteers — drama students, perhaps — sit in a room. The investigator holds up a card with the name of a mental state on it: “Anger,” “Grief,” “Love,” “Awe,” or something along those lines. The volunteers concentrate, trying to bring themselves to that state of mind. The room is darkened, and the volunteers are instructed to sit silently and very still. I’m given a signal to walk into the room.

“What atmosphere do you detect?” asks the investigator. I give the name of one of those mental states, and the investigator writes down my answer. I leave the room, and the process is repeated many more times. Later, the investigator compares my answers with the actual cards that were held up.

I have no reason to believe I could pass such a test. The environment would be so sterile, the atmospheres so artificial, the pressure so great. My answers would be no better than guesswork. “It doesn’t work under these conditions,” I would say, and the investigator would merely give me a condescending nod.

Eventually, after testing a hundred other people like me, the investigator would write a report. “There is no evidence that mental or emotional states can be communicated by extrasensory means,” the report would say. “There is no evidence of the existence of ‘emanations’ or ‘atmosphere.’”

I have great respect for the scientists who might conduct such a test. I can’t deny the soundness of the experimental method. But at the same time, I can’t deny the reality of what I’ve experienced in my life. In the case of this hypothetical experiment, I could only conclude that what went on in the laboratory was different from what went on when I did my mime shows, or what goes on when I walk into a house of prayer.  I don’t think this makes me anti-scientific; it just makes me human.

After all, we all experience atmosphere. Whenever two or more people come together — sometimes even when one person is alone — an atmosphere is created. Homes, offices, classrooms, neighborhoods, and cities all have atmospheres, even if we don’t stop to acknowledge them. And each of us can influence the atmosphere around us. Imagine the heights to which the world’s singers, dancers, musicians, and actors would soar if everyone in the audience chose to greet their performances with openness and gratitude. Then imagine what the world might be like if every person, everywhere, chose to do the same thing. Given that there’s no scientific basis for believing in atmosphere as I’ve described it, the result might be nonexistent. But it sure couldn’t hurt to try.

Read Me 1 comment