Under the Influence

I’ve been hearing lately about a video game that people seem to be excited about, called “The Legend of Zelda: Tears of the Kingdom.” Apparently it’s the latest of many installments of the Legend of Zelda series, which dramatizes the adventures of a young hero named Link and a wise princess named Zelda.

As someone who has never played video games — apart from some brief sessions of Pong, which came out when I was in college — I’m unable to comment on the merits of the Zelda series or its newest iteration. I do, however, have two reactions. First, a princess named Zelda? The name Zelda, for me, doesn’t conjure fantasies of medieval kingdoms; it makes me think of F. Scott Fitzgerald’s mentally unstable spouse or some Long Islander’s Jewish grandmother. But my more significant beef is with the full name “The Legend of Zelda.”

The story of Zelda, whatever it may be, is not a legend. A legend is a tale that has come to us from so distant a past that we have no way to trace its beginnings. In fact, it’s so embedded in our culture that its historical accuracy — or lack of it — has become irrelevant; the fact that it has been retold over so many generations grants it its own sort of authenticity. King Arthur’s court is a legend. Robin Hood’s band of merry men is a legend.

Zelda, by contrast, was invented (according to Wikipedia) by a pair of Japanese game designers in 1986, and her adventures have been made up as they go along. If her story is referred to as a legend, then it’s a faux legend, an imitation of a legend.

So why is this series of games called “The Legend of Zelda” rather than “The Story of Zelda” or “The Adventures of Zelda”? Obviously, it’s for reasons of marketing. The use of the word “legend” gives the series a feeling of weight and significance that it doesn’t actually merit. If I were a game player, my realization that the marketers of Zelda are trying to manipulate me by using that emotionally resonant word would immediately put me on my guard and make me less likely to want to buy the product.

I admit that most people would consider this an overreaction. I do confess to being unusually sensitive to the idea of being manipulated, and becoming more so as time passes. Most recently, I’ve become irritated by the use of underscoring in movies to tell me how I’m supposed to feel about a scene.

If you watch the earliest sound films from the late 1920s and early 1930s, you’ll notice that there’s no background music at all. Partly this is for reasons of technology — originally, there was no way to add music to a scene after it was shot — but it was also because filmmakers were concerned that if music was heard behind the characters’ dialogue, audiences would wonder where the music was coming from. A film’s score, if any, was limited to “diegetic” music, which is music that comes from a specific source in the world of the film — such as a band in a nightclub or a phonograph in an apartment, which would almost always be shown onscreen.

I’ve really come to appreciate the spare sonic landscape of those early films, where often the only extraneous sound is the random crackle of the soundtrack. It gives the film a feeling of immediacy and authenticity that’s missing from the carefully crafted sound mixes that came later. When I watch a modern film, I often find myself mentally filtering out the background music to sense whether the scene works dramatically without it. Much of the time, it doesn’t — which makes me resent the director who, in my mind, is cheating by using music to manipulate my emotions in a way that the scene doesn’t accomplish on its own.

Given my feelings about manipulation, you can imagine how mystified I am by the fact that “influencer” is now an accepted job title in the world of social media. Of course, celebrity endorsements have always been used to market products, but the intent to manipulate wasn’t nearly as overt as it is now. If anyone came to me and said, “I am an influencer, and I’m here to influence you,” I’d run the other way. So who are the people who are agreeing — nay, demanding — to be influenced, and thus, by definition, handing over their free will? And why would anyone choose to be one of them?

Read Me 1 comment

Ruling Out

I’m generally known to be a rule-follower, but that’s only because I can see how most rules make sense. There’s a flip side to that coin, however: If I don’t see how a rule makes sense, and if no one can explain to me why the rule is in place, then I feel no obligation to follow it.

For example, at the community college where I taught digital media courses, there was a rule that we were not supposed to dismiss a class early. I was told that the rationale was that it could be seen as shirking on the part of the faculty member, depriving students of their money’s worth, and granting them academic credit that they hadn’t earned through time in the classroom. But none of that made any sense to me. Dismissing a class early wasn’t the same as requiring students to leave early; anybody who wanted to stay for the remaining time was welcome to, and would receive my full attention. And besides, I’d always told my students that instead of pushing themselves to solve a seemingly insoluble problem, the best thing is to go away, do something else, and then come back to the problem later, at which time they’d probably be able to solve the problem quickly and easily. To me, it often seemed that allowing them to forget about digital media for a while would be more productive than continuing to force-feed them with it. So I would ignore the rule and dismiss the class. The students did fine, and the educational system did not collapse.

Lately, however, I’ve come to realize that my attitude toward rules is sort of a self-centered one. Even if nobody is able to explain a rule in a way that makes sense to me, it doesn’t necessarily mean that there’s something wrong with the rule; it may just mean that I’m asking the wrong people to explain it.

I remember that when I was growing up, my parents used to take my sister and me out for an occasional afternoon at the bowling alley. Both of my parents were pretty good bowlers; I, unsurprisingly, was not. My father was constantly giving me advice to improve my game: Look at the dots on the lane, not at the pins; roll the ball so that it arcs toward the pins rather than traveling in a straight line, and so on. For the most part, I’d compliantly follow his instructions, but there was one step I strongly resisted: the thing he called “following through.”

“Following through” meant that after releasing the ball, I was supposed to continue to swing my arm upward until it reached about chin height. To me, this felt ridiculous. How could anything I would do after releasing the ball possibly affect its trajectory? I think the term “following through” was actually part of the problem — it implied that this was a separate step to be tacked on at the end. I certainly visualized it that way, which of course made it ineffective. After trying it a few times and feeling silly, I refused to do it anymore until my father could explain how the continued motion of my arm could influence a ball that was already partway down the lane.

My father worked at the time as a mechanical engineer, so you’d think he could have given me a quick lesson in physics and momentum. But for whatever reason, he didn’t — probably because the physics of it never occurred to him. He simply kept insisting that following through was part of good form, and I should just do it. I stubbornly declined to do a thing that made no sense to me. Naturally, my ball consistently ambled down the lane instead of hurtling the way my father’s did, and I reluctantly chalked it up to my general lack of athletic ability.

I swear that it wasn’t until I reached adulthood that I worked out what following through was about. I discovered that if I swung my arm in a fluid arc and released the ball midway, the ball traveled with much more speed and precision. Of course the motion of my arm had no effect on the ball after it was released — I had certainly been right about that. What I hadn’t realized, and what my father somehow failed to explain, was that creating the optimal conditions for launching the ball required my arm to keep moving afterward. As they say: Duh!

Recalling this outcome late in life has given me a lesson in humility. If a rule fails to make obvious sense, it’s not necessarily due to a deficiency in the rule; it may just be that I and whoever created the rule are using different frames of reference. I may be right based on the way I’m framing the problem; they may be right within their framing of the problem. The challenge is for each of us to see outside of our own frame of reference and understand the other’s. Until I’m able to do that, I need to do what I was unable to do with my father: give him the benefit of the doubt.

Read Me 2 comments

Double Door

As my old friend Regina could tell you, I’m usually able to appreciate a good pun. But there’s one pun-based riddle that has always annoyed me: “When is a door not a door?” The answer is, “When it’s a jar (i.e., ajar).” When I was a child, the riddle was incomprehensible and hence not funny, simply because I was unfamiliar with the word “ajar.” Even when the word eventually entered my vocabulary — in a house shared with cats and kittens, a door that’s ajar can be a bad thing — the riddle still irritated me, because its premise is clearly untrue. When a door is ajar, it doesn’t stop being a door. It’s a door and it’s a jar.

As unintuitive as it seems, two seemingly contradictory things can be true at the same time. The difference lies in the context. For example, in the context of ethnicity, I’m Jewish. In the context of religion, I’m not. A Nazi would say that I’m Jewish, a Hassid would say that I’m not, and both would be correct.

I belong to a chorus whose music, with few exceptions, is arranged for soprano, alto, tenor, and bass. My voice would normally be classified as baritone, but that’s not one of the options, so I found my home in the tenor section. In the context of the chorus, I’m a tenor; in the context of solo singing, I’m a baritone. Both can be true at the same time.

These examples — to me at least — feel uncontroversial. So I wonder why cases involving gender can’t be equally straightforward. My chorus (if I may use it in another example) used to include, in each season’s repertoire, one song performed just by the men and another performed just by the women. There’s a long tradition of this in choral music: Men’s voices and women’s voices have such distinctively different qualities that songs are often arranged for one or the other. (Think of the contrasting sounds of the Mills Brothers and the Andrews Sisters.) But a few years ago, we permanently dropped the men’s and women’s songs in order to avoid discomfort for gender-nonconforming people.

I’m all for sparing people unnecessary discomfort. But is it really necessary to eliminate entire categories of music? Just about every person I’ve ever met, regardless of their gender identity, has what is traditionally considered a man’s voice or a woman’s voice. Can’t we say to someone, “In the context of society, you’re a trans woman/nonbinary person/gender-fluid individual, but in the context of choral music, you’re a man”? (In cases of people whose voices are not neatly categorizable, they can choose the group that they fit into more comfortably, just as I, a baritone, chose tenor over bass.)

I hesitate to wander into politically disputed territory, but can’t we say the same thing about restrooms? I’m a great fan of urinals. They use less water, take up less space, and require fewer surrounding walls than toilets. Practically speaking, it makes sense for anyone who can physically use a urinal to use one. Perhaps one day there will be one big room where everyone can urinate into the fixture of their choice, but for now, urinals are almost universally found behind the door marked “Men.” So can’t we say, if only for environmental reasons, that — strictly in the context of elimination — anyone who can make use of a urinal is considered a man?

I understand that there is more at stake here than practicality. I can see how for someone who has been has been maligned, demeaned, threatened, or attacked for not conforming to traditional notions of gender, the idea of being asked to accept a label that they have so long fought against would be abhorrent. But if we could break from the habit of assigning a single label to each person, to recognizing that everyone can have multiple labels in multiple contexts, the meaning and force of any individual label would be reduced. If a door can simultaneously be a door and ajar, can’t a person simultaneously be a man and a woman, depending on the context? And wouldn’t that then be true of most of us?

Read Me 5 comments

This Is the Way We Watch Our Words

My wife Debra and I were at a recent social gathering where a friend remarked that he “felt badly” about something. Then he immediately stopped. “Wait,” he said. “Should that be ‘felt bad’ instead of ‘felt badly’?”

“It’s ‘felt bad,’ ” Debra said. “‘Felt badly’ means that you weren’t very good at the act of feeling.”

“But ‘felt’ is a verb,” someone said, “and so you have to use an adverb. And ‘badly’ is an adverb, right?”

“Yes,” said I, unable to contain myself. “But ‘to feel’ is a copulative verb.”

Given that this was a well-educated group, most of whom worked with language professionally, I guess I expected their response to be something along the lines of, “Ah, yes, of course.” But instead, all heads turned toward me incredulously. “A what?” said at least one person.

“A copulative verb,” I said. “A verb like ‘to be’ or ‘to appear.’ It takes an adjective rather than an adverb.”

“But what about ‘I feel well’?” said the original speaker. “ ‘Well’ is an adverb, isn’t it?”

“Sure,” I said, “but it can also be an adjective — as in ‘I’m not a well man.’ When you say ‘I feel well,’ you’re using it as an adjective.”

I shouldn’t be surprised at the general unawareness of copulative verbs. They’re not the kind of thing that come up in everyday conversation. (Some Googling revealed they’re now more commonly called “copular” verbs, presumably because it sounds less dirty.) I’m sure that I would never have heard of them, except that the concept was drilled into me when I was in fourth grade.

Yes, fourth grade! I’ll grant that I didn’t have a typical elementary school education, but for some reason my teacher was so convinced of the importance of copular verbs that she taught us a song about them (sung to the tune of “This Is the Way We Wash Our Clothes”):

Act and feel and get and grow,
Be, become, stay, and seem,
Look, sound, smell, and taste
Are all copulative verbs.

Apart from allowing me to show off at parties, I have to wonder: Is this bit of knowledge an efficient use of my rapidly diminishing brain cells? I loved studying grammar when I was a kid, especially when we started learning to diagram sentences (also in fourth grade). Being able to precisely parse the structure of a sentence allowed me to write with increasing confidence and authority, a skill that got me surprisingly far in life.

But how important is it, really? So far as I can tell, kids in the 21st century study very little grammar — not much beyond learning the difference between a noun and a verb — and yet their communication skills appear adequate for most purposes. After all, when we use language, we generally don’t follow a conscious set of rules; we speak or write instinctively, based on example and habit. Even if no one at the aforementioned social gathering was familiar with the concept of copular verbs, they still most likely say “She looks young” rather than “She looks youngly.”

Knowing the rules can help when questions come up, but even those instances don’t seem to matter much. Smart, educated people can say “I feel badly” — and often do —without anyone questioning their intelligence or level of education. I’ve come to feel that the finer points of grammar are just something for language nerds to amuse themselves by, just as baseball fans argue about batting averages and RBIs. It’s a harmless activity, but it’s not going to contribute much to the progress of civilization.

Read Me 4 comments

School Work

There used to be a tradition among Princeton alumni to be coy about where we went to college. If the subject came up, we’d say something like “I went to school in New Jersey,” because merely admitting the fact that we’d attended Princeton could be seen as boasting.

I’ve lately come to wonder why attending an elite university is considered something one might boast about. It’s not as if I worked harder than my high school classmates in order to earn my spot at Princeton — if anything, I slacked off more than I should have. I simply was endowed with some character traits that made me more likely to get in: a capacity to absorb and retain information, an ability to write convincingly, and a knack for test-taking. I didn’t strive to acquire any of those competencies; I just had them. So what is there to be proud of?

So much of our social structure is built on the idea that anyone who works diligently enough can achieve success. In many cases, I’m sure that’s true — there are plenty of people who work hard and are rewarded for it. But there are just as many people who work hard and are not rewarded, and then there are those few who are rewarded without having to work at all. Some of these different outcomes result from racism, classism, and other forms of discrimination, but it seems obvious that most of the differences are the product of simple chance: Some people come into the world more athletic than others, some more cerebral, some more aggressive, some more intuitive. Whether such traits are the result of nature, nurture, or a combination of both doesn’t matter — the fact is that the people who have them have done nothing to earn them.

I recently saw a documentary about Ava Gardner, who was a popular movie star in the 1940s and ’50s. She was the daughter of impoverished North Carolina sharecroppers, and had no thought of modeling or acting, but she was offered a contract at MGM simply because a talent scout had been stunned by a framed photo of her in a portrait photographer’s window. It’s said of her that after becoming a star, she worked hard to become an actress. (I’ve heard the same thing said of Marilyn Monroe.) But she never would have had the reason or opportunity to do that work — at which, by all accounts, she succeeded — if it hadn’t been for her extraordinary beauty. That beauty wasn’t acquired; she was born with it.

I try to imagine a society in which people are rewarded not for qualities that they have by random chance, but for what they’ve done to advance from the place where they started. I don’t think such a thing would be possible, since there’s no objective way to measure effort or sacrifice. I do know that when I taught digital arts courses at a community college, I made it clear that students who appeared to have a natural talent for art or design would not be given any advantage over those who didn’t. Everyone would be evaluated only for their ability to master and apply the specific skills I taught in my class. Other art teachers took the opposite approach — for them, the important thing was the overall quality of a student’s work, since that’s what would matter in the professional world. But I couldn’t reconcile myself to grading students according to criteria that — at least, in part — they had no control over.

I like to think that in a just world, my spot at Princeton would have been given to someone who worked harder than I did to get there. They probably would have made more of the opportunity than I did. I’m not saying that my time there was wasted — I contributed a lot to the Princeton community, and a Princeton education contributed a lot to me — but I’m sure that would be equally true of almost anyone who got the chance to be a student there. These days, if I’m reticent to say where I went to college, it’s not necessarily to avoid sounding like I’m boasting, but more likely to avoid giving the impression that I was especially worthy.

Read Me 3 comments