Store-ies

When I was in my early 20s, a new takeout place opened up on my street. I don’t remember the name of the shop, but a sign in the window announced that it sold gyros. Since my diet as a college student had consisted mostly of dining-hall food and pizza, the word “gyros” was new to me. I was curious to try one.

I walked into the store and encountered the proprietor, a large, bald, scowling Greek man who stood behind a high counter. “Could I have a gyro, please?” I said.

The man leaned forward indignantly. “A jye-ro?” he said, mimicking my pronunciation. “What’s a jye-ro?” Then, rising up to his full height, he barked, “It’s a…” followed by a Greek word that had no resemblance to “gyro.” It sounded to me like “hee-ros,” with a hard “s” at the end indicating that it was singular, not plural.

“Thank you, I’ll remember that,” I squeaked out. Despite the humiliation I experienced in acquiring this food item, it turned out to be delicious — definitely something I’d want again.

So a couple of weeks later, I strode confidently into the shop. “I’d like a hee-ros, please,” I said.

The man stared at me with bulging eyes. “Hee-ros?” he bellowed. “You want a hee-ros?!” He paused as I looked up at him helplessly. “The word is YEE-ros!” he said. “Hee-ros means PIG!”

To this day, whenever I order that particular Greek delicacy, I’m careful to ask for a “yee-ros.” And invariably, the person behind the counter looks at me quizzically and says, “You mean a gyro?”

There’s not much point to that tale; I just like it. But it makes me think of another store that opened a few years later on the same street. Not only don’t I remember the name of the store; I don’t think it had a name. This store’s only products were clocks and ceiling fans, which struck me as a delightfully random combination — like shoes and lightbulbs, or paper clips and accordions. Quirkier still was a little display in the corner that had nothing to do with either clocks or ceiling fans. It was called “Adopt an Ancestor,” and it consisted entirely of framed, antique photos of anonymous people, most likely acquired at estate sales. As was the style in the days those photos were taken, their subjects were stiff, well dressed, and stern-faced.

We think of ghosts as people who died but had unfinished business on earth, and therefore are unable to extricate themselves from the world of the living. It struck me that the invention of photography had turned all of the unsuspecting people in those pictures into ghosts. Their physical bodies had turned to dust years before, but their monochrome images continued to live among us, unidentified and untethered, flickering in the light of twirling fan blades.

Part of me wanted to adopt one of them, to make a place for him or her in my family tree. But none of these ghosts looked the least bit Jewish, so it would have been an awkward fit. Eventually the question was moot, because the store went out of business not long after it had opened.

I sometimes wonder where those photos are now, and whether their subjects have found a well-earned place of refuge. Even if they have, their place has been taken by the thousands — or more likely millions — of photographic ghosts that have been released into the netherworld since then.

When I went through my mother’s house after her death, I found a closet stacked from bottom to top with boxes of old family photos. I rescued a couple of boxes’ worth, but the rest stayed in the closet, waiting to be gathered by the company we’d hired to do the final cleaning out. Although I couldn’t bring myself to toss the photos into a dumpster, I do hope they’re decaying in a landfill somewhere, and not floating anonymously through the world, waiting for somebody to adopt them.

Read Me 1 comment

Hat Check

One spring day when I was in my early 20s, I decided that I needed to get a hat. I asked a female coworker — someone I knew to have good taste in clothing — what sort of hat she thought would look good on me.

“Mark,” she said with great sincerity, “you don’t have enough style to wear a hat.”

I can’t pretend that her remark didn’t hurt, but I suppose she was right. I’ve never been known for my fashion sense. Not only have I dressed the same way for my entire adult life — khakis or jeans, a button-down shirt, and a cotton sweater — but I hardly ever notice what other people are wearing (unless they happen to look exceptionally good or exceptionally bad). When I saw a headline the other day that said something like, “High-waisted jeans are out; low-rise jeans are coming back,” what I took away was that there apparently are different flavors of jeans. (I’d always thought jeans were just jeans.)

What struck me about the hat comment, though, was that the style of wearing a hat was evidently more important than the usefulness of the hat. Of course, appearance does matter — if I’m going to wear a hat, I want it to be one that looks good on me — but that consideration is secondary. It seems obvious that the primary consideration should be that the hat will do what I need it to do (which, in this case, was to protect me from the sun).

What feels obvious to me, however, appears not to be as obvious to everyone else. People who wear hats indoors (a behavior that I’ve found inexplicably common among jazz musicians), or people who turn their baseball caps around so that the sunshade is in the back, seem not to be particularly concerned about the purpose of a hat. Then there are the many people (mostly women, but some men as well — think Frank Sinatra) for whom being stylish requires wearing a hat on an angle. Take Lauren Bacall, in the 1940s photo at the top of this post, whose severely tilted beret leaves half of her head exposed to the elements for no good reason.

For an even more extreme example, consider this portrait of the entertainer Josephine Baker, taken in the 1930s (above, to the right of Lauren). The thing she’s wearing on her head does nothing to shield her from sun or rain, or to keep her warm, or to stop her hair from getting mussed (in fact, she probably had to use a dozen hairpins to make it stay on her hair at all). And yet, for lack of a more descriptive word, I suppose we’d still have to call it a “hat.”

Again, I have nothing against fashion. We humans have a hunger for novelty, and continually changing styles help to satisfy that hunger. What baffles me, though, is style at the expense of practicality.

There’s probably no better example of that tendency than women (it is almost entirely women) who have long, painted fingernails. If you know me, you know that I have trouble with the idea of painting body parts in general — I think it looks silly. But since a reliable source (OK, Wikipedia) tells me that women have been painting their nails since 3,000 BC, I have to accept that mine is a minority opinion, and that most people find colored nails to be genuinely attractive. That’s fine; putting pigment on one’s nails doesn’t hurt anybody.

Why, though, do nails have to be long? I’m always mystified about why so many women make everyday tasks — typing, buttoning, opening a can of soda, playing a stringed instrument — more difficult for themselves, when nobody is forcing them to do so. Not only do long nails make fingers less able to do the things that fingers are supposed to do; it also makes the nails themselves less able to do the things that they are supposed to do. When my goddaughter once asked me for help in opening a sealed package, I said, “You’re the one with the long fingernails, so you can do it more easily than I can.”

“Oh, no,” she said, “these nails are purely decorative. I can’t do anything with them.” She explained that her artificial nails were so tightly glued on that if she put too much pressure on them, they would yank her real nails right out of their beds.

The phrase “form follows function,” which was a guiding principle for 20th-century architecture, has somewhat fallen out of vogue. It’s become permissible for architects to add ornamentation to buildings even when it doesn’t serve any practical purpose. However, I don’t know of any serious architect who would design decorative elements that diminish the usefulness of a building. I wonder why that principle doesn’t also apply to hats, fingernails, and other elements of our daily lives.

Read Me 4 comments

Guys with Guns

Much has been written to debunk the thesis, popular among gun-rights advocates, that “only a good guy with a gun can stop a bad guy with a gun.” The refutations tend to rely on statistics showing that increased gun ownership leads to increases in rates of violence, or that the instances in which civilians are able to use guns to stop crimes are vanishingly rare. But there’s one simple, common-sense argument that I’ve never seen put forward: Practically speaking, there is no difference between a “good guy” and a “bad guy.”

Given a collection of random people, would you be able to sort them into two groups, one good and one bad? Of course not — there is no real-world marker of goodness or badness. Clearly, then, we can’t expect a bullet to know whether it’s been fired by a good guy’s gun or a bad guy’s gun. It’s just going to go wherever the gun is pointed.

Therefore, it’s meaningless to say that “a good guy with a gun can stop a bad guy with a gun.” All we can say is that one person with a gun can stop another person with a gun. It may be that one is a more skilled marksman than the other, or that one is luckier than the other, but which individual is “good” and which is “bad” doesn’t enter into it.

How, then, can we make it more likely that in any violent confrontation, the right person will prevail? Well, one possibility is that we, through our democratically elected government, could agree on who we want to represent our collective definition of goodness, and allow those people — and only those people — to have guns. We could train them to use their guns conscientiously and safely. We could make them responsible for protecting the rest of us (who are unarmed) against anyone who attempts to do us harm. We could make them easily identifiable as “good guys” by giving them uniforms… maybe even badges. (OK, you see where I’m going with this.)

Unfortunately, this proposal can only work if it’s built on a foundation of trust, and trust is in short supply. There are plenty of people who say, “I’m not going to put my safety in the hands of a government whose interests might be contrary to mine. The only person I can trust is myself. I refuse to relinquish the weapons that would allow me to defend myself against those who wish to hurt me — possibly including the government itself.”

Now, here’s where it gets interesting, because this is the point where I’d feel the urge to take sides: I’d want to say, “Yes, but in taking that position, you’re putting all of us in danger. If everyone has equal access to guns, there’s no guarantee that the ‘good’ person will defeat the ‘bad’ person in any given conflict — the outcome is just as likely to be the opposite.”

A thoughtful gun-rights advocate might then respond, “I acknowledge that if everyone has the right to use a gun, some people are going to die unnecessarily. But if that’s what’s necessary to preserve our absolute right to self-defense, I’m willing to make that tradeoff.”

And there lies the problem: Every policy decision involves a tradeoff. The difference between the two sides in an argument usually comes down to what each side is willing to trade away in exchange for something they consider more valuable.

This may seem like a weird change of topic, but consider the debate over encryption. Several digital communication companies protect their users’ privacy by providing end-to-end encryption — meaning that if I send you a message, it will be transmitted in a highly secure format that can be read only on your device. End-to-end encryption enrages law enforcement authorities, who have long been able to listen in on phone conversations and depend on being able to do the same with digital communications. In order to stop terrorist acts before they happen, they say, they must have a way to find out what potential terrorists are saying to each other. But companies such as Apple tell them, “We’re sorry, but messages sent via iMessage are so secure that even we can’t read them.”

The U.S. government has demanded that Apple build a back door into its communications software that would allow law enforcement to read encrypted messages in cases of potential threats to national security. Apple has refused, and privacy-rights advocates — myself among them — support Apple’s stance. The government, they say, could abuse its access by illegally tapping into conversations that have nothing to do with national security, and then using the gathered information for its own ends. To protect themselves from government agents who can’t be trusted, people should have the right to communicate privately.

Someone could easily say to a privacy-rights advocate, “But in taking that position, you’re putting us all in danger. You’re making it more difficult for law enforcement to predict and prevent terrorist acts.”

And I, as a privacy-rights advocate, would have to respond, “if that’s what’s necessary to preserve our absolute right to privacy, I’m willing to make that tradeoff.”

In other words, there’s no such thing as a “good guy with an argument” and a “bad guy with an argument.” Both sides are using the same argument, but we’re just filling in the blanks differently. In the absence of a perfect solution to a problem — which almost never exists — each of us has no choice but to weigh one potential outcome against another, using our own values as a guide.

Read Me 1 comment

Vote of Confidence

On November 8, 2016, an hour into teaching my regular Tuesday night class, I announced that I was going to end the lesson early and turn on the news of the presidential election. The polls had finally closed across the country, meaning that news organizations could now release their projected results. “OK, turn it on, but you’re not going to like it,” said one student who was already staring at his phone.

Sure enough, when I switched the digital projector over to CNN, I was shocked to see a US map largely drenched in MAGA-red. “This wasn’t supposed to be possible!” I said to myself in horror. Then, a moment later, another thought occurred to me: Why had the student assumed that I wasn’t going to like it?

I had always been scrupulous about keeping politics out of the classroom. I wanted every student to feel included and accepted regardless of whether their political beliefs aligned with mine, and the easiest way to ensure that was never to discuss politics. On every Election Day, I would exhort students to go out and vote — even offering to excuse them from class if there was no other time they could make it to the polls — but never said a word about whom they ought to vote for. Eight years before, during a similar Tuesday night class, I had turned on the news just as Barack Obama was being declared the first Black person elected to the presidency, and felt constrained to show no reaction, even as I watched tears come to the eyes of several of my students of color. Now, watching Donald Trump emerge victorious, I had to avoid displaying my despair.

I don’t know where I got the idea that part of my role as a teacher was to remain politically dispassionate. Certainly, many of my colleagues strongly believed the opposite — that it was their duty to bring politics into the classroom, regardless of what subject they were teaching. The college administration had an explicit policy of advocating for social and environmental justice. Faculty members were expected to support and encourage activism among their students, many of whom came from lower socioeconomic backgrounds or immigrant families.

Not surprisingly, I share those beliefs in progressive causes. Nevertheless, I chafed at being expected — to some degree, required— to hold a particular set of values, and I didn’t want to put my students in that same position. My job was to teach my students how to think, but not what to think. If I were to present my values as being true, I would be, in effect, telling some number of my students that theirs are false. I’m not prepared to do that.

I’ve long experienced queasiness at what was for a time called “virtue signaling” (and now appears to be called “performative behavior”). When people posted about their political beliefs on Facebook, I wouldn’t regard their posts as saying something about the world — at least not anything that hasn’t been said a million times before — but as saying something about themselves: “Look how noble I am to have these opinions.” In the words of P.T. Barnum, talk is cheap. If I’m not doing something to better the state of the world (and I honestly can’t claim to be doing much), then I haven’t earned any right to tell others what my feelings are about it, and they have no reason to care.

So when that student said, on election night, that I’m “not going to like it,” I immediately wondered what led him to assume that I favored the liberal candidate over the conservative one. Had I failed in keeping my opinions to myself, and if so, how? I knew that I hadn’t said anything in class, or even outside of class, and I hadn’t posted about politics on social media. I couldn’t have communicated my desired outcome unconsciously — say, through body language — since until that moment, I didn’t have any idea who was ahead in the election.

All I could come up with is that, having spent a few months in my classroom, the student had intuited that I just didn’t seem like a Trump supporter. I treated my students with respect and was genuinely interested in their ideas. I was flexible. I was compassionate. I often initiated discussions of ethical issues in class, but I hadn’t attempted to impose my values on my students or make judgments about theirs.

If that was indeed the reason, then I could be happy. It meant that I hadn’t failed; I had, in some sense, succeeded. It meant that I didn’t have to explicitly express my values and beliefs; I could simply live them, and know that some people might benefit. It meant that even if I wasn’t doing as much as I should to make the world better, I could take solace in the fact that in my own limited way, I was at least not making it worse.

Read Me 1 comment

Give Me a Break

Today is Giving Tuesday, at least according to the many emails I’ve been receiving from charitable organizations. The idea appears to be that we can compensate for the consumerist excesses of Black Friday and Cyber Monday by contributing to the public good on Tuesday.

The sentiment is admirable, but one thing about Giving Tuesday has always irked me. Black Friday got its name because stores promoted heavy discounts for the day after Thanksgiving, and so many shoppers turned out that the retailers’ balance sheets instantly went “into the black.” Cyber Monday came about because, after having their appetites whetted by Black Friday sales, shoppers continued to buy things online when they returned to the office on Monday. The following day, however, became Giving Tuesday only when some nonprofit organization declared it so. Instead of being descriptive of consumers’ actual behavior, Giving Tuesday is prescriptive — it’s a day when you’re supposed to give to worthy causes. And I’ve always been wary of any holiday that has “supposed to” embedded in its premise.

On Martin Luther King Jr. Day, you’re supposed to engage in volunteer activities. On Mother’s Day and Father’s Day, you’re supposed to express your gratitude to your respective parents. On Memorial Day, you’re supposed to honor the men and women who died in military service. Hell, even on Christmas, you’re supposed to be merry. (It wasn’t enough for Scrooge to give Bob Cratchit the day off from work; he was expected to be happy about it.)

In college anthropology classes, I learned the difference between two kinds of social norms: mores (pronounced “morays”) and folkways. Both are sets of rules that members of a society are expected to follow, but they differ in degree of significance. Violating a more — for example, by engaging in racist or sexist behavior — may result in serious sanctions or punishment. Violating a folkway — for example, by chewing with one’s mouth open — usually results only in chiding, shaming, or quiet disapproval.

The “supposed-tos” that are associated with holidays are folkways. I once attended a Memorial Day parade where a man was going from spectator to spectator, handing them cheap little paper poppies with wire stems in exchange for a contribution to the American Legion. When he came to me, I — having no need or desire for a fake poppy — politely said, “No, thank you.” He looked at me with undisguised hostility and said, “What do you mean, ‘No, thank you’?” He shook his box of contributed coins. “Disabled veterans!”

Clearly, what I had interpreted as an optional transaction was actually a folkway, and I was violating it. I dropped some coins in the box and accepted a poppy. I suppose that doing so constituted a charitable act, but there’s no joy in charitable giving when it’s done under coercion.

It would be nice if holidays could be relieved of their coercive aspect. Ideally, a holiday should constitute a reminder — People gave their lives for our country! Your parents sacrificed for your well-being! — and an opportunity to act on that reminder. But if I choose not to give on Giving Tuesday — because I didn’t buy anything on Black Friday or Cyber Monday, and because I make charitable contributions regularly — there should be no shame in that.

Oddly enough, this problem with holidays makes me think of the Pledge of Allegiance. Every morning, from the time I first started attending school, we students were expected to rise, face the American flag with our hands over our hearts, and recite the pledge. As a young child, I had no idea what “allegiance” meant (nor, for that matter, who the mysterious Richard Stans was), but I did it because that’s what we were required to do.

Of course, making the pledge was pointless from a practical point of view. First, a promise made under coercion is not an enforceable promise. Second, because I’m a US citizen, my allegiance to the United States is legally required whether I make the pledge or not. (If I’m found to be giving aid and comfort to an enemy, I’ll be considered a traitor either way.) Third, even if the pledge had some legal or moral significance, it wouldn’t have to be renewed every day — it would be presumed to remain in effect until I specifically renounced it.

Eventually those objections became moot. Around the time I entered high school, it came to light that the Supreme Court had, in 1943, ruled that schools could not require students to salute the flag. Naturally, then, I stopped reciting the pledge, as did almost all of my classmates. But I wondered why the Pledge of Allegiance was still considered meaningful enough that people chose to recite it voluntarily at public events.

What I came to realize is that people like to express their deepest convictions — whether it’s their loyalty to their country, their love of their mother, or their caring for people less fortunate — at a time when they can see other people doing the same thing. It gives us a sense of solidarity and community. Like publicly reciting the Pledge of Allegiance, holidays serve that purpose. We can — and hopefully do — offer our gratitude for life’s blessings many times throughout the year, but there’s something special about doing it on Thanksgiving. I suppose there could be something similarly special about giving to charity on Giving Tuesday.

Read Me 3 comments