Long Lived the Queen

Since most of my blog posts are not time-sensitive, dealing as they generally do with events that happened 40 or 50 years ago, I’ve never been in a hurry to make them public. Some of them were written weeks or even months before you see them, giving me time to rethink them and tweak them and sometimes even throw them away.

This is one of the rare posts that are written on the same day they are published. We’re in the UK, you see, and the queen is dead.

Debra and I arrived in London a week ago for a two-month stay. We settled into our basement flat in West Kensington and strolled over to North End Road, the nearby shopping street, to check out the neighborhood. I’m always self-conscious when I first arrive in a foreign country, feeling like everything I’m doing is wrong. Am I wearing the wrong clothing? Am I talking too loudly? Am I supposed to walk on the left side rather than the right? Suddenly an older woman stopped in the middle of the sidewalk and stared right at me. I was about to apologize for whatever I’d done wrong, when she gasped, “The Queen has died!”

I don’t know what one says in a situation like that. What I said was, “I’m sorry,” which is what you say when someone tells you that their grandmother has passed away. Hopefully my American accent made my response seem less inappropriate.

It’s not clear how the woman had gotten the news at that moment — was she in the middle of a phone call? — but clearly the queen’s death was not yet common knowledge. Debra and I looked at each other. “We’d better get some groceries, quick, before the news spreads and the whole city shuts down,” I said. We ran into a convenience store and bought a few prepackaged meals, then went next door and snagged some dinner from a fish-and-chips shop that was about to close. That was our first day in London.

As it turned out, the city did not shut down. I’m sure that plenty of people watched the 24-hour coverage on the BBC; many others gathered outside Buckingham Palace, despite the fact that no members of the royal family were inside. But life remained surprisingly normal in the subsequent days — the pubs and the theaters remained open, the stores engaged in business as usual, and young people continued to promenade along the south bank of the Thames.

It’s only when you talked to Britons — particularly older ones — that you found out the truth. Life was normal on the surface, but not so in people’s hearts. People told us that they felt cast adrift, that the world suddenly felt unreal. “I’m not especially in favor of the monarchy,” was a common comment, “but even so, she was a source of stability and continuity. She’s been the Queen all my life!” King Charles III feels like a barely adequate replacement.

I can’t help thinking of the afternoon of November 22, 1963, when John F. Kennedy was assassinated. Students were dismissed from school early, and I arrived home to find my mother in tears, sitting at the kitchen table surrounded by crumpled tissues. It’s difficult to imagine anyone today having such an emotional reaction to the death of a national leader, but the connection between Queen Elizabeth and her subjects seems to come close.

I don’t mean to equate the two events. JFK’s death was sudden, shocking, and horrifying, while the queen’s impending death has been anticipated for years. Her death was natural; his was not. But the one thing that both deaths seem to have in common is that for the citizens whose leader had been lost, the world never felt the same afterward.

In the case of Kennedy, America permanently lost its innocence — it was as if Adam and Eve had just eaten the fruit from the Tree of Knowledge and suddenly realized that they were naked. JFK had been a symbol of youth, energy, and optimism, of the best times that were yet to come, and now that vision of the future was exposed as an illusion. Even though I was a child, that shock of recognition felt very real to me.

Queen Elizabeth, for her part, was the British people’s last connection to a long-mythologized past — a time when Britain was at the center of an empire and a leader of the world, a small but noble nation that stood bravely against the Nazis in World War II, a symbol of the superiority of Western civilization. That whitewashed characterization of the UK’s global role is no longer accepted intellectually, but it has always remained potent emotionally. With the queen’s passing, the last tether of the present to the past has given way.

The queen’s funeral is scheduled for next Monday, and on that day the city — and the country — really will shut down, as I remember happening in the United States during the funeral of JFK. Just as my eyes were glued to the small, flickering screen of our black-and-white TV in 1963, I’ll be watching the ceremony intently — albeit this time on a large, bright, flat screen in vivid color. The first broadcast was about the death of the future; the new one will be about the death of the past.

Read Me 3 comments

Dead to Rights

When my mother was told, eight years ago, that nothing more could be done to treat her pancreatic cancer, she was undaunted. She was not about to let something as important as her death escape her control. She somehow managed to enter hospice over Labor Day weekend, making it as convenient as possible for all of the out-of-town relatives to fly to her Florida hospital room. She told us exactly who should cater her shiva, and instructed us to order food for 75 people. (Taking into account my mother’s popularity, we instead ordered food for 100 people, and ended up with precisely 25 people’s worth of food left over.)

Most important, she had long since arranged and paid for her funeral and burial. (She was proud of having nabbed “waterfront property” for her gravesite, which was her half-serious way to describe its location next to one of the cemetery’s small ponds.) All we, her offspring, had to do was go to the funeral home and sign some papers.

There was one small omission in her planning. “Did she belong to a synagogue?” the funeral director asked us. The answer was no — her second husband, Eddy, had never been a fan of attending services. “Well, we’ll need to get a rabbi to officiate at the funeral,” the director said. “What kind of rabbi do you want?”

I was not prepared for that question. It hadn’t occurred to me that rabbis came in kinds. For lack of a more sophisticated response, I said, “A rabbi with a sense of humor?” That turned out to be the perfect answer. “Ah, I have just the right person for you,” the funeral director said, and the rabbi we were matched with did turn out to be an ideal choice.

What strikes me now is how the funeral director phrased the question. She did not ask — as she well might have — “What kind of rabbi would your mother have wanted?” Clearly, the rabbi’s primary function was to make us, the mourners, feel comforted. My mother’s preferences, whatever they might have been, did not need to be taken into account. She was, after all, dead.

The reason I need to state this so bluntly is that people’s condition of being deceased rarely seems to get in the way of their wishes being carried out. I’m puzzled by the deference that’s given to the feelings of someone who is no longer equipped to have any.

I’m happy that my mother planned her funeral in advance — not because that allowed her to have the funeral she wanted, but because doing so took the burden of arranging it off her survivors. I’m grateful that she made a will — not because it means that her property was distributed in a way she would approve of, but because it relieved her survivors of having to squabble over who was entitled to what. For some reason, actions that I consider altruistic — done as a favor to the next generation — are routinely treated as if their sole purpose is to benefit the deceased. This makes no sense to me, because short of resurrection, nothing can be done to benefit the deceased.

These thoughts come to mind because of an article that I recently read in The Guardian, about the man who pretty much invented the idea of dead celebrities’ likenesses being owned by their estates. It had previously been established that celebrities were legally entitled to “publicity rights” — the right to decide who got to make use of their names and faces, and for what purposes. This made perfect sense, since living celebrities might object to being portrayed as endorsing a cause, or a product, that they didn’t in fact support. Secondarily, it gave those celebrities the sole right to profit — or to license others to profit — from their hard-won fame.

In the early 1980s, a lawyer named Roger Richman promoted the idea that since publicity rights constituted a financial asset, they could be inherited after a celebrity’s death, just like any other asset. He became well known for his aggressive representation of the estate of Albert Einstein, immediately suing anyone who put Einstein’s face on a T-shirt or otherwise used his image for a purpose not authorized by Einstein’s estate. His litigiousness is estimated to have earned $250 million for the Hebrew University, which currently owns Einstein’s publicity rights.

But publicly, this financial arrangement is portrayed as being for Einstein’s benefit — protecting his image from being sullied by association with ideas or organizations that he would not have approved of in life. For example, Einstein’s persona may not be used to promote tobacco, alcohol, or gambling (which is interesting, since he is well known for his habitual pipe-smoking).

It’s quite possible that Einstein’s heirs do feel a moral duty to maintain the purity of his reputation, apart from whatever financial gain that continued purity brings them. They’re certainly entitled to hold that belief, if it brings them comfort. But we need to dispense with the fiction that this protection is somehow owed to Einstein himself, because it’s “what he would have wanted.” Einstein died 67 years ago, and any wants he might have had died along with him. There are a number of things that dead people cannot do, and holding opinions is one of them.

Read Me 1 comment

Deception Reception

If you’ve ever commented on one of my blog posts and wondered why it took some time for the comment to show up, it’s because I have to review them all manually to screen out automated comments from Russia. Unlike the Google chatbot that’s been in the news lately, these Russian bots are nowhere close to sentient — their comments have nothing to do with the post they’re attached to, and they never say anything self-revealing. As a public service, I offer some samples (translated from the Russian by Google), with links omitted:

The most important component in the foundation of a bookmaker’s office are all silhouettes as well as lines.

An extremely important part of your outfit is a motorcycle helmet. It’s more important though that your riding gear fits you exactly and fits your size.

Blackbutt is a unique ecoregion inhabited by many peoples, with snowy mountain domes separating the subtropical coastline.

Although these comments can be informative — for example, I never knew that it was possible for mountain domes to separate a coastline — I’ve never been able to figure out why the bots (or, rather, their human overseers) go through the effort of posting them. Are there really readers who are so interested in a blog post that they go on to read the comments, encounter a comment that is clearly irrelevant and mercenary, and then are so stimulated by what the comment says that they go on to click a link? I want to know who those people are.

I don’t mean to be patronizing here. I understand that there are people who are inexperienced in the ways of the online world and are therefore vulnerable to scams. (I’ll admit that one phishing email was so well made that even I fell for it.) What I don’t get is why some people willingly trust strangers whose intent to deceive is right out in the open. For example, if you receive an email whose subject line says something like “You’ve won our grand prize!” but whose message, when you click on it, turns out to be an ad for generic Viagra, you’ll probably delete it immediately. But the continued existence of such emails implies that there are recipients who instead say, “Haha!  You’ve successfully tricked me into opening your email! Therefore I will send you money.” The existence of such people mystifies me.

Slightly better than the sellers who openly deceive their potential customers are those who willfully annoy them. I’m thinking of the merchants who leave flyers tucked under the windshield wipers of my car, forcing me to (A) physically handle the flyer, (B) read it to make sure it’s not a parking ticket, and (C) carry it around until I can find a place to recycle it. My temptation is always to take such a flyer to the originating business and hand it back, saying “Excuse me, but you accidentally left this on my car, and I’m sure you want it returned,” but that response would only be effective if everybody did it. Instead, there are apparently people who treat the flyer as an incentive to order a pizza, go to a nightclub, or whatever, thereby rewarding the business for wanton littering.

Finally, there are those sellers — generally, but not always, online — who consider me the kind of person who would sell out my friends. “Here’s your personal link,” they’ll say. “Anytime a friend uses this link to buy our product, you’ll get a reward!” In other words, they’re saying that they know that my friend wouldn’t be interested in receiving advertising from them, but that perhaps my friend would be open to getting advertising from me. Even in the rare event that I would really have wanted to recommend a business to friends, this strategy makes it much less likely. (And if I do recommend it to a friend, I certainly won’t give them my “personal link,” which would constitute a clear case of conflict of interest.)

Speaking of recommendations, how do you handle those surveys that ask you “On a scale of 1 to 10, how likely are you to recommend our product/service to a friend?” Even if I really like the product or service, my answer is always 1, because I rarely give friends advice on what to buy. (If they end up being unhappy about their purchase, I’ll feel partly responsible.) I once admitted this in a Facebook post, and one of the commenters got very upset about it. “You just fucked up their performance statistics!” she said, but all I was doing was honestly completing the survey. If the business wants me to give them a 9 or 10, they’ll have to ask a better question.

Read Me 3 comments

Twist and Shout

During the 1964 presidential campaign season, my father took me to an outdoor event where the Democratic candidate, Lyndon Johnson, was supposed to make a personal appearance. We milled around as various people made speeches, punctuated with announcements that Johnson would be arriving at any time now. I’m not sure that anyone was listening to the speeches; we had come only to see the man himself. After a couple of hours and no sign of Johnson, we drove home, tired and disappointed. This was my first experience of what I suppose would be called a political rally.

I feel bad for the people at the podium. Their job, I assume, was to get the crowd fired up, but their oratory just melded into an indistinguishable drone. Even if LBJ had made an appearance, it’s hard to imagine the spectators becoming a demonstrative mob. I’m guessing that most of the people there, like my father and me, were just normal, suburban Long Islanders who were curious to get a glimpse of a presidential candidate. We already knew that we were voting Democratic (at least those who, unlike me, were old enough to vote), and there was no need to get excited about it.

Ever since then, I’ve wondered about the crowds who we always hear cheering, shouting, and applauding at televised political events. In my experience, backing a candidate in an election is always a matter of choosing the best from an imperfect bunch and crossing your fingers that enough other people will make the same choice. It’s business. I have no more urge to root wildly for a politician than I do for the pumpkin that I select at a Halloween pumpkin patch.

It’s probably worth noting that I have an aversion to having my emotions toyed with. I recently heard Eddie Muller (the acknowledged maven of film noir) say on the Turner Classic Movies channel that he resents it when a movie tries to evoke tears. “Once I realize they’re manipulating me to cry,” he said, “I will not do it.” I would expand that sentiment to any sort of emotional manipulation. Nobody can tell me what to feel and how to express that feeling.

The one time I participated in a political demonstration was in the autumn of 2002, after the US — having already gone to war in Afghanistan — was preparing for an invasion of Iraq, all in retaliation for the September 11 attacks. The idea of responding to violence with more violence was abhorrent to me, and I was moved to attend a peace march in organized by a group called Not In Our Name. I marched silently through the streets of San Francisco, hoping that my physical presence, along with the presence of thousands of other people, would influence our leaders in Washington to rethink their plans. (Spoiler alert: It didn’t.)

Having had no experience with this sort of event, I wasn’t prepared for the whole parade of us to end up gathered in a plaza, listening to overamplified speeches by people expressing their grievances — not only about the impending war, but about the sorry state of the world in general. I don’t know what the point of this was, since our participation in the march should have been evidence that we already felt the same way. Clearly, we were being baited; we were supposed to respond by vocally expressing our righteous anger. For some people, I guess that joining in the call-and-response chants was cathartic; for me, it was a sign that it was time to go home.

My discomfort with this sort of thing goes beyond the political. I’ve always been annoyed when the performers at a large concert delay their entrance until a warmup announcer has prodded the audience into explosive, but artificial, enthusiasm. “Are you ready to have a good time?” the announcer will say. My natural answer would be, “What do you think? Would I have paid this much money — not to mention those outrageous Ticketmaster fees — and come all of this way to be scanned by metal detectors and herded into a densely packed space, if my intention were not to have a good time?” Being a man of few words, I would be willing to condense all of that into a simple “Yes.” But instead, I find that the audience has been conditioned to respond with a roof-rattling cheer.

Of course, that’s not enough for the warmup announcer, whose eardrums have apparently been severely damaged by the effusiveness of previous crowds. “I can’t he-e-a-r you!” he’ll shout.

“I’m sorry about your hearing loss, but take my word for it; I’m definitely ready,” I’m prepared to say. But the rest of the audience responds with an even louder cheer, accompanied by foot stomps and whistles. Not until the sound level reaches a prescribed point on the decibel meter will the announcer deign to let the performance begin.

Let me tell you: I was on stage many times at one point in my life, and I never felt the need to have someone tell the audience how enthusiastically to applaud. I always subscribed to the idea that it’s my job, as the performer, to earn the applause. If the audience response is tepid, I know that it’s not because they’ve been insufficiently manipulated; it means that I haven’t been working hard enough. Even then, if they prefer to sit quietly and think to themselves, “I’m enjoying this,” I have no legitimate reason to object.

Read Me 3 comments

After a Fashion

The good shirts

The community synagogue was the center of my parents’ social life in the 1960s and ’70s, and at least once a month they would attend one of its fundraising events — casino nights, galas, auctions, and rummage sales. Unfamiliar with the word “fund,” I construed that my parents were going to “fun raisers,” and I lamented that I wasn’t old enough to participate in raising the fun.

The event I (vicariously) enjoyed the most was the annual Journal Dinner, a formal-dress affair at which each participant received a copy of the “journal.” The journal was a magazine-sized publication in which members of the congregation could buy ads — a single line, an eighth of a page, a quarter of a page, and so on up the affluence scale, with the most expensive being a full-page ad printed on metallic paper. Each ad consisted of the name of the donor (or, more often, the donating couple) in bold type, optionally accompanied by “Best wishes,” “In honor of…,” or “In memory of….” That was it — the journal had no other content besides the ads, arranged in increasing order of extravagance. I loved leafing through the journal the morning after the dinner and discovering who had bought what. I was perpetually embarrassed that my parents could only afford an eighth-page ad, but I loved getting to the end of the journal where the rich people were gathered, and seeing each donor’s name tastefully emblazoned in the center of a gleaming silver or gold page.

Another synagogue fundraising event that my parents (or, more likely, just my mother) attended annually was the fashion show. Unlike the Journal Dinner — which I understood as a clever means for members of the congregation to flaunt their wealth by buying a bigger ad than their neighbors did — the idea of a fashion show made no sense to me. When I asked my mother to describe a fashion show, she said that it was an event where models dress up in fancy clothing and an audience pays to look at them. That can’t be right, I thought to myself. People pay to watch other people wear clothes? Don’t we do that every day for free?

Eventually I learned that there was such a thing as styles of clothing, that they changed periodically, and that people were always eager to find out what the new styles would be. Wearing the most fashionable clothes was somehow equivalent to buying a full-page ad in the journal. I remember my mother joking about how every few years, the fashion designers would proclaim, “Up with the hemlines!” and women would run out to buy shorter skirts and dresses; then, a few years later, the designers would proclaim, “Down with the hemlines!” and women would run out to buy longer ones. At first I thought that these proclamations were rules that women were somehow required to follow; it wasn’t until later that I realized that women (and men, with regard to the widths of neckties and lapels) were doing so voluntarily.

My response to this newfound understanding was to refuse to wear anything simply because it was in style. What was the point of buying something that was fashionable now, if it’s going to look silly in a couple of years? When bell-bottoms came along in the late 1960s, I insisted on continuing to wear traditional pants. Same with designer jeans in the ’70s. I remember my father treating me to an appointment at a then-novel “men’s salon” (as opposed to a plain old barber shop) in preparation for my sister’s Bat Mitzvah, and watching in shock as the hairdresser, after meticulously cutting and blow-drying my hair, went in with his fingers and deliberately mussed it up. “What are you doing?!” I said. He replied, “That’s the style now. You don’t want to appear too careful.” I would have none of it. If he was going to demand an outrageous amount of money to cut my hair, the least he could do was comb it neatly.

Because my aversion to following the dictates of fashion seems so common-sensical to me, I tend to forget that other people don’t share that attitude. When I was teaching digital-arts classes, students would often ask me to teach them to recreate a particular look or technique that they’d recently seen online. “Why do you want to do that?” I would say. “It’s just a fad.” It didn’t occur to me until later that if these students were going to be graphic designers, they were supposed to latch onto design trends.

Of course, I understand and appreciate that styles evolve over time. If they didn’t, the music world would never have had ragtime, jazz, rock, and rap, and the art world would never have had impressionism, expressionism, and modernism. One thing bothers me, though: In most realms, the emergence of a new style doesn’t necessitate the disappearance of an old one — the old and new can coexist. Symphony orchestras still play the music of Bach and Mozart alongside Thomas Adès and Caroline Shaw. The walls of art galleries still display figurative art along with abstract works. But when a new style of clothing comes along, nobody continues to make the old ones.

For years, I routinely wore long-sleeved cotton shirts with vertical stripes in dark, saturated colors. (See the photo at the top of this post.) They’re really the only shirts I feel comfortable in, and they used to be ubiquitous — I could easily pick up an armload of nice shirts from the tables at Costco. Then, about ten years ago, they disappeared. Dress shirts now are available only in shades of pink, purple, or blue, and in solids or checks — never stripes. The kind of shirts I like are so out of date that I can’t even find them in thrift shops anymore. To me, they seemed standard and timeless. If I’d known at the time that they were only a temporary fad, I would have hoarded a lifetime supply.

If anyone sees shirts like these presented as a new look in a fashion show, please let me know.

Read Me 3 comments