Take Me Out From the Ball Game

Despite my complete lack of interest in sports, social circumstances have required me, every ten years or so, to attend a major-league baseball game. This isn’t as horrible as it sounds, because — thanks to my dad dragging me to Mets games when I was a kid — I at least understand the rules of baseball. Football by contrast, is a complete mystery to me. It appears to consist almost entirely of men piling on top of each other, with the piles occasionally migrating toward one goalpost or the other.

As frequent commenter John Ozment has pointed out, unfamiliarity with the game was a liability in childhood phys ed classes. The gym teacher would never explain how to play football; it was just assumed that everybody knew. We were just told to go out on the field — shirts vs. skins — and play it.

Even if I’d had some insight into the game, I completely lacked the skills to do anything about it, so I was generally assigned to the position of linebacker. My teammates would patiently show me how to fold my arms in front of me, and then explain that the players from the other team were going to run toward me, and that my job was to keep them from getting through the line. That, of course, was a crazy idea. It was clear to me that if a determined, physically fit body was charging at me, there was no possible way I could impede its progress. So when said body was in fact hurtling toward me, I did the sensible thing and stepped out of the way. I have no memory of what sorts of things would happen after that, although I assume that they involved people piling on top of one another.

Baseball is a different story. Although I lack the ability to throw, catch, or hit a ball, I at least understand what it means when other people do it. So when I make my decennial visits to a major-league ballpark, I’m at least theoretically equipped to cheer and hiss at the appropriate times. What I wasn’t prepared for was the crowd’s behavior at my most recent Oakland A’s game. (This was well over ten years ago — I’m long overdue for my next baseball experience.) When the members of the opposing team made their entrances, each introduced by name, the Oakland fans booed them. Not because of anything they’d done — the game hadn’t started yet — but simply because they belonged to a rival team.

I was appalled, as I explained to a friend later. “I thought baseball was supposed to be about good sportsmanship!” I said. “Aren’t the players on the other team professionals, deserving of respect? And when people from somewhere else visit your city, aren’t you supposed to make them feel welcome?” My friend looked at me as if I were a space alien in a human-skin suit.

But it used to be about good sportsmanship, didn’t it? I don’t remember any Mets fans booing the teams who visited Shea Stadium in the 1960s. For that matter, I don’t remember such a thing happening when I made my first visit to the Oakland Coliseum thirty-something years ago. The only explanation I can think of is that baseball players didn’t make as much money back then, so maybe there was more of a sense that they were people like us, who could be our friends or neighbors.

This is all brought to mind by an article I recently read in the New Yorker about a game called pickleball, which I’d heard of but knew nothing about. According to the article, pickleball started as a tennis-like game that anyone — children, adults, senior citizens in retirement communities — could play and win, even in combination with each other. It was suffused with good humor and community spirit. But in recent years, pickleball has become professionalized, with official leagues and big-money contracts. There’s a growing gap — not just in skill level, but in attitude — between the professionals and the amateurs, and between members of the two national leagues. Pickleball isn’t just for fun anymore; it’s serious.

Is this sort of devolution inevitable? Outside of the sports world, the closest analogue I can think of is the World Wide Web. The invention of the website and the browser brought the internet — previously reserved for nerds and academics — to everyone. The online world was a shared space, where anyone from goofy kids to specialized scholars could set up a site, and where everyone’s site was equally available to visit. A good website back then was considered to be one that had lots of links, giving the reader plenty of opportunities to encounter things they never would have come across otherwise. (The expression “web surfing,” now considered quaint, referred to the addictive practice of jumping around the web from one site to another, following wherever the ever-inviting links led you.)

That vision of the internet is gone. Sometime around the turn of the century, the model of a good website was no longer one with lots of jumping-off points, but one that was “sticky” — one that kept the visitor on your own site for as long as possible. How else could you make money? Sharing was out; ads (measured by eyeballs) and paywalls were in. Anyone who didn’t have a plan to monetize their site couldn’t be taken seriously. (That awful word “monetize” originally meant to convert something into money, as in creating a currency; the present meaning of “turning something free into something that earns a profit” is entirely a recent invention.)

The one thing that all of these examples have in common is the corrupting influence of money. As much as I hated phys ed classes, I can appreciate that they weren’t intended to train us for careers as professional athletes; they were just about play for its own sake.

Read Me 2 comments

Long Lived the Queen

Since most of my blog posts are not time-sensitive, dealing as they generally do with events that happened 40 or 50 years ago, I’ve never been in a hurry to make them public. Some of them were written weeks or even months before you see them, giving me time to rethink them and tweak them and sometimes even throw them away.

This is one of the rare posts that are written on the same day they are published. We’re in the UK, you see, and the queen is dead.

Debra and I arrived in London a week ago for a two-month stay. We settled into our basement flat in West Kensington and strolled over to North End Road, the nearby shopping street, to check out the neighborhood. I’m always self-conscious when I first arrive in a foreign country, feeling like everything I’m doing is wrong. Am I wearing the wrong clothing? Am I talking too loudly? Am I supposed to walk on the left side rather than the right? Suddenly an older woman stopped in the middle of the sidewalk and stared right at me. I was about to apologize for whatever I’d done wrong, when she gasped, “The Queen has died!”

I don’t know what one says in a situation like that. What I said was, “I’m sorry,” which is what you say when someone tells you that their grandmother has passed away. Hopefully my American accent made my response seem less inappropriate.

It’s not clear how the woman had gotten the news at that moment — was she in the middle of a phone call? — but clearly the queen’s death was not yet common knowledge. Debra and I looked at each other. “We’d better get some groceries, quick, before the news spreads and the whole city shuts down,” I said. We ran into a convenience store and bought a few prepackaged meals, then went next door and snagged some dinner from a fish-and-chips shop that was about to close. That was our first day in London.

As it turned out, the city did not shut down. I’m sure that plenty of people watched the 24-hour coverage on the BBC; many others gathered outside Buckingham Palace, despite the fact that no members of the royal family were inside. But life remained surprisingly normal in the subsequent days — the pubs and the theaters remained open, the stores engaged in business as usual, and young people continued to promenade along the south bank of the Thames.

It’s only when you talked to Britons — particularly older ones — that you found out the truth. Life was normal on the surface, but not so in people’s hearts. People told us that they felt cast adrift, that the world suddenly felt unreal. “I’m not especially in favor of the monarchy,” was a common comment, “but even so, she was a source of stability and continuity. She’s been the Queen all my life!” King Charles III feels like a barely adequate replacement.

I can’t help thinking of the afternoon of November 22, 1963, when John F. Kennedy was assassinated. Students were dismissed from school early, and I arrived home to find my mother in tears, sitting at the kitchen table surrounded by crumpled tissues. It’s difficult to imagine anyone today having such an emotional reaction to the death of a national leader, but the connection between Queen Elizabeth and her subjects seems to come close.

I don’t mean to equate the two events. JFK’s death was sudden, shocking, and horrifying, while the queen’s impending death has been anticipated for years. Her death was natural; his was not. But the one thing that both deaths seem to have in common is that for the citizens whose leader had been lost, the world never felt the same afterward.

In the case of Kennedy, America permanently lost its innocence — it was as if Adam and Eve had just eaten the fruit from the Tree of Knowledge and suddenly realized that they were naked. JFK had been a symbol of youth, energy, and optimism, of the best times that were yet to come, and now that vision of the future was exposed as an illusion. Even though I was a child, that shock of recognition felt very real to me.

Queen Elizabeth, for her part, was the British people’s last connection to a long-mythologized past — a time when Britain was at the center of an empire and a leader of the world, a small but noble nation that stood bravely against the Nazis in World War II, a symbol of the superiority of Western civilization. That whitewashed characterization of the UK’s global role is no longer accepted intellectually, but it has always remained potent emotionally. With the queen’s passing, the last tether of the present to the past has given way.

The queen’s funeral is scheduled for next Monday, and on that day the city — and the country — really will shut down, as I remember happening in the United States during the funeral of JFK. Just as my eyes were glued to the small, flickering screen of our black-and-white TV in 1963, I’ll be watching the ceremony intently — albeit this time on a large, bright, flat screen in vivid color. The first broadcast was about the death of the future; the new one will be about the death of the past.

Read Me 3 comments

Rock of Ages

Nothing much ever happened in the suburban housing tract where I grew up. About the only source of stimulation was the jingling Good Humor truck that occasionally made surprise appearances on summer days. I was part of the generation of “free-range children” who were sent outside in the afternoon and told to return by dinnertime, and pretty much did what we wanted in between. We could go anywhere our bicycles could take us, unsupervised. If there was an unoccupied construction site nearby, we’d be climbing over the equipment and sifting through the dirt. If there was a fire or auto accident, we’d be there in a matter of minutes, watching the crews handle the emergency.

Most of the time, though, there was nothing special to do, so we’d be hanging out in front of our houses, playing street games or shooting the breeze. It was always during one of those idle times that the street pavers came around. The first sign was always a distant rumble accompanied by an acrid smell. Eventually the truck itself would appear, spewing black smoke as it crawled up the street and laid down a gooey layer of tar. Another truck would follow not far behind, depositing a heavy layer of gravel atop the hot tar.

As I remember it, there was no warning that the paving was going to happen. Whatever cars were parked on the street remained there as the trucks did their work. (I’m not sure how those curbside shoulders were maintained — perhaps since they didn’t host moving traffic, they didn’t need frequent repaving.) The tandem trucks, never stopping, lumbered into the distance, leaving behind a sea of loose gravel.

From that point on, any car that drove on our street would be accompanied by a loud crackling and pinging as the gravel flew out from under the tires and bounced against the undercarriage. No one seemed to take any notice. It would take a few weeks for the gravel to fully sink down into the tar, restoring the appearance of a smooth road surface.

Two things strike me. The first is just surprise that our streets were paved with tar and gravel. Today, even remote country roads are paved with asphalt, as are the streets in that neighborhood where I grew up. I liked the gravel. It could be picked up, played with, and used in school projects. When I made a diorama showing how the ancient Egyptians carried stone on barges to build the pyramids, the part of the stones was played by gravel from my street.

(As an aside, I remember that we referred to the bits of gravel as “pebbles.” It was only much later in adulthood that I found out that pebbles technically are stones that are worn smooth by the action of water. If they’re not pebbles, I don’t know what those individual pieces of gravel are supposed to be called.)

The other surprising thing is how routine the street paving appears to have been. It happened every four years, with no notice and no fuss. As I said, we neighborhood kids always seemed to be outside when it happened, and although it was fun to watch, it also felt uneventful. Every day, the fire station horn sounds at noon; every week, the newspaper delivery boy comes to collect his fee; and every four years, the road gets repaved. It was simply the rhythm of life in the Long Island suburbs.

I don’t know what it’s like to be a child in the current era, but my sense is that there’s no such sense of steady and dependable rhythm. Family life and the school environment have a tendency to be disturbing and unpredictable, and for me, the assurance that there was an underlying order — even if it revealed itself only once every four years — was somehow comforting. I feel for the kids who no longer even have that small amount of comfort.

As I write this, a one-block segment of our street in Oakland is having a cable laid under it, and the work seems endless — scheduled to last ten days, with constant noise and no street parking. I like to imagine that a plow-like truck could travel up the street, carving a furrow in the roadbed, and another truck could follow behind, laying the cable and filling the furrow with blacktop. I guess we’d need a third truck to tamp down the blacktop, but still, the whole process ought to take about twenty minutes, no? Since kids don’t play outside anymore, no one would even notice.

Read Me 1 comment

Dead to Rights

When my mother was told, eight years ago, that nothing more could be done to treat her pancreatic cancer, she was undaunted. She was not about to let something as important as her death escape her control. She somehow managed to enter hospice over Labor Day weekend, making it as convenient as possible for all of the out-of-town relatives to fly to her Florida hospital room. She told us exactly who should cater her shiva, and instructed us to order food for 75 people. (Taking into account my mother’s popularity, we instead ordered food for 100 people, and ended up with precisely 25 people’s worth of food left over.)

Most important, she had long since arranged and paid for her funeral and burial. (She was proud of having nabbed “waterfront property” for her gravesite, which was her half-serious way to describe its location next to one of the cemetery’s small ponds.) All we, her offspring, had to do was go to the funeral home and sign some papers.

There was one small omission in her planning. “Did she belong to a synagogue?” the funeral director asked us. The answer was no — her second husband, Eddy, had never been a fan of attending services. “Well, we’ll need to get a rabbi to officiate at the funeral,” the director said. “What kind of rabbi do you want?”

I was not prepared for that question. It hadn’t occurred to me that rabbis came in kinds. For lack of a more sophisticated response, I said, “A rabbi with a sense of humor?” That turned out to be the perfect answer. “Ah, I have just the right person for you,” the funeral director said, and the rabbi we were matched with did turn out to be an ideal choice.

What strikes me now is how the funeral director phrased the question. She did not ask — as she well might have — “What kind of rabbi would your mother have wanted?” Clearly, the rabbi’s primary function was to make us, the mourners, feel comforted. My mother’s preferences, whatever they might have been, did not need to be taken into account. She was, after all, dead.

The reason I need to state this so bluntly is that people’s condition of being deceased rarely seems to get in the way of their wishes being carried out. I’m puzzled by the deference that’s given to the feelings of someone who is no longer equipped to have any.

I’m happy that my mother planned her funeral in advance — not because that allowed her to have the funeral she wanted, but because doing so took the burden of arranging it off her survivors. I’m grateful that she made a will — not because it means that her property was distributed in a way she would approve of, but because it relieved her survivors of having to squabble over who was entitled to what. For some reason, actions that I consider altruistic — done as a favor to the next generation — are routinely treated as if their sole purpose is to benefit the deceased. This makes no sense to me, because short of resurrection, nothing can be done to benefit the deceased.

These thoughts come to mind because of an article that I recently read in The Guardian, about the man who pretty much invented the idea of dead celebrities’ likenesses being owned by their estates. It had previously been established that celebrities were legally entitled to “publicity rights” — the right to decide who got to make use of their names and faces, and for what purposes. This made perfect sense, since living celebrities might object to being portrayed as endorsing a cause, or a product, that they didn’t in fact support. Secondarily, it gave those celebrities the sole right to profit — or to license others to profit — from their hard-won fame.

In the early 1980s, a lawyer named Roger Richman promoted the idea that since publicity rights constituted a financial asset, they could be inherited after a celebrity’s death, just like any other asset. He became well known for his aggressive representation of the estate of Albert Einstein, immediately suing anyone who put Einstein’s face on a T-shirt or otherwise used his image for a purpose not authorized by Einstein’s estate. His litigiousness is estimated to have earned $250 million for the Hebrew University, which currently owns Einstein’s publicity rights.

But publicly, this financial arrangement is portrayed as being for Einstein’s benefit — protecting his image from being sullied by association with ideas or organizations that he would not have approved of in life. For example, Einstein’s persona may not be used to promote tobacco, alcohol, or gambling (which is interesting, since he is well known for his habitual pipe-smoking).

It’s quite possible that Einstein’s heirs do feel a moral duty to maintain the purity of his reputation, apart from whatever financial gain that continued purity brings them. They’re certainly entitled to hold that belief, if it brings them comfort. But we need to dispense with the fiction that this protection is somehow owed to Einstein himself, because it’s “what he would have wanted.” Einstein died 67 years ago, and any wants he might have had died along with him. There are a number of things that dead people cannot do, and holding opinions is one of them.

Read Me 1 comment

The Olive

This, to my amazement, is my hundredth blog post. I’ve been posting weekly, almost without interruption, for nearly two years. So with your indulgence, I’d like to take a step back and reflect on the experience.

This project began as a way to keep myself occupied and motivated during the pandemic. Since I wasn’t having any meaningful experiences in the present, it seemed like an opportune time to delve into the past, and to reexamine some questions that I’d been wrestling with all my life.

Looking back on what I’ve written, I’m surprised by what I chose to write about. Most of the incidents I’ve recounted come from early in my life, from my childhood through my mid-20s. Apparently nothing that’s happened since then carries as much emotional weight as the experiences I had as I was growing into an adult. I’ve said little or nothing about relationships I’ve had, places I’ve visited, or cultural events I’ve been involved with. Emotional injuries, injustice, and death seem to be recurring themes, despite my initial intention to give these essays a lightly humorous tone. As someone who had always considered himself entirely secular, I’m startled by how many times I’ve made reference to God or religion.

I mentioned in my first post that I’d attempted twenty years ago to write a book of personal essays. The book never got finished, because I found that much of what I’d written on any given day sounded foolish or inconsequential on the following day. I think I fell into that trap because of my training as a philosophy major: I felt that every essay had to take the form of an argument that led to a meaningful conclusion. I realize now that life doesn’t build to conclusions — it just happens — and so it’s pointless to try to present it as if it does.

You’ve probably noticed that most of my posts have weak endings, or lack any ending at all. That’s because I’ve taken advantage of the looseness of the blogging format to avoid having to shoehorn my thoughts into a formal structure. I just say what I want to say, and when I find myself straying into bullshit territory — usually, it turns out, after about 800 words — I stop. This simple strategy has been tremendously freeing. For the first time, I’ve experienced writing as a pleasure rather than a trial. I’m grateful to finally know what that feels like.

At the same time, this experience has forced me to face my limitations as a writer. The first surprise was that I have limitations. Writing was always my most reliable skill. I was the student who was able to write an elegant, convincing book report on a book I hadn’t read. I wrote such persuasive essays on my college applications that I was able to get into Princeton despite my spotty high school record. As an adult, I always earned my living at least partly by writing for hire. I could write about almost anything and make it sound like I knew what I was talking about.

But it’s clear now how limited my range as a writer is. The purpose of my writing was always to explain, to instruct, to convince. I don’t write poetry; I don’t write fiction. My prose may be polished, but there’s nothing beneath the surface — it means what it says, no more and no less. In the case of these blog posts, that literalness usually takes the form of here’s something that happened to me; here’s how I felt about it; here’s something else that it reminds me of.

A real artist can start with the particular and transform it into something universal. It’s clear to me that I lack that transformative power — not just in writing, but in other realms as well. The photo-illustrations that accompany these posts may hopefully be witty, but they’re also often literal interpretations of the text. The same can be said of my other visual art projects. (My pieces about hands are just about hands.)

Although I’ve retired from teaching, I realize now that this tendency toward literalness pervaded my teaching as well. My skill as a teacher always relied on my ability to explain things: I can take a complicated subject and present it in a way that’s clear, organized, and easily digestible. But a real teacher has the ability to take those explanations and transform them into something more valuable for the student: an inspiration, an identity, a mission. Those things have always been beyond me.

On the positive side, writing this blog has given me a sense of purpose in my retirement. I’m surprised at how much I have to say that feels like it’s worth saying, and how much my perception of things is different from anything I’ve read elsewhere. It’s my hope that my idiosyncratic observations about my own life will encourage you to see your life differently.

Writing this blog has been like making olive oil. When they first crush the olives, the juice comes running out — that’s the stuff we call “extra virgin.” It was like that for me when I first started the blog: I’d sit down to write and the words would flow. But then the olive stops giving up its oil so readily, and the growers have to use more invasive methods, such as heat and chemicals, to extract the rest. In my case, it now takes more effort to find things to write about that don’t feel self-indulgent, obvious, or repetitive. But I think there’s still some flavorful oil left in this olive, and — as long as you keep reading — I’ll keep squeezing until there’s nothing left but the pit.

Read Me 2 comments