Good to Go

One year for my birthday, Debra bought me an hour-long ride in a small, private airplane — something I’d always wanted to experience. Debra sat in the back, and I sat up front with the pilot, grinning widely as I watched familiar landmarks pass below us.

About halfway into the ride, the pilot took his hands off the controls and turned to me. “Why don’t you fly for a while?” he said.

Eyes wide, I immediately gave the only reasonable answer: “NO!” Just the thought of it was crazy: There are two people sitting here; one knows how to fly and the other doesn’t; and the one who doesn’t know how to fly should pilot the plane?

I’m always willing to try things if the consequences of screwing up are small. Mixing two unfamiliar ingredients in a recipe — sure, why not? Installing a new hard drive in a computer — my files are backed up, so what do I have to lose? But in any situation whose outcome could possibly be life-altering (or, in the case of flying a plane, life-depriving), I’m content to let someone with demonstrated skills take the lead.

That principle extends to crossing the street. Although people get frustrated when I stop at a corner and watch for the “walk” sign before stepping off the curb, I know that my senses of a car’s speed and distance are fallible — so why would I trust my judgment in deciding when it’s safe to walk? Given the high stakes, I’d prefer to let the engineer-designed traffic-control system — which has the added bonus of actually causing cars to stop — make that determination.

The idea that not everyone is a competent jaywalker seems not to have occurred to most people. It’s one of those things that all city-dwellers are expected to know how to do. But where is that skill supposed to come from? Are people supposed to cross the street in incrementally more hazardous situations and learn from the incidents in which they avoid being hit? No — I’d venture to say that crossing a busy street safely is something that most people just have the knack for; it’s not a skill they’ve acquired by trial and error. And if it’s something that one doesn’t have the knack for, it’s good to recognize that early.

Probably the most common thing that people are expected to be good at is driving. All of us who spend time on freeways have had occasion to criticize others for being bad drivers. But what do we want those inept drivers to do about it? Staying off the road isn’t an option; many people are forced by practical or economic circumstances to drive. Certainly they can discipline themselves to stay alert and be careful, but care and alertness aren’t always sufficient for reacting effectively to dangerous situations.

I often think that the best drivers are the ones we consider the worst — those who do whatever they can to pass everyone else on the road. Those drivers certainly need to learn courtesy and respect, and I agree that such things can be learned. But the actual skills that they have — the ability to weave effortlessly from one lane to another, to precisely judge the speed and angle they need to insert themselves between two moving cars — are most likely not things they’ve learned through trial and error. They just happen to be naturally good at it. If only those skills could be used for good instead of evil!

There are plenty of other activities at which everyone is groundlessly expected to be proficient. One, surprisingly, is sex. The advice columnist and social commentator Dan Savage has proposed a sexual standard that everyone is supposed to meet: to be GGG, or “good, giving, and game.” But where, exactly, is the ability to be “good” supposed to come from? Even if we assume that sexual skill can be acquired through practice, those who are less naturally good in bed will clearly have fewer opportunities to practice. It’s a perfect vicious cycle.

What’s true of sex tends to be true of other social interactions. Take dancing: One can attend dancing classes and learn patterns of steps, but not everyone gets to the point where they can execute those steps with ease and grace. And with the freeform style of dancing that’s been prevalent since the 1960s, even learning standard steps isn’t going to be of any help. It’s often said that the secret of social dancing is just to “go out on the dance floor and show ’em what you’ve got.” But that advice assumes that you’ve “got” something — and it doesn’t offer any guidance about where you’re supposed to get it.

I can frame the problem differently by referencing something I know I’m great at: audio editing, specifically music editing. In the earlier days of recording technology, before audio production could be done on a desktop computer, I would work closely with a sound engineer in an expensive recording studio. I’d direct our recording and editing sessions, but only he was allowed to touch the equipment. The exception was when a piece of music needed to be fitted to a specific length of time. As we listened to the playback, I would be the one to punch the button that instantly stopped the tape deck. “Cut here!” I would say. I knew precisely what needed to be deleted, and exactly where the cuts had to be made, to make the music sound seamless. It’s not something I’d learned; it was just something I was able to do. There was no way I could teach it to the engineer.

Imagine if music editing were something we had to do every day, like crossing the street or driving a car. “Oh, jeez,” I might say, listening to somebody’s clumsy cut. “Who let that clown use a sound-editing program?” Because I was good at what was considered a natural, routine task, I’d assume that everyone else was, too. It would probably not occur to me to empathize with the person who was required to do that edit, but just didn’t have the knack for it.

Read Me 2 comments

Power Lines

Photo from The Daily Princetonian, April 5, 1978

In the spring of my junior year at Princeton, a newly formed student group called the People’s Front for the Liberation of South Africa staged a demonstration outside a meeting of the university’s board of trustees. Their demand was that the university help bring an end to the apartheid regime in South Africa by disinvesting in corporations doing business in that country. The trustees refused, and by the following spring, noisy lunchtime demonstrations in front of the school’s administration building had become a daily event.

The trustees’ position was that divestiture would have a severe impact on the university’s finances while doing little to advance the anti-apartheid cause. They maintained that they could do more good by using their influence as stockholders to pressure the corporations to sever their South African ties.

The demonstrators rejected that argument as mere self-interested rationalization. That may have been so, but I wondered how the demonstrators could be so confident of their own rightness. We were, after all, students — people who presumably were attending this university for the opportunity to learn from people who were wiser and more experienced than we were. If we believed that we knew better, what reason did we have to pay tuition? As someone who knew nothing about corporate investment, I just didn’t feel qualified to take sides in this dispute.

Beyond that, as a philosophy major, I was convinced of the primacy of rational argument. Ideally, if two people have conflicting beliefs, the one who can make a valid argument for his or her belief must prevail, because the laws of logic are unassailable. Unfortunately, as I eventually came to recognize (and as I explored in “Modus Ponens”), rational argument rarely succeeds in real life, because every argument must begin with basic premises accepted by both sides. In a world where people have not only vastly different values but conflicting sets of facts, there’s no set of agreed-upon premises that a logical argument can be based on. In the South African example, both sides could claim — truthfully — to be anti-apartheid, but one side believed in the political effectiveness of free-market capitalism while the other did not. Under those circumstances, there was no way one side could convince the other.

So what do people do when they can’t settle a dispute by means of argument? They do what those Princeton students did: They exercise power. If one side can’t persuade the other of the rightness of their position, they do what they can to force their opponent to yield. In the case of the student activists, the repeated demonstrations — culminating in a sit-in in the administration building — brought the weight of public opinion down onto the university, pressuring the trustees to do what was necessary to protect the school’s reputation. In more extreme cases, the exercise of power may take the form of physical force, property damage, or potentially lethal violence.

My problem with the exercise of power is that it is essentially amoral. Either side can use it, and either side can prevail, regardless of the rightness of their position.

Power was viewed differently in pre-Enlightenment times. When a fight took place between two individuals or two armies, the outcome was presumed to reflect God’s plan. Whichever side was victorious was not just stronger or luckier; they were — by definition — demonstrably right, in that their position must also have been God’s position.

I don’t think anyone believes that anymore. We acknowledge that the outcome of a battle (whether literal or figurative) is arbitrary, and that the difference between victory and defeat is strictly a matter of whose strategy and tactics are more effective. Sometimes the good guys win, sometimes the bad guys win, and there’s not much we can do to change that.

As much as I distrust the use of power to settle conflicts, I recognize that often it’s the only option. In the case of racial justice, Black people — whether in the U.S., in South Africa, or elsewhere in the world — would not have been able to overcome oppression without active resistance in the form of marches, civil disobedience, and sometimes violent clashes with law enforcement. We can all agree with Martin Luther King’s famous quote that “A riot is the voice of the unheard.” And yet, the people who stormed the U.S. Capitol in the wake of Donald Trump’s electoral defeat also considered themselves to be the unheard.

A conflict can be considered to have ended only if one side is able to convince the other of the rightness of their belief. Otherwise, no dispute is ever settled; it just awaits the slow and unpredictable shifts in power from one side to the other.

Read Me 3 comments

Home Game

As soon as I was old enough to learn, my father taught me to play chess. He was not a chess player himself, but he knew the basic rules of the game, and he thought it was something I ought to know how to do.

Chess was way more interesting than checkers. I loved how each piece had its own way of navigating the board, and how the game’s idiosyncratic choreography led to unexpected situations that I had to improvise my way out of. After playing a few introductory games with my father, I began to play with my friend Carl, who lived across the street.

Carl was as new to chess as I was, so our games were played just for their entertainment value, mostly as a way to pass the time on rainy days. In a sense, I viewed chess the way I’d later view a game of Twister: The outcome didn’t matter so much as what sorts of interactions happened on the board.

Then, one day, everything changed. Shortly after our game began, Carl’s rook advanced inexorably toward me, and when it got far enough into my territory, it began to knock off my pieces, one by one. The game ended quickly, before I’d had much of a chance to do anything. The next game went the same way. Obviously, Carl had been studying.

What I discovered that day was that my father hadn’t really taught me chess. He’d taught me the rules of chess, but he’d left out the main part of the game, which was analyzing your opponent’s weaknesses, predicting how each move would play out down the line, and working out a strategy to limit your opponent’s available defenses against your attacks. In other words, it was about ruthlessly driving your opponent to defeat.

This is supposed to be the part of the story where I vow to learn all I can about chess strategy so I can exact my revenge on Carl, and go on to vanquish much better players. But in truth, that idea never occurred to me. I’d never seen the point of competitiveness. Sure, losing felt bad, but winning meant making my opponent feel bad, and where was the pleasure in that? I’d always thought that the idea of one person winning and one person losing was just to make sure that games had a way to end. If playing chess meant investing time, work, and emotional energy into defeating the other person, I didn’t see the point. It made much more sense just to quit playing chess.

While my father had been the one who introduced me to chess, it was my mother who taught me to play Scrabble. As with my father and chess, my mother was not a Scrabble player — mah jongg with “the ladies” was her game of choice — but she thought that learning Scrabble would encourage my interest in language, which was something that she shared. The Scrabble board was a place where I could show off the breadth of my vocabulary and engage in creative problem-solving, so playing the game came pretty naturally to me. My mother usually won, but that seemed only fair, since she was the one did the Sunday New York Times crossword puzzle every week. Part of the fun was watching how elegantly she played the game.

It was not until years later that I discovered that I had been as wrong about Scrabble as about chess. For real players, Scrabble was not about vocabulary at all. Playing it well required memorizing lists of words, but it wasn’t necessary to know what the words meant or how to use them in a sentence. For purposes of the game, they were merely sequences of letters, as arbitrary as the winning tile combinations on my mother’s mah jongg card.

Worse, Scrabble was as much about playing aggressively as chess had been. It wasn’t enough to make good use of the letters you’d drawn; you were supposed to keep track of which letters your opponents were likely to have, and prevent them from laying down the ones with higher point values. In fact, you were supposed, as much as possible, to prevent them from putting down any letters at all. Whenever I dared to play Scrabble as an adult, I was berated by my opponents for making it too easy for them. “Look!” they would say disparagingly. “You just opened up this whole section of the board for me!”

I didn’t get it. I thought I was doing a good thing. I always came up with my best Scrabble words when I had numerous options as to where to put my letters, so why wouldn’t I want to give other players the same opportunity? To do otherwise just felt mean-spirited.

Now that I think about it, I guess I’m just uncomfortable with the whole idea of strategy. Strategy has its places — for example, I try to load the dishwasher strategically, so that I can keep adding dishes throughout the day without having to rearrange anything — but in interpersonal affairs, it feels cynical. Strategizing means trying to outsmart other people, to take advantage of their blind spots, rather than aiming to be generous toward them. It’s certainly not an attractive part of human nature.

I understand that in the real world — particularly in business and politics — it’s often necessary to act strategically. People whose interests are different from yours are going to try to outmaneuver you, and they’ll do whatever they can to find an edge. You may need to do the same in your own defense. Games of chess, from what I understand, were long considered a training ground for military strategists, allowing them to cultivate skills that would aid them on the battlefield. Napoleon, for example, was known to be an avid chess player. But while I understand the necessity of learning those skills, I can’t find a way to experience them as a source of pleasure — especially among friends. We may live in a world of winners and losers, but recreate that world in microcosm on a game board? I don’t want to spend my leisure time plotting against people; I want to find ways to share with them.

Read Me 8 comments

On the Money

I thought that paying off the mortgage on our house would be a big deal. When I went to the bank to make the final payment, I half expected balloons to fall from the ceiling and a marching band to file through the bank lobby. But no; the teller just gave me a receipt and wished me a nice day.

Our financial adviser told us that having an unmortgaged house was a terrible thing. “Take out another mortgage,” he said, “and invest the money you get! You’ll earn enough of a return on your investment to cover the mortgage payments, and still have money left to reinvest. Plus you’ll get a tax break.”

What he said made sense, and Debra agreed. But I couldn’t imagine going through with a plan like that. There was such a sense of security in finally owning our home. It was ours, and nobody could take it away from us. We didn’t owe money to anybody, and never would again. For me, that feeling was worth more than any amount of income.

I’ve always been told that emotion shouldn’t enter into financial planning, that the cold numbers are all that matter. I respect the people who can abide by that rule, but I just can’t.

From the time that I was old enough to understand money, my parents gave me an allowance of 25 cents per week. Instead of blowing it on candy, I did my best to save it up for a time when I might want to buy something more expensive.

I was proud of my ability to conserve my money. Sometimes at night I would pour my accumulated quarters out of the wooden box I kept them in. I’d spread them out on my bed, gaze at them, handle them, and stack them. My mother once caught me in the act of coin-fondling and demanded that I stop it, saying, “Only misers do that.” I wasn’t clear on what was wrong with being a miser — wasn’t that just another word for frugality? — but, in any case, I promised to stop taking pleasure in my wealth.

Nevertheless, my mother came back and asked me to hand over my cash so she could put it in the bank. “It will be safe there,” she said. “And you can get it back whenever you need it.”

I reluctantly gave her the fifteen dollars I’d saved, and that was the last I saw of it. When I did eventually ask to get the money back so I could buy something, my mother denied any memory of ever having taken it. I don’t think that she deliberately intended to cheat me; I’m sure that she honestly didn’t remember. But from that time on, I fiercely guarded my access to my savings. When I opened my own bank account, I made sure that I held the bankbook and could withdraw my money whenever I wanted to.

We were not a well-to-do family. My parents never talked about their finances in front of me, but I often heard their late-night arguments about money. One night, they were particularly distraught about having lost most of their savings in mutual funds. I had no idea what a mutual fund was, but I vowed never to repeat my parents’ mistake by putting my money in one.

When I was in college, I worked three part-time jobs and carefully controlled my spending. As an adult, I routinely put part of my paycheck into a savings account. I refused to go into debt. I never took out a car loan; I would save up first and then buy the car.

When I married Debra, we closed our individual bank accounts and put all of our savings into a joint account. I had a separate account for my freelance business, which I managed as conservatively as possible. It never even occurred to me to spend money I didn’t have. I’m always amazed when I hear about people who take out business loans for office space or equipment, with the expectation of making enough money to repay the lender. How could they possibly take a risk like that? My office was in my home, and my purchases were strictly pay-as-you-go.

Eventually, as our savings grew, Debra gently suggested that we consider investing our money rather than keeping it in a bank. I reluctantly agreed, and we went to see a financial advisor who had been recommended to us. He proposed putting our money into mutual funds. “No!” I said, remembering my childhood. “No mutual funds!”

I have friends who play the stock market, buying shares in companies that they think are undervalued and selling them when they appreciate. I don’t see how anyone could do that. The idea that I might have insight into a business’s financial prospects that other investors don’t, and that I would then bet money on that supposed insight, is inconceivable to me.

Today, our retirement savings are invested in stocks, bonds, and — yes — mutual funds, but I have nothing to do with it. Someone else does the investing for us, and Debra keeps track of the numbers. The thought that our money could vanish overnight is immensely frightening to me, so I’m happy to leave the management of it to people other than myself. At least we own our home, and no one can take it away from us. Knowing that makes me feel safe and independent, like having a pile of quarters in a wooden box.

Read Me 2 comments

Only Just

Most people presumably begin to think about morality and justice when some profound injury occurs, either to them or to someone they care about. For me, it started with something much more mundane: the Academy Awards.

Although I’ve always been a lover of classic movies, I never understood people’s emotional investment in the Academy Awards. Why does anybody care who wins? Unless you work in the higher levels of the film industry (in which case you’re probably not reading this blog), you’re not personally acquainted with any of the nominees. You have no financial stake in the studios that may benefit from the increased ticket sales that accompany a win. Most importantly, if you’ve seen any if the nominated films, you already know what you think of it, so the fact that it wins an Oscar — or doesn’t — isn’t going to affect your assessment.

At first I guessed that it was just a matter of validation. If you thought a film was great, it feels good to know that other people — particularly the presumed experts — share your opinion. If a film you hated wins the award, you have the satisfaction of being able to look down on those idiots in Hollywood who don’t recognize rubbish when they see it.

But that theory doesn’t go far enough. I began to notice that much of people’s agreement or disagreement with the Academy’s decisions isn’t based on comparative rankings of films and the people who make them. Instead, their reaction seems to have a moral component — that such-and-such an actor or director deserves the award, based on who they are and what they’ve done. In other words, it seems to be a matter not of taste, but of perceived fairness. In matters both significant and insignificant, we have a primal need to feel that justice has been done.

Justice, for me, has always been a tricky concept. When a human being has been killed, we call for justice to be done on their behalf. Strictly speaking, however, that’s not possible: The only real justice would be for the victim to be given back the life that was wrongly taken, but we don’t have that ability. So when people talk about justice in a case like this, what they usually mean is that the killer should be punished. But what does the punishment actually accomplish?

As I already mentioned, it doesn’t change the actual situation; the victim remains dead. Incarceration is unlikely to prevent the killer from killing more people, since most murders result from a unique set of circumstances that are unlikely to recur. Punishment of any sort may perhaps deter other people from killing, but the effectiveness of that deterrence is questionable — after all, this killer wasn’t deterred by the fact that others have been punished for similar crimes.

No, when we want to see a killer punished, it’s not about the punishment having any practical value. It’s about some innate feeling of rightness — the knowledge that a wrong has been done, and that wrong has to be compensated for somehow. The wrongdoer has to suffer in order to bring the moral universe back into balance.

It’s a sense that exists in all of us. As children, when we’re scolded or penalized for something we know we didn’t do, our immediate reaction is, “That’s not fair!” And the usual parental retort — “Life isn’t fair” — doesn’t fix the hurt. If life truly isn’t fair, then something is fundamentally wrong. It’s supposed to be fair.

Where does that innate sense of justice come from? Why is it so strong that we don’t take it to be merely an abstract concept, a way to interpret the world, but something essential about the world? I can’t think of anything else — with the possible exception of love — that so powerfully feels like it pre-exists us.

This is generally where religious faith enters the picture. For many people, the objective and essential rightness or wrongness of things is something established by God. As someone who doesn’t accept the existence of what I call the God Guy — the anthropomorphic figure who has ideas, feelings, and opinions about what each of us ought to be doing — I was always dismissive of this view. But the older I get, the more I have to believe that this sense of justice is not just something housed in our brains. If our sense of what’s fair is only a figment of our neurons, then there’s no such thing as justice; there are only ideas about justice.

Much about my upbringing left me with a dislike for — and often downright antipathy toward — anything religious. For reasons I’ve alluded to elsewhere, those feelings gradually faded, and I began to recognize that our existence incorporates more than can be sensed or analyzed. I can’t say that my spiritual side is very broad or deep, but the one thing that feels undeniable to me is that there is a sense of rightness woven into the universe. We can go with it or against it, just as we can swim with the current or against the current, but either way, it’s there. If you want to call that rightness God, I have no problem with that.

I’ve long admired the distinction the Quakers make between God’s will and self-will, and their view that the only way to give the former precedence over the latter is to cultivate a quiet mind. Learning to engage in a meditative practice has given me the chance to separate myself from my own self-importance, even for just a moment, and to feel which way that current of rightness flows. Any time I think I know what’s just — whether so-and-so should have won the Best Director award, or whether so-and-so ought to be locked up for life — I have to recognize that it’s likely just my ego talking, and that I need to open myself to the wisdom that lies outside of me. That recognition has value in itself, even if it usually doesn’t leave me with the answers that I crave.

Read Me 2 comments