Mental Notes

Our household recently had a dinner-table conversation about the so-called Mandela effect, which refers to the sharing of a vivid — but false — memory by a large group of people. It gets its name from the many Baby Boomers who (it is said) have a distinct memory of Nelson Mandela’s dying in prison in the 1980s, when in fact he lived well into the 21st century. Other frequently cited examples are people’s remembering the children’s-book character Curious George as having a tail, or the “Mr. Moneybags” character in the Monopoly game as sporting a monocle. (As it turns out, neither is true.)

If you read my post “Fair Minded,” you know that I acknowledge the existence of invented memories, having experienced them myself. But when it comes to the Mandela effect, I’m skeptical of the idea that there are false memories that are culturally shared. For one thing, I’d assume that anyone who can recognize Nelson Mandela’s name would have noticed that he served as the first post-apartheid president of South Africa. (In passing, I should add that I’m irritated at how Mandela’s name has been trivialized by being connected to a piece of pop psychology.)

I’d also claim that most of the so-called false memories that are offered as examples aren’t memories at all, but simply mental images that get formed when the topic arises. For example, if you were to ask me, “Does Curious George have a tail?” I would probably think to myself, “Well, he’s a monkey, so I assume he has a tail,” and then I would immediately picture him that way. But that’s different from actually remembering him as having a tail.

That said, I just came across what appeared to be an instance of the Mandela effect. While browsing around YouTube, I discovered some clips from a 1929 Technicolor musical called “Gold Diggers of Broadway.” Most of the film has been lost, but a few fragments survive, one of which features a tenor named Nick Lucas singing the (then-new) song “Tiptoe Through the Tulips.” Lucas’s rendition of the song was pretty straightforward, until he got to the bridge — the part that starts with “Knee deep in flowers we’ll stray.” It sounded odd at first, and when he got to the next line, “We’ll keep the showers away,” it just sounded wrong. The rhythm was off. At first I thought that the film editor had accidentally cut in an extra few beats, but no — when the chorus joined in, they sang it the same way. I was perturbed: This is not how the song is supposed to go.

In my memory, the bridge went like this:

But in the film, it went this way:

Notice not only the difference in rhythm, but also how “deep” goes down only a seventh instead of a full octave. I checked the original 1929 sheet music — which, fortunately, is accessible online — and evidently that’s the way the song was written. So why did I remember it differently? And why did all of the cover versions on YouTube sing it my way instead of Nick Lucas’s way?

I was ready to attribute all this to a Mandelaesque mass delusion when I realized that I’d forgotten one important thing: When I and others of my generation learned the song, we didn’t learn it from “Gold Diggers of Broadway” — we learned it from Herbert Khaury, aka Tiny Tim. And as I discovered when I checked his classic 1968 recording, Tiny Tim sang it like this (which, apart from a syncopated flourish at the end, is just the way I remembered it):

Tiny Tim was an avid music historian, and he actually knew Nick Lucas — he insisted that Johnny Carson book Lucas on the infamous show in which he married Miss Vicki —  so it’s not clear why he felt the need to alter the bridge from the way the song was originally sung. In any case, it’s clear that the Mandela effect doesn’t apply here. I don’t have a false memory of how the song went in 1929; I have a true memory of how the song has gone since 1968.

Read Me Leave comment

Ruling Out

To the people who know me, I’m infamous for following rules. It’s not that I naturally defer to authority — I really don’t like being told what to do — but I can usually see why the rule is in place, and therefore believe that it’s a good idea to follow it.

I had a friend in college who used to pocket fruit in the dining hall to eat later. I reproached him for doing that, since the rule was that the buffet-style food was only to be eaten on the premises.

“What’s the harm?” he said. “Why does it matter whether I eat it now or later?”

“Because if everybody took fruit away with them, the dining hall would run out of fruit. Or they’d have to keep buying extra fruit, which means that the cost of meals would go up.”

“But everybody doesn’t do it,” he said.

“That’s because there’s a rule,” I said. “Besides, what if everybody said, ‘It’s fine if I do it, because nobody else is doing it’? That would result in everybody doing it.”

I never convinced him. He continued to pocket the fruit, and nothing bad happened as a result.

Like the not-taking-away-food rule, most rules exist to remind us that we live in a community. Doing something that benefits me may have adverse repercussions for other people. We tend not to think about other people, which is why effective rules and laws have to include consequences that apply directly to us.

When you think about it, the consequences of most rule-breaking are entirely artificial. If I drive too fast, the real consequence is that I’m putting other people in danger. But that’s too abstract to stop most people (including me, sometimes) from doing it. So the law has to provide a penalty that’s specific to me: If I’m caught speeding, I have to pay a fine. There’s no inherent connection between my rule-breaking and my having to pay a fine; it’s not as if my parting with my money is in any way reducing the danger that I created for other people. And yet we tend to accept this cause-and-effect as perfectly natural.

I have a problem with this when I’m the one making the rules. If I’m teaching a course, I want my students to complete assignments on time, so I impose a penalty for turning work in late. In part, that’s because late work causes real problems for me: I have to take make time to grade each straggling assignment, when it would have been much more efficient for me to grade all of them at once.

But what if, as sometimes happens, I’ve procrastinated on grading for a couple of days? That means that if a student turns in an assignment a day or two late, it really doesn’t matter; I can still grade it along with all the others. Yet I still have to deduct points for lateness. I can see why it’s important to do this — I need to be consistent in enforcing rules; I need to teach students that there are consequences for missing deadlines — but penalizing the student in a case like this still bothers my conscience. It feels wrong to create artificial consequences for an action that has no actual consequences.

Looking at my feelings in that situation makes me realize that there’s another aspect to rules and rule-breaking that has nothing to do with consequences: plain old morality.

I’ve never cheated on a test — not out of fear of getting caught, but just because I know it’s wrong. I’m just not the kind of person who cheats on tests, even if it were possible to do so with no risk of punishment.

Where does this sense of right and wrong come from? As I mentioned in “Only Just,” that question is the only thing that leads me to believe in anything like a deity. But regardless of where it originates, I don’t think it’s something that can be taught, at least not past early childhood.

If I’m trying to persuade my students that plagiarism is a bad thing, it doesn’t help to tell them that it’s just wrong. If they don’t believe that already, nothing I say is going to give them that belief. The only thing I can do is talk about the consequences. And the only consequences I can talk about are artificial ones — failing the course, possible expulsion from school — that are in no way direct results of the act of plagiarism itself. They’re just things that we invented to discourage students from committing an act that we believe to be immoral.

Ideally, doing something wrong would have its own inherent consequences. I guess that’s what makes people believe in karma, or “what goes around comes around.” But those beliefs seem to be used more in judging other people’s actions than guiding our own.

For practical purposes, the only way our society seems to have of limiting people’s behavior is to make rules and provide artificial consequences for breaking them: fines, or imprisonment, or execution, or — for some people — punishment after death (such as going to hell). How strange it is that the concept of right and wrong has such a prominent place in our culture, and so little power to actually change anything!

Read Me 1 comment

On the Money

I thought that paying off the mortgage on our house would be a big deal. When I went to the bank to make the final payment, I half expected balloons to fall from the ceiling and a marching band to file through the bank lobby. But no; the teller just gave me a receipt and wished me a nice day.

Our financial adviser told us that having an unmortgaged house was a terrible thing. “Take out another mortgage,” he said, “and invest the money you get! You’ll earn enough of a return on your investment to cover the mortgage payments, and still have money left to reinvest. Plus you’ll get a tax break.”

What he said made sense, and Debra agreed. But I couldn’t imagine going through with a plan like that. There was such a sense of security in finally owning our home. It was ours, and nobody could take it away from us. We didn’t owe money to anybody, and never would again. For me, that feeling was worth more than any amount of income.

I’ve always been told that emotion shouldn’t enter into financial planning, that the cold numbers are all that matter. I respect the people who can abide by that rule, but I just can’t.

From the time that I was old enough to understand money, my parents gave me an allowance of 25 cents per week. Instead of blowing it on candy, I did my best to save it up for a time when I might want to buy something more expensive.

I was proud of my ability to conserve my money. Sometimes at night I would pour my accumulated quarters out of the wooden box I kept them in. I’d spread them out on my bed, gaze at them, handle them, and stack them. My mother once caught me in the act of coin-fondling and demanded that I stop it, saying, “Only misers do that.” I wasn’t clear on what was wrong with being a miser — wasn’t that just another word for frugality? — but, in any case, I promised to stop taking pleasure in my wealth.

Nevertheless, my mother came back and asked me to hand over my cash so she could put it in the bank. “It will be safe there,” she said. “And you can get it back whenever you need it.”

I reluctantly gave her the fifteen dollars I’d saved, and that was the last I saw of it. When I did eventually ask to get the money back so I could buy something, my mother denied any memory of ever having taken it. I don’t think that she deliberately intended to cheat me; I’m sure that she honestly didn’t remember. But from that time on, I fiercely guarded my access to my savings. When I opened my own bank account, I made sure that I held the bankbook and could withdraw my money whenever I wanted to.

We were not a well-to-do family. My parents never talked about their finances in front of me, but I often heard their late-night arguments about money. One night, they were particularly distraught about having lost most of their savings in mutual funds. I had no idea what a mutual fund was, but I vowed never to repeat my parents’ mistake by putting my money in one.

When I was in college, I worked three part-time jobs and carefully controlled my spending. As an adult, I routinely put part of my paycheck into a savings account. I refused to go into debt. I never took out a car loan; I would save up first and then buy the car.

When I married Debra, we closed our individual bank accounts and put all of our savings into a joint account. I had a separate account for my freelance business, which I managed as conservatively as possible. It never even occurred to me to spend money I didn’t have. I’m always amazed when I hear about people who take out business loans for office space or equipment, with the expectation of making enough money to repay the lender. How could they possibly take a risk like that? My office was in my home, and my purchases were strictly pay-as-you-go.

Eventually, as our savings grew, Debra gently suggested that we consider investing our money rather than keeping it in a bank. I reluctantly agreed, and we went to see a financial advisor who had been recommended to us. He proposed putting our money into mutual funds. “No!” I said, remembering my childhood. “No mutual funds!”

I have friends who play the stock market, buying shares in companies that they think are undervalued and selling them when they appreciate. I don’t see how anyone could do that. The idea that I might have insight into a business’s financial prospects that other investors don’t, and that I would then bet money on that supposed insight, is inconceivable to me.

Today, our retirement savings are invested in stocks, bonds, and — yes — mutual funds, but I have nothing to do with it. Someone else does the investing for us, and Debra keeps track of the numbers. The thought that our money could vanish overnight is immensely frightening to me, so I’m happy to leave the management of it to people other than myself. At least we own our home, and no one can take it away from us. Knowing that makes me feel safe and independent, like having a pile of quarters in a wooden box.

Read Me 2 comments

Only Just

Most people presumably begin to think about morality and justice when some profound injury occurs, either to them or to someone they care about. For me, it started with something much more mundane: the Academy Awards.

Although I’ve always been a lover of classic movies, I never understood people’s emotional investment in the Academy Awards. Why does anybody care who wins? Unless you work in the higher levels of the film industry (in which case you’re probably not reading this blog), you’re not personally acquainted with any of the nominees. You have no financial stake in the studios that may benefit from the increased ticket sales that accompany a win. Most importantly, if you’ve seen any if the nominated films, you already know what you think of it, so the fact that it wins an Oscar — or doesn’t — isn’t going to affect your assessment.

At first I guessed that it was just a matter of validation. If you thought a film was great, it feels good to know that other people — particularly the presumed experts — share your opinion. If a film you hated wins the award, you have the satisfaction of being able to look down on those idiots in Hollywood who don’t recognize rubbish when they see it.

But that theory doesn’t go far enough. I began to notice that much of people’s agreement or disagreement with the Academy’s decisions isn’t based on comparative rankings of films and the people who make them. Instead, their reaction seems to have a moral component — that such-and-such an actor or director deserves the award, based on who they are and what they’ve done. In other words, it seems to be a matter not of taste, but of perceived fairness. In matters both significant and insignificant, we have a primal need to feel that justice has been done.

Justice, for me, has always been a tricky concept. When a human being has been killed, we call for justice to be done on their behalf. Strictly speaking, however, that’s not possible: The only real justice would be for the victim to be given back the life that was wrongly taken, but we don’t have that ability. So when people talk about justice in a case like this, what they usually mean is that the killer should be punished. But what does the punishment actually accomplish?

As I already mentioned, it doesn’t change the actual situation; the victim remains dead. Incarceration is unlikely to prevent the killer from killing more people, since most murders result from a unique set of circumstances that are unlikely to recur. Punishment of any sort may perhaps deter other people from killing, but the effectiveness of that deterrence is questionable — after all, this killer wasn’t deterred by the fact that others have been punished for similar crimes.

No, when we want to see a killer punished, it’s not about the punishment having any practical value. It’s about some innate feeling of rightness — the knowledge that a wrong has been done, and that wrong has to be compensated for somehow. The wrongdoer has to suffer in order to bring the moral universe back into balance.

It’s a sense that exists in all of us. As children, when we’re scolded or penalized for something we know we didn’t do, our immediate reaction is, “That’s not fair!” And the usual parental retort — “Life isn’t fair” — doesn’t fix the hurt. If life truly isn’t fair, then something is fundamentally wrong. It’s supposed to be fair.

Where does that innate sense of justice come from? Why is it so strong that we don’t take it to be merely an abstract concept, a way to interpret the world, but something essential about the world? I can’t think of anything else — with the possible exception of love — that so powerfully feels like it pre-exists us.

This is generally where religious faith enters the picture. For many people, the objective and essential rightness or wrongness of things is something established by God. As someone who doesn’t accept the existence of what I call the God Guy — the anthropomorphic figure who has ideas, feelings, and opinions about what each of us ought to be doing — I was always dismissive of this view. But the older I get, the more I have to believe that this sense of justice is not just something housed in our brains. If our sense of what’s fair is only a figment of our neurons, then there’s no such thing as justice; there are only ideas about justice.

Much about my upbringing left me with a dislike for — and often downright antipathy toward — anything religious. For reasons I’ve alluded to elsewhere, those feelings gradually faded, and I began to recognize that our existence incorporates more than can be sensed or analyzed. I can’t say that my spiritual side is very broad or deep, but the one thing that feels undeniable to me is that there is a sense of rightness woven into the universe. We can go with it or against it, just as we can swim with the current or against the current, but either way, it’s there. If you want to call that rightness God, I have no problem with that.

I’ve long admired the distinction the Quakers make between God’s will and self-will, and their view that the only way to give the former precedence over the latter is to cultivate a quiet mind. Learning to engage in a meditative practice has given me the chance to separate myself from my own self-importance, even for just a moment, and to feel which way that current of rightness flows. Any time I think I know what’s just — whether so-and-so should have won the Best Director award, or whether so-and-so ought to be locked up for life — I have to recognize that it’s likely just my ego talking, and that I need to open myself to the wisdom that lies outside of me. That recognition has value in itself, even if it usually doesn’t leave me with the answers that I crave.

Read Me 2 comments

Cats as Cats Can

Timmy, our fluffy orange Maine Coon mix, is an extortionist. When I sit down to have lunch, he’ll jump up on the table, saunter over to my plate, and say, “Nice sandwich you’ve got there. It would be a shame if anything happened to it.” Then, to show that he means business, he’ll poke it with his nose. I have to pay him off with a bowl of kibble if I want to have any peace.

Mary Beth, our gray-brown tabby, is a discriminating shopper. She’ll jump onto my crowded desk, stroll around examining the merchandise, find the object she wants to claim — perhaps a scrap of paper or a thumb drive — and then carry it off with the satisfaction of someone who has found a valuable antique in a flea market.

If you find these cameos charming, then you’re clearly an ardent cat person. If you don’t, I can’t blame you. I’m a cat person, but the only cats I’m really interested in are my own. Other people’s cats are just cats. Sure, they may do something adorable in a Facebook photo, like snuggling up in a blanket or chasing a toy, but that’s just generic cuteness, a defining aspect of felinity. My cats have multilayered personalities and complex psychological profiles. They’re also prettier than anyone else’s cats.

Jon Carroll used to write a daily column for the San Francisco Chronicle in which he’d interweave stories of his daily life with unique insights into politics and culture. He had a devoted following, myself among them. (I’ve recently come to realize that I’ve unconsciously been emulating Carroll in my approach to writing this blog.) Occasionally, he’d choose to write about the latest doings of his cats, Archie and Bucket (and later, Pancho). He didn’t necessarily have anything profound to say about them; he just thought that the activities of his cats were fascinating and assumed that other people would, too. It turned out that they didn’t, or at least a vocal minority of his readership didn’t. He got so many complaints about his Archie and Bucket stories that he ended up having to preface each such column with a disclaimer like “This is a cat column. If you have an objection to reading about cats, stop here.”

Several of my earlier blog posts have begun with stories about my cats, but the cats later turned out to be metaphors for something else. Taking a page from Jon Carroll, I should warn you that there are no metaphors coming up; I’m really just writing about cats this time.

I wasn’t always a cat person. I grew up without pets, because my mother considered animals — particularly cats — to be nasty and filthy. (When my parents came to visit Debra and me after we’d adopted Brook, our first cat, my mother said, “I can’t understand why you’d allow wildlife in your house.”) The first extended exposure I had to a domestic animal was the summer after I graduated from college, when some friends and I sublet a small house that came with a cat named Motley. Motley was a mostly-outdoor cat who would drop by only now and then to pick up his mail, but when he was around, he tended to act as if he owned the place (which, in a sense, he did). I would sometimes wake up in the morning to find him standing on me, and I was surprised to find out that I liked the pressure of his little padded feet on my belly.

Like most other recent graduates, I moved around for a few years from house to house and apartment to apartment, and I gradually got to know other cats along the way. Although I was never one to set long-term goals, I did develop a fantasy of the ideal domestic life: feeling securely settled enough in a place to get a piano and a cat.

That time came years later, shortly after Debra and I moved from New Jersey to California and rented a house in a friendly Oakland neighborhood. Debra, who had even less experience with animals than I had, was hesitant, but we decided to adopt Brook, a month-old kitten — because who doesn’t love kittens? — and assume that Debra would grow attached to her by the time she became a cat. The plan worked, and now Debra is a cat maven who volunteers at three animal shelters. (We eventually got a piano as well.)

As for me, slightly more than twenty years ago, I started studying a form of therapeutic bodywork called Breema. One of Breema’s fundamental principles is that the recipient’s body will be relaxed and comfortable only if the practitioner’s body is relaxed and comfortable. If I’m touching someone’s body with the intention of making them feel better, simply having that intention works against my aim of being comfortable. As a Breema practitioner, the only way I can help someone is by not trying to help them. That was a difficult lesson to learn.

The breakthrough came when I realized how much I could learn from Brook, our cat. Brookie not only slept with me, but was in physical contact with me most of the time — draped over my shoulders when I was sitting at my desk, curled on my lap when I was watching TV, kneading my belly when I was lying down. I came to realize that she was always doing Breema. She was never trying to relax me; she was making herself comfortable, letting her body adapt to mine, and the result was naturally soothing to me. Now, whenever I do Breema, I try to remember how a cat would do it.

Timmy, the cat for whom I currently serve as a bed, likes to drape himself over my left leg with his head resting on my thigh. Each of us sleeps better when the other is there. Nearly every afternoon while I’m working at my desk, Timmy comes over, bumps his head against my shin, and trots toward the bedroom. That’s his sign that it’s time for us to take a nap together. If I ignore him and keep working, he’ll come back and bump me with his head again. Eventually, I’ll give in and follow him to the bedroom. It’s hard to say no to taking a nap, especially with a warm cat draped snugly over your leg.

Read Me 1 comment