To a Degree

My brilliant friend Lisa Rothman — entertainer, entrepreneur, and corporate trainer — recently left a comment that got me thinking. (If you find Lisa’s name familiar, it’s probably because she leaves thoughtful comments on pretty much everything I post here.) “I would love to live in a society,” she said, “[that’s] structured around people being able to spend all their time doing the things that they are really good at and enjoy doing.”

I would, too. I’m not entirely sure how that would work, given some serious obstacles:

  • The things that people enjoy doing are not necessarily the things that they’re good at, and vice versa.
  • There is likely to be much disagreement about what it means to be good at something, and who is and isn’t.
  • The things that people enjoy doing don’t necessarily benefit anyone other than themselves, and in some cases might even cause harm.
  • It’s not at all certain that the things people are good at, or the things they like to do, are evenly distributed enough to ensure that all necessary tasks get done. (How many people like cleaning toilets?)

Nevertheless, I do think that principle could be applied more often than it is. One place to start might be an area where I have some experience: higher education.

Community colleges, such as the one at which I was a faculty member, have been put under great pressure to model themselves after fast-food corporations. Just as McDonald’s enforces uniformity among its retail outlets, making sure that a hamburger sold at a McDonald’s in Kansas City tastes exactly like a hamburger sold at a McDonald’s in Miami, community colleges are supposed to require uniformity in their curriculum and in the people who teach it. A student who takes English 101 from one instructor is supposed to have exactly the same experience, and (measurably!) learn exactly the same things, as a student who takes the same course from another instructor. The course outline has been rigorously developed and approved by a committee, and it must be adhered to.

Naturally, there are good reasons for this demand for uniformity. Students who enroll in an advanced course must all be presumed to have received equal preparation in their earlier courses. Transfer students must have taken courses at their first school that are aligned with those as their next school. Students who graduate with the same degree must possess the same knowledge and skills, or else the degree has no meaning.

But it’s clear that something precious is being lost here: the value of each individual teacher’s life experience. In Lisa’s words, every instructor has something different that “they are really good at and enjoy doing.” Wouldn’t the students benefit if the teacher were free to teach that? For example, I’m no expert in desktop printing, because nearly all of my Photoshop work is intended for use online, but I have to teach it anyway because it’s part of the curriculum. I have lots of experience in preparing graphics for video, but I can’t share that experience because it’s not included in the official course outline. Because of my background in theater, I enjoy creating projections for stage productions, but that’s obviously not thought of as a mainstream activity.

What makes this particularly sad is that the specific topics that are being taught don’t matter much. I’ve always told my students, “Everything you do here is going to be obsolete in ten years.” What they really need to learn is how to learn — how to be curious, how to think critically, how to take responsibility for and pride in their work, how to start with fundamental principles and apply them to new situations. The actual subject matter of the course is just a vehicle for passing on those skills. So why can’t the curriculum flow naturally from where the instructor’s heart is?

When I think of teachers who have influenced me in the past, I rarely remember anything specific that they taught me — instead, I remember what sort of people they were, what they valued, and how they approached life. My education came simply from being in the room with them. Wouldn’t it be nice if, instead of taking English 101, a college student could take a course called “A Semester with Professor So-and-So”?

Read Me 1 comment

Touched

There’s no better place to hold a concert, in these twilight days of COVID, than a barn. This particular barn was spacious, high-ceilinged, and well ventilated, with fragrant bales of hay serving as benches and planks on sawhorses passing as tables. Most of the men wore flannel shirts, bushy beards, and ponytails; the women wore western-style hats and boots. Although this was in San Gregorio, only about 60 miles from Oakland, I was apparently the only city dweller there, feeling a bit out of place in my crewneck sweater and striped dress shirt.

I was there because two of my favorite touring musicians, Nathan Rivera and Jessie Andra Smith — who perform together as Nathan & Jessie — were among the acts on the bill. We know each other from the times they’ve performed in my living room as part of my long-running house concert series, but they — along with my house concerts and live music in general — had been idle for the past year and a half, and I really wanted to hear them play again.

What I didn’t expect was the greeting I got outside the barn, first from Nathan and later from Jessie. Each of them looked at me in wide-eyed surprise, and then — following one of those awkward, late-COVID-era hesitations in which one person, with arms flung open, has to pause and ask the other, “May I…?” — gave me a warm, tight hug.

Hugs! It had seemed for a while that hugs would never return, having been replaced by sorry fist-touches and elbow-bumps. I remember back before the lockdown began, when we were all learning the new rules, I said that I would never give up hugging — until that moment when I realized that it wasn’t just my decision, that it was something we all had to do to protect each other. I remember when, months later, my wife Debra and I came to an agreement with our friend Amy that the three of us would become a “pod,” the first thing each of us did was hug Amy — an experience that felt strange, oddly foreign, and enormously satisfying.

Even in ordinary times, there’s too little opportunity for physical contact among people. I’ve always been unusually sensitive to touch, to the extent that for every person whose hand I’ve ever grasped, I can remember exactly what their hand felt like. (“You must be some kind of savant,” Amy said, when I mentioned this to her.) I get more of a sense of connection from one moment of contact than I do from hours of conversation. And yet, conversation is pretty much the only avenue our society offers toward bonding with most of the people around us.

During the concert in the barn, a trio of very happy dogs kept darting in and out, unable to decide whether they preferred romping in the field or socializing with us humans. One dog in particular, whenever she came inside, would make the rounds of the hay-bale benches, delightedly accepting strokes and pets from one audience member and then eagerly moving on to the next. Another dog sat contentedly among the standees in the back, waiting for people to come to him and scratch under his chin.

I kept thinking, “What do these dogs know that the rest of us don’t?”

I was reminded of another dog that Debra and I met during a visit to a dairy farm — an old dog who had retired from her farm duties and hung out on the front porch, watching the action. When any of us would approach her, she would simply roll over and expose her belly, as if to say, “You know what to do!” As a retired person myself, I would love the opportunity to do whatever the human equivalent is of accepting joyfully offered belly rubs.

As it turned out, I didn’t get to hear much of Nathan and Jessie’s performance; they had been moved to the last spot on the bill, and I, having poor night vision and not being eager to navigate narrow, winding roads in the dark, needed to leave before sunset. But even though I didn’t get the music I was hoping for, I was very happy I’d come, because I’d gotten something equally valuable: a reminder of the immense pleasure contained in a spontaneous, simple, and heartfelt hug. May we all experience more of them!

Read Me 1 comment

Greater Than

Over the centuries, philosophers have attempted to construct irrefutable arguments that prove the existence of God. These arguments have been sorted into various categories: the teleological argument, the cosmological argument, the argument from design, and so on.

My favorite of these attempted proofs is the so-called ontological argument. It essentially goes like this:

God is the greatest of all possible beings.
A being that exists is greater than a being that doesn’t exist.
Therefore, God must exist.

I’ve always loved this argument because it feels like a magic trick: It elegantly and instantly performs a transformation that feels impossible. You know that there’s something shady going on behind the scenes, but you can’t quite figure out what it is.

Well, one thing that makes the trick work is some cleverly camouflaged circular reasoning. If you think about it, the only logical way to find the greatest of all possible beings is to make an inventory of all possible beings and rank them according to greatness. This task is made easier by limiting the inventory to beings that exist, as per the argument’s second premise.

Assuming that one can come up with criteria (beyond mere existence) for evaluating greatness, we just have to look at the scores and see who comes out as #1 in the ranking. If the deity of the Bible actually exists, he or she would be a shoo-in to take the top spot. If not, the top spot would go to some other being. (Who knows? A gas cloud at the edge of the Milky Way? A tree in Pittsburgh?) In other words, the ontological argument for the existence of God only works if you first assume the existence of God.

But it’s also worth taking a look at the second premise. Is it really true that something that exists is greater than something that doesn’t? I’d maintain that there are beings — fleas, or coronaviruses, or Mitch McConnell — whose nonexistence would make the universe better off. There are certainly much worse possible things — such as a godlike, all-powerful but malevolent entity — that are the greater for not existing.

This thought comes up often when I hear my creative friends — writers and musicians and painters — talk about how because they are artists, whatever they produce has value, and that they’re not being fairly compensated for the value of their work. Even apart from financial considerations, they often insist that the mere act of producing something has value. (One Facebook friend recently posted, “I am claiming my space as a creator.”)

I certainly like to believe that anything I bring into existence is greater than something that doesn’t exist. My believing that, however, doesn’t necessarily make it true. I’m a lover of live music, and one of the reasons I host house concerts is to give talented musicians an opportunity to be paid for their work. But I’ve also heard music that’s so badly performed that it makes me wince, in which case I’d say that its existence has negative value. Fair compensation in that case would be for the musicians to pay me to keep listening.

If something I create has value to me, that’s great. But if I’m to call myself an artist, what I create has to have value to others, and I’m in no position to judge whether that’s the case.

When I was a freelance writer/producer/editor/designer, I always had misgivings about taking my clients’ money. If they were going to pay me, I wanted it to be because they were so pleased with my work that they actively wanted to pay me — not just because we had a contract. Of course I always billed the client for the amount we’d agreed to, because I wasn’t in a position not to do so. But I always felt lucky to get the money, rather than feeling entitled to it. Work doesn’t acquire value simply by virtue of existing; it can only have value if it fills a need that would otherwise have gone unmet.

Read Me 1 comment

A Storied History

On a warm, late-spring night in 1977, I made a spur-of-the moment decision to go see a new movie called “Star Wars.” I wasn’t a huge fan of science-fiction movies, but I’d heard vague rumors that this one was good. I walked into the theater having no idea what I was about to see.

Going to the movies was no big deal; I’d been doing it all my life. The first film my parents ever took me to was a Disney-produced family comedy called “Bon Voyage.” Being six years old and unfamiliar with the French language, I misheard the title as “Googly Eyes.” I remember nothing about the movie other than getting bored halfway through, and being disappointed that no one on the screen had googly eyes.

As I got older, I gradually learned what I liked and what I didn’t. I didn’t like Westerns, or war movies, or action films, or anything where people did bad things to other people. I didn’t like mysteries or other movies that depended heavily on plotting — I could never follow complicated plots (and still can’t). What I loved were films that took me to a place or time that I could never have imagined on my own: foreign films that immersed me in unfamiliar cultures, historical films that made the distant past feel present, animated films where animals talked and people effortlessly did impossible things. I also loved films that allowed me to spend time with strong, compelling, charismatic characters (or, as in classic films from the 1930s and 40s, actors such as Humphrey Bogart or Katharine Hepburn who were pretty much indistinguishable from their characters). I didn’t care what the characters in the movie did; I just wanted the experience of being with those people in that time and place.

“Star Wars” — which, at the time I saw it, had not yet received the subtitle “Episode IV: A New Hope” — had all of those elements. Its long-ago, far-away galaxy felt real and tangible, not least because of its brilliant use of sound (the industrial hum of the light sabers, the adorable bleep-bloop language of R2D2, the labored sucking sounds of Darth Vader’s breathing). The view from the cockpit of the Millennium Falcon as it goes into hyperdrive literally took my breath away. The character of Han Solo was as good as any special effect, and I would have enjoyed the movie if it had just been Han making wisecracks. I left the movie feeling dazzled and lightheaded. I got into my little Volkswagen Beetle and tore down New Jersey’s Route 1 as if I were piloting a TIE fighter.

So, naturally, when “The Empire Strikes Back” came out three years later, I rushed out to see it. Many critics considered it superior to the original “Star Wars,” since it was scripted by a better screenwriter than George Lucas, and directed by (some would argue) a better director. But I found it surprisingly disappointing. Revisiting the Star Wars universe didn’t provide the same visceral thrill that the first film had, and being dipped in a carbonite fondue put Han Solo out of commission for too long a stretch. Instead, the film’s emphasis was on expanding the mythology that had first been laid out in “Star Wars,” which I had barely paid attention to. Suddenly I was supposed to care about imperial machinations and rebel alliances and who was whose father. I simply wasn’t interested.

Alfred Hitchcock is credited with popularizing the concept of the MacGuffin — the thing that the characters in a film care about but that the audience doesn’t. The MacGuffin is just there to set the plot in motion and to give the characters a reason to interact. An obvious example (from a non-Hitchcock film) is the Maltese Falcon in “The Maltese Falcon.” We in the audience have no emotional investment in the bird; we just want to see Humphrey Bogart match wits with Sydney Greenstreet, and eventually tell Mary Astor that she’ll be taking the fall.

My problem with the Star Wars saga is that I’m supposed to care about the MacGuffin, and I don’t. I’m obviously in the minority, though — people can spend hours debating the finer points of Star Wars canon with real passion. The popularity of franchises such as Lord of the Rings and Harry Potter shows that vast audiences have become engaged in those worlds’ mythology.

I don’t get it. When I had to learn about real mythology in school — the Greek tales of vengeful gods and flawed humans — I felt a similar lack of interest. Why spend time studying stories? I was willing to acknowledge that because these particular stories had come down to us from thousands of years ago, some familiarity with them was necessary to an understanding of Western culture. The same could be said of the stories in the Bible. But studying those things is demanding work, whereas Star Wars is supposed to be entertainment.

Given my aversion to mythology, you might be surprised to learn that I’m a long-time fan of “Doctor Who,” which has accrued a TARDIS-load of mythological baggage in the more than fifty years that it’s been on the air. But I have to confess that I can rarely follow the plots. I have no interest in the Time War or the Key of Rassilon; I just enjoy traveling through time and space in the splendid company of the Doctor. Wouldn’t you?

Read Me 1 comment

Good to Go

One year for my birthday, Debra bought me an hour-long ride in a small, private airplane — something I’d always wanted to experience. Debra sat in the back, and I sat up front with the pilot, grinning widely as I watched familiar landmarks pass below us.

About halfway into the ride, the pilot took his hands off the controls and turned to me. “Why don’t you fly for a while?” he said.

Eyes wide, I immediately gave the only reasonable answer: “NO!” Just the thought of it was crazy: There are two people sitting here; one knows how to fly and the other doesn’t; and the one who doesn’t know how to fly should pilot the plane?

I’m always willing to try things if the consequences of screwing up are small. Mixing two unfamiliar ingredients in a recipe — sure, why not? Installing a new hard drive in a computer — my files are backed up, so what do I have to lose? But in any situation whose outcome could possibly be life-altering (or, in the case of flying a plane, life-depriving), I’m content to let someone with demonstrated skills take the lead.

That principle extends to crossing the street. Although people get frustrated when I stop at a corner and watch for the “walk” sign before stepping off the curb, I know that my senses of a car’s speed and distance are fallible — so why would I trust my judgment in deciding when it’s safe to walk? Given the high stakes, I’d prefer to let the engineer-designed traffic-control system — which has the added bonus of actually causing cars to stop — make that determination.

The idea that not everyone is a competent jaywalker seems not to have occurred to most people. It’s one of those things that all city-dwellers are expected to know how to do. But where is that skill supposed to come from? Are people supposed to cross the street in incrementally more hazardous situations and learn from the incidents in which they avoid being hit? No — I’d venture to say that crossing a busy street safely is something that most people just have the knack for; it’s not a skill they’ve acquired by trial and error. And if it’s something that one doesn’t have the knack for, it’s good to recognize that early.

Probably the most common thing that people are expected to be good at is driving. All of us who spend time on freeways have had occasion to criticize others for being bad drivers. But what do we want those inept drivers to do about it? Staying off the road isn’t an option; many people are forced by practical or economic circumstances to drive. Certainly they can discipline themselves to stay alert and be careful, but care and alertness aren’t always sufficient for reacting effectively to dangerous situations.

I often think that the best drivers are the ones we consider the worst — those who do whatever they can to pass everyone else on the road. Those drivers certainly need to learn courtesy and respect, and I agree that such things can be learned. But the actual skills that they have — the ability to weave effortlessly from one lane to another, to precisely judge the speed and angle they need to insert themselves between two moving cars — are most likely not things they’ve learned through trial and error. They just happen to be naturally good at it. If only those skills could be used for good instead of evil!

There are plenty of other activities at which everyone is groundlessly expected to be proficient. One, surprisingly, is sex. The advice columnist and social commentator Dan Savage has proposed a sexual standard that everyone is supposed to meet: to be GGG, or “good, giving, and game.” But where, exactly, is the ability to be “good” supposed to come from? Even if we assume that sexual skill can be acquired through practice, those who are less naturally good in bed will clearly have fewer opportunities to practice. It’s a perfect vicious cycle.

What’s true of sex tends to be true of other social interactions. Take dancing: One can attend dancing classes and learn patterns of steps, but not everyone gets to the point where they can execute those steps with ease and grace. And with the freeform style of dancing that’s been prevalent since the 1960s, even learning standard steps isn’t going to be of any help. It’s often said that the secret of social dancing is just to “go out on the dance floor and show ’em what you’ve got.” But that advice assumes that you’ve “got” something — and it doesn’t offer any guidance about where you’re supposed to get it.

I can frame the problem differently by referencing something I know I’m great at: audio editing, specifically music editing. In the earlier days of recording technology, before audio production could be done on a desktop computer, I would work closely with a sound engineer in an expensive recording studio. I’d direct our recording and editing sessions, but only he was allowed to touch the equipment. The exception was when a piece of music needed to be fitted to a specific length of time. As we listened to the playback, I would be the one to punch the button that instantly stopped the tape deck. “Cut here!” I would say. I knew precisely what needed to be deleted, and exactly where the cuts had to be made, to make the music sound seamless. It’s not something I’d learned; it was just something I was able to do. There was no way I could teach it to the engineer.

Imagine if music editing were something we had to do every day, like crossing the street or driving a car. “Oh, jeez,” I might say, listening to somebody’s clumsy cut. “Who let that clown use a sound-editing program?” Because I was good at what was considered a natural, routine task, I’d assume that everyone else was, too. It would probably not occur to me to empathize with the person who was required to do that edit, but just didn’t have the knack for it.

Read Me 2 comments