To a Degree

My brilliant friend Lisa Rothman — entertainer, entrepreneur, and corporate trainer — recently left a comment that got me thinking. (If you find Lisa’s name familiar, it’s probably because she leaves thoughtful comments on pretty much everything I post here.) “I would love to live in a society,” she said, “[that’s] structured around people being able to spend all their time doing the things that they are really good at and enjoy doing.”

I would, too. I’m not entirely sure how that would work, given some serious obstacles:

  • The things that people enjoy doing are not necessarily the things that they’re good at, and vice versa.
  • There is likely to be much disagreement about what it means to be good at something, and who is and isn’t.
  • The things that people enjoy doing don’t necessarily benefit anyone other than themselves, and in some cases might even cause harm.
  • It’s not at all certain that the things people are good at, or the things they like to do, are evenly distributed enough to ensure that all necessary tasks get done. (How many people like cleaning toilets?)

Nevertheless, I do think that principle could be applied more often than it is. One place to start might be an area where I have some experience: higher education.

Community colleges, such as the one at which I was a faculty member, have been put under great pressure to model themselves after fast-food corporations. Just as McDonald’s enforces uniformity among its retail outlets, making sure that a hamburger sold at a McDonald’s in Kansas City tastes exactly like a hamburger sold at a McDonald’s in Miami, community colleges are supposed to require uniformity in their curriculum and in the people who teach it. A student who takes English 101 from one instructor is supposed to have exactly the same experience, and (measurably!) learn exactly the same things, as a student who takes the same course from another instructor. The course outline has been rigorously developed and approved by a committee, and it must be adhered to.

Naturally, there are good reasons for this demand for uniformity. Students who enroll in an advanced course must all be presumed to have received equal preparation in their earlier courses. Transfer students must have taken courses at their first school that are aligned with those as their next school. Students who graduate with the same degree must possess the same knowledge and skills, or else the degree has no meaning.

But it’s clear that something precious is being lost here: the value of each individual teacher’s life experience. In Lisa’s words, every instructor has something different that “they are really good at and enjoy doing.” Wouldn’t the students benefit if the teacher were free to teach that? For example, I’m no expert in desktop printing, because nearly all of my Photoshop work is intended for use online, but I have to teach it anyway because it’s part of the curriculum. I have lots of experience in preparing graphics for video, but I can’t share that experience because it’s not included in the official course outline. Because of my background in theater, I enjoy creating projections for stage productions, but that’s obviously not thought of as a mainstream activity.

What makes this particularly sad is that the specific topics that are being taught don’t matter much. I’ve always told my students, “Everything you do here is going to be obsolete in ten years.” What they really need to learn is how to learn — how to be curious, how to think critically, how to take responsibility for and pride in their work, how to start with fundamental principles and apply them to new situations. The actual subject matter of the course is just a vehicle for passing on those skills. So why can’t the curriculum flow naturally from where the instructor’s heart is?

When I think of teachers who have influenced me in the past, I rarely remember anything specific that they taught me — instead, I remember what sort of people they were, what they valued, and how they approached life. My education came simply from being in the room with them. Wouldn’t it be nice if, instead of taking English 101, a college student could take a course called “A Semester with Professor So-and-So”?

Read Me 1 comment

Practice to Deceive

Like many boys of my generation, I became interested in magic during my preteen years. Partly it was because performing magic tricks was a way to get attention and recognition, which was pretty much my full-time job. Beyond that, though, I think wanting to learn magic was a response to the powerlessness that came with being a child. Knowing secrets that adults didn’t know, being able to baffle and mystify them, would in a small way give me power over them.

The library had plenty of how-to magic books for kids, and from those books I learned to build a few practical illusions, such as an apparently empty box that would become full of knickknacks, or a sheet of newspaper that could be torn up and become whole again. But I found that no adults were particularly mystified by them: It was clear to everyone that the trick was purely mechanical, even if the exact details of the mechanism weren’t obvious. (In one case, I demonstrated to my uncle a crank-and-roller device that would convert a one-dollar bill to a five-dollar bill; he grabbed the resulting five-dollar bill and tore it up. “What’s the matter?” he said mockingly when I broke into tears. “Can’t you make more of them?”)

Eventually, I discovered — as with so many other things — that I’d gotten magic all wrong. Knowing the secrets of how tricks were done was only a small part of it. The real power of magic lay in being able to do things: feints, sleight of hand, misdirection. These were physical skills that needed to be mastered through intensive practice. That realization marked the end of my ambition to become a magician. The idea that anyone could acquire skills like those was a form of magic that was beyond my comprehension.

When it comes to physical skills, the adage that “practice makes perfect” has rarely applied to me. In my freshman year of college, I chose tennis for my required phys-ed course. The first step in learning tennis was being able to repeatedly hit a ball against a wall. I spent the entire semester fruitlessly trying to hit the ball as it bounced back at me, never reaching the level where I could play against another person. In my 20s, inspired by my college roommate Jay, I decided to take up juggling. Despite dedicated daily practice, I never got past the first step of being able to toss a single beanbag from hand to hand. Later, I rehearsed a relatively simple piano accompaniment for weeks on end in preparation for a performance, but never got to the point where I could play it without mistakes.

Even when skills are not explicitly body-based, I’m not sure that the length of time spent exercising them has much of an impact. We tend to think of knowledge and skills as going hand-in-hand, but I, for one, experience them quite differently. I certainly know more than I did when I was in high school, but I wouldn’t say that my skill level — my ability to apply what I know — has changed much over the past fifty years. The things that I’m good at, such as problem-solving, writing, and doing creative work, I’ve been good at right from the start. The things that I’m not so good at, such as big-picture thinking (see “The Freeway Problem”) and social interaction, I haven’t gotten much better at despite a lifetime of trying to do so.

My experience of skills being relatively static has led to one of my big deficiencies as a teacher. I’m very good at explaining things — the knowledge part — but not good at enabling students to get better at what they’re doing. If, despite repeatedly getting detailed corrective feedback on their work, a student remains unable to tell a well-composed photograph from an ill-composed one, or a well-timed video edit from a random one, my internal reaction is, “Well, it appears that you don’t have the knack for this. Maybe you should try something else.” Of course, I would never say that directly to the student. Doing so would go directly against what we’re supposed to do, which is to encourage the student to persist no matter what.

On the other hand, despite my instincts to the contrary, I must admit that there is something to be said for persistence. I have known actors and designers whose skills — in my judgment, at least — are mediocre, yet who have continued to believe in their own excellence. Somehow, their self-regard rubs off on others, and their careers continue to advance. (I can also think of a recent president who falls into this category.) Their skills don’t improve, but it turns out not to matter.

Read Me 3 comments

Parent-Teacher Association

It was a Monday morning in spring, the day after Daylight Saving Time had taken effect, and my first-grade teacher asked whether anyone had stories about how the resetting of the clocks had affected them. Nobody raised their hand, so I raised mine. “I lost an hour of sleep!” I said.

“Mark, everybody lost an hour of sleep,” said the teacher, patronizingly. “Does anybody have any real stories?”

This sounds like a very minor exchange, but I still feel its sting nearly six decades later. The teacher had wanted to start a discussion, and I thought I was helping. My loss of sleep was the only story I had, and it had the added benefit of showing that I understood the basic premise of Daylight Saving Time. But I was shot down, summarily dismissed. It’s the first time I remember feeling that school was not a place where I’d find help and encouragement, but a place where I’d be judged.

Of course, I’d clearly been judged all along, but only positively. I was diligent and eager to please — partly because I enjoyed learning, but also out of self-preservation. At the beginning of the school year, my teacher had called my mother to advise her that I was holding my pencil incorrectly, and my mother had reacted furiously. “How does it make me look if you don’t know how to hold a pencil?” she yelled. So it was clear to me that I’d be in real trouble if any further negative reports came from my school.

Teachers have always judged students; it’s unfortunately part of their job. Grading students’ work has always been my least favorite part of teaching college courses, because it puts me in a position of authority that feels unjust. My stance as a teacher has never been “I am here to educate you,” but rather, “I have a lifetime of experience with this subject matter, and I’m eager to share it with you, but only to the extent that you think it will help you. I don’t know everything, and I may be wrong sometimes.” I always give students extensive written feedback on their assignments, but I try to make clear that this feedback is only my opinion, and they can take it for what it’s worth.

The odd thing is that students never seem to get that message. I remember, early in my teaching career, coming back to my class after a ten-minute break and saying, “I want to apologize. In teaching the lesson before the break, I was talking to you as if I’m somebody special and superior. I’m really not.”

The students just stared at me. “But you’re the teacher,” one of them said, and the rest nodded. They actually seemed embarrassed.

I’m sure this is because, like me, they came out of a system in which the teacher is presumed to be someone important and authoritative, someone whose opinions and judgments have real significance. And teachers have to play that role, because otherwise the grades that they give — grades that have the potential to shape a student’s future — would have no legitimacy.

The problem is that, as every student knows, teachers’ judgments are often wrong. I felt the system’s unfairness frequently as a student — partly because I was unusually sensitive, but partly because whatever conclusions a teacher reached about me were echoed and amplified by my parents.

When my second-grade teacher assigned us to draw a self-portrait, I took it as a challenge. I had never thought of myself as looking any particular way — I was just a generic boy, with eyes and a nose and a mouth no different from anyone else’s. But if this drawing was specifically supposed to represent me, I had to find out what was unique about my appearance. I spent a long time staring into the mirror, pencil-sketching every line on my face and every imperfection I could find. The result, given my second-grade-level drawing skills, must have looked like a wrinkled old man. But it represented my best attempt to do what had been asked of me.

To my shame, the teacher sent the drawing home with me to show my parents, with a big X at the top and “I want Mark” written in red pen. My mother was incensed that I had failed such a simple assignment, and sent me to my room to do it again. What I came out with was what had apparently been expected of me all along: a colorful crayon drawing with a generic round head, scribbled brown hair, dots for eyes, and a curved, smiling mouth. That drawing got an A from my teacher, and a gold star.

If anybody asks me how I came to be so cynical about education, that’s the reason.

Read Me 1 comment

Language Lessens

A while back, I began a blog post called “Sound Barrier” with this sentence:

The first Broadway show I ever saw was “Hello, Dolly!,” which had recently been recast with Pearl Bailey and Cab Calloway in the lead roles.

My wife Debra, who vets everything I write before I post it (partly to catch typographical errors, but mostly to make sure I don’t say anything inappropriate) flagged that sentence. “You can’t follow an exclamation mark with a comma,” she said.

“But the exclamation mark is part of the title of the show,” I said. “It’s not punctuating the sentence.”

“It’s still not right,” she said.

I came away grumbling. I had to admit that it did look funny, but I didn’t want to have to rewrite the sentence. A few days later, I happened to pick up the November 30 issue of The New Yorker, and found the following sentence in an article about William Faulkner:

In these books, no Southerner is spared the torturous influence of the war, whether he flees the region, as Quentin Compson does, in “The Sound and the Fury,” or whether, like Rosa Coldfield, in “Absalom, Absalom!,” she stays.

“The New Yorker did it!” I said. “They put a comma after ‘Absalom, Absalom!’ ” That definitively settled the argument. To borrow a formula from Richard Nixon, if the New Yorker does it, it’s not illegal.

The fact that Debra and I can quibble about the finer points of grammar and punctuation — but not about much else — is one of the delights of our relationship. Mostly, our complaints are not with each other, but about errors we find in other publications: things like the use of “literally” to mean “figuratively,” or the misuse of an apostrophe to form a plural.

Lately, though, we’ve been feeling like members of a rapidly shrinking minority. When she gripes about someone who used “unique” to mean something other than “the only one of its kind,” I have to tell her, “That battle’s been lost.” Meanwhile, I go on fighting for even more hopeless causes. When I complain about the use of “as such” to mean “therefore,” or insist on use of the subjunctive mood to describe a hypothetical event, my Facebook friends invariably tell me that it’s time to give up.

These issues are of more than theoretical importance, because I don’t know how critical I should be of my students’ writing when I teach college courses. I’m not an English teacher, so enforcing the rules of written language is not strictly my job. At the same time, I caution my students that no matter how good they are at what they do, no one will take them seriously if they can’t communicate well about what they do. If they want the respect of their employers, clients, and peers, they need to use proper grammar, punctuation, and spelling.

However, I’m not sure that this is true anymore. When I look at the memos that come from several college administrators, or the classroom materials that are written by some of my fellow instructors, the quality of their writing is not much better than that of my students. Nevertheless, those people have managed to rise to positions of authority. Maybe we’re at the point where not many people pay attention to the old rules. If the people who will be hiring my students don’t know much about spelling or grammar, why should my students have to?

I’m also not convinced that students can internalize the rules of grammar and punctuation if they haven’t grown up reading books that follow those rules. My childhood was kind of unusual in that much of the reading material in our house had been picked up at rummage sales. We had an encyclopedia that had been published in 1912, and a series of fairy tale collections (“The Red Fairy Book,” “The Blue Fairy Book,” and so on) that Andrew Lang had compiled in the 1890s. As a result, from the time I learned to write, my writing had sort of a Victorian style — formal and somewhat distant, with lots of polysyllabic words and compound sentences. (Come to think of it, that pretty well describes my writing style even now.) I don’t see how students who grow up reading tweets and websites can develop a sense of what formal language is supposed to sound like.

So maybe it really is time to give up on preserving arbitrary rules, and just focus on clear writing that communicates clear thinking. After all, when we see a sign that says “Vegetable’s for sale,” we still know what it means, despite the unneeded apostrophe. If someone says, “Tell me if you agree,” we understand that they want us to let them know whether we agree, not to notify them only in the event of our agreement. So far as spelling goes, William Shakespeare famously spelled his own name in several different ways, and yet still seemed to do OK for himself.

In talking to students, I’ve always compared language to clothing. Just as the practical purpose of clothing is to keep us warm, the practical purpose of language is to communicate. But clothing goes far beyond that basic function. What we choose to wear, and how suitable our wardrobe is to the place where we wear it, is how we tell people what we want them to think of us. Similarly, the style of language that we use, and its suitability to the environment we’re in, necessarily affects people’s assessment of our character.

I think that’s still true. But just as the rules about formal attire have relaxed greatly over the past few generations without any great harm to society, I suspect that the rules of formal language might need to be relaxed as well.

Read Me 2 comments

Blind Spots

My fifth-grade teacher was teaching us about the Panama Canal, and how it connects the world’s two great oceans with an elaborate series of locks. He described how a ship would enter the first lock, the gates would close behind it, the lock would fill up with millions of gallons of water to lift the ship up to a higher level, and another set of gates in front of the ship would open to let the ship pass to the next lock. It was all very impressive, but there was something missing from his explanation.

“Why are the locks there?” I asked. “Why are they needed?”

The teacher seemed never to have thought about this before. He paused for a moment and said, “Probably because the water level in the Pacific Ocean is higher than the Atlantic Ocean.” Satisfied with himself, he went on with the lesson.

His answer was patently absurd. Anyone who looked at a map could see that the two supposedly separate oceans were in fact different portions of a single body of water, and therefore couldn’t have different levels. But by that time, I’d learned from hard experience that it never pays to correct the teacher.1

For years afterward, I recalled that exchange with a bit of resentment and a big dollop of smugness. Why couldn’t he just have confessed that he didn’t know, instead of making up an answer on the spot? When I eventually became a teacher, I was always ready to admit when I was stumped by a student’s question — and in fact, I took it as an opportunity to model problem-solving behavior. “I don’t know,” I’d say to the student who asked the question. “Why don’t we find out?” And then everyone could watch my screen as I went to Google to investigate.

(Sometimes it was useful to say “I don’t know” even if I did know. If I was demonstrating how to use a piece of software — Photoshop, for example — and a student would ask a “what if?” question such as “What happens if I use the eraser on a type layer?” I’d answer, “I don’t know; let’s all try it and see!” hoping that the students would realize that they could easily answer such questions on their own.)

So it was easy to look back at my fifth-grade teacher and feel superior. But the longer I went on teaching, the more I realized that I had frequently been guilty of saying things that were absolutely wrong. I had told students that skin tone contained more blue than green (in reality, the opposite is true); that Tim Berners-Lee, at the time he invented the World Wide Web, was a physicist (he had a BA in physics, but that’s about it); that sans-serif characters weren’t used in ancient Rome (they were); that there’s no such thing as half a pixel (there sort of is), and many other things that I can’t remember now because, well, they were wrong.

Obviously, when I taught these “facts” to classes, I thought at the time that they were correct. Why I thought they were correct, I can’t say. But no matter where my supposed knowledge came from, I had clearly fallen victim to that age-old philosophical conundrum, “You don’t know what you don’t know.” (Or, in the famous words of Donald Rumsfeld, “There are unknown unknowns.”) I’m good at problem-solving, but I’ve never figured a way out of that one.

Back when I worked in educational publishing — before the days of computerized layout and spell-checking — I was preparing to send a workbook to be printed. This was the final step in a long process for which I was responsible, which included multiple passes of rewriting and editing, getting type from a compositor, proofreading the typeset copy and making corrections, and finally having a designer slice up the type and adhere it to layout boards, resulting in “mechanicals” that the printer would photograph to make plates.

As I was about to package up the mechanicals, a colleague of mine — an experienced editor who could spot a mistake across a room — happened to be walking by. She scowled at me and said, “You have a spelling error.”

“What?! Where?” I said. It was unlikely that a typo would have made it this far through the process, and at this stage it would be an expensive thing to fix.

“ ‘Ophthalmologist’ is misspelled,” she said.

“No it isn’t,” I said. “I know people think it’s ‘opthamologist,’ but I made sure the ‘l’ is in there — ‘opthalmologist.’ ”

“Yes, but there’s an ‘h’ missing,” she said. It’s not ‘opthalmologist,’ it’s ‘ophthalmologist.’ ”

I turned pale. “Are you sure?” I asked, knowing as soon as it came out of my mouth that it was a stupid question.

“If you didn’t know how to spell it, why didn’t you look it up?” she said, glaring.

“But I did know how to spell it,” I said — meaning, of course, that I thought I knew how to spell it. How was I supposed to know that I didn’t? Since that time, I’ve been aware that each of us is a storehouse of snafus that are waiting to happen. We can hear the ticking of the time bomb, but we have no way to know where the bomb is and when it’s due to go off.


Read Me 1 comment