Change of Address

The first class I attended in the spring of my sophomore year at Princeton was for a course called Philosophy 201, Introduction to Logic. Because this course was required for philosophy majors, the lecture hall was packed. The professor, who had taught this course many times before, mounted the podium and looked out at the crowd of students. “I don’t know what you’re all doing here!” he said. “If you were logical, you would have elected one person to come and take notes and share them with the rest of you.”

He was joking, of course. But his comment opened my eyes to the strangeness of the way we were being educated: The professor draws on his knowledge of a topic to prepare a lecture (or, in this case, dusts off a canned lecture he’d already prepared years before). Referring to his notes, he recites this lecture to a room full of students who listen to his words and summarize them in a new set of notes. The students then go back to their dorm rooms and, by reviewing their notes, try to reconstruct the original body of knowledge that the professor had encapsulated in a lecture.

Not only does this process appear crazily inefficient; it also is virtually guaranteed to distort the message, as in a game of telephone. The idea that I, as a student, would walk away with the same understanding that the professor had when he prepared the lecture seemed improbable at best. We do know that writing things down with a pen and paper reinforces learning, but we have no way of knowing whether what the student is writing is accurate, or whether notes that seem clear while listening to the lecture will make equal sense when the student reviews them later on.

And yet, the custom of lecturing in the classroom lives on. Technically, it would be more efficient for a professor to write down what he or she wants to communicate (as most of them have, in articles and books), and have students read the written material. Students might even be required to copy out particularly important passages by hand, as they often were in centuries past. But we all know instinctively that such a process wouldn’t work.

Let’s acknowledge the obvious objection that a teacher who communicated solely through writing would disadvantage those students whose primary learning modality is auditory. (This issue wouldn’t even have come up when I was at Princeton, since the concept of learning modalities hadn’t yet been developed in the 1970s.) Regardless, I think the importance of lecturing goes much deeper than that.

The ritual of having a respected person get up and speak in a dedicated space has its own significance, apart from what the person says. It invites attention. It lends weight and importance to the message, and flavors it with the history and personality of the speaker, in a way that written material does not. It allows the speaker’s delivery to be influenced by audience response, whether overt (through laughter or applause) or tacit (through body language or even the atmosphere in the room), thereby creating a feedback loop. Even when the speaker writes on the blackboard or whiteboard, the words have an immediacy that they don’t have in a printed book. These elements contribute to learning beyond what language itself can communicate.

These thoughts come to mind because it’s the time of year when the president of the United States delivers the annual State of the Union address. I’ve rarely bothered to watch the live event — it’s enough for me to read the press accounts afterwards of what the president has said. The delivery of the address is just a formality, since the script has been so carefully engineered that any vestige of spontaneity is lost, and it’s been distributed to the press even before the president gets up to speak. I never saw the point of watching the president’s mouth move as he recites the scrolling words on the teleprompter.

But it occurs to me now that the way the speech is staged — in the august halls of Congress, with the most powerful figures in the federal government present and all of the media watching — is what gives the address significance beyond the written words that constitute it. Even if I don’t watch the speech itself, it’s still the gravity of the occasion of its delivery that makes the words feel important when I read them later in the news reports.

It’s astonishing how different a lecture or a speech is from, say, a series of tweets, even if the content is the same.

Read Me 3 comments

Resolution

My fourth-grade science teacher, Mr. Watt, was the first person I’d ever heard talk about the scientific method. He told us that when a scientific question needed to be answered, the only reliable way to answer it was through firsthand observation carried out in a controlled manner — in other words, an experiment — whose results are recorded in a lab report.

A proper lab report, Mr. Watt told us, had five parts:

  1. Question: The question that the experiment is intended to answer
  2. Hypothesis: A statement of what the outcome of the experiment was expected to be
  3. Method: A description of how the experiment was conducted
  4. Results: The data generated by the experiment
  5. Conclusion: The answer to the initial question, based on the experiment’s results

This made sense, and seemed quite elegant, except for one annoying thing: the hypothesis. What good did it do to guess at what the results would be before the experiment was conducted? What possible bearing could my prediction have on the experiment’s conclusion? I found that I actually enjoyed writing lab reports, except for the part where I had to arbitrarily decide how I expected the experiment to turn out. That felt completely unscientific.

What brought this to mind, oddly enough, is the presidential debates. By now, everyone realizes that these “debates” aren’t debates at all, but — at best — merely joint press conferences. A real, formal debate has a single question to be decided, a series of well-structured arguments made by each side, and an opportunity for each side to rebut the other’s arguments. So far as I know, there hasn’t been a genuine debate between presidential candidates since Lincoln debated Douglas in 1858.

But thinking about debates reminded me of something I’d always wondered about: Why is the topic of a debate traditionally expressed in the form of a resolution? For example, rather than address the question of “Should there be a nationwide requirement to wear masks for the duration of the COVID-19 pandemic?,” a formal debate would address the statement “Resolved [or “Be it resolved] that there should be a nationwide requirement to wear masks for the duration of the COVID-19 pandemic.” This never made sense to me. Why is the matter considered to be resolved before the debate takes place? And if the resolution were expressed in the negative (“Resolved that there should not be a nationwide requirement…”) how would the debate be any different?

It only just occurred to me that this is the same question I had about the hypothesis in a lab experiment. In each case, why is it considered necessary to predict the outcome in advance?

And in thinking about it, I realized that Mr. Watt had it wrong. (Or, equally likely, I misunderstood what Mr. Watt was telling us. I was only in fourth grade, for heaven’s sake.) A hypothesis isn’t a prediction, or a guess, as to what the outcome of the experiment is likely to be. It’s simply a statement that the experiment is designed to prove or disprove. Similarly, the resolution that initiates a debate isn’t intended to represent the predicted outcome; it’s just a proposition that one side can argue in support of and the other can argue against. The reason this type of formulation is necessary is that unlike a question, to which one can always hedge an answer (“Well, it depends on how you look at it…”), a hypothesis follows an ironclad rule of logic: Either a statement is true or its negation is true; it’s impossible for both to be true at the same time.

As far as I know, this is also why courtroom trials are structured to decide whether a defendant is “guilty” or “not guilty,” rather than “guilty” or “innocent.” In the former case, a jury must decide between two mutually exclusive conditions. In the latter, it would be possible for a jury to decide that the defendant isn’t strictly guilty, but isn’t quite innocent either.

For my friends who are scientists or lawyers, this principle is most likely in the “Duh!” category, but for me,  it was an eye-opening realization. Be it resolved, for me at least, that stating a hypothesis makes sense.

Read Me 2 comments

No Shit

A childhood friend of mine, in her comment on my first blog post, expressed surprise at my having written the word “bullshit,” since she never heard me use such language back in the day. She’s right: Throughout my childhood and adolescence, my mouth remained scrupulously clean. It’s not as if I didn’t know the words. It’s just that — and I’m not entirely sure how to put this — I never felt like those words were mine to use. I’ve never easily become part of a group or a culture, and my stance has often been that of an anthropologist, observing others without sensing any relationship between their lives and my own.

As a child, I knew that other boys my age were playing sports, flipping baseball cards, and watching espionage dramas like “The Man from U.N.C.L.E.” and “I Spy.”  In high school, I knew that fellow students were drinking beer, using drugs, and having sex. It’s not that I didn’t want to do those things, or even that I did want to do those things and had some reason not to. It simply never occurred to me that those activities had anything to do with me. They were just things that people did. Swearing fell into that category.

That sense of remove, by the way, extended beyond the world around me; it applied to literature as well. I devoured books from the time I learned to read, but each book took place within its own world, which I had no reason to believe had any relationship to my own. I remember, early on, reading a book in which two friends are sent by their families to military school, and are thrilled to discover that they’ve both been assigned to reside in Lockhammer Hall. “Okay,” I remember thinking, “this book takes place in a world where people live at their school and sleep in a hallway.” Having accepted that premise, I could go on with the story. It was not until my freshman year of college that I realized that there really were people who went to boarding school, that it wasn’t just something that happened in novels.

That’s why I found it particularly irritating when exams in high school English classes included a question along the lines of “What did you learn about life from reading this book?” The obvious answer was “I didn’t learn anything about life — the things that happened in the book were made up!” (I soon discovered that this wasn’t an acceptable answer, so I had to learn to, um, bullshit my way through.)

As I got older, although I never really developed the capacity to feel like I was part of a culture (or part of the world of a book), I did start learning to adopt other people’s customs — to engage in cultural appropriation, if you will. Thanks to some good friends in college, I learned to appreciate the joys of alcohol and marijuana, I was initiated into the world of sexual relationships, and I tentatively adopted a few mild swear words. I began to feel almost like a normal person.

Like the friend whose comment inspired this essay, people were sometimes surprised at the ways college had changed me. I vividly remember being home on a break, accidentally dropping some things that I was carrying, and exclaiming, “Oh, shit!” My mother emerged from the next room, her eyes wide and her jaw dropping.

“Thank God!” she said. That is not the reaction I expected. “All these years,” she continued, “I’ve been watching what I say, being careful not to swear, because I didn’t want to corrupt you. Now that you’ve picked it up somewhere else, I can relax!” Language became much looser at family gatherings after that.

I have to admit, though, that I still have never become comfortable with the word “fuck.” (I cringe even to have written it here.) It’s such a versatile and expressive word, and I admire people who can wield it skillfully, but it’s just never felt natural coming out of my own mouth. Feel free to use it in my presence, though — I’m already as corrupted as I’m going to get.

Read Me Leave comment