Pleasure

I was about twelve years old, and I was having a conversation with my mother as she was driving me somewhere. I wish I could remember what we were talking about, but in any case, the subject of sex came up.

“It’s OK; I already know all about that,” I said. As indeed I should have, since my parents had a year earlier given me a pamphlet — it was probably called “Your Changing Body,” or something like that — that explained it all. Learning about heterosexual sex was a revelation about human anatomy, similar to when plate tectonics explained why the shapes of South America and Africa fit together so well.

Granted, there were some things in the pamphlet that remained mysterious. For example, it assured me that if I had the urge to rub or stroke my penis, doing so would be perfectly normal. It was unsurprising to be told that such behavior was harmless, since rubbing or stroking anything was unlikely to cause damage, but — really? Why in the world would I ever want to rub my penis?

Another perfectly normal event it described was a “wet dream,” in which my penis would stiffen and emit liquid in the middle of the night. Except for the stiffening part, this sounded an awful lot like wetting the bed, which was definitely not normal behavior — so I wondered why the stiffening part made it OK.

In any case, I readily absorbed all of the information in the pamphlet and assumed that it would all make sense when the time came. Which is why, when the subject of sex came up, I assured my mother that I already had it covered.

“No, you don’t,” she said. “There’s so much you don’t know!”

I looked at her quizzically. The biology of it seemed pretty straightforward: When the time came when they wanted to have a child, the man and woman — most likely consulting a set of instructions to make sure they did it right — would simply insert Tab A into Slot B, and nature would take care of the rest.

“Listen,” she said. “God wanted people to have lots of children, so he made sex feel good. Really, really good. Most of what people do in life is about getting a chance to have that good feeling.”

I’m pretty sure that the conversation ended there, as we’d reached our destination. But the unexpectedly emotional tone with which she’d imparted that information made it hit me especially strongly. Most of what people do in life…?! Her comments that day gave me a new and powerful way to look at and understand human nature.

I thought of this conversation years later, when I was working as a freelance writer of educational and training materials. A client was putting together an elementary-school curriculum about drug abuse, and wanted me to write the script for an introductory video.

There was no substance-abuse education when I was in elementary school, but there was when I was in high school, and it was awful. We had to memorize each commonly abused drug and its effects: Heroin use leads to insomnia, impaired coordination, and slowed breathing. LSD brings on paranoia and hallucinations. Methamphetamine causes anxiety and hypertension. By the time we got though the whole list, I thought, “Why would any sane person want to take these drugs?” Which I guess was the point.

At the same time, I knew that plenty of people do take these drugs, and they must have a reason. Clearly there must be a pleasurable aspect of the experience. The fact that nobody was telling us about the positive side meant that we were being fed propaganda, and that nothing we were being taught could be trusted. What’s the point of going to school if you’re going to be lied to?

I wanted to be honest with kids. My video script had a bunch of neighborhood kids talking about their experiences with and feelings about illegal drugs, including what made these drugs attractive. In the end, of course, they would conclude that the downsides of drug use outweighed the upsides. Hopefully, the audience would reach the same conclusion.

My client would have none of it. Any mention of the fact that abused drugs can make a person feel better — if only temporarily — was taboo. I ended up having to write the traditional gloomy script, which I knew that students would have no reason to pay attention to.

So my question is: Why did nobody tell us about pleasure? Why couldn’t the writers of pamphlets and textbooks admit that sex (or masturbation) can be pleasurable, or that recreational use of drugs can be pleasurable? Part of understanding how the world works is understanding people’s motivations. I’m not a parent, so I don’t know whether things are different now, but I can say unreservedly that once I understood why people make the choices they do, I was much better equipped to make my own informed choices.

Read Me 1 comment

Resolution

My fourth-grade science teacher, Mr. Watt, was the first person I’d ever heard talk about the scientific method. He told us that when a scientific question needed to be answered, the only reliable way to answer it was through firsthand observation carried out in a controlled manner — in other words, an experiment — whose results are recorded in a lab report.

A proper lab report, Mr. Watt told us, had five parts:

  1. Question: The question that the experiment is intended to answer
  2. Hypothesis: A statement of what the outcome of the experiment was expected to be
  3. Method: A description of how the experiment was conducted
  4. Results: The data generated by the experiment
  5. Conclusion: The answer to the initial question, based on the experiment’s results

This made sense, and seemed quite elegant, except for one annoying thing: the hypothesis. What good did it do to guess at what the results would be before the experiment was conducted? What possible bearing could my prediction have on the experiment’s conclusion? I found that I actually enjoyed writing lab reports, except for the part where I had to arbitrarily decide how I expected the experiment to turn out. That felt completely unscientific.

What brought this to mind, oddly enough, is the presidential debates. By now, everyone realizes that these “debates” aren’t debates at all, but — at best — merely joint press conferences. A real, formal debate has a single question to be decided, a series of well-structured arguments made by each side, and an opportunity for each side to rebut the other’s arguments. So far as I know, there hasn’t been a genuine debate between presidential candidates since Lincoln debated Douglas in 1858.

But thinking about debates reminded me of something I’d always wondered about: Why is the topic of a debate traditionally expressed in the form of a resolution? For example, rather than address the question of “Should there be a nationwide requirement to wear masks for the duration of the COVID-19 pandemic?,” a formal debate would address the statement “Resolved [or “Be it resolved] that there should be a nationwide requirement to wear masks for the duration of the COVID-19 pandemic.” This never made sense to me. Why is the matter considered to be resolved before the debate takes place? And if the resolution were expressed in the negative (“Resolved that there should not be a nationwide requirement…”) how would the debate be any different?

It only just occurred to me that this is the same question I had about the hypothesis in a lab experiment. In each case, why is it considered necessary to predict the outcome in advance?

And in thinking about it, I realized that Mr. Watt had it wrong. (Or, equally likely, I misunderstood what Mr. Watt was telling us. I was only in fourth grade, for heaven’s sake.) A hypothesis isn’t a prediction, or a guess, as to what the outcome of the experiment is likely to be. It’s simply a statement that the experiment is designed to prove or disprove. Similarly, the resolution that initiates a debate isn’t intended to represent the predicted outcome; it’s just a proposition that one side can argue in support of and the other can argue against. The reason this type of formulation is necessary is that unlike a question, to which one can always hedge an answer (“Well, it depends on how you look at it…”), a hypothesis follows an ironclad rule of logic: Either a statement is true or its negation is true; it’s impossible for both to be true at the same time.

As far as I know, this is also why courtroom trials are structured to decide whether a defendant is “guilty” or “not guilty,” rather than “guilty” or “innocent.” In the former case, a jury must decide between two mutually exclusive conditions. In the latter, it would be possible for a jury to decide that the defendant isn’t strictly guilty, but isn’t quite innocent either.

For my friends who are scientists or lawyers, this principle is most likely in the “Duh!” category, but for me,  it was an eye-opening realization. Be it resolved, for me at least, that stating a hypothesis makes sense.

Read Me 2 comments

Ars Gratia Occupatio

Oddly for someone whose work and hobbies always revolved around creative endeavors, I never thought much about art. I grew up drawing, painting, writing, and making music, but it wouldn’t have occurred to me to call any of those things “art”; they were basically ways to get approval and attention. In high school, college, and young adulthood, I was an actor, director, playwright, and mime, but I saw those as means of entertainment (for the audience) and emotional development (for me). In my twenty years as a freelancer, I did scriptwriting, graphic design, animation, and video, but that was just work I did to make a living. If you’d asked me what all these things had in common, I would have said that I was simply making use of skills that I was lucky enough to have.

That all changed in 2003, when I was hired by Chabot College to lead a new Digital Media program, teaching students how to use creative software such as Photoshop, Illustrator, Flash, and Dreamweaver. I became a full-time faculty member in a division that was known at the time as Fine Arts, and my colleagues were painters, sculptors, illustrators, and photographers. The visual arts faculty didn’t know what to make of me; they thought of me as “the computer guy.” (Of course, when I got to know people in the Computer Science department, they thought of me as an art guy.) Having had no professional training in either computers or the arts, I just made things up as I went along.

Toward the end of my second year at Chabot, it was announced that there would be a faculty art show in the division’s recently opened art gallery. Assuming that it had nothing to do with me, I paid no attention — until I received official word that as a member of the Fine Arts faculty, I was expected to participate. This threw me into a panic. “I’m not an artist!” I said. I didn’t know what I could possibly do that would be considered art.

“So, what is art?” I asked my friend, the art history professor.

“Generally, art is anything that’s made by an artist,” she said. We both agreed that wasn’t very helpful in my case.

The division dean gave me more practical advice. “Just do whatever you normally do, and call it art,” he said. So, since most of my recent career experience had been in video production, I made a video, which ended up being displayed on a computer monitor in the art gallery. People liked it. (In case you’re curious, it’s been preserved on YouTube, at https://youtu.be/Zrpje8NpdqE.)

Making the video was a strange experience, because every video I’d previously made had been an education or training program for a paying client. This one was being made for no reason at all. Based on this experience, I formulated a functional definition for myself: Art is anything I make that has no practical purpose.

That definition has served me well over the years, as I’ve continued to make visual images and videos with no practical value. I still hesitate to call them art, though. Real art, I think, has an emotional impact — it makes you want to look at it, and then leaves you changed in some way afterward. I have no reason to believe, or even any way to know, whether the things I make have that effect or not. So for lack of a better term, I refer to them as “art projects.”

(I have to admit that I have an underlying wariness of people who call themselves artists. That seems a bit self-aggrandizing. I’m more comfortable when people describe the activities that they actually do: “I’m a painter” or “I’m a dancer” or “I’m a musician.” Then it can be left to other people to decide whether those paintings or dances or musical performances qualify as art.)

I’ve retired from my tenured faculty position at Chabot College. I still teach an occasional course there as an adjunct instructor, but I feel less and less comfortable doing so. I always thought of myself as teaching a set of skills that the students could apply in any way they wanted — they could use them to do work for employers or clients, for example, or they could make art. The person who took my place as head of the Digital Media program has a different view; she’s very insistent that “these are art classes.” If, as my art history professor friend said, art is something made by an artist, then I have much more to learn before I can teach. Or I can just emulate Miss Bliss, the preschool teacher in Richard Thompson’s comic strip “Cul de Sac.” As her four-year-old students begin to go wild with glitter and glue, she cautions them, “Remember, creativity plus neatness equals art.” That’s my favorite definition by far.

Read Me 2 comments

Professionalism

Sometime in 1985, I got a call from a friend. “I just got one of those new Macintosh computers,” he said.

“I played with one for a couple of hours,” I said. “They’re fun.”

“Well, I was thinking,” he said. “You know how they come with those different typefaces? I thought I might offer typesetting services to people, and make some extra money that way. You have a background in publishing — do you think that would work?”

I chuckled, trying not to sound patronizing. “I’m sorry,” I said, “but nobody’s going to accept Macintosh output as camera-ready repro. Those bitmapped typefaces are clunky and amateurish, and the resolution is way too low. Besides, there’s a lot more to typesetting than just typing words on a line. There are subtleties of leading, tracking, and kerning that no computer can handle by itself. You need years of experience to be a good typesetter.”

“Oh, well,” he said unhappily. “The type looks fine to me, but I guess you know what you’re talking about.”

In my defense, I should note that in 1985, the Macintosh was still basically a toy. The introduction of laser printers and PostScript fonts was still a year away. Typefaces on the Mac were designed to be printed out on the Imagewriter, a dot-matrix printer. They looked better than previous dot-matrix output, but definitely could never be mistaken for the clean, elegant type that we were accustomed to seeing in books and magazines.

I was astonished, therefore, to begin seeing printed publications using Mac-generated type arriving in the mail, no more than a month or two after I told my friend what a silly idea that was. I had clearly been wrong in thinking that the years of experience and the critical eye that professional compositors brought to their craft was something that people valued.

Cut to ten years later, when my wife and I had a successful business producing educational and training videos for business and nonprofits. I’d sent a proposal and a demo reel to a prospective client who’d seemed pretty interested in hiring us. The client responded by sending us a sample video he’d received from another producer. “I still like your work,” he said, “but this guy is offering to do the job for less than half of what you’re charging. How is that possible?”

“Your guy isn’t using professional equipment,” I said after viewing the video. “He shot this with a consumer camcorder with a built-in camera mic. The image isn’t as sharp and clear as it ought to be, and the audio isn’t clean. He shot it in natural room light instead of using studio-quality lighting. He seems to have done it himself instead of using a crew. If you’re satisfied with that level of quality, then go with him. I certainly can’t match his price.”

Once again, I assumed that the marks of professionalism were important, and once again, I was wrong. The client accepted the other producer’s bid. And at that point, I began to wonder whether my priorities were wrong. The fuzzy, Mac-generated type had communicated the same information that traditionally set type would have. And the video shot with the consumer camcorder was every bit as educational as what I was shooting with my professional crew and equipment.

I started finding ways to integrate desktop technology into my production workflow, and the lapse in professional polish was apparently not noticed by my clients. Today, of course, the tremendously increased power of desktop computers and software, along with parallel advances in cameras and lighting, make the quality of digital video so remarkably high that it’s hard to remember a time when anyone had to be concerned about a trade-off. But I’ve continued to wonder what other indicators of professionalism are ready to fall by the wayside. Since the beginning of the COVID-19 pandemic, TV, radio, and podcasts have generally been homemade, often with consumer-grade equipment, and audiences don’t seem to mind. Professionals of every stripe are allowing themselves to be seen onscreen in casual dress, with gray roots and grown-out hair, and the quality of their work clearly hasn’t suffered. Maybe it’s again time to put our emphasis on the inherent value of what people do, and to forget the attention to appearances.

Read Me Leave comment

No Shit

A childhood friend of mine, in her comment on my first blog post, expressed surprise at my having written the word “bullshit,” since she never heard me use such language back in the day. She’s right: Throughout my childhood and adolescence, my mouth remained scrupulously clean. It’s not as if I didn’t know the words. It’s just that — and I’m not entirely sure how to put this — I never felt like those words were mine to use. I’ve never easily become part of a group or a culture, and my stance has often been that of an anthropologist, observing others without sensing any relationship between their lives and my own.

As a child, I knew that other boys my age were playing sports, flipping baseball cards, and watching espionage dramas like “The Man from U.N.C.L.E.” and “I Spy.”  In high school, I knew that fellow students were drinking beer, using drugs, and having sex. It’s not that I didn’t want to do those things, or even that I did want to do those things and had some reason not to. It simply never occurred to me that those activities had anything to do with me. They were just things that people did. Swearing fell into that category.

That sense of remove, by the way, extended beyond the world around me; it applied to literature as well. I devoured books from the time I learned to read, but each book took place within its own world, which I had no reason to believe had any relationship to my own. I remember, early on, reading a book in which two friends are sent by their families to military school, and are thrilled to discover that they’ve both been assigned to reside in Lockhammer Hall. “Okay,” I remember thinking, “this book takes place in a world where people live at their school and sleep in a hallway.” Having accepted that premise, I could go on with the story. It was not until my freshman year of college that I realized that there really were people who went to boarding school, that it wasn’t just something that happened in novels.

That’s why I found it particularly irritating when exams in high school English classes included a question along the lines of “What did you learn about life from reading this book?” The obvious answer was “I didn’t learn anything about life — the things that happened in the book were made up!” (I soon discovered that this wasn’t an acceptable answer, so I had to learn to, um, bullshit my way through.)

As I got older, although I never really developed the capacity to feel like I was part of a culture (or part of the world of a book), I did start learning to adopt other people’s customs — to engage in cultural appropriation, if you will. Thanks to some good friends in college, I learned to appreciate the joys of alcohol and marijuana, I was initiated into the world of sexual relationships, and I tentatively adopted a few mild swear words. I began to feel almost like a normal person.

Like the friend whose comment inspired this essay, people were sometimes surprised at the ways college had changed me. I vividly remember being home on a break, accidentally dropping some things that I was carrying, and exclaiming, “Oh, shit!” My mother emerged from the next room, her eyes wide and her jaw dropping.

“Thank God!” she said. That is not the reaction I expected. “All these years,” she continued, “I’ve been watching what I say, being careful not to swear, because I didn’t want to corrupt you. Now that you’ve picked it up somewhere else, I can relax!” Language became much looser at family gatherings after that.

I have to admit, though, that I still have never become comfortable with the word “fuck.” (I cringe even to have written it here.) It’s such a versatile and expressive word, and I admire people who can wield it skillfully, but it’s just never felt natural coming out of my own mouth. Feel free to use it in my presence, though — I’m already as corrupted as I’m going to get.

Read Me Leave comment