Monkeys, Apes, and Lunatics (1)

(Part one of two)

One morning when I was in the second grade, I was unexpectedly excused from class and introduced to a school psychologist, who asked me to walk with him to his office. I still have a vivid memory of that walk, during which the psychologist first made some small talk (to the extent that one can make small talk with a seven-year-old) and finally brought up the reason for our meeting.

“Who do you think is the smartest person in your class?” he asked.

“Me!” I said. (Modesty was a concept to which I’d not yet been introduced.)

The psychologist seemed to agree. He told me that there existed a special program for the school district’s most gifted students, and his task that day was to decide whether I would be placed in that program beginning in third grade. In his office, he interviewed me and administered an IQ test, which I breezed through. (I was always a good taker of standardized tests.) I was then returned to my second-grade classroom, where I resumed my role as an ordinary student while inwardly picturing myself as a ruler of the world.

The special program was called “More Able Learners,” usually referred to as MAL (pronounced as the initials M–A–L, not as the acronym “Mal,” although its association with words such as “maladjustment” and “malpractice” should have been warning signs). It was still in an experimental stage, having been launched, if I remember correctly, just a couple of years before.

The Farmingdale, Long Island school district already had a tracking system — an arrangement that’s increasingly controversial today, but was standard educational practice when I entered school in the 1960s. Students were placed in separate tracks, with different curriculums and often different teachers, according to their assessed level of academic potential. I don’t know how these tracks were identified in elementary schools, but by the time a New York student reached high school, we knew our assigned track by the label HR (for “Honor Regents),” R (for just plain “Regents,”) or G (for “General Education”). Under that conventional system, I presumably would have been placed in the HR track.

The new MAL program — to which I was admitted shortly after my session with the psychologist — was different. Unlike the traditional system, in which students followed a track within their neighborhood school, all MAL students in a given grade were gathered into a single classroom in one designated school, regardless of where they lived. This same group of 20-odd students would remain together for the rest of our public-school careers, being taught advanced subjects by specially trained teachers, progressing from grade to grade in our own protected bubble.

From the time I met my MAL classmates at the start of third grade, we were continually reminded that we were special. Not only would we move through conventional academic subjects at an accelerated pace — for example, beginning foreign-language classes in third grade, algebra in sixth, biology in seventh — but we would also be exposed to subjects that other students were not, such as an intensive study of the “Great Books.” We were considered to be ideal guinea pigs for emerging instructional techniques such as programmed learning, in which specially devised workbooks (with the answers included) allowed individual students to master a subject at their own pace. Our teachers had an unusual degree of autonomy, and exercised a good deal of creativity, in determining how to use our time in the classroom.

MAL classes were the perfect environment for a curious and inventive student like me. Within the first two years, I contributed editorial cartoons to a class-produced newspaper, built and performed with a ventriloquist’s dummy whose head was made from a shoebox, wrote an award-winning civics essay, and composed a short piece of music that our music teacher later played on the piano. Taking full advantage of our proximity to New York City, our teachers took us on field trips to the Metropolitan Museum of Art and the Museum of Natural History, to a swanky French restaurant (where we, of course, ordered our lunch in French), and on a Circle Line boat tour around Manhattan. We discussed world events such as the move of Brazil’s capital to the custom-built city of Brasilia, the struggle against apartheid in South Africa, and the war between India and Pakistan over Kashmir.

But the academic and intellectual benefits of MAL came at a price. Because each of us was being bused to a school in another part of the district, we lost our connection with the other kids in our own neighborhoods, and we spent our weekdays in a building whose other occupants were strangers to us. In those situations when we had to mix with non-MAL students — in phys ed classes, at lunch, or at recess — relations were icy. A few socially adept classmates managed to make friends, but the rest of us lived as misfits. We were thought to be stuck up, weak, nerdy — in other words, different, which among young students constitutes a social death sentence. The joke among the normally-tracked students was that the initials M–A–L stood for “monkeys, apes, and lunatics.”

(To be continued in part 2)

Read Me 4 comments

Dear Me

An odd thing I’ve noticed, after consuming a steady diet of films from the 1930s, is that people in those movies often call each other “darling.” I’m not talking just about spouses or romantic couples. Grown children call their parents and grandparents “darling.” Siblings call each other “darling” (which, to me at least, feels kind of creepy). Even good friends can call each other “darling.” Outside of the romantic context, the “darling”-ing seems to confine itself to women. I don’t recall any instances of Army buddies calling each other “darling,” but I may just not have gotten to those movies yet.

I seem to remember my parents calling each other “darling” once or twice, but only in an ironic context. By now, “darling” has apparently died out entirely, replaced by “honey” or “sweetie.” (I’m not sure where “dear” fits into the chronology. Does anyone these days call anyone else “dear”?)

The whole culture of terms of endearment is mysterious to me, because I’ve never been able to use them. I’m not sure at what point in my life I’m supposed to have acquired the habit. Children don’t call each other “honey,” so what impels them to start saying it when they get older? And how do they decide which name to use? I’ve been called “honey” or “sweetie” by servers in diners, but I don’t think I’ve ever been called both by the same person.

For me, the only natural way to address someone is by the name they were introduced to me as. In nearly 35 years of marriage, the only thing I’ve ever called Debra is “Debra.” (She went through a “Debbie” phase in high school, so friends from that era are grandfathered in, but she made very clear when she met me that no one else is permitted to call her Debbie.) There are occasions when I’ll perceive that other people are addressing someone differently than I’ve been — for example, that everyone but me is calling Elizabeth “Liz” — in which case I’ll ask Elizabeth, “Would you rather be called ‘Liz’?” And if she says yes, then “Liz” is what she becomes. There’s no going back.

I genuinely can’t emulate the casualness with which most people seem to use pet names and nicknames. It’s not that I haven’t tried. Years ago, I noticed that my girlfriend Marcia was occasionally called “Marsh” by her family and friends, but only in the most informal circumstances — as in “Hey, Marsh, look at this!” I decided that I would try to do the same, and even told her of my intention. But execution proved to be difficult. Every time I was about to address her, I had to ask myself, “Which is more appropriate for this occasion — ‘Marcia’ or ‘Marsh’?” Whenever I concluded that “Marsh” was the way to go, it came out of my mouth sounding so stilted and rehearsed that I was already cringing by the time I got to “sh.” After a few such attempts, Marcia and I both agreed that I should stop.

What’s especially strange is that I have no such problem when it comes to animals. Debra’s and my first cat, Brook, was almost never addressed as “Brook.” I could call her “Brookie,” “Brookface,” “Brooklyn,” “Broccoli,” or pretty much anything else without the need to reflect in advance. When I meet a cat or dog on the street, I hear myself saying, “Hello, Cuteface!” or “Hey, Beautiful!” as I drop to my knees to administer pets. Maybe it’s because the furry creature has no idea what I’m saying, so it’s impossible to come out with anything inappropriate.

Speaking of animals, another thing I’ve noticed in old movies is that people’s pets never have names that traditionally belong to humans. Nick and Nora Charles’s dog is Asta, Dorothy Gale’s dog is Toto, Susan Vance’s leopard is Baby, Gillian Holroyd’s cat (in “Bell, Book and Candle”) is Pyewacket, Roy Rogers’s horse is Trigger, and Friendless’s cow, in Buster Keaton’s “Go West,” is Brown Eyes. (Tom Mix had a horse named Tony, but that’s a rare exception.)

I’m not sure when it became fashionable to give people’s names to pets — the first instance I can remember is in the early 1980s, when David Letterman talked about his dog, Bob — but it’s time for that trend to pass. I have had cats named Brook, Timmy, and Mary Beth, so I’m as guilty as anyone, but I’ve since decided that giving animals human names is lazy and no longer cute. At the time I write this, there are adoptable cats at Oakland Animal Services named Beanie, Lentil, Clover, Hiccup, Rascal, and Acorn. I hereby advocate for more imaginative names like those.

Read Me 4 comments

Twist and Shout

During the 1964 presidential campaign season, my father took me to an outdoor event where the Democratic candidate, Lyndon Johnson, was supposed to make a personal appearance. We milled around as various people made speeches, punctuated with announcements that Johnson would be arriving at any time now. I’m not sure that anyone was listening to the speeches; we had come only to see the man himself. After a couple of hours and no sign of Johnson, we drove home, tired and disappointed. This was my first experience of what I suppose would be called a political rally.

I feel bad for the people at the podium. Their job, I assume, was to get the crowd fired up, but their oratory just melded into an indistinguishable drone. Even if LBJ had made an appearance, it’s hard to imagine the spectators becoming a demonstrative mob. I’m guessing that most of the people there, like my father and me, were just normal, suburban Long Islanders who were curious to get a glimpse of a presidential candidate. We already knew that we were voting Democratic (at least those who, unlike me, were old enough to vote), and there was no need to get excited about it.

Ever since then, I’ve wondered about the crowds who we always hear cheering, shouting, and applauding at televised political events. In my experience, backing a candidate in an election is always a matter of choosing the best from an imperfect bunch and crossing your fingers that enough other people will make the same choice. It’s business. I have no more urge to root wildly for a politician than I do for the pumpkin that I select at a Halloween pumpkin patch.

It’s probably worth noting that I have an aversion to having my emotions toyed with. I recently heard Eddie Muller (the acknowledged maven of film noir) say on the Turner Classic Movies channel that he resents it when a movie tries to evoke tears. “Once I realize they’re manipulating me to cry,” he said, “I will not do it.” I would expand that sentiment to any sort of emotional manipulation. Nobody can tell me what to feel and how to express that feeling.

The one time I participated in a political demonstration was in the autumn of 2002, after the US — having already gone to war in Afghanistan — was preparing for an invasion of Iraq, all in retaliation for the September 11 attacks. The idea of responding to violence with more violence was abhorrent to me, and I was moved to attend a peace march in organized by a group called Not In Our Name. I marched silently through the streets of San Francisco, hoping that my physical presence, along with the presence of thousands of other people, would influence our leaders in Washington to rethink their plans. (Spoiler alert: It didn’t.)

Having had no experience with this sort of event, I wasn’t prepared for the whole parade of us to end up gathered in a plaza, listening to overamplified speeches by people expressing their grievances — not only about the impending war, but about the sorry state of the world in general. I don’t know what the point of this was, since our participation in the march should have been evidence that we already felt the same way. Clearly, we were being baited; we were supposed to respond by vocally expressing our righteous anger. For some people, I guess that joining in the call-and-response chants was cathartic; for me, it was a sign that it was time to go home.

My discomfort with this sort of thing goes beyond the political. I’ve always been annoyed when the performers at a large concert delay their entrance until a warmup announcer has prodded the audience into explosive, but artificial, enthusiasm. “Are you ready to have a good time?” the announcer will say. My natural answer would be, “What do you think? Would I have paid this much money — not to mention those outrageous Ticketmaster fees — and come all of this way to be scanned by metal detectors and herded into a densely packed space, if my intention were not to have a good time?” Being a man of few words, I would be willing to condense all of that into a simple “Yes.” But instead, I find that the audience has been conditioned to respond with a roof-rattling cheer.

Of course, that’s not enough for the warmup announcer, whose eardrums have apparently been severely damaged by the effusiveness of previous crowds. “I can’t he-e-a-r you!” he’ll shout.

“I’m sorry about your hearing loss, but take my word for it; I’m definitely ready,” I’m prepared to say. But the rest of the audience responds with an even louder cheer, accompanied by foot stomps and whistles. Not until the sound level reaches a prescribed point on the decibel meter will the announcer deign to let the performance begin.

Let me tell you: I was on stage many times at one point in my life, and I never felt the need to have someone tell the audience how enthusiastically to applaud. I always subscribed to the idea that it’s my job, as the performer, to earn the applause. If the audience response is tepid, I know that it’s not because they’ve been insufficiently manipulated; it means that I haven’t been working hard enough. Even then, if they prefer to sit quietly and think to themselves, “I’m enjoying this,” I have no legitimate reason to object.

Read Me 3 comments

After a Fashion

The good shirts

The community synagogue was the center of my parents’ social life in the 1960s and ’70s, and at least once a month they would attend one of its fundraising events — casino nights, galas, auctions, and rummage sales. Unfamiliar with the word “fund,” I construed that my parents were going to “fun raisers,” and I lamented that I wasn’t old enough to participate in raising the fun.

The event I (vicariously) enjoyed the most was the annual Journal Dinner, a formal-dress affair at which each participant received a copy of the “journal.” The journal was a magazine-sized publication in which members of the congregation could buy ads — a single line, an eighth of a page, a quarter of a page, and so on up the affluence scale, with the most expensive being a full-page ad printed on metallic paper. Each ad consisted of the name of the donor (or, more often, the donating couple) in bold type, optionally accompanied by “Best wishes,” “In honor of…,” or “In memory of….” That was it — the journal had no other content besides the ads, arranged in increasing order of extravagance. I loved leafing through the journal the morning after the dinner and discovering who had bought what. I was perpetually embarrassed that my parents could only afford an eighth-page ad, but I loved getting to the end of the journal where the rich people were gathered, and seeing each donor’s name tastefully emblazoned in the center of a gleaming silver or gold page.

Another synagogue fundraising event that my parents (or, more likely, just my mother) attended annually was the fashion show. Unlike the Journal Dinner — which I understood as a clever means for members of the congregation to flaunt their wealth by buying a bigger ad than their neighbors did — the idea of a fashion show made no sense to me. When I asked my mother to describe a fashion show, she said that it was an event where models dress up in fancy clothing and an audience pays to look at them. That can’t be right, I thought to myself. People pay to watch other people wear clothes? Don’t we do that every day for free?

Eventually I learned that there was such a thing as styles of clothing, that they changed periodically, and that people were always eager to find out what the new styles would be. Wearing the most fashionable clothes was somehow equivalent to buying a full-page ad in the journal. I remember my mother joking about how every few years, the fashion designers would proclaim, “Up with the hemlines!” and women would run out to buy shorter skirts and dresses; then, a few years later, the designers would proclaim, “Down with the hemlines!” and women would run out to buy longer ones. At first I thought that these proclamations were rules that women were somehow required to follow; it wasn’t until later that I realized that women (and men, with regard to the widths of neckties and lapels) were doing so voluntarily.

My response to this newfound understanding was to refuse to wear anything simply because it was in style. What was the point of buying something that was fashionable now, if it’s going to look silly in a couple of years? When bell-bottoms came along in the late 1960s, I insisted on continuing to wear traditional pants. Same with designer jeans in the ’70s. I remember my father treating me to an appointment at a then-novel “men’s salon” (as opposed to a plain old barber shop) in preparation for my sister’s Bat Mitzvah, and watching in shock as the hairdresser, after meticulously cutting and blow-drying my hair, went in with his fingers and deliberately mussed it up. “What are you doing?!” I said. He replied, “That’s the style now. You don’t want to appear too careful.” I would have none of it. If he was going to demand an outrageous amount of money to cut my hair, the least he could do was comb it neatly.

Because my aversion to following the dictates of fashion seems so common-sensical to me, I tend to forget that other people don’t share that attitude. When I was teaching digital-arts classes, students would often ask me to teach them to recreate a particular look or technique that they’d recently seen online. “Why do you want to do that?” I would say. “It’s just a fad.” It didn’t occur to me until later that if these students were going to be graphic designers, they were supposed to latch onto design trends.

Of course, I understand and appreciate that styles evolve over time. If they didn’t, the music world would never have had ragtime, jazz, rock, and rap, and the art world would never have had impressionism, expressionism, and modernism. One thing bothers me, though: In most realms, the emergence of a new style doesn’t necessitate the disappearance of an old one — the old and new can coexist. Symphony orchestras still play the music of Bach and Mozart alongside Thomas Adès and Caroline Shaw. The walls of art galleries still display figurative art along with abstract works. But when a new style of clothing comes along, nobody continues to make the old ones.

For years, I routinely wore long-sleeved cotton shirts with vertical stripes in dark, saturated colors. (See the photo at the top of this post.) They’re really the only shirts I feel comfortable in, and they used to be ubiquitous — I could easily pick up an armload of nice shirts from the tables at Costco. Then, about ten years ago, they disappeared. Dress shirts now are available only in shades of pink, purple, or blue, and in solids or checks — never stripes. The kind of shirts I like are so out of date that I can’t even find them in thrift shops anymore. To me, they seemed standard and timeless. If I’d known at the time that they were only a temporary fad, I would have hoarded a lifetime supply.

If anyone sees shirts like these presented as a new look in a fashion show, please let me know.

Read Me 3 comments

Say Uncle

A book review in the current issue of The Atlantic mentions that the Great Depression “shrunk international trade by two-thirds from 1929 to 1932.” This probably would not have made much of an impression on me, except that I’d recently seen an article in the Financial Times about the discovery of Ernest Shackleton’s wrecked ship, which noted that the ship was found “roughly four nautical miles from the position originally recorded by Shackleton’s crew before it sunk in November 1915.”

Shrunk? Sunk? Whatever happened to shrank and sank?

I may be a bit sensitive on this subject, since I was once scolded by a teacher for using snuck in a sentence. Today, if online authorities are to be believed, snuck is considered as acceptable as, and perhaps even preferable to, sneaked. Part of me wants to track down that teacher and get her to apologize.

I understand that language constantly changes, but I find the shift from sank to sunk and shrank to shrunk — not to mention stank to stunk — especially puzzling. Why the vowel change?

Could it be because the “u” sound in sunk is easier to say than the “a” sound in “sank”? Pronouncing that “a” vowel actually does require more muscular effort — try it yourself — but that seems like an unlikely explanation, since we still say “I sang” rather than “I sung” and “I drank” rather than “I drunk,” and we all still have ankles instead of uncles. (Well, most of us have ankles and uncles, but you get the point.)

My sense is that saying “it sank” or “it shrank” sounds prissy and affected, kind of like saying “It is I” rather than “It’s me.” If I were the sort of person who described things by analogizing them to unpleasant odors, I’d probably feel much more comfortable saying “It stunk” rather than “It stank.” Still, that doesn’t account for where that air of affectation came from, and why we’d say “The beer she drank stunk” rather than “The beer she drunk stunk.”

Of course, we’re dealing here with a language in which the past tense of blink is blinked and the past tense of think is thought — in other words, a language in which nothing makes any sense. In such an environment, having the past tense of drink be drank while the past tense of slink is slunk is hardly worth remarking on.

I think that what’s really bothering me is the simple fact of being old. Where I might have to wrestle with whether shrank or shrunk is more acceptable to my readership, the copy editor at The Atlantic (who, I’m guessing, is much younger than I am) has probably never heard of shrank, and therefore has no problem.

There was a time, around 400 years ago, when a writer using the second-personal singular pronoun might have to decide whether to stay with thou or go with the newer, hipper you. People at the time had strong feelings about the issue, but now it’s something we no longer have to think about, since you won out handily. I’m guessing that in another couple of generations, language questions that we’re fretting about — such as whether it’s OK to use they as a singular pronoun — will be similarly settled, and nobody will give a sentence like “They looked at themself in the mirror” a second thought.

In the meantime, living through transitions is disturbing. I’m irritated when I see “The economy shrunk,” and even more so when I see “They looked at themself.” Unlike the vowel shift from “a” to “u,” I totally understand and support the change in the use of they — but that doesn’t make me any less upset when I hear it spoken or see it in writing. Emotional reactions are emotional reactions, and unlike language, they don’t tend to change easily or readily.

Read Me 5 comments