Impossible

Many years ago, I met a woman on a train who told me that she worked in a plastics factory. When I asked her what the factory made, she said that one half of it manufactured the little plastic cups that come with NyQuil bottles, and the other half manufactured dolls’ eyes.[1]

Naturally, I’ve always remembered that conversation — who could forget the juxtaposition of NyQuil cups and dolls’ eyes? But I remember it also because it started me thinking about the sheer number of things that exist in the world, and where they must come from. Every product must be manufactured in a factory somewhere, and in many of those cases (as with NyQuil cups and dolls’ eyes), the parts that make up those products must be made somewhere else. Add to that the number of factories that must make the machinery that makes the parts in the aforementioned factories, and the factories that make the parts for those machines. In sum, for the world to have enough factories to turn out the hundreds of millions of different products that are used in homes and businesses is, quite plainly, impossible. And yet — as is hardly necessary to point out — it does. Clearly, I’m not I’m not the best judge of what’s possible and what’s not.

I must confess that I have a long history of limited imagination. My first computer came with a rudimentary spell-checker that would go through a document and check it against a limited list of frequently misspelled words. I remember thinking that, given enough floppy disks, it was theoretically possible for the spell-checker to contain a full dictionary, possibly even with definitions. But of course I knew that would never happen, because who was going to type an entire unabridged dictionary into a computer?

A few years later, my friend Brad, an early tech visionary, showed me a model he had come up with for software that would enable numerous members of a team to work together on a project. While I admired the model’s elegance, I told him that it would never work in the real world, because human nature made it impossible for more than a couple of people to collaborate on anything.

My then-girlfriend Marcia, who at the time worked for the pioneering spreadsheet company Lotus, gave me a chance to try out one of the first-generation Macintosh computers. I concluded that its visually oriented, mouse-based interface made it an expensive toy that would never be useful for any serious work. Who would have the patience to keep looking for things to click on, instead of simply telling the computer what to do?

There’s an old myth that the laws of aviation make it impossible for bees to fly, and yet they do so anyway. If that were true (which it isn’t), the lesson should be that there’s something wrong with the commonly accepted laws of aviation — not that there’s something miraculous about the bee. But I was one of those people who, when something I considered impossible actually came about, refused to accept that my conception of the universe might be wrong.

Surprisingly — or perhaps not surprisingly, given my tendency to learn kinesthetically — my first opportunity to see beyond that blinkered worldview came not in a computer lab, but at a rock-climbing gym. Although I’ve always resisted doing anything athletic, a friend named Bonnie, who was as smart and analytic as anyone I’ve ever met, convinced me that making my way up a climbing wall would present a mental challenge as much as a physical one. And sure enough, as I learned to navigate the route upward, I found that strategizing about where next to place my hand or my foot paid off rewardingly each time I found a secure hold.

Except for the time when I didn’t. I was midway up the wall and I couldn’t find anything to grab or step onto that would allow me to move ahead. After long consideration, I had to admit defeat. “This is impossible,” I said to Bonnie.

“Don’t think; just go for it!” was her unexpected response. Given that Bonnie was gripping a rope that assured my safety, I had nothing to lose but my dignity, so I did something I could never have imagined doing: I leapt upward and trusted that my body would find some way to hold on. And it did!

I still don’t know how I was able to latch onto that wall, but it was suddenly clear that my belief that something was impossible didn’t make it so. That’s a lesson that has stayed with me ever since. The expression “leap of faith” holds a very concrete meaning for me.

Of course, I don’t know for sure that if I’d worked at Google when someone said, “Let’s map every street in the world and make those maps available for free on everybody’s phone,” I wouldn’t have used the I-word. But I like to think that some small piece of my imagination would have marveled at the possibility.


[1] Seeing how this looks when written down, I wouldn’t blame you for thinking that she was putting me on. But I swear that she was entirely earnest.

Read Me 1 comment

Because I Said So

Our fourth-grade teacher must have been friends with another fourth-grade teacher in another town. That’s the only reason I can think of why each of us was assigned a penpal in the other teacher’s class. My penpal was named Paul, and I remember nothing about him. I wrote to him because I was supposed to, and he wrote back to me because he was supposed to.

Toward the end of the school year, our teachers arranged a special treat: The penpals would get to meet! In preparation for the grand event, we were assigned to prepare a lunch box labeled with our penpal’s name, and to decorate it with complimentary adjectives starting with their first initial. Naturally, I hit the dictionary in order to come up with as long a list of “P” words as I could: patient, peaceful, perky, personable, perspicacious….

My teacher inspected the lunch box and told me that one of the adjectives — pathetic — would have to go.

“Why?” I said. “The dictionary says that pathetic means ‘deserving of pity.’ Why wouldn’t he deserve pity?” I imagined that if Paul broke his arm, I would say something like, “Poor Paul! It must really hurt,” to which my teacher would respond by snarling, “No! Don’t pity him! He doesn’t deserve it!”

I’d hate to be the teacher who had to explain to a fourth-grader the subtle difference between pity and compassion. Fortunately, I’m not that teacher — but then again, neither was my actual teacher, who engaged in her usual mode of problem-solving: “Don’t argue. Just get rid of the word pathetic.”

It’s understandable that in many situations, adults may lack the time, patience, or even the ability to explain sophisticated concepts to kids. That’s why every child eventually becomes resigned to hearing the all-purpose response, “Because I said so.” But I can still feel the sense of anger and helplessness that came from being deprived of an explanation.

When I was a third-grader, I won a contest by writing an essay about how great the American system of government was. (This was at a time when such sentiments were still expressed without irony or embarrassment.) I remember highlighting the idea that American citizens govern themselves by saying, “If the people want a road around Lake Whozit, the people get a road around Lake Whozit!” My prize for rhetorical gems like these was that I got to read the essay aloud to a school assembly.

Shortly before the public reading, my teacher told me that she needed to make a slight revision in the essay. It was in the section where I talked about the three branches of the federal government. I had written, “The Congress makes the laws, the Supreme Court makes sure the laws are constitutional, and the President carries out the laws.” She rearranged the sentences to say, ““The Congress makes the laws and the President carries out the laws. The Supreme Court makes sure the laws are constitutional.”

That seemed like a crazy revision. First of all, her phrasing wasn’t nearly as elegant as mine. But more important, her rewrite seemed to say that the Supreme Court determines whether a law was constitutional after it had been carried out. To me, it was evident that the system couldn’t possibly work that way. Surely the president wouldn’t want to enforce a law before knowing whether it was constitutional. If things really worked in the backwards order that the teacher was suggesting, then it was a stupid system, and why would I want to boast about it in an essay?

When I told the teacher that she must be mistaken, she assured me that she wasn’t, and that I should read the essay in the way that she had revised it. I did, but without nearly as much enthusiasm as the line about Lake Whozit.

Nobody clarified for me that judicial review was not called for in the Constitution, and that the Supreme Court rules on a law’s constitutionality only if someone challenges the law in court and the challenge works its way up through the appeals process. Again, I can understand why — that’s a pretty complicated thing to try to explain to an eight-year-old. But I was left with the embarrassment of having to read an essay aloud that I supposedly had written, but that I didn’t fully understand or stand behind.

I have no children of my own, and the classes I’ve taught have all been at the college level, so I’ve had fewer situations than most adults in which I’ve had to resort to saying “Because I said so.” I’ve still had to hear it, though — usually when I’ve asked a customer service representative why the company had done something unconscionable, and the representative replies, “Because that’s our policy.” It still makes me as angry now as it did when I was a child.

Read Me 3 comments

Display Case

The Library Displays Handbook, published 30 years ago this month, was the first book to sport my name on the cover. I wrote the text, collaborated on the design, created the illustrations, and built a variety of large and small library displays that were reproduced in a color insert. (You can see a couple of samples at the bottom of this post.)

I did all this as a work-for-hire under contract to the H.W. Wilson Company — a publishing house that catered exclusively to libraries and librarians, best known in those days for the Readers’ Guide to Periodical Literature — so I never received any royalties, and have no idea how well the book sold. It’s apparently still available on Amazon, although it’s listed as “Temporarily out of stock,” and it’s accompanied by a single one-star review that says “Too old to be useful.”

For the most part, I can’t argue with that review. The section called “Computer-Generated Lettering” talks about the relative pros and cons of daisy-wheel, dot-matrix, and laser printers, and spends two dense pages expounding on the technical knowledge required to use a laser printer. (“Commands to a LaserJet must be expressed in the Hewlett-Packard Printer Control Language [HP PCL], while commands to a LaserWriter must be expressed in the PostScript Page Description Language [PostScript PDL].”) Later, the book notes that computer-generated text and graphics are generally limited to black and white, since any sort of color printer would be beyond the budget of most school or public libraries.

Other sections describe now-antiquated tools such as rub-on lettering, Kroy lettering machines, photomechanical transfers, and hot-wax machines for paste-up. Photocopying is described as a technology that only recently has become widespread and affordable. (“Self-service copy shops have become increasingly common in recent years; most communities have at least one, and some communities have dozens.”)

Surprisingly, however, much of the book is not dated at all. The whole first chapter introduces the elements and principles of design, which certainly haven’t changed since 1991. Many of the techniques and sample projects involve the use of timeless tools and materials such as paper, scissors, paint, and glue. The Construction chapter demonstrates how to make a book stand out of a wire hanger, how to make a sturdy shelf out of cardboard, or how to make a concealed picture-hanger out of thumbtacks and cloth tape.

The only reason that I — a non-librarian — felt qualified to write such a book is that I’d spent my whole life practicing these sorts of techniques. By the time I was eight years old, I was able to cut letters and numbers freehand out of construction paper, or quickly make a hinged-lid box out of shirt cardboard. I routinely won poster contests in elementary and high school, and earned a nice supplemental income in college by making signs for the university’s food-service department. As a project director at a small educational publishing company, I often hired myself as a freelancer to create graphics and props for books and filmstrips. The library side of the project felt familiar as well; I’d spent much time in libraries (and had briefly dated a librarian), so I had a pretty good sense of what the handbook needed to cover.

But if so many of the techniques for hand-crafting library displays are timeless, why does the book feel — in the wise words of that Amazon reviewer — “too old to be useful”?

Part of it, I suppose, is how the nature of a public library has changed in 30 years. The handbook dates from a time when people still went to libraries to acquire printed books, which they found by thumbing through individually typed cards in the card catalog. They’d read newspaper articles on microfilm or microfiche, and magazine articles in actual issues of the magazines themselves, using hardbound indexes such as H.W. Wilson’s Readers’ Guide to find the topics they were looking for. In such a physical and tactile environment, posters and displays made out of cardboard, paint, glitter, and yarn didn’t feel out of place.

The nature of librarianship seems to have changed as well. The profession has become more specialized and technical — many library schools have rebranded themselves as schools of “information science” — to the point where creating hand-crafted displays would not meld easily with the responsibilities of the 21st-century librarian.

Mostly, though, children’s upbringing has changed. I’d guess that — like me — most people who grew up in the ’50s or ’60s had some experience making things out of paper, paint, and glue (not to mention pasta, popsicle sticks, and pipe cleaners). It was a common pastime at summer camps, at scout meetings, and in school art classes. Later, these materials remained familiar to us as adults, and we were able to make use of them when the occasion arose. Today’s young adults grew up in a largely digital environment, where the chief purpose of hands is to push a mouse and type on a keyboard.

Interestingly, in response to the increasing digitization of childhood, some elementary schools have begun to offer “maker spaces,” where students can use physical materials to create artistic or practical objects. Perhaps in the future, the Library Displays Handbook will be seen not as obsolete, but as ahead of its time.

Read Me 3 comments

Regrets, I’ve Had a Few

Detail from the third of the “Hunt of the Unicorn” tapestries, 1495–1505

The question often comes up in late-night conversations with friends: Do you have any regrets about decisions you’ve made in your life? Even when taking into account the most unfortunate consequences I’ve faced for things I’ve done, my standard answer has always been, “No, since I don’t know what would have happened if I’d done otherwise. It might have been worse.”

That principle only goes so far, though. Although it allows me to feel comfortable with the general path my life has taken, it doesn’t eliminate the sting of tiny moments when, through selfishness or thoughtlessness or negligence, I’ve hurt someone else. How many times I’ve wished that I could hit the Stop button, rewind a bit, and redo the last few seconds!

My college roommate Jay once told me that I had “an overdeveloped sense of loss.” To the extent that’s true, it probably started on a spring day when I was a young child. My mother had never been a morning person — she was pretty much unapproachable until she’d had her coffee and put on her makeup. But on this one morning, she somehow woke up in a good mood. She emerged from her bedroom smiling, and remarked on how nice a day it was. She had a lightness that I’d never seen before.

I, meanwhile, had been privately working myself into a snit about some injustice I had suffered — something my sister had done to me, or some chore I’d been tasked with that I shouldn’t have had to do. Whatever the cause of my pique, seeing my mother so happy caused me to confront her with an aggrieved, whiny outburst. Her sunny aura vanished instantly, and she reverted to her usual morning grumpiness and irritation as she dealt with my complaint.

Immediately, I felt a huge wave of guilt and remorse. In killing her rare good mood, I felt like a murderer — like a hunter who had slaughtered a unicorn. But I was a child, not yet old enough to know how to back off and apologize, and so I continued to gripe and whine, even while seeing the damage I’d caused and knowing that it hadn’t been necessary.

I still grieve for that lost ray of sunshine. If “grieve” sounds like hyperbole, I have to assure you that it isn’t. Even though I’ve been able to forgive myself for the incident, the emotions associated with it are still fresh.

Another such moment occurred when I was in college. The feelings in this case are not as intense, but just as long-lasting.

It was a warm summer night, the kind in which the day’s oppressive humidity is relieved by a mild breeze, and clouds part to reveal the stars. Princeton has no classes during the summer, so the campus population was small: just a few grad students and those of us undergrads who had summer jobs. (I was working for the campus tour service.) There would be occasional evening activities at the graduate college, such as outdoor concerts and film showings. This was one of the latter — a showing of the classic “Treasure of the Sierra Madre,” which I’d never seen and had long wanted to.

I liked to arrive early to such events, to find a good seat and get settled in. I found myself sitting near a young woman whom I’d never seen before. She and I began to chat, and in the fresh embrace of the summer air, I immediately felt at ease. Think of how rare those occasions are when you meet somebody and instantly hit it off — no self-consciousness, no posing. This was one of those occasions, where we felt each other’s warmth and delighted in each other’s openness. We didn’t talk once the film began to roll, but my enjoyment of it was acutely enhanced by having her nearby.

When the movie ended and we got up to go our separate ways, I wanted to tell her how much I enjoyed her company, and — perhaps — find a way to see her again. But how to find the words? “I’ve really enjoyed getting to know you, Carol,” I said.

Carol?” she replied, her eyes narrowing. “My name is Susan.”

Actually, I don’t know whether her name was Susan; I just made that part up. That’s the point — I remembered her as Carol, but I had it wrong. And that was the end of our rapport; the door slammed shut. I watched her walk off into the night.

Who knows whether anything would have come of that chance meeting? Summer nights have strange effects on people, and we might not have fallen under the same spell if we were to get together a second time. Maybe it’s best that the evening ended the way it did, with the pleasant memory of our brief time together.

But having called her by the wrong name in such a vulnerable atmosphere, I felt, and still feel, like I committed an act of violence. Not only did I insult her, but I had negligently put a sudden end to a precious moment of connection. From such small acts come the greatest regrets.

Read Me 6 comments

Give Me a Break

Today is Giving Tuesday, at least according to the many emails I’ve been receiving from charitable organizations. The idea appears to be that we can compensate for the consumerist excesses of Black Friday and Cyber Monday by contributing to the public good on Tuesday.

The sentiment is admirable, but one thing about Giving Tuesday has always irked me. Black Friday got its name because stores promoted heavy discounts for the day after Thanksgiving, and so many shoppers turned out that the retailers’ balance sheets instantly went “into the black.” Cyber Monday came about because, after having their appetites whetted by Black Friday sales, shoppers continued to buy things online when they returned to the office on Monday. The following day, however, became Giving Tuesday only when some nonprofit organization declared it so. Instead of being descriptive of consumers’ actual behavior, Giving Tuesday is prescriptive — it’s a day when you’re supposed to give to worthy causes. And I’ve always been wary of any holiday that has “supposed to” embedded in its premise.

On Martin Luther King Jr. Day, you’re supposed to engage in volunteer activities. On Mother’s Day and Father’s Day, you’re supposed to express your gratitude to your respective parents. On Memorial Day, you’re supposed to honor the men and women who died in military service. Hell, even on Christmas, you’re supposed to be merry. (It wasn’t enough for Scrooge to give Bob Cratchit the day off from work; he was expected to be happy about it.)

In college anthropology classes, I learned the difference between two kinds of social norms: mores (pronounced “morays”) and folkways. Both are sets of rules that members of a society are expected to follow, but they differ in degree of significance. Violating a more — for example, by engaging in racist or sexist behavior — may result in serious sanctions or punishment. Violating a folkway — for example, by chewing with one’s mouth open — usually results only in chiding, shaming, or quiet disapproval.

The “supposed-tos” that are associated with holidays are folkways. I once attended a Memorial Day parade where a man was going from spectator to spectator, handing them cheap little paper poppies with wire stems in exchange for a contribution to the American Legion. When he came to me, I — having no need or desire for a fake poppy — politely said, “No, thank you.” He looked at me with undisguised hostility and said, “What do you mean, ‘No, thank you’?” He shook his box of contributed coins. “Disabled veterans!”

Clearly, what I had interpreted as an optional transaction was actually a folkway, and I was violating it. I dropped some coins in the box and accepted a poppy. I suppose that doing so constituted a charitable act, but there’s no joy in charitable giving when it’s done under coercion.

It would be nice if holidays could be relieved of their coercive aspect. Ideally, a holiday should constitute a reminder — People gave their lives for our country! Your parents sacrificed for your well-being! — and an opportunity to act on that reminder. But if I choose not to give on Giving Tuesday — because I didn’t buy anything on Black Friday or Cyber Monday, and because I make charitable contributions regularly — there should be no shame in that.

Oddly enough, this problem with holidays makes me think of the Pledge of Allegiance. Every morning, from the time I first started attending school, we students were expected to rise, face the American flag with our hands over our hearts, and recite the pledge. As a young child, I had no idea what “allegiance” meant (nor, for that matter, who the mysterious Richard Stans was), but I did it because that’s what we were required to do.

Of course, making the pledge was pointless from a practical point of view. First, a promise made under coercion is not an enforceable promise. Second, because I’m a US citizen, my allegiance to the United States is legally required whether I make the pledge or not. (If I’m found to be giving aid and comfort to an enemy, I’ll be considered a traitor either way.) Third, even if the pledge had some legal or moral significance, it wouldn’t have to be renewed every day — it would be presumed to remain in effect until I specifically renounced it.

Eventually those objections became moot. Around the time I entered high school, it came to light that the Supreme Court had, in 1943, ruled that schools could not require students to salute the flag. Naturally, then, I stopped reciting the pledge, as did almost all of my classmates. But I wondered why the Pledge of Allegiance was still considered meaningful enough that people chose to recite it voluntarily at public events.

What I came to realize is that people like to express their deepest convictions — whether it’s their loyalty to their country, their love of their mother, or their caring for people less fortunate — at a time when they can see other people doing the same thing. It gives us a sense of solidarity and community. Like publicly reciting the Pledge of Allegiance, holidays serve that purpose. We can — and hopefully do — offer our gratitude for life’s blessings many times throughout the year, but there’s something special about doing it on Thanksgiving. I suppose there could be something similarly special about giving to charity on Giving Tuesday.

Read Me 3 comments