Video Effects

I learned about video production by spending several days a week at my local cable TV station. The studio was run by one paid professional, while the rest of the crew consisted of community volunteers like me. Most of our output consisted of community-access programs for which each of us got to play a variety of roles: camera, switcher, floor manager, chyron, audio mixer, even onscreen talent. (My stint there also led, indirectly, to my meeting my wife, but that’s a story for another time.)

Although most of our work took place in the studio, we were occasionally sent out into the field to shoot community events. One such event was an arts fair that was set up in the parking lot of a shopping center. For its part in the fair, some resourceful theater group thought to stage a two-person play that took place in a laundromat, in an actual laundromat. The actors were miked, and the audience stood outside to watch the action through the plate-glass storefront.

“Let’s get some footage of that play,” said the director (the guy who ran the studio).

I was operating the camera, which in that era was a bulky, shoulder-mounted unit connected to a large video recorder carried in a backpack. I uncomfortably shoved my way to the front of the crowd in order to get an unobstructed shot.

“No,” he said. “You have to get closer.”

“I can’t,” I said. “If I get any closer, I’ll interfere with the performance.”

“We’re TV,” he snapped. “That’s what we do!” He grabbed the camera and backpack and barged into the laundromat, getting close shots of the actors as they did their best to pretend that he wasn’t there.

That’s when I first became uncomfortable with video. Clearly, once a camera moved from the studio into the real world, it couldn’t help but alter the events it was recording — and sometimes, take precedence over the events it was recording. These misgivings stayed with me for the nearly twenty subsequent years that I spent as a video producer.

Since the videos I produced (with the collaboration of my aforementioned wife, Debra) were intended for education and training, we rarely had to document events in real time. Pretty much everything we shot was staged for the camera. Still, I couldn’t escape the feeling that we were dishonoring the people we were filming.

Because educational productions necessarily work on low budgets, we could rarely afford professional actors. All of the people we shot were playing versions of themselves. When we went into jails to shoot training videos for corrections personnel, we cast real inmates as inmates and real officers as officers. When we made a video for utility employees showing how the district responds to large-scale emergencies, we had actual workers and supervisors staffing a phony field operations center.

For the most part, nobody minded being put in front of the camera — particularly jail inmates, who relished the chance to get out of their daily routines. But I never felt good about asking a correctional officer to act out the process of disciplining an inmate, when that same officer and inmate have most likely had that interaction in real life. I was certainly not comfortable staging scenes of female employees being mistreated for use in a video about sexual harassment prevention, or people with disabilities encountering obstacles in a video explaining the Americans with Disabilities Act. The breaking point for me was when we were producing a public-awareness video for a homeless shelter, and I directed a real homeless family to act out their life on the street while we followed them with a camera.

All of these people were volunteers, and they knew what they were agreeing to do. The videos were intended for professional or public education, and therefore we could all rationalize that what we were doing was for a higher purpose. But that didn’t relieve me of the sense that I was demeaning real human beings by turning their lived experiences into fodder for the camera. That’s one of the reasons why I left the production field and went into teaching instead.

But let me end on a more positive note: There was one thing I loved about making these videos, which is that leading a video crew allowed me entrance into places where I never would have been otherwise. I got to put on a hard hat and orange vest and hang out with water-company workers in a ditch in the middle of a road. I got to spend time at a correctional boot camp, at a fiberglass factory in Kansas City, and behind the scenes at an advertising agency, a drug treatment center, and a credit-union bank. And on one memorable occasion, I got to stand on a rooftop and shout “Action!” to police officers down below who were about to stage a high-speed chase. I’d take experiences like those over sitting at a desk any day.

Read Me 2 comments

Growth Experience

The kittens at four weeks

My college roommate Jay, an Iowan, used to tell us that if you stand in a cornfield on a warm, quiet afternoon, you can hear the corn grow. I’m not sure how he knew this — he was from Council Bluffs, an honest-to-goodness city whose residents spend very little time in cornfields — but I have no reason to doubt him. A typical stalk of corn grows almost an inch per day, so it’s easy to imagine that those stiff, proliferating plant cells might make some sort of racket.

I think of that now because Debra and I — as we so often do — have been raising a litter of foster kittens. When they arrived from the city animal shelter, they were only a few days old. They looked like tiny mice, their eyes and ears still sealed, their legs wobbly. We had to bottle-feed them every three hours around the clock. We fretted about whether these fragile creatures would manage to survive without a mama cat to nurture them.

Well, it’s now two months later, and those mice have grown into healthy, affectionate, playful, and adorable kittens. They weigh more than five times what they did when they first arrived. Soon we’ll be returning them to the animal shelter, which will find a loving, permanent home for each of them. None of this is particularly remarkable; and yet I find myself wondering daily: How is this possible?

We used to watch these kittens latch onto a rubber nipple and suck down half a bottle of milk substitute without stopping for breath. Eventually they graduated to lapping up partly-solid gruel, and now they’re noisily feasting on juicy, brown paté spooned straight out of a can. Somehow, those nondescript foodstuffs routinely turned into more kitten. Each kitten’s weight increased by half an ounce or an ounce a day; now the daily weight gain is sometimes as much as two ounces.

I realize that growth is a pretty universal function for living things; we’ve all experienced it ourselves. What makes it feel so miraculous in this case is that it happens so quickly — from a little mound of fluff that fits in the palm of the hand to a fully-formed animal, all in the space of a few weeks. Every time I look at or hold one of these kittens, I realize that it must be growing right now, right in front of me. Cells are multiplying, differentiating, turning into organs and limbs and fur, as I watch. You’d think that if I looked hard enough, just as if I were listening intently in a cornfield, I should be able to be able to observe the process happening. And yet all I see is a kitten doing the normal things — eating, breathing, purring — with no detectable sign of the furious activity underneath.

When I was very young, I used to stare at the hour hand on our kitchen clock, trying to catch it in motion. It clearly had to be moving, since it was in a different position each time I returned to the clock, but it frustratingly always appeared to be standing still. My parents’ explanation that its movement was so slow as to be imperceptible by human eyes was something I refused to believe. It was as unappealing as the idea that the earth was too large for me to see its curvature.

We like to think that our senses allow us to see the world as it is, and perhaps they do — but only the small slice of the world that’s available to us. Just as there are entities that are too small or too large for us to see, there are events that are too fast or too slow, wavelengths that are too short or too long. All of these levels of being exist simultaneously, and there’s no reason to believe that our human perception of reality is any better or any “realer” than any of the others. And yet — for me, at least — the only things that feel true are the things I can experience firsthand. All of the rest is just concepts and abstractions.

Clearly I’m not the only one who has this problem. The world, as we now know, is gradually growing warmer, but at a rate that’s too slow for us to experience through our senses. We know that it’s happening, but not in a way that we can perceive directly, and therefore it’s easy and natural for us to dismiss it as not quite real. Just as we can’t see the cells of a kitten multiplying, we can’t see the molecules of carbon dioxide accumulating in the atmosphere. All we can see is what’s on a larger scale — wildfires, droughts, intense weather events — and ask ourselves, as when we see a kitten somehow getting larger: How is this possible?

Read Me 1 comment

The Melting Room

I wish I could remember how my family ended up in this nearly-empty restaurant one rainy night. Maybe we were coming back from a long car trip, and we were hungry and cranky, and my father pulled off the road at the first restaurant that looked family-friendly. The restaurant was attached to a shabby motel, and we had to walk through a hallway to get into it. Just before we turned the corner, I sensed something strange on my right. I glanced over and saw a plain wooden door with glued-on plastic letters that spelled out MELTING ROOM B.

I couldn’t concentrate on my meal. All I could think about was what could possibly be behind that door. What would a motel need to melt? And why would it need not one, but two or more rooms to do it in? Were the motel guests using or consuming something that came from the melting room? Was I? Did my school have melting rooms too, and I’d just never noticed them?

My questions became moot when, on our way out of the restaurant, I noticed another door with the same type of glued-on plastic letters. It said MEETING ROOM A.

As satisfying as it was to have the mystery solved, I was disappointed to have the melting room taken away from me. The thought of it had been tantalizing, as if something from a horror movie had suddenly appeared in the real world, and I was the hero who discovered that something isn’t right.

In fact, the melting room never really went away — it still exists in the parallel universe of my imagination. I wonder whether other people have a personal world populated by people and things that seem to have some kind of existence, even if not a physical one.

My parallel universe grew larger when I entered college. Each of us was given a book called the Freshman Herald, filled with head shots, hometowns, and birthdates of everyone in the incoming class. (This was back when “facebook” really meant a book with faces in it.) Like many other lonely first-year students, I leafed through the book often, looking at the people I might encounter and forming impressions of who they were. One face stuck with me in particular: a young woman with honest eyes and a kind smile, her head at an inviting tilt, her hair illuminated by sunlight falling through leaves. The fact of her being somewhere on campus made me feel all warm inside. I hoped that I would someday find out how it felt to be in her presence.

As it turned out, I didn’t meet her until two or three years later. She was a perfectly fine person — I liked her immediately — but she didn’t have the gentle aura and sincere smile of the girl in the picture. Her voice was wrong, her manner was wrong, her physical presence was wrong. Without the halo of sunlight, even her hair was wrong. I felt the same mix of emotions that I’d had on that night when I left the restaurant: relief that my curiosity was satisfied, but sadness that the person I’d been connecting with for so long was imaginary. In this case, it was more than sadness. It felt something like grief.

As with the melting room, though, I took solace in knowing that since the person in the photo had never really existed, she couldn’t be taken away. I still felt all warm inside when I looked at her picture. Even now, I think of her as two different people who happened to live in two different universes.

Living with this sort of cognitive dissonance becomes more complicated when morality enters the picture. Consider the case of Bill Cosby. I was seven years old when I first heard a recording of Cosby’s “Noah” routine, in which God informs Noah that he’s going to destroy the world, and that Noah therefore needs to build an ark. (Noah: “Right…. What’s an ark?”) I thought it was hilarious, especially given my already developing skepticism about religion.1 Although I never followed Cosby’s career very closely, I always enjoyed encountering him on TV — guest-hosting for Johnny Carson, repping Jello pudding, embodying the world’s most admirable father figure on “The Cosby Show” — and always felt like the world was a better place as a result of his being in it.

When it eventually came to light that he was, in fact, a despicable man with a long history of drugging and raping women, my warm feelings toward him naturally turned brutally cold. Yet the knowledge that Cosby was a monster can’t erase the many years in which I experienced him as a benign and genuinely funny presence. Is it OK that the benevolent Cosby still exists as a cherished inhabitant of one mental universe, while the vile Cosby casts his shadow over another? Can I still look back fondly on the brilliant early films of Woody Allen or the insightful comedy routines of Louis C.K. while acknowledging that those men never existed as I imagined them?

The universe where the melting room exists isn’t necessarily a better one. (The melting room itself remains pretty creepy.) And unlike the “real” universe, it will cease to exist when my life ends. But as long as I can still derive pleasure from visiting it, I have no wish to abandon it.


Read Me 4 comments

The Machine Age

I know that you’re not interested in hearing about my first computer — an IBM PCjr with a single floppy drive, a 4.77 kHz processor, and 128 kb (yes, that’s kilobytes) of memory. When an old guy like me talks about how rough he had it compared to kids today, you naturally want to tune him out and go back to your phone.

The thing is, though, I didn’t have it rough. I loved my PCjr. How was I to know that it would be totally obsolete in a couple of years? At the time I bought it, it changed my life. I no longer had to remember and process loads of information in my head — I could outsource it to a machine. I could write, edit, and type a finished draft, all at the same time. With the addition of a modem, I was able to communicate with people anywhere in the world, do research, and even buy things without having to leave my studio apartment.

Those of us who grew up before the 1980s tend not to think much about that decade. The 1970s had disco, energy crises, and hard-fought rights for women and gay people; the 1990s had hip-hop, the fall of the Soviet Union, and the World Wide Web. But what did the 1980s have, other than mixtapes and Ronald Reagan? Personally, I passed some important milestones during that decade: I quit my secure publishing job to go freelance; I met and married my wife; I moved with her from the east coast to the west. Putting aside those personal events, however, I think of the 1980s as the time when the technological environment that we now take for granted began to take shape.

In addition to the aforementioned computer, the 1980s brought my first phone-answering machine, my first cable TV, my first VCR, and my first microwave oven.2 For the first time, I was able to get money out of a machine anywhere in the world, instead of having to go to my local grocery store to cash a check. Cash itself was less necessary, as bank-affiliated credit cards had become ubiquitous. Thanks to the same network infrastructure that made ATMs possible, sellers were now able to approve credit card purchases instantly, without having to manually look up deadbeat card numbers in a printed booklet.

Deregulation of the aviation industry made flying affordable to people who were not well-to-do. In some cases, it was more than affordable — an upstart airline named People Express offered flights from New York to Boston or Washington, DC for $29 (sometimes discounted to $19), making it easy to visit distant friends. Flying People Express was an adventure. There were no tickets; passengers would stampede onto the plane until all the seats were taken, at which point the doors would close. Not until the plane was in flight would a flight attendant wheel a cart down the center aisle, taking each passenger’s credit card and printing it through several layers of carbon paper with a satisfying ka-chunk.

I would venture to say that before the 1980s, the average person’s lifestyle would have been comfortably recognizable to someone from the 1950s. By the end of the 1980s, it was entirely different. I remember the day when I suddenly grasped the possibilities of this new era. I was in a city — I don’t recall which one, because my frequent People Express flights have all blurred together — with a friend, and we decided to split up and meet later. But in those days before cell phones, how could either of us let the other know if we’d been delayed, or if we couldn’t find each other?

“I know!” I said. “We can use my phone-answering machine.” It had recently become easy and cheap to make long-distance calls from a pay phone, thanks to the emergence of new networks such as MCI and Sprint. If either of us had an urgent need to contact the other, we could call my machine back in New Jersey and record a message. And if either of us was concerned about the other’s whereabouts, we could call my machine to check whether a message had been left. This was not what answering machines were invented for, but their existence had nevertheless opened the door to something formerly impossible: two people making contact when they couldn’t find each other in a big city. What other previously unimaginable things would we soon be able to do?

Coda: Most of those life-changing innovations from the 1980s are either gone or going away. Nobody uses answering machines anymore. VCRs are a thing of the past, and cable TV is rapidly being eclipsed by streaming services. ATMs are far less necessary, due to the growing reliance on cashless transactions and the ability to deposit checks remotely. Long-distance phone networks such as AT&T, MCI, and Sprint have been supplanted by wireless carriers (although some of those familiar names remain). Even desktop computers — the descendants of my primitive PCjr — are fading in popularity, with many of their functions being taken over by smartphones, tablets, and wearables. Still, many aspects of the way we live now can be traced back to the big technological shift that began forty years ago.

Interestingly, the one thing that still exists virtually unchanged from its 1980s precursor is the microwave oven. It’s hard to remember what life was like before it became a standard kitchen appliance. Come to think of it, it’s not that hard, since my house’s microwave oven recently broke down, and we were at a loss as to how to reheat leftovers. I poured my day-old Chinese food into a pot and impatiently stirred it over a gas flame. The others in the household ate theirs cold.


Read Me 1 comment

Idle Worship

“Last Embrace”: Marcia and me (far right) with our co-stars Roy Scheider and Janet Margolin

In the summer of 1978, the cast and crew of the film “Last Embrace” came to Princeton for a week of shooting on the university campus. I had already graduated, but was still living in town, so I was lucky enough to get cast as an extra (along with my girlfriend Marcia) in one nighttime scene. Marcia and I were directed to walk behind the stars of the movie, Roy Scheider and Janet Margolin, as they strolled along a walkway having a conversation. I don’t remember much about the shoot itself, but I do remember that there was lots of waiting around, that we got surprisingly good food, and that one attractive female student got invited to the afterparty at a nearby hotel while the rest of us didn’t.

Mostly, though, I remember getting my first closeup look at Roy Scheider. It’s not as if I was a particular fan of his; the only reason I was familiar with him was that (like everybody else) I had seen him in “Jaws.” Yet here he was, standing just feet away from me, brushing his hair with a pocket hair brush just like the one I had. He looked exactly like Roy Scheider, except that now he was three-dimensional and breathing the same air I was breathing. The thrill of that moment is firmly etched in my memory.

Why should that be? Why do we always get a thrill when someone we’ve seen on the screen appears before us in person? For me, at least, it doesn’t even have to be someone who’s famous. I’ve been to film screenings where the subject of a documentary takes the stage for a Q & A session afterward, and even that person exudes a special aura.

It’s even stranger when places and things we’ve seen onscreen take on that air of specialness. I’ve never watched “Game of Thrones,” but when our tour bus in Northern Ireland stopped at a key location from the series, passengers went nuts taking photos. I’ve been known to stare reverentially at the Brocklebank Apartments in San Francisco, because that’s where Kim Novak’s character lived in “Vertigo.” Sometimes it gets just plain silly: I was once on a tour of Paramount Studios in Hollywood during the time the series “Monk” was in production, and our tour guide sneaked us onto the set. “Ohmigod,” I remember saying to myself as I walked breathlessly through Adrian Monk’s apartment. “There’s his refrigerator!”

Back when I was a boy in Hebrew school, I remember the teacher having to explain what “graven images” were, since the second commandment admonished us not to “bow down to them nor serve them.” She explained that people used to pray to stone figures as if they were gods. I found it impossible to believe that any sane human being would attribute divinity to a human-made artifact. Yet later on, we learned the story of how the Israelites worshipped a golden calf while Moses was away on Mount Sinai. Was idolatry such an irresistible impulse, I wondered, that even the people who had been freed from Egypt by a series of divine miracles felt driven to do it?

I can’t help thinking that the thrill we get in the presence of people and places we’ve seen on the screen is a modern-day outgrowth of the ancient need to worship something concrete — to treat ordinary entities as having some connection to a realm higher than our own. Rationally, we know that celebrities are just people, and that movie locations are just places, but there’s a strong, irrational part of us that wants to experience them as special.

In my 20s, when I developed close ties with a Quaker community, I was surprised to find out that the Quakers didn’t put much stock in holidays such as Christmas and Easter. Since God made every day, they reasoned, why should we treat any particular day as being more special than another? On Christmas Day, the sun rises and the wind blows just the same as on any other day. Any special attention we pay to a holiday is just us projecting our human egos onto nature, and Quakers don’t think too highly of the human ego.

I’ll grant that it’s possible to observe holidays without thinking of the days themselves as being innately special. They’re a time on the calendar when we can all agree to engage in rituals that we enjoy — rituals that retain their specialness because they only happen once a year. (Even the Quakers hold a special meeting for worship on Christmas Eve, even if it’s no different from their normal Sunday meetings.) Similarly, it makes sense to experience an encounter with a celebrity as having a heightened atmosphere — simply because it’s so rare for something that only existed as a mental image to suddenly become concrete — without necessarily conferring a hallowed status onto celebrities themselves.

Still, there’s something a little crazy about the relationship we have with the movie or TV screen. Whether or not it’s odd that we get a thrill from encountering a person or place that we’d never seen in real life, think about the opposite: when something that’s totally familiar to us suddenly appears on film.

“Oh, wow!” I thought to myself when I finally watched “Last Embrace” in a theater. “That’s the Princeton campus! There’s the East Pyne archway! There’s Holder Courtyard!” Despite the fact that I’d spent nearly every day for four years on that very campus, there was this inexplicable excitement about seeing it on the movie screen. “That’s Ron Grayson!” I said when a familiar classmate was shown running out of a dorm, carrying a trombone. Somehow seeing him on film felt so much more emotionally charged than seeing him in person. So how do you explain that? Why did I run to the theater in 1993 to see a mediocre movie called “Made in America,” simply because parts of it were shot in my neighborhood in Oakland? Why do I feel a tingle whenever my high school friend Rob Bartlett shows up on the TV screen, even though I know perfectly well that he’s played recurring characters on several popular series? Why do I, many years after my graduation, still get excited when I see a film (“A Beautiful Mind,” “I.Q.,” “Across the Universe”) that was partially shot on the Princeton campus? This is behavior that we all take for granted, yet it’s entirely weird when you take a hard look at it.

Read Me 1 comment