Fair Minded

One of the highlights of my childhood was my visit to the 1964–65 New York World’s Fair. There was the Sinclair exhibit with its life-size dinosaurs, and Ford’s Magic Skyway, where you could watch the entire history of the human race go by from the comfort of a self-driving Mustang convertible. There was the Illinois pavilion, where Disney’s audio-animatronic Abraham Lincoln miraculously stood up and gave a speech, and the IBM pavilion, where the audience was hydraulically lifted into a giant egg and dazzled by an immersive multimedia show.  There was DuPont’s “Wonderful World of Chemistry,” in which live actors sang, danced, and interacted with filmed actors projected onto moving screens. And there were technological innovations that I’d never seen before: color TV, “Touch-Tone” phones with buttons instead of dials, and IBM Selectric typewriters, where the type element moved along a track while the carriage stood still.

As I got older, my wondrous memories of that fair led me to be interested in another exposition that once had been held on the same site: the 1939-40 World’s Fair. Unlike the later fair, which was a hodgepodge of futuristic architectural styles, the 1939 fair was a visual delight, featuring Art Deco graphics and clean Modernist architecture. It had an overarching theme — “The World of Tomorrow” — intended to lift the spirits of a population that had weathered the Great Depression and was looking ahead to a better and more prosperous world. Its most famous exhibit was General Motors’ Futurama, which displayed an imagined model city of 1960, with gleaming suburbs connected by a network of fast, efficient highways (a new idea at the time). I’ve watched films taken at the fair and seen exhibits of its relics, but I’ve always wished I could have experienced it in person.

One reason for my emotional attachment to that fair was that my mother had been there. I remembered her stories about the majestic size of the fair’s centerpiece, the Trylon and Perisphere; about seeing television for the first time; about being introduced to nylon stockings; and about trying out a new type of pen, the ballpoint, which didn’t have to be dipped in ink. Compared to those things, push-button phones and improved typewriters felt trivial.

A few years before her death, I told my mother about how I’d been influenced by her descriptions of the 1939 fair when I was growing up. I expected her to lapse into warm reminiscences, but instead she looked at me like I was crazy.

“What are you talking about?” she said. “I never went to that fair. I was five years old! Even if I had gone, I wouldn’t have paid attention to things like pens and nylon stockings.”

I quickly did the math, and was stunned to realize that she was indeed five years old in 1939. Evidently, she had never told the stories that I so clearly remembered her telling. I still have no idea where those false memories came from.

That conversation left me shaken. What other memories, what other explanatory stories, were pure inventions? So much of my sense of who I am comes from remembered events and conversations. How can I be sure that any of them are real?

The answer is that I can’t — especially now that all of the members of my immediate family are gone. Other than me, there are no surviving witnesses to my childhood. There is no objective reality about my formative years; there is only what’s in my head. The stories that form the basis for much of what I’ve written in my blog posts may be entirely fictional.

My only consolation is that if they are fiction, they’re pretty good fiction. I don’t think I have the skill to have made them up consciously. Perhaps I’m not a product of my past; I’m just a product of what my current brain thinks was my past. If so, that doesn’t stop me from drawing lessons from it.

Still, who would have thought that The World of Yesterday was as much a product of imagination as The World of Tomorrow?

Read Me 1 comment

Skipping and Jumping

As with pretty much everything else I do, I’m self-taught in computer coding, so figuring things out sometimes takes a while. The animated illustration in my post “Crossing a Line” looks simple, but writing the JavaScript that makes it work took three days. (Part of the difficulty was that it has randomness built into it, so that in the diagram — as in life — the action never repeats.)

For the most part, I enjoy the challenge, but there are times when it’s immensely frustrating. There will be a block of code that’s relatively simple and absolutely ought to work, but doesn’t. I’ll stare at every character and say, “Yup, that’s right,” and retrace the logic in my head and say, “Uh-huh, that makes sense,” and yet the code just stares back at me. I could blame this on my own ineptitude, except that professional programmers tell me that they encounter the same thing.

What amplifies the frustration is that I know that eventually I will solve the problem, since I always have in the past. The answer is right in front of me; I just haven’t discovered it yet. At times like that, I often wish that I could just skip the useless hours and jump ahead to the time when the problem has been solved, so I can get back to doing productive work.

The idea of jumping ahead in time has always had special interest for me, because it feels almost tangible: If I know that a particular moment in the future is going to happen, why can’t I just go there? We’re all accustomed to cuts in movies, where the time and place change in an instant, so I imagine it shouldn’t be too jarring for it to happen in real life. I don’t want to change the future; I just want to do some judicious editing.

But an interesting philosophical problem emerges when I ask myself what it would actually mean to jump ahead in time. The jump isn’t something that I could perceive while it’s happening, since it would be instantaneous. I’d only be aware of it once it’s happened. Therefore, “skipping ahead” is something that can be experienced only in memory: I’d remember sitting at my computer staring at a block of code that’s not working, and then seeing a moment later that the code has been rewritten (most likely in a stupidly obvious way) and is now working smoothly.

In that case, it seems like “skipping ahead” is an illusion that could actually be accomplished retroactively. If we imagine that there were some way to surgically operate on my memory so that the problem-solving hours could be removed, my post-operative experience would be indistinguishable from one in which time itself had somehow been edited. From my perspective, it would appear that I’d actually jumped a few hours into the future.

Of course, that raises the question of which person is me — the one who does all the frustrating work and then has his memory operated on, or the one who experiences a painless jump? Ideally, I’d want to identify with the latter me, the one who doesn’t even recognize that the former me (or at least a few difficult hours in the life of the former me) ever existed.

But there’s no reason why I shouldn’t equally identify with the former me, the one who actually did the work. That me has already spent the time fretting and experimenting and eventually solving the problem, so how would it benefit him to have that time surgically removed from his memory? He’s already at the point where his memory would resume, so why not just get on with further coding?

So it turns out that the operating-on-the-brain solution really is no different from the idea of physically jumping ahead in time (whatever that might mean). In either case, someone is going to go through the mental agitation that leads up to solving the problem, and either way, that person has to be me. Consider that fantasy dashed, then.

As a postscript, have I mentioned that I was a philosophy major in college? Engaging in philosophy requires conducting this sort of thought experiment all the time — going around in circles as you try to come up with an answer to a philosophical question. If I’m going to have this frustrating experience of working on a problem, I’d rather do it with computer code, because at least I have something concrete to show for it at the end.

Read Me 7 comments

People Say Things

A colleague of mine at Chabot College once asked me for a favor: He wouldn’t be able to attend the annual open meeting of the Faculty Association, at which the union officers would update us about their most recent negotiations with the college administration. Could I please attend the meeting, and then let him know afterwards what happened?

I saw him in the hallway late that afternoon, and he asked me, “So, what happened at the meeting?”

“Well,” I said, “Charlotte got up and said some things, and then she introduced Tom, who said some things. Then Dave said some things….”

“Wait a minute,” he said. “What did they say?”

“I don’t know,” I said. “Probably nothing important, or at least nothing significant enough for me to remember.”

He looked at me as if I were an imbecile. “I thought you were going to tell me what happened at the meeting!”

“But that is what happened at the meeting,” I said. “Charlotte said some things, and then Tom said some things….”

Needless to say, he never again asked me for a similar favor. But I learned something from that conversation — namely, that he and I had different definitions of the word “happened.”

In retrospect, I admit to having been in the wrong in that situation. But I think my mistake was understandable. People say things all the time, and hardly ever does the content of what they say matter more than the fact that they said it. Think of graduation ceremonies: Apart from the handing out of diplomas, the only thing that happens is that people make speeches. The school administrators make speeches, the valedictorian and salutatorian make speeches, and a special guest VIP makes a speech. Looking back on the graduations you’ve attended, do you remember anything that any of those people said? Most likely you don’t, because what they said doesn’t matter. What matters is that oratory was delivered, preferably with an air of great significance. If nobody gave a speech, there would be no ceremony.

What is true of graduation ceremonies is true of much human interaction. Our society offers very few ways to connect with people, other than through conversation. As much as I might want to, I can’t reach out and physically touch you unless we already know each other well. I can’t gaze into your eyes or project telepathically into your mind. I can’t even sing in close harmony with you unless we both happen to be musicians who know the same songs. All I can do is talk with you, and the fact of our talking matters much more than whatever we happen to be talking about. I have been known to claim (admittedly with some hyperbole) that the true subject of any conversation is “I love you.“

There are exceptions, of course. There are plenty of interactions whose primary purpose is the transmission of information — getting directions, for example, or listening to a news report. (Clearly, the union meeting I attended should have fallen into this category.) And even in ordinary conversations, the meaning of the words has some importance. But — like the cat who ignores the fancy pet-bed you bought in favor of the cardboard box that the bed came in — I find much more satisfaction in the vessel that contains the words than in the words themselves.

I’m not sure how much of this is universal, and how much is just me. I’ve long known that my brain is wired funny, and one symptom of the miswiring is difficulty with processing spoken language. If someone is talking, I can concentrate on parsing the words for meaning, or I can relax and experience the energy of the person who is speaking, but I can’t easily do both. As you can imagine, the more I like a person, the more I tend to savor the feeling of being in their presence — which means that I’m less likely to take in the literal meaning of what they’re saying. This often proves embarrassing later, when they assume that I’ll remember something significant that they told me, and I don’t.

But it can’t all be me. Think about the last time you went to a movie with someone, and how different that was from going to see a movie by yourself. You and your companion don’t converse during the movie — at least I hope you don’t — and yet simply having that person in the seat next to you changes the nature of your experience. That impalpable element, I believe, is what gives most conversations their flavor. Writing about this makes me sad, because we’re in the midst of a pandemic in which most human contact is off-limits. Conversing by phone or screen feels empty, because information is the only thing those media can transmit. Even meeting in person falls short, because it’s hard to feel the visceral presence of someone who is masked and sitting six feet away. All we have to offer each other is words, and words are inherently unsatisfying. I long for the return of a time when meaningful things don’t just get said, but happen.

Read Me Leave comment