Saturday, March 29, 2008

Time, Again

Two days ago, my firstborn child turned 32; two days from now would have been my grandmother’s “eleventy-first” birthday (she was born in 1897 and died in 2001)—and although there won’t be wizard-designed fireworks and a big cake and party, I will probably celebrate by lifting a glass in her name (even though she herself was not a tippler). My son’s a year younger than Frodo was when Bilbo turned 111, and the dates aren’t exact, but I thought the parallel (for a couple of Tolkien fans) was close enough to mention—and he’ll undoubtedly have lifted a pint or two of the Gaffer’s best brew in honor of himself. The passage of time is, consequently, much on my mind, especially since spring has sprung and growth is becoming increasingly visible and tangible.

As if the mere season weren’t enough, however, one of my favorite comic strips, Rick Kirkman’s and Jerry Scott’s Baby Blues is on a “Days are long/years are short” kick, every day reminding me how quickly children grow up. I’ve loved this strip for years because the mom has breastfed all three of her kids, and the nursing jokes alone are worth their weight in . . . well, you get the idea. As the strip’s kids grow, they frequently remind me of the trials and tribulations of raising siblings who don’t get on all that well, which often slips my mind now that they live half a continent apart from each other and are great chums. The distance probably helps.

At any rate, unfinished business from my last post is still lurking, especially since it was originally inspired by an article in last Sunday’s Dallas Morning News: Carolyn Johnson’s piece on “The Joy of Boredom,” which I never got around to mentioning last time. As I’ve undoubtedly mentioned before, I’m fond of telling my students (when they tell me something I’ve asked them to read is “boring”) that boredom reflects a lack of imagination. I go on to boast, in my best preacherly tone, that I have never been bored in my life. It’s probably a lie, because I’m sure I’ve been faced with an interminable lecture or two, complete with bullet points, that I actually found boring. But at least I can’t ever remember being bored, and even when told to read something I wouldn’t otherwise have tackled, I always seemed to find some value in it (if only the knowledge that God wouldn’t assign me to the fires of hell for having disobeyed Sister Francisca). Nonetheless, my own students are difficult to convince, and sometimes I wish I were more like a Dominican. The fear of God was a prime motivator in my day, but the most evangelically-oriented of my students don’t seem to be cowed by visions of hell, not even when they’re illuminating Canto V of Dante’s Inferno.

Johnson’s point is that boredom is not only inevitable, but “a primordial soup for some of life’s most quintessentially human moments”:

Jostled by a stranger’s cart in the express checkout line, thoughts of a loved one might come to mind. A long drive home after a frustrating day could force ruminations. A pang of homesickness at the start of a plane ride might put a journey in perspective.

She goes on to note that technology has come to the aid of the bored, offering them myriad ways to fill up those moments of ‘microboredom’ (she’s quoting Motorola here), putting folks out of potential misery by making sure they’re engaged (whether or not in anything worthwhile) at all times.

To me, time is the important element in this assessment: the frenetic filling-up of time that seems to preoccupy modern culture more and more, so that people feel compelled to be “doing” something in addition to what they’re already doing. (Who the hell really needs to tap out a text message while driving the bloody SUV through a school zone, anyway?)

Now, I will admit to having answered my cute little iPhone exactly three times while driving on the freeway. But even just picking it up and figuring out how to answer it (I’m not exactly adept with the thing yet) was distracting enough to make me slow down and become very deliberate about my actions, and to be very happy to hang up. But most of the time I try to call home and answer any missed calls before I get on the road, so that my drive time can be spent (there’s that pesky little time is money metaphor again) sorting through the day and—oh, yes: attending to the driving of the car.

Boredom, it would seem, is tied these days to how we think of time, and the “spending” of it. Our most common cliché reflects our attitude: it flies when we’re having fun, even though the Latin (from which it probably derives) implies regret. Tempus fugit (time flees) suggests that it escapes, gets away from us. It does seem to be the case that the older we get the more conscious we are of time’s passing, and the less anxious we are for it to get away from us. The experience of time is always relative—it often dilates when we’re engaged intensely in some activity, and contracts when we’re enjoying ourselves—but our consciousness about time might also increase as we proceed, kicking and screaming, toward its “end.”

I tend to rail at my students for using the phrase “since the beginning of time” because it’s meaningless (when exactly did time begin, anyway?), and its opposite is equally silly: “until the end of time” (that’s how long I’ll love you, say the songs). Human beings are babies in the chronology of the world, but we think we’re incredibly wise (whether it’s through religious belief or scientific knowledge). And we treat time cavalierly, wasting it, spending it, losing it, gaining it, playing with our clocks and yet pretending that it’s something terribly important and unknowable. We might do better to adopt a less linear view, as many other cultures have done, or notice more carefully the “events” on our natural calendars. A cyclical, recursive notion of time might forestall boredom entirely, allowing us to incorporate reflection as part of our daily lives, and not having to snatch at it only when we have nothing better to do, or when we have “time on our hands.” The link, by the way, leads to a lovely short film about Mt. Athos, in Greece, at Easter time.

According to the Maya, we’ve only got a few years left, anyway, since their current Long Count ends on my birthday in 2012. Wouldn’t you know—just when I’m due to retire and have more time to myself.

Photo: My father holding my eight week-old son, in the chair my grandfather bought for my parents when I was born.

Thursday, March 20, 2008

Spring Time

If I had my druthers, we’d be on “daylight savings time” all the time—even though the notion of “saving daylight” is pretty silly. The very idea of mechanical time is problematic to me, since it always seems to interfere with “real” time: i.e. seasons, circadian rhythms, phenology, and other natural forms of “telling” time. But the idea that one can “save” daylight by moving the hands on mechanical clocks around (or resetting digits on a digital clock) seems absurd. It doesn’t really save anything, since the number of hours of sunlight doesn’t change—we just do a bad job of training ourselves to get up earlier for part of the year. It would be more tolerable if the clock switch took place regularly at the vernal and autumnal equinoxes (at which time we would at least acknowledge the fact that temporal changes have something to do with seasons), but when we decide to “spring ahead” and “fall back” now seems pretty arbitrary, and it hasn’t been very constant lately. So the switch tends to leave me fairly befuddled these days.

For some reason I can’t really explain, I’m fond of old clocks and timing devices, although I don’t even wear a watch. I find it amusing to check with my students on the timing of my lectures and usually get a laugh when I’m spot-on (I try to keep the talking and slide-showing down to 90 minutes at a time). Nevertheless, I own two lovely old clocks, one each from the maternal and paternal branches of my family. One, known as “Uncle Fred’s Clock” is a Seth Thomas oak shelf or kitchen clock with a carved case and a pretty etched design on its glass front. It’s supposed to be an “Eight Day Half Hour Strike” job, but I’ve only ever gotten it to work for a few minutes at a time. The eponymous “Uncle Fred” was my great grand uncle, Fred Uhlmeyer, who settled in the Owens Valley in the nineteenth century, and who washed the ore from his mining claim at the spot now known on the U. S. geological survey as “Uhlmeyer Spring.” He’s mentioned on p. 142 of the 1975 edition of The Story of Inyo, by W. A. Chalfant, as having “squatted” in the Valley. There’s also a nod to him in More News From Nowhere, since most of the “action” (if you can call it that) takes place nearby in my future version of the area. [Thanks, by the way, to my Uncle Art—my Dad’s youngest brother and only surviving sibling—who helped me figure this out, and who’ll swear it’s true as long as I get it mostly right.]

At any rate, the clock looks great on one of the built-in bookshelves that flank the fireplace in our bungalow living room. It’s balanced on the other side by an onyx and glass Ansonia crystal regulator (of unknown name or vintage) that came from the Worden side of the family. I remember my grandmother’s winding it before bed each night, when she and her second husband (another Fred) lived in Coos Bay, Oregon. What I really love about this clock, however, is that although it works, it keeps crappy time. And even though it strikes on the hour and half-hour, it’s pretty arbitrary about how many times it “bongs”—and never any more than seven, no matter what time it is. I simply wind it occasionally, but don’t re-set it, so that it keeps its own peculiar time. There’s something reassuring about its complete arbitrariness, a reminder that we human beings are really not in control of anything, and certainly not of time.

Today, on the vernal equinox, the sun shone into my dining room window for the first time since dawn on the autumnal equinox last year. The single east-facing window in that room (the others face north) serves as a time-henge, marking the beginnings of two seasons. I’m not sure when I first noticed it, but made sure to note the coincidence henceforth; since then it’s provided a non-mechanical marker that delights us both. It takes the place of Easter and Passover in a non-religious household as a way of reminding us of “when” we are in the year. This morning my dear, sweet husband, who thinks I’m only a little nuts, gleefully woke me (and the three cats under which I was helplessly pinned) to announce the Arrival of Spring.

All this reminds me that despite my love for machines like clocks and orreries (those wonderful models of planetary motion like the one in The Dark Crystal, or Penn’s Rittenhouse model ), I truly mourn the demise of the physical, sensual ways in which human beings used to tell time. When people had to chip a notch into a bone to mark the passing of a day, or a phase of the moon, it made us more conscious of our connection with the celestial movements that affect our lives directly in such phenomena as tides and growth cycles. There were once ceremonies to celebrate such momentous events as the onset of menses (from the Latin for “month”), and there should be one for reaching menopause— because both events signal the parameters of human fertility, and may once have made women seem magical (that is, before males discovered that they had a role to play and then decided that it was the major role). But the more we insulate ourselves from the natural world, taking drugs that alter our biological clocks, “saving daylight,” building temperature-controlled houses, flooding our cities with incessant light, growing monoculture lawns watered by automatic sprinklers, riding in cars to work and on stationary bikes in gyms for exercise (instead of walking to work and working in our gardens, feeding ourselves)—the more we move away from our physical knowledge of the world, the less we understand about it, and the less we will have to say about what becomes of it.

It occurs to me that one of the more subtle, yet ominous, effects of global warming is that in addition to causing all sorts of havoc (like this week’s drenching and unseasonal rain) it ’s beginning to muck with phenological indications of seasonal change. I hadn’t realized that folks were actually taking note of the potential for problems until I went looking around the internet for links on the topic, and found one for the USA National Phenology Network, which “exists to facilitate collection and dissemination of phenological data to support global change research.” So it’s already clear that we’re not just gumming up the atmosphere and causing extinctions; we’re causing the signs of change themselves to change. What’s really going to be interesting is to see whether we’ll allow ourselves enough time to learn an entirely new language, especially since most of us seem to have forgotten, or never even learned, the existing one.

Life was somewhat simpler, and probably a bit sweeter, when we guaged time by the spawning of fish or the arrival of sun in a window—or by even the daily winding of Uncle Fred’s clock. But now that it’s officially Spring, I’ll comfort myself by keeping watch for the blooming of the wild gladioli, and get busy with the planting, already.

Monday, March 10, 2008

Going backward, looking forward

The utopian impulse seems to have occupied human consciousness at least since the Bronze Age, when biblical stories of a perfect world (the Garden of Eden) describe the past before human beings learned how to screw things up (in this case by seeking knowledge). The Greeks also imagined a long-gone golden age, and the idea developed further through the imaginations of Plato (Atlantis) and Francis Bacon (The New Atlantis) and myriad others—including William Morris and yours truly. But there’s another phenomenon that seems to spring from the same yearnings: a kind of short-sightedness about the more recent past that makes us look back at our childhoods or some previous historical moment as—if not actually perfect—better than now. In trying times, this seems to make some sense, and it might actually explain my own interest in utopian ideas.

But the idea of utopia seems to elicit both forward- and backward-looking responses. In her 1982 essay “A Non-Euclidian View of California as a Cold Place to Be” Ursula K. Le Guin writes about the utopian impulse as being fundamentally progressive:

I am not proposing a return to the Stone Age. My intent is not reactionary, nor even conservative, but simply subversive. It seems that the utopian imagination is trapped, like capitalism and industrialism and the human population, in a one-way future consisting only of growth (85).

In the same essay, Le Guin mentions that some people actually think of North Texas as a kind of utopia (specifically, Arlington—p. 88; it’s a story told by Prof. Kenneth Roemer, who teaches at UT Arlington), and it’s clearly the case that many American citizens think of the United States itself in utopian terms: the best of all possible worlds, if not still, then in the past. This has got me to thinking about the kind of historical blindness that seems to spring up when things aren’t going well, as they haven’t been in recent years, and ties in with Susan Jacoby's article in the Sunday Dallas Morning News on the rise of anti-intellectualism in the US (
An incredibly incurious America). It’s also related to an anonymously authored e-mail that’s been circulating (I’ve been sent copies by two people).


I hadn’t really noticed it before, but folks who live in north Texas suburbs actually seem to be pretty happy about doing so. After all, they’ve got nice big houses, clean streets, and shopping close by. Fast food is plentiful, uniform, and predictable, so there aren’t any surprises. Chances are no one’s going to run across anything unique, but hey—that’s why the suburbs are attractive: nothing to upset the equilibrium. Things may be getting a teensy bit unsettling with the spate of foreclosures and rising gas prices, but area ‘burbs aren’t nearly as affected as in the rest of the country, and, well, one can always downsize the Hummer to a standard SUV if gas gets to be uncomfortably expensive. So the notion of Arlington (or Frisco, or the west side of McKinney) as utopia doesn’t seem that far-fetched, if one envisions comfort and placidity as the ideal.

But many older folks (my generation and my parents’) seem to be looking back on an unlikely utopia: the years surrounding World War II, when (as the e-mail puts it), “they had just come out of a vicious depression. The country was steeled by the hardship of that depression but they still believed fervently in this country. They knew that the people had elected their leaders so it was the people’s duty to back those leaders.” Furthermore, “Often there were more casualties in one day in WWI than we have had in the entire Iraq war. But that did not matter. The people stuck with the President because it was their patriotic duty. Americans put aside their differences in WWII and worked together to win that war.”

Today, however, we’re “a cross between Sodom and Gomorra and the land of Oz” and are “subjected to a constant bombardment of pornography, perversion, and pornography” and “have legions of crack heads, dope pushers, and armed gangs roaming our streets.” (One does wonder where the author of this piece actually lives—since it’s clearly not around here!) Then was good; now is bad. As a reminder to those who bought into this message wholesale, I offer this list from Al Devito’s blog, Vineyard News: What a Difference 60 Years Makes.

Now, I’ll admit to giving my students the occasional speech in which I wax nostalgic about my rather rich education, imposed in part by the stern Dominican nuns who must have learned a thing or two from the Inquisition. But I also know that current problems in education are in large measure our own fault—the fault of my generation, for buying into “progress” for its own sake, and for embracing without question the technologies that we blame today for everything from short attention spans to sexual license. The kids, to be sure, whined endlessly, begging for the newest electronic gizmo, but most of us went along with it, upgrading from Atari to Commodore 64 to Xbox to whatever. They learned from this that whining is a winning strategy.

But nostalgia for the war years seems equally misplaced, as does the unquestioning acceptance of “patriotic duty” (the people of Germany, after all, felt it their patriotic duty to follow their elected chancellor and to ignore what the SS was doing to guarantee German sovereignty). Having lived in Japan shortly after that same war ended, I clearly remember weekly air raid drills that scared the bejeezis out of me, and duck-and-cover practice sessions that haunted my dreams for decades. During that same war, we also deprived thousands of American citizens of their rights, simply because of their ethnic origins, and transported them to “relocation” camps like the one outside of my home town—Manzanarwhile also depending on many of them to help fight the war for us. We were not perfect then, any more than we are perfect today.

I see no reason to disagree with Susan Jacoby’s fundamental premise that Americans are becoming increasingly ignorant of history and decreasingly curious about anything that will help remedy that ignorance. But we could start by looking back honestly at what has happened, and using the knowledge we gain to help us build paths toward a more viable future. In her essay, Le Guin quotes Howard A. Norman (from his book on Cree folk tales) describing a kind of porcupine philosophy: “He goes backward, looks forward” in order to back safely into a rock crevice. She goes on to interpret this in terms of thinking about what lies ahead for us:

In order to speculate safely on an inhabitable future, perhaps we would do well to find a rock crevice and go backward. In order to find our roots, perhaps we should look for them where roots are usually found . . . With all our self-consciousness, we have very little sense of where we live, where we are right here right now. If we did, we wouldn’t muck it up the way we do (84-85).

We might, in other words, learn to get alonglearn to beby being conscious of where we are and where we’ve been. Ignorance of history is a dangerous indulgence, and lack of curiosity is even more deadly. The human imagination requires both in order to thrive. We owe our kids the inspiration to learn from the past without holding it up as some glorious utopian moment, and we sure as hell have no place moaning helplessly about the present, when our generation helped to create it. Nevertheless, when we vote for a nebulous idea like “change,” we also need to understand exactly what we mean; the only way we can do that is to fully understand what we’re changing from.

Quoted material: Ursula K. Le Guin, “A Non-Euclidian View of California as a Cold Place to Be” in Dancing At the Edge of the World: Thoughts on Words, Women, Places. New York: Harper Perennial, 1989. The link is to the 1997 edition from Grove Books.

Photo Credit: Cemetery shrine, Manzanar, Japanese internment camp, photo taken on 2002-03-24 by Daniel Mayer © 2002. Wikimedia Commons.

Tuesday, March 4, 2008

Happiness is Bliss?

I once helped run a campaign for student government president at UC Riverside on behalf of a guy named Larry Bliss. Clever person and half-assed artist that I am, I ripped off Charles Shulz’s dancing Snoopy and under it drew (in cute ‘60s-style balloon letters) the slogan that I’m using to (sort-of) title this post. I don’t even remember if Bliss won the election, but I’m assured by the alumni magazine that he’s alive and well.

As usual, the urge-to-post is prompted by a confluence of newspaper articles and NPR broadcasts, together with latent notions brought to mind by snow and the realization that we’re not going to be able to escape north Texas after all, at least until summer.

I spent the entire day yesterday calculating a route, distances, and time for our proposed trip to California in a couple of weeks. I had made it only to Cathedral Gorge in eastern Nevada by suppertime, when my husband discovered that because of a mix-up in dates, we wouldn’t be able to go after all. It turns out that his spring break and mine do not overlap as we had originally thought, and he would miss two important tennis tournaments. So, instead of letting the players and the other coach down, we decided to postpone the trip, and I was plunged into a fit of melancholy that lasted only until snow started falling a couple of hours later.


This morning there is a six-inch deep layer in some parts of the yard (actual sn
owdrifts!), and the puppies—who haven’t seen snow since they were infants—are ecstatic. That last time, on Valentine’s Day four years ago, coincided with my last trip to see my father before he died, so that memory is rather bittersweet. I thought of him when I looked out and saw the ancient metate that has been in our family for nearly a hundred years, now ensconced in my back yard, and looking a bit like a caldera, with its rim covered with snow; he would have loved the picture I took with my new iPhone (yes, I know; another Luddite bites the dust).

As I settled into the Comfy Chair and began to read the paper, I came across mention of Eric Wilson’s new book, Against Happiness: In Praise of Melancholy. I’d heard a brief bit of it on All Things Considered last month, but (because I have yet to start carrying a note pad in the car) had forgotten about it. I also remembered hearing about Eric Weiner's travelogue, The Geography of Bliss: One Grump's Search for the Happiest Places in the World, on Weekend Edition, and although I've yet to read either book, I was pleased to note that the world may not be as full of happiness-addicts as I had feared. Wilson's book, especially, seems to have garnered some interesting responses, from a blog for Moms to Book Forum, so it's got my interest piqued. Now I'm going to have to read both of them.

I have long been intrigued by the attractions of melancholy and the improbability of a sovereign nations having been founded on some foggy idea of insuring life, liberty, and the pursuit of happinessnone of which it could actually guarantee. Utopian literature often focuses on an equally fuzzy notion that everyone should be “happy,” and this is one of the stumbling blocks to writing about any kind of “ideal” society. I have also long been disturbed by the often vapid expressions of people who claim to have completed their lives by having found Jesus (or Buddha, or the Way), as if all the struggle involved in actually living (including its pains as well as its joys) were something to be done away with. Perhaps our collective reluctance to embrace the “down” along with the “up” is what has prompted pharmaceutical companies to keep inventing drugs to “cure” every minor ache and pain, as if to numb us from reality like Soma did in Brave New World. (As an aside, there actually is a drug called Soma, and it looks familiar to those of us who have read Huxleys quintessentially dystopian novel.)

If, as I also read this morning (in a review of the new Fox series, New Amsterdam), death is what makes life meaningful (the protagonist calls it “God’s joke”), surely the absence of happiness is what makes happiness mean anything at all. Philosophical happiness, in fact, seldom has anything to do with the feel-good sensibilities of self-help books. It’s about being intellectually fulfilled—recognizing the good, even if one can’t actually attain it (Plato). Rather than seeking to forget about death (The Ultimate Sad Thing), philosophers tend to find in it a reason for living. Martin Heidegger’s term Sein zum Tode (being-toward-death), for example, actually describes what it means to be human: the human being is the only one aware of its own mortality.

In my utopia, happiness does not describe a blissful state of existence. I’m not sure I’ve even used the word in the story. What my characters try to build is a world in which people can exist without all of the artificial angst-makers the modern world presents: pollution, inequality, terror, greed, environmental degradation, violence, racism, and the like. Their freedom lies in the ability to do what people need to do, and to live meaningful, creative, fulfilling lives that center on be-ing, and not on the denial of death.

Happiness is a slippery notion, and I welcome books that buck the trend of trying to guide people out of “negativity” by urging them not to be “cynical” or “pessimistic,” or by developing step-by-step programs that earn their authors untold wealth by fleecing people who’d rather not think. Thinking is difficult and frequently uncomfortable, but it keeps the mind alive. Certainty is unattainable (and probably not any more desirable than blissful happiness), but the search for understanding is boundlessly rewarding. All the “definitions” of happiness promoted by popular culture seem to suggest that one should choose to be a contented fool, rather than a discontented Socrates.

And so I go forth to wallow in my uncertainty (Clinton or Obama?), and to mull over my delayed escape from Texas. I can, however, assuage my melancholy by planning to spend the time re-glazing needy windows and planting my new garden. And I can go out and watch the puppies, contented fools that they are, and ignorant of death and destruction, while they romp in the snow—which is, sad to say, melting as I type.

Photos: Chinaberries laden with snow; my grandmother's metate.

Saturday, March 1, 2008

Of Comfy Chairs and Soft Cushions

I have many things for which I am grateful, not the least of which is the fact that I can spend my mornings tucked into a comfortable, if not beautiful, chair, with a couple of pillows to support my aging lumbar region. Here I read the paper, and then, if there’s time, I read another chapter in one of the books I keep on the stand next to me, under the stained glass lamp we got on sale not long after we moved in. The chair is set in front of the living room windows, three double-hung jobs badly in need of re-glazing and painting, that face east: the best orientation for the house, according to my Asian-influenced upbringing. Artifacts from our stints in Japan and Taiwan lie around the room, and many of the aforementioned soft cushions are covered in fabrics from China—probably woven in some sweatshop or other for the export trade. I am always torn between my love for the colors and patterns of the East, and the ethical questions that arise from purchasing them so far away from their origins. I have no way of knowing how the people who made them are faring; so I can only hope that some portion of their cost makes it back to the weavers, fullers, and tailors responsible for their manufacture.

In fact, a good deal of the time I spend in my chair each morning is devoted to thinking about utopia—or reflecting on these dystopian times. The pile of books from which I glean ideas and which provide the fodder for most of my blog posts, almost always focus on what human beings have done to the world that makes it necessary for thoughtful people to wonder about the environmental and cultural impact of everything we buy, eat, wear, or otherwise consume. It’s not that I mind being mindful; it’s just that it would be better for us as a species, and better for the world as a whole, if we were more able to “spend” that time living in the world rather than worrying about it.

Writing and thinking occupy almost all of my “off” (i.e. non-teaching/grading/prep) time during the winter. I often spend Saturdays ensconced in my chair with my laptop, working on the “Farm” or on my latest literary effort (a science fiction novel about an older woman on an archaeological adventure), at least until the sun has risen enough to suggest the possibility of working in other parts of the house. The study is, unfortunately, the coldest and darkest room in the house, even though it faces south, because of the deep eaves that shade the windows. These conditions make it pleasant in the summer, but in winter I often have to sit at my desk with a comforter around my knees and a shawl over my sweater. It’s a bit like writing in a garret in a tenement somewhere, but not terribly romantic. On sunny mornings, though, the Comfy Chair is the venue of choice, especially when there’s a fire going, and one of the puppies is napping on my feet.

Come spring, when the morning temperatures are above 60°F and the weather fine, we move into the garden for our morning read. This year, since we’ll be rearranging things a bit, we’ll be able to sit at a table, with the coffee carafe and the newspapers, and perhaps even the laptop (I’ve recently discovered that the wireless connection reaches out there), which should make for some pleasant writing-mornings. The seasonal migrations through and around the house are something I’ve become much more aware of since I began working on the “Farm” in the first place. As I was writing More News From Nowhere, I spent much of my time “living” there—thinking about alternatives to civilization. But I have since begun to think more about the quality of life in my immediate vicinity, and how what I imagined in the book could in any way be implemented in “real life” (or the “RW” as they call it on my forum).

One of the few ways to escape the constant, intrusive, nagging, and mounting problems in the world has always been to become a hermit, and I can certainly understand the impulse. If I did not find myself regularly in the company of young adults who are inheriting what those of my generation have bequeathed them, I’d be very tempted to simply enclose myself in my little domain and lose myself in the works of Morris and the other utopians I’ve spent the last twenty years reading, or in the ancient world, or some other place not here, not now. But the kids I teach—creative, funny, sardonic, frustrating, but sometimes quite wise and much more optimistic than I—draw me back into the world at least four days a week, and make me want to fix things for them. I can’t, of course. And I’m too old and too tired to do much other than help them know some of what I know, so that maybe they can do something to fix it for themselves. Maybe our job now is to help them keep wanting to figure out a way to keep the world going.

Meanwhile, I need to get out of the chair and into the garden. I picked up herb seedlings and lettuces at Whole Foods yesterday, and need to get them in the ground—or at least into pots where they can be protected if another freeze sneaks in on us. Planting a garden is always a sign of hope, so perhaps I’m actually more sanguine than I pretend. And I now have to go make entries in a new garden journal, which gives me something else to look forward to doing in the Comfy Chair—at least until the puppies start poking me with their soft noses and luring me out to get some actual work done. Perhaps we should have named them after Michael Palin and Terry Gilliam, in honor of their antics in the Python sketch that suggested the title of this post. After all, as long as we live in a world that can remember its past well enough to make fun of it, perhaps we really do have a future.