Showing posts with label time. Show all posts
Showing posts with label time. Show all posts

Wednesday, January 1, 2020

(Yet Another) Meditation on Time and Memory


Time is relentless, the tide which measures
the perturbations of the cosmos.

Once again I've come upon a story, serendipitously, just when I was musing about just how quickly things seem to be happening these days. The link is to a short story written in 2010 by SF and fantasy writer Jay Lake, who died in 2014. In it he considers the probability that if "human thoughts moved with the pace of bristlecone pines, we would never have invented the waterwheel, because rivers flash like steam in that frame of reference. Likewise if we were mayflies—flowing water would be glacial." The slow pace of geological time, we are reminded by Carl Sagan and others, renders the whole of human existence but a mote in time. The Bristlecone pines, I should note, grow not far up the White/Inyo Mountains from Uhlmeyer Spring, where my banner shot was taken. They are about 5000 years old.


A good friend and former colleague came to town a few days ago, and we got together for the first time in a year to catch up and to celebrate each others' birthdays and the holidays. We usually manage to do this twice a year, but he's been busy with his new life in his home town, and The Beloved Spouse and I have been preoccupied with dog-rearing and a bit of house renovation, so this was the first chance we'd had to reconnect since last December.

The time issue came up in conversation first in regard to retirement life, when neither TBS nor I could remember how long we'd been voluntarily out of work. Later, when I was trying to remember what had been occupying me during the last three years of not being a wage slave to a proprietary educational institution, I came to marvel at the number of things I used to accomplish while I was teaching five four-hour classes per week, eleven weeks per quarter, four quarters per year. 

Not only was I wrangling the class load (with its incessant grading, prep, faculty development, course and syllabus updating and the hour or so drive to and from campus), but I was maintaining a massive course website (Owldroppings, now retired) with extensive original content for each of my courses (Art History I and II, Writing I or II, Intro to Humanities, and an upper level elective), and at one time maintaining four blogs--including this one. Only one of the others is still more or less active (Owl's Cabinet of Wonders), but for a year or two I really was trying to post on the others as well, and on all much more regularly than I do this one. I was also pretty active on a fanboy website devoted to Joss Whedon's film Serenity (and indirectly to the TV series it was based on, Firefly), which meant several hours of online conversations per week. During some of that time, I was still writing letters home to my Dad, for which I'm glad I found the time, because, thanks to e-mail, I still have most of that correspondence.

In addition to all that, I successfully completed several online courses through Coursera and other venues, during a fit of MOOC-ing: Archaeology's Dirty Little Secrets (Brown U); Ancient Egypt: A History in Six Objects (University of Manchester); Japanese Culture Through Rare Books (Keio U); Live! A History of Art for Artists, Animators, and Gamers (Cal Arts); Philosophy and the Sciences (Edinburgh); Photography: A Victorian Sensation (Edinburgh); Sagas and Space (Zurich); Ideas from the History of Graphic Design (Cal Arts). I didn't quite make it through Imagining Other Earths (Princeton) or Introduction to Sustainability (UIUC) or the image-making course from Cal Arts. Not completing these latter three indicates that I might have run into time constraints along the way.

Somehow, during the more-than-fifteen years we were both teaching after we moved into our labor-intensive house, we also managed to keep a garden, herd several cats and a couple of dogs, make an occasional trip out to visit the Auld Sod in California, and do things to keep the house livable.

Nowadays it seems impossible to me that I could have done all that. The only online activity I've got going on at the moment is Quora, and I'm writing less and reading more on that. I no longer think that there's a whole lot more I can offer to the body of Quora information on breast-feeding, child-rearing, Star Trek, science fiction, nineteenth-century American literature, and cookery after mouthing off on all of this since 2011. I'm pretty sure that very few people really care what I have to say about any of it, although I do get occasional upvotes. But the most "popular" answer I've ever written was on whether or not Sam Clemens could really have met up with the Enterprise crew as depicted in the Star Trek: The Next Generation episode, "Time's Arrow," and seldom do I get many comments--so for me it's more of a means to keep my brain working and a platform for food philosophy than anything really interactive.

As those few who still come by The Farm these days know, I don't show up often, although I frequently think of stuff that might make good content. I'm now thinking of retiring Owl's Cabinet, since much of what I write there would work quite well here, and I'm even more neglectful that blog than I am of this one.

Things could change a bit here, though. The Beloved Spouse and I are currently engaged in a bit of kitchen renovation that might provide good blog fodder, and my latest archival project will almost certainly find its way onto these pages. I'm transcribing nearly two years' worth of letters my (maternal) grandfather wrote to my grandmother from France during WWI. Not only that, but we're planning a sort of fact-finding, visit-neglected-family visit out west as early in the spring as we can make it. The noisier this little "farm" gets (from increasing local highway racket and neighborhood construction), the more we long for acreage and quietude, so our trip west this year will wind northerly so we can visit Oregon and Washington to see my son and his wife and my father's brother and his family. Likely new home venues will be part of the itinerary. After that, we'll wander south to try to catch up with my late brother's kids in Nevada before heading to the Owens Valley for some boondocking in Porco Rosso (now to be towed by Totoro the Gladiator) and dog/cat adventuring. 

Other plans for the new year include a quick trip down to Padre Island to see how Porco does on the road behind the new truck, and to re-acclimate the animals to RV life before the longer trip. I've also got ideas for transforming Owl's Farm: The Website (formerly known as Owldroppings) into a sort of lifelong-learning resource. One possibility is a sort of Eltern-garten (Old Folks' equivalent of a kindergarten), or a modification of the old course site with updates in the fields I covered. A good deal of research and archaeological discovery has taken place in the last five years or so that might make it worth while to revisit some of my old fields of expertise.

In the last couple of years I've also been fairly faithful at keeping a reading journal, and a little less faithful at a design/sketchbook.

Which brings me to the topic of memory. As we all know, our memories begin to transmogrify as we age, and mine is doing so somewhat predictably. I don't have too much trouble with the big events, but short term is pretty iffy, and that's one reason for my fidelity to the reading journal. This is, of course, a familiar pattern to many folks my age--as is my preoccupation with firming up family history and uncovering mysteries that no one has yet solved. So I'll continue trying to complete family trees and compile historical events drawn from old letters and documents left to me by my grandmother(s). I'm calling this latter effort "epistolary archaeology," and I think that my grandfather would be especially happy with what I'm doing, since his expressed wish in his letters was to use them in lieu of keeping a diary, so that he (and presumably his offspring) would have access to his war memories. I expect that these will also someday find their way into the appropriate archive.

Is it really 2020? I can remember when 2000 still sounded like a science fiction date, and I wasn't sure (in 1995) I'd make it that far into the future. My grandfather's letters were written in 1918 and 1919 (I was reading them a century after they were written!). He didn't die until 1973, and I do wish I had known about the letters earlier, so I could have asked for more of his story. But they were in my mother's keeping, and she didn't die until 1999, and I didn't even see them until 2000.

I can't say that I'm terribly sanguine about our collective future, but I do hold out some hope that during the next decade we become smarter, more thoughtful, and kinder. As Virgil noted rather famously in his Third Georgic, "Sed fugit interea, fugit inreparabile tempus." What we most often see shortened to "tempus fugit" ("time flies") more accurately points out that it's also irretrievable--not unlike the tides of time Jay Lake so aptly notes in his story, where he also observes that "we are all time travelers, moving forward at a speed of one second per second. The secret to time travel was that everyone already does it."

The pace suits me. It gives me time to think, to imagine alternatives, to read what others have imagined, and to be grateful that we don't yet have household robots or flying cars. For some reason I always enjoy looking toward the future, appreciating the past, and being happy about whatever time I have left. There always is, it seems, hope.

Happy new year, Folks. Live long and prosper.

Image credit: Graham, Joseph, Newman, William, and Stacy, John, 2008, The geologic time spiral—A path to the past (ver. 1.1): U.S. Geological Survey General Information Product 58, poster, 1 sheet. Available online at http://pubs.usgs.gov/gip/2008/58/ via Wikimedia Commons.


Saturday, January 9, 2010

Who Knows Where the Time Goes?

Maybe it's my advancing age, but I suspect that my current discomfort with the space-time continuum has at least as much to do with the way the modern world works as my having just become eligible for Senior Citizen discounts at the local bijou.

Time was (as they say) when folks thought that vacuum cleaners, automatic dishwashers, and such "labor saving devices" would add to the amount of available leisure time and make us all happier, healthier people. I'm not quite sure what the proponents of these adult toys thought we'd do with that time, but I suspect (in retrospect) that it had something to do with having more time to shop for more adult toys.

I do have a vacuum cleaner, although I'm not happy with the way it works, and I'm not sure it saves me all that much time. A carpet sweeper and a broom are lighter, and although they don't do the "deep" cleaning that hoovering is supposed to accomplish, I suspect that taking my area rugs out a couple of times a year and beating the bejeeziz out of them would do the same thing.

I've only had a dishwasher once in my life, and it took every bit as much time to prepare dishes for automatic washing as it did to do them by hand. In those days the machines also used more water than hand-washing did. I understand that that's no longer the case, and that one doesn't have to pre-wash everything. Still, about the only advantage to having one seems to be that it gives people somewhere to hide their dirty dishes. But both Beloved Spouse and I find dishwashing to be soothing (and, in winter, hand-warming), and there's nothing quite like a kitchen newly cleared, wiped down, and at peace. Some of my best memories involve conversations with my grandmother at the kitchen sink, washing up after a family meal. The only dishwashers she ever had were her children and grandchildren.

But this isn't really a rant about products. It's about truthfully wondering why I seem to have so much less time to do things I love, like gardening, reading, writing, hanging out in the Carbon Sink with the puppies, and home-keeping.

When I started writing Owl's Farm two and a half years ago, I managed to sit down for several hours a week to work on posts--and eventually to divide the content into three separate blogs. But something has happened between then and now, and I'm beginning to wonder if my internal clock wasn't knocked askew while I was under the knife last spring.

I also wonder if illness isn't more conducive to thinking than health is. When I couldn't run around like a fool, chasing my own tail or fighting fires for others, I frequently managed to think about what needed doing, and then to make a stab at doing it. I actually managed to fix up a couple of rooms in the house last spring, even while my aortic valve was narrowing down to a pinhole. But now that I'm bionic, its as if time itself has narrowed, and I don't seem to have nearly as much of it. As soon as my leave was up, only six weeks after surgery, I was back at it in full force. The summer slipped away so quickly that I can't remember what happened. Now that it's bitter cold, I can barely remember the heat.

The winter holiday positively thundered past, leaving me reeling in its wake, and bereft of two weeks I thought were mine in which to relax and take it easy. I wasn't even caught up in any particular rush, because our holiday events included only our daughter's annual overnight stay on Christmas eve, and a leisurely afternoon dinner. No big New Year's bash, no parties, no mad dashes to malls, very little shopping. But my two-week holiday was over last Monday, to be replaced by meetings and course preparation, some of it valuable, some of it useless.

Perhaps it's because I'm reading Morris again. I'll be teaching him this quarter, in particular his essay on Useful Work vs. Useless Toil. Re-reading it frequently makes me wonder about how much of my time is taken up in doing things that some higher education guru thinks is necessary in order for me to do my job well--but that end up producing nothing really worthwhile. It's not that I mind creating lesson plans and making sure that I'm delivering information and sharing ideas in such a way as to promote real learning. But some of it deprives me of time to read and think, which is where my expertise originates. It's times like this that I long for the way my utopians learn in More News From Nowhere: by talking and doing, rather than taking tests and being graded.

The topic of assessment and modern modes of teaching will undoubtedly be dealt with energetically over this next year in The Owl of Athena, because my college is merging with another and will be undergoing another round of scrutiny by regional accreditors. But for now I'm trying to figure out how I'll have time to prepare for any of that, if the minutes keep flowing by at light speed.

I long for languor, relishing small moments like this one, when I'm sitting by the fire on a very cold morning, laptop on lap, writing and musing. Last year at this time I was wondering if I'd make it to another new year, and now that I have, it's up to me to wrestle my time back so I can enjoy the greater number of days now available to me. So I'll quit whining, and when the fire dies down, I'll go out into the bare winter yard to take the frost-covers off the rosemary and lavender plants. The afternoon will be a good deal warmer, so I'll let them breathe and enjoy the sunshine. Then I can come in and start a stew simmering, pour a cup of good tea, and open a book.

I'm pretty sure that's when time slows down: when we stop the clock ourselves and refuse to let modern life take over. This seems like a plan, actually. Were I one to make resolutions, this would be it: not to "waste" or "spend" or "save" time as if it were a commodity, but to just take it and live it and let the other stuff go. This is, I think, why the Sabbath was invented.

Good plan. But we shall see.

Image credit: Evelyn Morgan (1850-1919), The Hourglass, via Wikimedia Commons.

Saturday, March 29, 2008

Time, Again

Two days ago, my firstborn child turned 32; two days from now would have been my grandmother’s “eleventy-first” birthday (she was born in 1897 and died in 2001)—and although there won’t be wizard-designed fireworks and a big cake and party, I will probably celebrate by lifting a glass in her name (even though she herself was not a tippler). My son’s a year younger than Frodo was when Bilbo turned 111, and the dates aren’t exact, but I thought the parallel (for a couple of Tolkien fans) was close enough to mention—and he’ll undoubtedly have lifted a pint or two of the Gaffer’s best brew in honor of himself. The passage of time is, consequently, much on my mind, especially since spring has sprung and growth is becoming increasingly visible and tangible.

As if the mere season weren’t enough, however, one of my favorite comic strips, Rick Kirkman’s and Jerry Scott’s Baby Blues is on a “Days are long/years are short” kick, every day reminding me how quickly children grow up. I’ve loved this strip for years because the mom has breastfed all three of her kids, and the nursing jokes alone are worth their weight in . . . well, you get the idea. As the strip’s kids grow, they frequently remind me of the trials and tribulations of raising siblings who don’t get on all that well, which often slips my mind now that they live half a continent apart from each other and are great chums. The distance probably helps.

At any rate, unfinished business from my last post is still lurking, especially since it was originally inspired by an article in last Sunday’s Dallas Morning News: Carolyn Johnson’s piece on “The Joy of Boredom,” which I never got around to mentioning last time. As I’ve undoubtedly mentioned before, I’m fond of telling my students (when they tell me something I’ve asked them to read is “boring”) that boredom reflects a lack of imagination. I go on to boast, in my best preacherly tone, that I have never been bored in my life. It’s probably a lie, because I’m sure I’ve been faced with an interminable lecture or two, complete with bullet points, that I actually found boring. But at least I can’t ever remember being bored, and even when told to read something I wouldn’t otherwise have tackled, I always seemed to find some value in it (if only the knowledge that God wouldn’t assign me to the fires of hell for having disobeyed Sister Francisca). Nonetheless, my own students are difficult to convince, and sometimes I wish I were more like a Dominican. The fear of God was a prime motivator in my day, but the most evangelically-oriented of my students don’t seem to be cowed by visions of hell, not even when they’re illuminating Canto V of Dante’s Inferno.

Johnson’s point is that boredom is not only inevitable, but “a primordial soup for some of life’s most quintessentially human moments”:

Jostled by a stranger’s cart in the express checkout line, thoughts of a loved one might come to mind. A long drive home after a frustrating day could force ruminations. A pang of homesickness at the start of a plane ride might put a journey in perspective.

She goes on to note that technology has come to the aid of the bored, offering them myriad ways to fill up those moments of ‘microboredom’ (she’s quoting Motorola here), putting folks out of potential misery by making sure they’re engaged (whether or not in anything worthwhile) at all times.

To me, time is the important element in this assessment: the frenetic filling-up of time that seems to preoccupy modern culture more and more, so that people feel compelled to be “doing” something in addition to what they’re already doing. (Who the hell really needs to tap out a text message while driving the bloody SUV through a school zone, anyway?)

Now, I will admit to having answered my cute little iPhone exactly three times while driving on the freeway. But even just picking it up and figuring out how to answer it (I’m not exactly adept with the thing yet) was distracting enough to make me slow down and become very deliberate about my actions, and to be very happy to hang up. But most of the time I try to call home and answer any missed calls before I get on the road, so that my drive time can be spent (there’s that pesky little time is money metaphor again) sorting through the day and—oh, yes: attending to the driving of the car.

Boredom, it would seem, is tied these days to how we think of time, and the “spending” of it. Our most common cliché reflects our attitude: it flies when we’re having fun, even though the Latin (from which it probably derives) implies regret. Tempus fugit (time flees) suggests that it escapes, gets away from us. It does seem to be the case that the older we get the more conscious we are of time’s passing, and the less anxious we are for it to get away from us. The experience of time is always relative—it often dilates when we’re engaged intensely in some activity, and contracts when we’re enjoying ourselves—but our consciousness about time might also increase as we proceed, kicking and screaming, toward its “end.”

I tend to rail at my students for using the phrase “since the beginning of time” because it’s meaningless (when exactly did time begin, anyway?), and its opposite is equally silly: “until the end of time” (that’s how long I’ll love you, say the songs). Human beings are babies in the chronology of the world, but we think we’re incredibly wise (whether it’s through religious belief or scientific knowledge). And we treat time cavalierly, wasting it, spending it, losing it, gaining it, playing with our clocks and yet pretending that it’s something terribly important and unknowable. We might do better to adopt a less linear view, as many other cultures have done, or notice more carefully the “events” on our natural calendars. A cyclical, recursive notion of time might forestall boredom entirely, allowing us to incorporate reflection as part of our daily lives, and not having to snatch at it only when we have nothing better to do, or when we have “time on our hands.” The link, by the way, leads to a lovely short film about Mt. Athos, in Greece, at Easter time.

According to the Maya, we’ve only got a few years left, anyway, since their current Long Count ends on my birthday in 2012. Wouldn’t you know—just when I’m due to retire and have more time to myself.

Photo: My father holding my eight week-old son, in the chair my grandfather bought for my parents when I was born.

Thursday, March 20, 2008

Spring Time

If I had my druthers, we’d be on “daylight savings time” all the time—even though the notion of “saving daylight” is pretty silly. The very idea of mechanical time is problematic to me, since it always seems to interfere with “real” time: i.e. seasons, circadian rhythms, phenology, and other natural forms of “telling” time. But the idea that one can “save” daylight by moving the hands on mechanical clocks around (or resetting digits on a digital clock) seems absurd. It doesn’t really save anything, since the number of hours of sunlight doesn’t change—we just do a bad job of training ourselves to get up earlier for part of the year. It would be more tolerable if the clock switch took place regularly at the vernal and autumnal equinoxes (at which time we would at least acknowledge the fact that temporal changes have something to do with seasons), but when we decide to “spring ahead” and “fall back” now seems pretty arbitrary, and it hasn’t been very constant lately. So the switch tends to leave me fairly befuddled these days.

For some reason I can’t really explain, I’m fond of old clocks and timing devices, although I don’t even wear a watch. I find it amusing to check with my students on the timing of my lectures and usually get a laugh when I’m spot-on (I try to keep the talking and slide-showing down to 90 minutes at a time). Nevertheless, I own two lovely old clocks, one each from the maternal and paternal branches of my family. One, known as “Uncle Fred’s Clock” is a Seth Thomas oak shelf or kitchen clock with a carved case and a pretty etched design on its glass front. It’s supposed to be an “Eight Day Half Hour Strike” job, but I’ve only ever gotten it to work for a few minutes at a time. The eponymous “Uncle Fred” was my great grand uncle, Fred Uhlmeyer, who settled in the Owens Valley in the nineteenth century, and who washed the ore from his mining claim at the spot now known on the U. S. geological survey as “Uhlmeyer Spring.” He’s mentioned on p. 142 of the 1975 edition of The Story of Inyo, by W. A. Chalfant, as having “squatted” in the Valley. There’s also a nod to him in More News From Nowhere, since most of the “action” (if you can call it that) takes place nearby in my future version of the area. [Thanks, by the way, to my Uncle Art—my Dad’s youngest brother and only surviving sibling—who helped me figure this out, and who’ll swear it’s true as long as I get it mostly right.]

At any rate, the clock looks great on one of the built-in bookshelves that flank the fireplace in our bungalow living room. It’s balanced on the other side by an onyx and glass Ansonia crystal regulator (of unknown name or vintage) that came from the Worden side of the family. I remember my grandmother’s winding it before bed each night, when she and her second husband (another Fred) lived in Coos Bay, Oregon. What I really love about this clock, however, is that although it works, it keeps crappy time. And even though it strikes on the hour and half-hour, it’s pretty arbitrary about how many times it “bongs”—and never any more than seven, no matter what time it is. I simply wind it occasionally, but don’t re-set it, so that it keeps its own peculiar time. There’s something reassuring about its complete arbitrariness, a reminder that we human beings are really not in control of anything, and certainly not of time.

Today, on the vernal equinox, the sun shone into my dining room window for the first time since dawn on the autumnal equinox last year. The single east-facing window in that room (the others face north) serves as a time-henge, marking the beginnings of two seasons. I’m not sure when I first noticed it, but made sure to note the coincidence henceforth; since then it’s provided a non-mechanical marker that delights us both. It takes the place of Easter and Passover in a non-religious household as a way of reminding us of “when” we are in the year. This morning my dear, sweet husband, who thinks I’m only a little nuts, gleefully woke me (and the three cats under which I was helplessly pinned) to announce the Arrival of Spring.

All this reminds me that despite my love for machines like clocks and orreries (those wonderful models of planetary motion like the one in The Dark Crystal, or Penn’s Rittenhouse model ), I truly mourn the demise of the physical, sensual ways in which human beings used to tell time. When people had to chip a notch into a bone to mark the passing of a day, or a phase of the moon, it made us more conscious of our connection with the celestial movements that affect our lives directly in such phenomena as tides and growth cycles. There were once ceremonies to celebrate such momentous events as the onset of menses (from the Latin for “month”), and there should be one for reaching menopause— because both events signal the parameters of human fertility, and may once have made women seem magical (that is, before males discovered that they had a role to play and then decided that it was the major role). But the more we insulate ourselves from the natural world, taking drugs that alter our biological clocks, “saving daylight,” building temperature-controlled houses, flooding our cities with incessant light, growing monoculture lawns watered by automatic sprinklers, riding in cars to work and on stationary bikes in gyms for exercise (instead of walking to work and working in our gardens, feeding ourselves)—the more we move away from our physical knowledge of the world, the less we understand about it, and the less we will have to say about what becomes of it.

It occurs to me that one of the more subtle, yet ominous, effects of global warming is that in addition to causing all sorts of havoc (like this week’s drenching and unseasonal rain) it ’s beginning to muck with phenological indications of seasonal change. I hadn’t realized that folks were actually taking note of the potential for problems until I went looking around the internet for links on the topic, and found one for the USA National Phenology Network, which “exists to facilitate collection and dissemination of phenological data to support global change research.” So it’s already clear that we’re not just gumming up the atmosphere and causing extinctions; we’re causing the signs of change themselves to change. What’s really going to be interesting is to see whether we’ll allow ourselves enough time to learn an entirely new language, especially since most of us seem to have forgotten, or never even learned, the existing one.

Life was somewhat simpler, and probably a bit sweeter, when we guaged time by the spawning of fish or the arrival of sun in a window—or by even the daily winding of Uncle Fred’s clock. But now that it’s officially Spring, I’ll comfort myself by keeping watch for the blooming of the wild gladioli, and get busy with the planting, already.

Tuesday, July 31, 2007

Family/Time

During a lovely conversation with a group of students yesterday after my lecture on the Bronze Age Aegean, the subject of time arose. Not an unusual topic for a humanities class, but the context was the question of violence in video games, rather than time lines or conceptions of time throughout time, etc. Everyone at the table (a group whose task it is this quarter to conduct research on ancient Japan and to create a presentation based on what they learn) was anxious to defend their favorite video games and to supply reasons why reasonable people might want to play hyper-violent games like the subject of my previous rant, Bioshock.

I was pleasantly surprised at how cogent their reasoning was, even though most of their conclusions relied on somewhat limited notions of human “nature” and the role of violence and conflict in history. One chap even brought up a psychological study (that he’d heard about from, I seem to recall, his sister, who is a psychologist) about conflict and bonding to support the notion that tendencies toward violence are staple components of human behavior. The main problem with the whole conversation, however, stemmed from an all-too-common confusion of the is/ought question. Human beings have always encountered violence, goes the premise, and have, more often than not, solved problems through wars, etc. Human “nature” is violent; just look at the statistics, the news, and other evidence and one has to come to the inexorable conclusion that we’re a violent species. Everyone seems to have the same picture in their heads: the scene from the beginning of 2001: A Space Odyssey in which primates get started on the road to humanity when they not only discover tools, but discover how to beat the crap out of their fellow primates with these same tools. I kill; therefore I am.

But, I countered, do we really have to be this way? If we do, in fact, possess free will, is it not possible to learn how not to solve our problems by blowing each other up?

One difficulty seems to stem from the fact that this generation lacks the metaphors that were available even just a half-generation ago: Gandhi, Martin Luther King, Jimmy Carter and Arafat. Not only that, but what they have instead simply reinforces the presumed inevitability of conflict: Iraq, Darfur, Palestine–and their parents’ memory of Viet Nam. The present government in the U.S. refuses even to discuss matters with “rogue nations” lest conversation beget legitimacy, so the ability to solve problems through dialogue has diminished considerably. The ultra-violent video games, I suggested to my students, only add to the dilemma, because they reinforce the notion that brutality is necessary.

“But that’s the nature of video games,” they countered. “They have to be exciting. They’re entertainment.” Sitting around talking about problems, it seems, is boring. Or at least it’s not something you want to watch on the screen. Apparently, what I need to be able to do to save the world is to inspire a bunch of game designers to create an “exciting” game about nonviolent negotiation. Hmmm . . . .

As we sat around talking about the matter (which most of us seemed to find fairly stimulating, even if it didn’t raise our blood pressure significantly), I posed my concern about the issue of children raising children in the modern world, because I see this as a crucial element. “How many of you ever sit down to a meal with your parents?” I asked. None, as it turned out–but they’re all in college, living away from home, and probably only see their parents at holiday gatherings. “Okay,” I countered, “What about when you were still in high school and living with your parents? How much time did you spend with them?” “Seldom” and “little” were the answers. “Too busy” was the excuse.

“Doing what?” I asked, with mounting trepidation.

“Playing video games.” This got a laugh. In all seriousness, however, they each expressed some regret that their final years at home–what should have been the capstone of their education as a family–were woefully short on contact hours with Mom and Dad and the sibs. Everyone had his or her own “thing” to accomplish. Parents were working (in part to help put their kids through a pricey college), brothers and sisters were hanging out with their own friends (playing video games?), so there wasn’t much time for conversation. There might have been time for arguing and conflict, however, so I’ll have to pose that question later. How much of domestic conversation these days is devoted to sharing ideas, and how much to arguing about whether or not the kids need to have a curfew or wear baggy pants–or play video games?

Now, my childhood was far from idyllic–or so it seemed to me as I was growing up. Frequent moves meant that leave-taking and emotional upheaval were common events; my parents divorced in my mid-teens; two cousins were killed in violent accidents: the usual stuff of human comedy. So the fact that I can look back on it all with fondness and even (at times) awe, means that it went well, in the long run. And the memories that stand out are significant: being taught to cook by an Italian friend of my mother; leisurely Sunday afternoons in a boat on Green Lake in Taiwan, where my parents and their friends taught me to swim; trips to the beach with the parents and our dog Griso (in a headscarf and sunglasses) in a beat-up ‘48 Chevy convertible with a hole in the floor; the “Mission Rounds” in Wulai, where we met with priest-friends and Taiwanese Aborigines and brought back home-made salami and fresh veggies from the mission garden; lunch at the Friends of China Club or the Foreign Correspondents Club, where I met people like Roy Crane (Buzz Sawyer), who drew a little cartoon for me on a note pad. Some of this was extraordinary, to be sure, but some was everyday stuff. The common ingredient was this: my brother and I were raised around and by adults. We saw our peers at school (in small classes taught by very strict Dominican nuns), and occasionally one would spend the weekend, but generally our world was dominated by adults. Nowadays, the situation seems to be reversed, so that children spend far more time with other children than with parents.

Which brings me back to the question of time.

Time is a concept invented by human beings, and mediated by technology. The commodification of time in the post-industrial, capitalist world (“Time is money,” the old saw goes) means that whatever “time” is, it doesn’t “belong” to us any more. We are constantly at the mercy and dictate of the clock (my new computer has a cute “gadget”: an image of an analogue clock on my desktop, right under the “local weather” icon). A major modern metaphor consists of the clock (remember Modern Times?), and the sound of seconds, ticking away.

I can remember being astonished when my students started looking at their watches to see what day it was, and I’m not yet used to the idea that they don’t even wear watches anymore (except as accessories, like jewelry)–they simply look at their cell phones when I ask them what time it is (since I neither wear a watch nor own a cell phone). If I’m not at my computer, I only have a relative sense of time, and try to be aware of “when” according to habits, the position of the sun, whether or not the dogs are acting hungry. But so many of our technologies revolve around time (television, radio, kitchen ranges, microwaves, coffee machines, automobiles, computers–everything’s got a clock) that avoidance is really impossible. And since I’m paid to be in the classroom at a certain hour each day, and for a measured length of time, there’s really no escaping it: hence my need to ask students what time it is in the first place (although my ability to sense when a break is due is something of a legend among them; the first part of class is inevitably an hour and a half, usually within five minutes of being spot on).

So: time and video games? Two items.

One, playing video games (violent or not) suspends the sense of time. When I’m playing, I lose all notion of “when.” The world consists of me and whatever is happening on the screen, and I’m isolated from everything else. I’m pretty sure that this divorcement from the outside world is radically different from what happens when I read a book or work in the garden, even though these are also solitary activities. My students pointed out that gaming provides its own kind of community (I remember my son’s LAN parties, with huge CPUs strung out all over the house, connected with wires in the days before wireless systems existed), but that’s probably a topic for a later post. I’m pretty sure there are fundamental differences in the quality of the community experience between game-playing online (or even on separate units in the same room) and face-to-face interactions that don’t involve blowing things up.

Two, if families were to take time back from the corporations, the technologies, and whatever other phenomena compromise our ability form communities with our own children, it might be possible to help them imagine a future that is not dictated by violence and conflict. “That’s the way it’s always been” is not an argument. Mohandas K. Gandhi, Nelson Mandela, and others have shown us that there are alternatives that can be made to work. But the political will has to exist for these alternatives to develop, and if people keep throwing up their hands and saying “it’s human nature” or whatever the excuse du jour is. William Morris himself, despite his ability to imagine a nonviolent future, thought that violent revolution was an inevitable precursor to utopia. But during Morris’s time, human beings did not yet have to power to eradicate life from the earth. We must find an alternative.

If we want to ensure the survival of our children (or at least our children’s children), taking time to help them learn from us, rather than from their peers or from those terribly exciting but morally dangerous games, is vital. Educators have a particular responsibility, because we form a sort of bridge between family and other, "external" communities. My recent encounter with my students showed me how hungry they all are for real conversation: for people to listen, and to share ideas, to argue, to affirm, to refute, to engage. For all our sakes, we really do need to take the time.