Tuesday, July 31, 2007

Family/Time

During a lovely conversation with a group of students yesterday after my lecture on the Bronze Age Aegean, the subject of time arose. Not an unusual topic for a humanities class, but the context was the question of violence in video games, rather than time lines or conceptions of time throughout time, etc. Everyone at the table (a group whose task it is this quarter to conduct research on ancient Japan and to create a presentation based on what they learn) was anxious to defend their favorite video games and to supply reasons why reasonable people might want to play hyper-violent games like the subject of my previous rant, Bioshock.

I was pleasantly surprised at how cogent their reasoning was, even though most of their conclusions relied on somewhat limited notions of human “nature” and the role of violence and conflict in history. One chap even brought up a psychological study (that he’d heard about from, I seem to recall, his sister, who is a psychologist) about conflict and bonding to support the notion that tendencies toward violence are staple components of human behavior. The main problem with the whole conversation, however, stemmed from an all-too-common confusion of the is/ought question. Human beings have always encountered violence, goes the premise, and have, more often than not, solved problems through wars, etc. Human “nature” is violent; just look at the statistics, the news, and other evidence and one has to come to the inexorable conclusion that we’re a violent species. Everyone seems to have the same picture in their heads: the scene from the beginning of 2001: A Space Odyssey in which primates get started on the road to humanity when they not only discover tools, but discover how to beat the crap out of their fellow primates with these same tools. I kill; therefore I am.

But, I countered, do we really have to be this way? If we do, in fact, possess free will, is it not possible to learn how not to solve our problems by blowing each other up?

One difficulty seems to stem from the fact that this generation lacks the metaphors that were available even just a half-generation ago: Gandhi, Martin Luther King, Jimmy Carter and Arafat. Not only that, but what they have instead simply reinforces the presumed inevitability of conflict: Iraq, Darfur, Palestine–and their parents’ memory of Viet Nam. The present government in the U.S. refuses even to discuss matters with “rogue nations” lest conversation beget legitimacy, so the ability to solve problems through dialogue has diminished considerably. The ultra-violent video games, I suggested to my students, only add to the dilemma, because they reinforce the notion that brutality is necessary.

“But that’s the nature of video games,” they countered. “They have to be exciting. They’re entertainment.” Sitting around talking about problems, it seems, is boring. Or at least it’s not something you want to watch on the screen. Apparently, what I need to be able to do to save the world is to inspire a bunch of game designers to create an “exciting” game about nonviolent negotiation. Hmmm . . . .

As we sat around talking about the matter (which most of us seemed to find fairly stimulating, even if it didn’t raise our blood pressure significantly), I posed my concern about the issue of children raising children in the modern world, because I see this as a crucial element. “How many of you ever sit down to a meal with your parents?” I asked. None, as it turned out–but they’re all in college, living away from home, and probably only see their parents at holiday gatherings. “Okay,” I countered, “What about when you were still in high school and living with your parents? How much time did you spend with them?” “Seldom” and “little” were the answers. “Too busy” was the excuse.

“Doing what?” I asked, with mounting trepidation.

“Playing video games.” This got a laugh. In all seriousness, however, they each expressed some regret that their final years at home–what should have been the capstone of their education as a family–were woefully short on contact hours with Mom and Dad and the sibs. Everyone had his or her own “thing” to accomplish. Parents were working (in part to help put their kids through a pricey college), brothers and sisters were hanging out with their own friends (playing video games?), so there wasn’t much time for conversation. There might have been time for arguing and conflict, however, so I’ll have to pose that question later. How much of domestic conversation these days is devoted to sharing ideas, and how much to arguing about whether or not the kids need to have a curfew or wear baggy pants–or play video games?

Now, my childhood was far from idyllic–or so it seemed to me as I was growing up. Frequent moves meant that leave-taking and emotional upheaval were common events; my parents divorced in my mid-teens; two cousins were killed in violent accidents: the usual stuff of human comedy. So the fact that I can look back on it all with fondness and even (at times) awe, means that it went well, in the long run. And the memories that stand out are significant: being taught to cook by an Italian friend of my mother; leisurely Sunday afternoons in a boat on Green Lake in Taiwan, where my parents and their friends taught me to swim; trips to the beach with the parents and our dog Griso (in a headscarf and sunglasses) in a beat-up ‘48 Chevy convertible with a hole in the floor; the “Mission Rounds” in Wulai, where we met with priest-friends and Taiwanese Aborigines and brought back home-made salami and fresh veggies from the mission garden; lunch at the Friends of China Club or the Foreign Correspondents Club, where I met people like Roy Crane (Buzz Sawyer), who drew a little cartoon for me on a note pad. Some of this was extraordinary, to be sure, but some was everyday stuff. The common ingredient was this: my brother and I were raised around and by adults. We saw our peers at school (in small classes taught by very strict Dominican nuns), and occasionally one would spend the weekend, but generally our world was dominated by adults. Nowadays, the situation seems to be reversed, so that children spend far more time with other children than with parents.

Which brings me back to the question of time.

Time is a concept invented by human beings, and mediated by technology. The commodification of time in the post-industrial, capitalist world (“Time is money,” the old saw goes) means that whatever “time” is, it doesn’t “belong” to us any more. We are constantly at the mercy and dictate of the clock (my new computer has a cute “gadget”: an image of an analogue clock on my desktop, right under the “local weather” icon). A major modern metaphor consists of the clock (remember Modern Times?), and the sound of seconds, ticking away.

I can remember being astonished when my students started looking at their watches to see what day it was, and I’m not yet used to the idea that they don’t even wear watches anymore (except as accessories, like jewelry)–they simply look at their cell phones when I ask them what time it is (since I neither wear a watch nor own a cell phone). If I’m not at my computer, I only have a relative sense of time, and try to be aware of “when” according to habits, the position of the sun, whether or not the dogs are acting hungry. But so many of our technologies revolve around time (television, radio, kitchen ranges, microwaves, coffee machines, automobiles, computers–everything’s got a clock) that avoidance is really impossible. And since I’m paid to be in the classroom at a certain hour each day, and for a measured length of time, there’s really no escaping it: hence my need to ask students what time it is in the first place (although my ability to sense when a break is due is something of a legend among them; the first part of class is inevitably an hour and a half, usually within five minutes of being spot on).

So: time and video games? Two items.

One, playing video games (violent or not) suspends the sense of time. When I’m playing, I lose all notion of “when.” The world consists of me and whatever is happening on the screen, and I’m isolated from everything else. I’m pretty sure that this divorcement from the outside world is radically different from what happens when I read a book or work in the garden, even though these are also solitary activities. My students pointed out that gaming provides its own kind of community (I remember my son’s LAN parties, with huge CPUs strung out all over the house, connected with wires in the days before wireless systems existed), but that’s probably a topic for a later post. I’m pretty sure there are fundamental differences in the quality of the community experience between game-playing online (or even on separate units in the same room) and face-to-face interactions that don’t involve blowing things up.

Two, if families were to take time back from the corporations, the technologies, and whatever other phenomena compromise our ability form communities with our own children, it might be possible to help them imagine a future that is not dictated by violence and conflict. “That’s the way it’s always been” is not an argument. Mohandas K. Gandhi, Nelson Mandela, and others have shown us that there are alternatives that can be made to work. But the political will has to exist for these alternatives to develop, and if people keep throwing up their hands and saying “it’s human nature” or whatever the excuse du jour is. William Morris himself, despite his ability to imagine a nonviolent future, thought that violent revolution was an inevitable precursor to utopia. But during Morris’s time, human beings did not yet have to power to eradicate life from the earth. We must find an alternative.

If we want to ensure the survival of our children (or at least our children’s children), taking time to help them learn from us, rather than from their peers or from those terribly exciting but morally dangerous games, is vital. Educators have a particular responsibility, because we form a sort of bridge between family and other, "external" communities. My recent encounter with my students showed me how hungry they all are for real conversation: for people to listen, and to share ideas, to argue, to affirm, to refute, to engage. For all our sakes, we really do need to take the time.

Sunday, July 29, 2007

Virtual Unreality

Yesterday a former student with whom I correspond frequently sent me a link to a trailer for a game called Bioshock. It featured some pretty brutal treatment of a little girl (by her brother? I couldn’t quite figure it out) and her “rescue” by a machine-like creature after some grotesque activities that included injecting something into an arm that produced a swarm of bees, and the impaling of one of the characters by a device that looked like an over-sized drill bit. What prompted the student to send me the clip in the first place was his fondness for insects, and his message was accompanied by a rather ironic quip about what we really need to be spending our defense dollars on (bee-arms?). My response was something like “Who thinks up this crap?” to which he replied that, in all fairness, the game was actually about how far one would go to save one’s own life, even if it meant sacrificing one’s own humanity. To which my reply was “Yeah, but . . . .” Because I teach in a school that offers a degree in animation, and thus know more gamers (including my son and daughter-in-law) than the casual observer, I’m quick to latch onto potential problems like a baby to a nipple (sweet image, that). Truth be told, however, I have never understood the attraction of games like Doom and Quake (which my son and his friends played, and even created additions to), I have no interest whatever in Second Life, and the only computer games that even mildly amuse me are puzzle-solving efforts like Myst or Qin or silly arcade candy like Luxor that involve nothing more violent than shooting down strings of brightly-colored scarabs to earn points and defeat Egyptian gods. I suppose that simulation games like Sims can be educational (although I once tried playing one that involved building space colonies and just got frustrated because I didn’t know what the hell I was doing), but what I’m really beginning to think is that all of the energy that’s going into developing games that explore our humanity could be better spent actually exploring our humanity. Part of the problem is that I can’t get “into” virtual worlds. The idea of virtual reality actually goes back to the Middle Ages, according to Margaret Wertheim (see especially her book, The Pearly Gates of Cyberspace). In situations where things are not going well, folks have a way of imagining something better--and the great Medieval cathedrals provide visible evidence that their builders and their communities were anticipating the kingdom of heaven. Not having the talent or energy to create the twenty-first century equivalent, however, I take refuge in books by people like William Morris who combined a tangible effort to better the physical world with a plethora of imaginary (“virtual”) worlds of his own. To top it off, even the most carefully-constructed of these games feature figures that move so unnaturally through their landscapes that I wonder, "why bother?" Why not just go to a damn movie, for crying out loud? I used to think that I understood why people play video games. When Myst and its early sequels came out, I was truly fascinated by the surreality of the environments players entered. I loved the fact that they were about books, and that the creators had spent some serious time studying the history of architecture and technology. But they lost me when they put people in their “worlds” and made play too immediate for cranky old misanthropes like me. I have enough reality going on around me every day (maybe if I didn’t read the newspaper things would be better), that I don’t need to escape into some kind of “otherness” unless its one I make myself. So, when I want to "escape," I write. I have begun to wonder what would happen if, instead of imagining grotesque, violence-ridden scenarios, game designers used their virtual worlds to set up real-life problems and enabled people to work through actual possibilities. I suppose Second Life has that potential, but from everything I’ve heard, it’s simply an extension of the MySpace mentality that I already find so repugnant (even though most of my students, many of my colleagues, and one of my own children are all devotees of the latter). I know that all this stuff is supposed to foster community, but what kind of community? Certainly not the face-to-face interaction imagined by Adam Smith and the other participants in the Scottish Enlightenment--and on which Smith’s notions of capitalism and moral sentiments were based. Certainly not the in-your-face experiences of our earliest ancestors, that eventually led to cooperative hunting and gathering and then agriculture on a larger, participatory scale (and, I have to admit, such other ramifications as conflict and war). Certainly not even the salons and conversations that produced the public intellectuals of the early twentieth century: the avant-garde in general, and specific movements like Cubism and Abstract Expressionism in particular. The fundamental difference seems to lie in the fact that many of today’s children find the real world too “boring” or too disturbing to deal with, and playing through the games provides a kind of power missing from actuality. I know that this treatment of the problem is superficial, and that it needs a lengthy, careful, phenomenological analysis in order for it to make real sense. It’s just that watching the trailer for yet another violent, disturbing, angry game made me long for the days when one could “walk” through a virtual world, uncovering clues and solving puzzles, until one reached a satisfactory conclusion. Maybe we didn’t save the world, but we didn’t slather it with blood and guts, either. It’s not that I want everything to be pink and peachy; it’s just that there ought to be room for thinking about our humanity, and working through our problems that doesn’t include unspeakable acts of violence that--let’s face it--have more to do with adolescent notions of power and sexuality than with creating a better alternative to the problems of the modern world.

Tuesday, July 24, 2007

The Gothic Critique of Modernity

I’m lecturing on Ruskin, Turner, and the Preraphaelites, along with Morris and the Arts & Crafts Movement this week, and it seems time for a defense of what I have come to call the “Gothic critique of modernity.” This was actually dreamed up as a subtitle for my never-to-be dissertation, “Medievalism, Art, and Technology”—although I’d have also added “William Morris and” to it once I began to focus on his work and how it influenced later writers like Henry Adams and Lewis Mumford.

Instead of writing a book, however, I now devote a segment of my History of Art and Design II class to a discussion of the implications of medievalism for the development of modern art in Europe, and later in America. I begin with a reprise of the problem of Turner (whom I’ve already discussed in the previous class as a bridge between Romanticism and Impressionism), using his paintings of the burning of the Houses of Parliament to introduce the Gothic Revival, and Ruskin’s defense of Turner to connect the similar defense of the Preraphaelites. Into the mix go Carlyle and Pugin, and everyone always wonders why these guys were so fascinated by an era that produced the Black Death and immutable social hierarchies (they seem to forget about illuminated manuscripts and Gothic cathedrals momentarily; that was last quarter, after all). They’ve also apparently never heard of Blake’s “dark satanic mills,” either, being much to young to have seen Chariots of Fire and thus not remembering the lovely rendition of “Jerusalem” used in the film.

This attitude seems to support my “veils of technology” theory (significant portions of which I probably stole from Gerald Holton, physicist, historian of science, and flat-out prophet). Almost every week in every class I encounter evidence that imagination today is wholly mediated by technological devices that increase the distance between human beings and the natural world. The thicker the veils, the greater the distance; they not only limit vision, but seem to extend from top to toe, insulating people from any immediate contact with anything that isn’t (in Wilhelm Dilthey’s words) “mind affected.” Just yesterday I watched a colleague slather on “hand sanitizer,” instead of running of to the Ladies to use water (made available through plumbing) and soap (machine-delivered). Much of this is further reinforced by the necessity of efficiency (it would be much less efficient to have bars of soap available next to faucets) and “the bottom line” (the porters would have to be paid more if it took longer to tidy up the loos). Having a bottle of hand-sanitizer handy probably also means that workers visit the facilities less often, and only for more urgent matters. Time on task, etc.

When I was younger, discussions of “labor-saving” devices were the staples of suburban cocktail party chatter. Or at least frequent feature articles in “Home” sections of newspapers made it sound as if this were the hot topic of the moment. Does a vacuum cleaner really make a housewife’s life easier? Does a steam iron lessen the onus of ironing day? Does an automatic washing machine and dryer leave time for a romance novel? These same feature sections now happily picture the latest versions of the same equipment, which prompt new questions: Does Martha Stewart really do her own house-cleaning, ironing, and washing using her state-of-the-art European vacuum cleaner (or is it Whole House vac?) in the beautifully-designed laundry facilities in each of her houses?

Picture this: A group of women gather up the week’s laundry (which doesn’t actually seem like a week’s because people actually wear clothes for more than one day, unless they fall while feeding the pigs) and collectively carry it to a communal trough, or down to the local river. They use washboards, brushes, and soap (made from fireplace ash and leftover animal fat) to scrub out stains and dirt, rinse it in the troughs (carrying dirty water over to garden plots) or in the river, and either hang the wash or lay it in the grass to dry. Their daughters and very young sons are with them, doing some of the work, listening to gossip, learning about laundry, being told stories, fishing in the river, being around adults. They’re out in the world, smelling, seeing, feeling, hearing, and none of this experience is masked by complex chemical concoctions, artificial perfumes, iPods, or camera phones.

I don’t mean to make this sound idyllic, but to me it almost does. I have lived in places where this scenario (or one close to it) was played out on a weekly basis (in rural Japan, in the post-war fifties, and in suburban Taipei in the sixties). Stateside, one grandmother owned the only washing machine in the neighborhood (an old Westinghouse wringer-job that was almost more trouble than it was worth), and three families did their washing on her back porch. My other grandmother owned a Laundromat in coastal Oregon, where the neighborhood did its wash, since nobody had room for a washing machine. In both of these cases, laundry was still a communal affair. Women would gather in the kitchen to talk, have coffee, sometimes bottle up the summer’s blackberries, and either hang the wash together, or take it home to hang. The café next to the Laundromat was the next best thing to a pub, and women would gather for coffee while the sheets and knickers spun in the machines, then carry home the wet wash to hang in the yard.

Nowadays, of course, we have our own private laundries in our homes, and we do it by ourselves. Women are no longer the sole practitioners of the science; in my house, my husband does a significant portion of the work—which is particularly welcome since he plays and coaches tennis in addition to teaching, and thus generates far more laundry than I do. I’m not sure how many men participate in this activity, but I suspect that they still lag behind women, so that it still falls within the traditional realm of women’s work.

For a long time I had no dryer, and one grandmother never did. To this day I still enjoy hanging out the wash whenever possible. The other grandmother's Laundromat eventually put in drying machines, and the clientèle gradually became more coeducational. Later, that grandmother sold the store and moved into a trailer park in Southern California, were they still did their laundry communally. But the rituals have since become more and more isolating and self-contained, and the wealthier one is, the further removed from the old scenario one becomes. I’m pretty sure that Martha’s maids and laundresses appreciate her fine facilities, and perhaps they get together for coffee during the breaks. But the community of women who once gathered by the stream has disappeared in the technologized West. And I’m not sure what we’ve gained except time—to spend on other technologically-imposed tasks.

Neither Carlyle nor Pugin, from what I can gather, gave a rat’s ass about how women did their laundry. Ruskin and Morris were actually concerned with all manner of work (although mostly men’s), and Morris even did some that was considered “women’s”—such as embroidery. And they all had servants to do the nasty bits. But all of these men saw in the Middle Ages some potential for a life richer than what was developing as a consequence of the Industrial Revolution. Perhaps only those who have lived on the shores of Love Canal, or along the Cuyahoga, or next to big nuclear facilities or industrial parks can really understand how the plague-ridden centuries before the Renaissance began to look good even to those who lived relatively well in the decades before the fin de siècle. It takes more effort than I care to admit to lift the veils, even for someone who didn’t live with a television set until 1962. And we’re now so blind to the consequences of unfettered technological acquisition that we can’t imagine life without it unless somebody writes an alternative into a science fiction novel.

My first thought experiment concerning technology started out with a question: What could we do without? In future posts, I’ll ruminate on the answers I came up with. But for now, I think it’s useful to consider the Gothic critique of modernity as just such a thought experiment. What if, mused Carlyle and Pugin and Ruskin and Morris, we had the will to resist the onslaught of technological innovation (steam engines, railroads, electricity), or at least to consider the consequences? What if we modeled possible results (they didn’t have computers, but they did have pubs, and that’s were the real philosophy had been done since the Enlightenment) and came up with alternatives? What if we examined the historical problems (disease, inequality, oppression) and used what we’ve learned to frame a different future that would incorporate what worked well (community, low environmental impact, artistic achievement)?

If we were to attempt what these men did, it would require a much greater effort, precisely because we are now so dependent on our technologies that the very idea of “losing” them throws us into paroxysms of post-apocalyptic angst. If all this went away, the new scenario goes, we would revert to mud-soaked, aristocracy-dominated, dreary, sunless, plague-ridden misery, just like those poor sods who succumbed to the Black Death and lived solitary, poor, nasty, brutish, and short lives. Never mind that these same folk gave us illuminated manuscripts, Gothic cathedrals, Gregorian chant, guilds, and the longest period of religious cooperation in history during what Richard Rubenstein calls the Aristotelian Revolution (in his recent book, Aristotle’s Children). I don't think it takes too much foresight to suggest that we might now be standing on a precipice similar to that on which Morris and his ilk teetered, or upon which late Medieval scholars found themselves poised. However, if we choose to look forward without remembering what our intellectual ancestors faced, perhaps the doom-sayers will have got it right after all. The old saw about forgetting the past and repeating its mistakes always seems to prove itself true, and without becoming more mindful about our technological choices we only seem to be laying a path toward yet another collapse.

If it's any comfort, Morris seemed to remain optimistic about the future, and in his post-apocalyptic vision England had become, once again, a green and pleasant land.


Friday, July 20, 2007

Utopian Pizza

This is primarily an addendum to my last post, because although I mentioned being vegetarian and Kosher, I neglected to place these (what? Events? Conditions? Practices?) in context. The context, as it turns out, was important to the point I was trying to make about the connection between cooking and community.

I began keeping a Kosher home when I married for the second time and was firmly established in an urban community rich with Jews, many of whom were much more religious than I. The nice thing about being newly married is that the couple receives presents, and we made our kashrut plans known to our friends and relatives--who graciously helped stock the kitchen with two of everything, coded with colors (cool for dairy, warm for meat) to help simplify the process. But I never was a true believer; I wanted a Kosher kitchen because I wanted any friend of any religious persuasion to be able to eat at my table. So my initial instincts were communitarian, if not doctrinally pure.

The move toward "vegetarianism" (which sounds like a religion itself) came about as I began to explore philosophy in graduate school, and probably in a subconscious effort to pay tribute to my beloved grandmother. I wouldn't be surprised if a psychoanalyst were to uncover another motive: to make keeping Kosher even simpler (if one eats only dairy or pareve, one needn't worry about mixing meat with milk. Tah Dah!). But even as I became a full-fledged, full-time vegetarian, one of my goals was to feed my friends so that no one would miss the meat. I remember throwing a holiday party for my then-husband's colleagues, and after everyone had raved about the food, I expressed some surprise that nobody had remarked about missing Swedish meatballs or cocktail weenies. Those gathered around the heavily laden table, piling their plates with marinated mushrooms and slices of veggie quiche looked at one another and laughed; nobody had even noticed. By this time we were living in a community with far fewer Jews, and even fewer who were concerned with dietary laws, and eventually any dietary restrictions could be handled with vegetarian meals.

Somewhat later, as the children grew and began to behave like starving, deprived orphans in front of the supermarket meat counter, I relented. I began to cook, serve, and eat meat, albeit sparingly and infrequently. Fish was easier (I hadn't ever gazed fondly into the eyes of a flounder, and was thus much less sentimental about scaly things), and soon after it became fairly simple to acquire poultry (and, for a hefty premium, beef) that hadn't been marinated in hormones, sprayed with insecticides, and/or tortured to death, and my conscience was soothed. My friends could then gather for an organic meal featuring well-raised and thoughtfully killed animals, and I could remind my children of their supper's origins by announcing that they'd be enjoying "dead baby chickens." We never did have "dead baby cow" because of the way veal is raised in the United States, although they once did taste Kosher, humanely-raised veal in London after being reassured about its free-range origins.

Both of my children are, today, much more aware of ethical considerations than are their peers. The specter of congenital heart disease now haunts us all (I had bypass surgery due to genetically astronomical cholesterol levels at the age of 47), so we eat even more healthfully than we did while they were growing up. And, because I teach at a school with a culinary arts program, I have become increasingly interested in the history, philosophy, and culture of food. As a result, being mindful about eating comes quite naturally, and solutions to dilemmas about what to eat are solved fairly simply. I'm still squeamish about eating certain kinds of foods (lobsters=bugs in my book, and I could probably only eat a cricket if I were starving to death), and I still eat things I shouldn't (take-out pizza, for example) more often than I would like.

And this leads me to the main obstacle in the way of mindful eating in modern culture: time. Until we can learn to slow down and to choose our priorities more wisely (do we really need to haul our children around to five different organized activities per week?), we will never put Papa John's out of business. My own hand-made pizza, baked with freshly ground whole wheat flour, home-grown tomatoes and peppers and herbs, and locally-crafted mozzarella cheese, tastes infinitely better, is far more nutritious, and economically more satisfying and sustaining than the one the pizza guy brings to the door.

One thing about thought experiments is that they're often fairly inspiring. So I'm going to finish this post and go out to the garden (more like a jungle after all the recent rain) in hopes of harvesting the makings of what I'll have to start calling Topsy's Pizza, in honor of the terrific meals Morris describes in News From Nowhere. He undoubtedly never had pizza, but if he'd run into it in his imagined future London, this is what it would've tasted like.

Tuesday, July 17, 2007

The Raw and the Cooked


I'm fairly new to Anthony Bourdain's show on the Travel channel, No Reservations, but I'm already a fan. I was a bit taken aback last night, though, when he went off on Charlie Trotter's book, Raw, at the end of the segment on New Zealand. Having spent a couple of years in Chicago, where Trotter is spoken of in soft, reverent tones, and having watched his own jazz-infused television show (The Kitchen Sessions) enthusiastically, I'm also something of a Charlie Trotter fan. Although the contrast between Bourdain and Trotter couldn't be more evident, I was still surprised because Bourdain strikes me as being a sort of food-libertarian, allowing for almost any perspective in the quest of really good things to put in one's mouth.

I must confess that I haven't read Raw (co-authored by Roxanne Klein), so I don't know much about it except its premise, which involves the idea that raw food can be exquisitely prepared and every bit as tasty as cooked, and is thus consistent with the apparently growing Raw Food movement. Bourdain, on the other hand, is an aficionado of most things edible, however they're prepared, as long as the results please the palate. His point on the New Zealand show had to do with enjoying ethnic cuisines without culinary prejudice. Part of his agitation may have resulted from an on-camera near-death experience, but Bourdain insisted that eschewing certain foods for whatever reason indicates profound disrespect for the culture that produces it.

Over the past forty years, I have come to embrace culinary anarchy myself, after a long journey through the history and philosophy of food. I grew up primarily in Asia, where, as the typical picky-eating ugly American child, I ate little of what was prepared for me by excellent cooks. People were always having to boil eggs for me. I do have some fond memories: soba in broth in both Japan and Taiwan; Mongolian barbecue (now available in this very town, albeit in a somewhat homogenized, tamed version); steamed rice buns, sour plums, and other "fast food" one could buy from street-vendors in Taipei; tempura and sukiyaki in rural Japan, along with sweet milk-candies that I can now buy in the local World Market. What I loved best were fresh leechee nuts, carambolas, and pumelos that grew on trees in our yard on Yan Ming Shan near Taipei, or on the slopes of Seven Star Mountain, where I wandered at will under the collective eyes of our cook-houseboy and villagers along the mountain trails. We had two different cooks named "Lee" whose wives and children were my closest companions when I wasn't in school. Both Lees were superb cooks, and tried continuously to find ways for me to enjoy Chinese food. Oddly enough, the person who finally succeeded in awakening my culinary instincts was an Italian friend of my mother's, so that I learned to cook northern Italian food in Taiwan. My favorite restaurant was the Marco Polo in Taipei, where my thirteenth birthday was held (to the embarrassment of my parents when my classmates made rude jokes about the salamis in the kitchen).

Back in the States, my major influence was my grandmother, who had been a vegetarian since the age of fourteen, when she came home from school to a meal that featured (she found out later) the calf she had been raising. All of her children were omnivores, and she dutifully roasted chickens, turkeys, and all sorts of meat, but never touched it herself. I spent most of my holidays with her, and although she'd supply me with meat if I asked for it, she was always clearly pleased when I didn't seem to miss it. Her rejection of meat was so strong that even when she began to slip into dementia at the age of 99, she continued to insist on meatless meals until she died five years later.

In college I had had two culinary muses: the Greek wife of the Classics department chair, and later the Greek wife of an archaeology professor (she was also my Modern Greek instructor). Both of these women not only taught me how to cure olives and make spanakopita; they also embodied the connection between food and family, and drove home the importance of culinary traditions in the preservation of cultural meaning.

As I became more thoughtful, I also became more selective in my food choices, and eventually became a vegetarian myself--not because I thought it was wrong to eat meat, but because I reasoned that if I couldn't take the responsibility for killing an animal, I didn't deserve to eat it. This went on for about fourteen years until I buckled under pressure from my own children. During most of this time, I also kept a Kosher home--which may have prolonged the vegetarianism, since it made the whole process of kashrut much easier to accomplish. Both of these experiences certainly fortified a notion of the power and importance of food in human culture. Further studies in anthropology and the history of technology made it clear that we are, in fact, what we eat (books like Mary Douglas's The Raw and the Cooked and Margaret Visser's Much Depends on Dinner). In addition to anthropological treatises, I read everything M. F. K. Fisher ever wrote, along with myriad books about artists and writers and food (The Joyce Cookbook, Monet's Table, etc.), and even more about the history of food and regional traditions. But it wasn't until I had a family of my own that all of these strands came together to make me think seriously about what it means to eat.

Once I became interested in Morris, I began to ruminate on what it would mean to eat well in his terms. In News From Nowhere, food is an emblem of community: carefully grown, cheerfully prepared, and gratefully enjoyed. Mindfulness about what we eat precludes wastefulness, overindulgence, greed, and many of the ills that have produced the "fast-food nation." One of the best remedies for problems like obesity, disrupted families, stress, and other modern dilemmas lies in the way we eat: if we were to eat mindfully, we could begin to reverse most of these symptoms. Movements like Slow Food could lead us in the right direction, but this particular effort seems to have taken hold more quickly in countries where people have traditionally taken food more seriously than they do in the U. S.: Italy and France. (I must mention, however, that like many such efforts, mindfulness involves choices that not everyone has the means to make. Those of us who can, however, should.) Every time I order pizza, I enjoy it less because I see it as a cop-out: acquiescence to the status quo, and an inability to consistently follow principles that I know are necessary for change.

Which brings me back to Anthony Bourdain and Charlie Trotter. Both of these men practice culinary mindfulness, albeit in significantly different ways. Trotter is a culinary priest; his kitchen, while encouraging improvisation, is grounded in the rituals of the trained chef. Watching Charlie Trotter cook is like participating in a discussion of the Talmud, following the Rabbi along as he leads us through the knotty texts of both Torah and commentary, until we emerge slightly wiser than we were before, and with a new perspective. Traveling along with Bourdain is a much different experience. His is the serendipitous exploration of variety and diversity, but grounded in a fundamental realization that different cultures' cuisines can tell us more about who they are than any amount of museum-going or any number of visits to "culturally important" sites. I probably couldn't ever gut a wild boar (no matter how much a scourge it is on the New Zealand landscape; I have enough trouble cutting up a chicken), but I still admire those who can kill and butcher what they eat. Seeing Bourdain tuck into slow-roasted pork and native root vegetables with a group of Maoris who clearly appreciate his enthusiasm provides a different model for how to accomplish a broader culinary perspective.

My uneasy compromise between being a vegetarian and being an omnivore has been made simpler lately by the increasing availability of humanely raised and killed animals. Michael Pollan's new book, The Omnivore's Dilemma, is also helpful, because it explores the difficulties involved in becoming mindful in the modern world. As a result, although bits of the picky-eater child still remain (along with remnants of my Kosher training, so that creatures like shellfish and crustaceans still seem largely inedible), I have become rather more adventurous. I will always love Mediterranean cuisine more than Asian, but I am slowly trying to "educate my desire" by including more foods I should have taken advantage of as a child. And while I admire Charlie Trotter's expertise, and understand, to some extent, the attraction of raw foods, Anthony Bourdain is my master now.


A note: The image included at the beginning of this post is the work of one of my students, Chelsea Gilmore, in response to a design problem that focuses on the relationship between photography and art during the nineteenth-century. We recreated a Cezanne still life, photographed it, and then students manipulated the photographs back into "paintings" in the style of one of the movements we had studied. Chelsea's response was to create a new post-Impressionist version.

Saturday, July 14, 2007

Fire and Water


Even though I live in north Texas, for some time I have regarded doing so as a form of exile. I followed a husband here, and raised my children here, and (partly from inertia) I stay. I will probably die here, but my ashes will be tucked into the family plot in a little town in eastern California, among those of my father's family.

Last week, the family plot itself was threatened by this season's biggest California wildfire--a 35,000-acre conjunction of lightning-caused fires among the sagey open spaces and canyons of the Owens River Valley, and on the slopes of the Sierra Nevada mountains. The fire, known as the Inyo Complex, is now under control, but at its height it closed U. S. Highway 395 and forced the evacuation of 200 residents in the county seat, Independence. Popular campgrounds west of Independence and northwest of Big Pine were closed, and one was burned over by the blaze. (A fellow blogger has commented on the fire at Blogged In The Desert.)

I have often railed (and have witnesses to prove it) against the propensity of human beings to build homes in areas demonstrably prone to disasters of one form or another. It seems to me that we're asking for it when we build in flood plains, fire-climax vegetation areas (like many of those near Los Angeles), some types of forest (like the Pine Barrens of New Jersey and Long Island), and especially on the edge of the ocean in places subject to periodic (and inexorable) beach erosion and/or hurricanes. As much as I love the idea of New Orleans, it should never have grown up where it did, and it certainly never should have grown as large as it has. Los Angeles is another example, not only because it was built on or near the conjunctions of several earthquake fault lines, but also because the region couldn't possibly sustain the population it has acquired without pulling water from the Colorado and Owens Rivers--both of which had other communities to support, thank you.

But the original inhabitants of the Owens River Valley (the Numa, members of the Paiute and Shoshone tribes) and early white settlers must have thought that they had found an idyllic home. They may well have known about the earthquake hazard there--at least, they did by 1872--but there was abundant water to support a farming/ranching economy, and they probably didn't suspect that a behemoth of a city would grow up to the south or that one of capitalism's great heroes would arrange to carry away most of the river's water by the 1930s and cause the great Owens Lake to dry up.

Don't get me wrong. I love L. A. Not nearly as much as I love the Owens valley, but it does have its charms. I graduated from high school near there (Orange County) and began my academic career not far away (Riverside County). I always returned to Inyo County, however, in part because my family had inserted such a palpable sense of place into my genes that even when I lived in Texas for a while as a teenager, I would hop on a Greyhound bus every holiday break and take the two-day ride to L. A. and then to Lone Pine to stay with my grandmother (who always seemed glad to pay for the trip because I was the grandchild who loved the valley the most). Sometimes I'd spend 4.5 days on the road just to spend 3 days breathing the granite- and cottonwood-scented air that meant "home."

I'm also fully aware that if Mulholland hadn't arranged for the aqueduct that still pumps millions of gallons of water out of the valley daily, the towns for which I hold such affection would probably have grown into a massive metroplex of expensive homes and resorts, instead of making a living for themselves by providing backdrops for science fiction movies and SUV commercials, and acting as a gateway to the national parks and campgrounds of the eastern Sierras. But my grandparents' photographs of the early homesteads and Owens Lake when it was deep enough for steamboats, and even Ansel Adams's photos of the infamous Manzanar Relocation Camp during World War II, evoke a sense of open spaces, clear air, purple mountains' majesty--and all the nostalgia anybody could ever want (unless, of course, you were interned during the war; but I've met people who spent several years at the camp, and they remember the valley itself fondly).

A few years ago, on a trip home from Texas, we drove by Owens Lake, hoping to see the results of the new wetlands project. I noticed that an imposing new building of corrugated metal had sprung up just outside of Olancha, blocking the view of the lake. A somewhat ironic development had brought to the valley a new industry: bottled water from the Crystal Geyser people. So now the thirsty minions of L. A. can help themselves to even more water--from the mountains above the lake they had already drained.

The fact that people have built and continue to build homes in the valley indicates to me that there are many others who are still drawn to this very place. Infrequent wildfires are much less of a threat than inevitable hurricanes or--here in exile--tornadoes, so that living between two mountain ranges that frame the clearly visible Milky Way at night seems a far better choice. The valley is pretty arid (even though the city of Los Angeles has been forced to allow more water to flow into Owens Lake to eliminate the pollution it causes when it's dry)--but then so was north Texas until this spring. The impending climate crises will affect everyone, and the outcome of global warming is largely unpredictable. (I wasn't going to plant tomatoes here this year because we all expected the drought to continue--and then the rains came.) I can't even begin to imagine what will happen in the valley as the temperature rises.

Many strategies exist to mitigate the problems brought on by climate change. We could, for example, learn from past mistakes that building huge, ungainly cities that drain natural resources of their value and require huge investments of technology and capital cannot provide a sustainable future. Small, self-sufficient communities based on appropriate technologies and minimal exploitation of local resources might help see us through another century. Siting these communities should also include risk-assessment so that we don't try to mold the environment to satisfy perceived human "needs" by abdicating common sense. We don't have to build houses on the beach in order to enjoy the ocean. We don't have to build expensive wooden cabins in areas prone to forest fires, just so we can get away from monotonous nine-to-five jobs. While it's certainly not possible to prevent natural disasters (the high desert does burn on occasion), we can certainly avoid the obvious locales (flood plains, hurricane-prone shorelines) or at least build houses designed to withstand the inevitable. And perhaps if we were to spend more time trying to develop meaningful work-lives, the need for "vacation homes" might be reduced.

The earliest examples of "civilization" (from the Latin civis, "city") were built along rivers: the Nile, the Tigris/Euphrates, the Hwang Ho, the Indus. Carefully tended (rather than ruthlessly exploited), rivers can provide many of life's necessities (food, water, plant materials, natural beauty) for as long as their sources exist. But they can also become polluted, over-fished, over-traveled, or generally over-used. Studying the various "collapses" of ancient civilizations (in Mesopotamia, Mesoamerica, North America) inevitably leads to the conclusion that misuse or abuse of an area's water-sources is always a major factor. Although human beings seem to be drawn to riverine environments (only to exploit them ruthlessly), imagine what would happen if we learned to use our rivers lightly and well. The Thames, for example, may not be the pristine stream Morris imagined in News From Nowhere--but it's far cleaner than it was in the nineteenth century. And New York's Hudson River has undergone significant improvement thanks to groups like the Hudson River Foundation and new efforts to clean up toxic dump sites. Hope abides.

So what does any of this have to do with utopia? In my thought experiments regarding "how we might live," I always conjure up a prelapsarian image of the Owens River Valley: pre-Mulholland, pre-aqueduct, even pre-mining days (although they didn't last long and didn't do all that much damage). This version of the valley is a bit like the New Zealand of Peter Jackson's Lord of the Rings films: open valleys, impossibly tall mountains, rushing rivers, expansive landscapes. Morris imagined his utopia along a clean, unpolluted Thames, on which he traveled and re-imagined London after the "revolution." A free, unfettered (un-pilfered) Owens River offers similar opportunities. I'd build a mote house near Uhlmeyer Spring, overlooking the river and the valley (see the photo, above), and like-minded folk could discuss and practice sustainability, permaculture, social responsibility, and other "utopian" ideas to their hearts' content, while breathing clean, desert air.

That is, of course, as long as the whole place doesn't go up in smoke.

Wednesday, July 4, 2007

Simplify, Simplify

All the talk about "living simply" (or, as the Philistines would have it, "living simple") is starting to stick in my craw. Crabby person that I am, I find it at best disingenuous, and at worst self-serving and fundamentally dishonest.

What can be simple about spending a fortune on storage items for accumulated crap? Unless it is simply good for the economy (as the recent sale of The Container Store might indicate). Now, I have nothing against the Container Store itself; I owe what organization exists in my house in large part to the versatile Swedish bookcases I can only buy there. But the very existence of a store whose sole purpose is to help us stow away (fashionably) all of the excess detritus of our consumer-driven lives seems to be counterproductive if our aim is "simple living." Such simplicity seems to come easily to those who live in half-million dollar McMansions with pristine carpets, designer furniture, California Closets, and a plasma television set in each of its five bathrooms. A trip to the Whole Foods in a nearby neighborhood (a bedroom community for the telecom industry) one recent winter brought me into contact with a mink-cloaked woman whose grocery basket was packed with $40 wine, exotic cheeses, organic frozen dinners, and a copy of Real Simple. My inner communist was so offended that I didn't return to the store for over a year, and then only because it's located right down the street from my cardiologist (allowing me to combine trips, and save time and gas--major "simple living" goals). Some of the efforts to effect simple lives seem to have the right idea (the Simple Living Network, for example), but others seem to be missing the point.

So here's the conundrum. How do we channel Thoreau in a modern world that constantly militates against anything that even vaguely resembles what happened at Walden Pond all those years ago? The situation reminds me of what Morris faced as an early Socialist who could only do what he did because he had inherited wealth (and wealth from mining interests at that). His dilemma arose because although he advocated well-designed, hand-crafted items of what we would now call (shudder) "home decor" for everyone, only the wealthy could afford them. The problem persists today, because many of us who still appreciate the Arts and Crafts aesthetic don't have the income to afford it. At best we can buy cheaply-made "Craftsman" or "Mission" style knockoffs because they resemble the real thing; but then we're stuck with piece-of-crap imitations that quickly show their true colors. The Craftsman ideal involved honesty, after all, and the imitations are anything but honest.

If we truly want to simplify our lives, it seems that what we really need to do is stay out of stores altogether: Educate our desire, as Morris would put it. Determine what we really need, versus what we only want. A bit of navel-gazing in that direction is usually instructive, as I have often found when I'm short of cash. I automatically switch into what I call "poverty mode" and what had seemed like a compelling need for a new (insert item) the day before is seen for what it truly was: a desire brought about by reading one too many bungalow shelter magazines (what somebody has appropriately termed "house porn" because they're so arousing). I don't leave the house except for work, don't shop for anything except bare necessities (usually coffee, milk, and/or wine--none of which are, in fact, necessary), and around pay day the mode subsides and I hit the bookstore--sometimes for more house porn. Because I teach in a design school, I regard my magazine fetish as an occupational hazard. But I'm cutting back, in an effort to simplify.

I wonder, however, if we have lost the ability to educate our own children's desire. When mine were young, I exercised certain power: no Cabbage Patch dolls, no Barbies, no GI Joes, no Izod--no clothes with logos (why should I pay some guy to advertise his name?). But we were awash with Star Wars toys, Strawberry Shortcake (smelly, but cute), Matchbox cars, and Happy Family dolls. We were the last family on the block with a color television (a little 13-inch job bought in 1980 solely so I could watch Cosmos in color), but bought a Commodore 64 as soon as they came out, even though we couldn't afford one. As a result, I think my children are somewhat more skeptical about advertising and somewhat more thoughtful about consumption than their peers--but neither of them lives particularly "simply." They are kind-hearted (rescuers of stray dogs and cats), and more ecologically-aware than most, but they live in lofts and condos, and spend far more money than I ever would on modern technology.

It's hard for most of my students (who are now somewhat younger than my thirty-something children; I've been teaching for twenty years or so, and there used to be much more overlap) to understand that all of this technology has become available in the last century or so. My great grandfather ran a stage-coach station in western Nevada, where my grandmother was born in 1897. The family moved to the Owens River Valley when she was about 11, traveling over the mountains on the narrow-gauge railway (The Slim Princess). By the time she died, at 104, she had seen just about every major technological innovation that had occurred since the Industrial Revolution--including electricity and standard indoor plumbing. She used to remind me, when I was feeling particularly picked-upon because I didn't have all the stuff my friends had, that they got on quite nicely without electric lights and air conditioning, and without cars. She did own up to rather liking flush-toilets, radio (she avidly listened to Night Owls, one of the original talk shows), and the souped-up '69 Nova she bought to replace her '57 Chevy. And she especially appreciated not having to boil up her own bathwater on a wood-burning stove.

The idea that imperialist types have to save the world from "poverty" by inflicting modern technology and consumer desires on unsuspecting folks dazzled by the glamor of iPods, cell phones, computers, televisions, and the like seems to indicate more about our own guilt than any true humanitarian impulse. We need to re-think the idea of "poverty" in the first place, because some aspects of it lie at the very heart of real simplicity. Not having electricity does not make someone "poor." Not having access to basic medical care, clean water, sufficient food, and reliable shelter does. Being able to educate one's children, foster community, take care of the land, and live thoughtfully are all possible with minimal technology. Yet, the West seems hell-bent on eradicating "poverty" among people who could sustain themselves if we just left them alone to live they way they have for millennia. Most of us couldn't last a week in the wilderness without at least a space blanket and a good sized bag of trail mix, but we think the way we live is "rich." It is certainly rich in stuff, but we seem to be having a great deal of trouble trying to understand what to do with it all.

Instead of buying fancy (expensive) new boxes and bins to house our clutter, perhaps we should think about not buying all the clutter in the first place. I guess it starts with leaving the magazine on the bookstore shelf, but I really do understand how difficult that can be. Especially when confronted with a beautifully-photographed essay on pared-down living in a Green and Green bungalow. Sigh.