Sunday, December 23, 2007

The Assessment Obsession

I’ve been absent from the blogosphere for a couple of weeks now, due to the inevitable, inexorable periodic pressure of grading inflicted upon those who practice my profession. Few teachers manage to avoid the necessity of testing and assessing their students’ work—and none of us enjoys the task.
Education has preoccupied utopian thinking at least since the time of Plato, and it figures prominently in his design of the Republic. William Morris’s response (in part stemming from his experience in the boarding school “boy farm” at Marlborough) to the question of how we should educate our young was—as he described it in News From Nowhere—to let it happen naturally. All children are hungry to learn, and adults need only supply them with the means to do so, by example primarily, and by allowing them to pursue what interests them while learning alongside adults about how to farm, how to develop a craft, how to build a boat, and how to live. Although he didn’t rule out “book learning,” Morris recognized that children require instruction and learn best through the example of adults. He completely rejected, however, Edward Bellamy's concept of compulsory education (through the age of 21) and preparation for the Industrial Army—centerpieces of Bellamy's utopian vision in Looking Backward.
When I look back on the significant moments of my own rather eclectic education, it’s clear that the memorable experiences are not those that took place in a classroom, but rather the explorations of my environments and encounters I had with the adults who helped shape my world view. Books were another major component, especially in a childhood that afforded little opportunity to watch television.
Now, as an educator of young adults, I’m faced with trying to combine a broadly-conceived philosophy of education (one that focuses on doing rather than rote learning) with a badly thought-out outcomes-based educational system. According to this regime, all courses need to show immediate results, and those results have to be quantified. I have to design course objectives that use “action verbs” to describe outcomes that can be achieved, demonstrated, and measured by the end of the course.
But what if what I’m trying to build is an intellectual toolkit that will allow students to learn how to determine what they need to learn in the future? How do I measure the ability to absorb information over time, make interesting connections, draw on experience, and combine new understanding with an existing knowledge base? What if my goal is to offer ways of seeing the world that can’t be measured in any meaningful way? What if I’m just trying to teach my students how to consider new ideas before jerking their knees in response to some pre-digested pabulum set forth in a textbook?
So I agonize over how to determine whether or not my students have learned anything during the quarter, and how I can measure that learning at the end of eleven weeks. I then must assign grades, for which I have drawn up rubrics to describe what they mean, in an effort to provide an assessment that actually says something about what students are actually accomplishing.
Ideally, my students would produce a portfolio of design work and essays that indicate what they’ve learned about the history of art and design. But I have upward of 150 students every quarter (that’s 600 students per year), and reading and responding to essays is enormously time-consuming. I’d never sleep if I were to assign a meaningful number of essays (say four per course per quarter) and a couple of design problems that would help me determine what students were learning about the principles and ideas being discussed in the class. So I end up giving exams that ask them to identify images, recognize terminology, and remember the characteristics of particular movements. I assign only one design problem (illuminate a literary text, for example, or produce a poster in the style of a particular design movement) that asks them to engage in the process of design and to describe that process in a concept essay. I ask them to conduct research—but not to write research papers, because I want their research to inform their design solutions. They produce annotated bibliographies of their sources to show me how they used their research, in an effort to make them aware that using others’ materials must be acknowledged. I’m generally quite pleased with the results of these “hands-on” projects (although the bibliographies generally leave much to be desired), and they usually tell me much more about what a student has learned than the exams do.
In other words, I do what I can to accommodate the “system,” but probably engage my students less and allow them to learn in a less meaningful way than I could if I didn’t constantly have to figure out how to “grade” them on what they learn.
Mind you, I have it fairly easy. Providing art students with even one creative assignment makes my job somewhat easier than my husband’s. He teaches philosophy courses (intro, ethics, social/political) in which he requires his students to think on paper. But his students (in a local community college) are not any more prepared to write coherently than mine are, and many of his students are on their way to four-year universities where they will major in business and politics and other professions that will determine the shape of our future economy.
The problem is that these kids have been taught to take multiple-choice, true/false exams that require only their ability to retain information long enough to pass a test. They have not been taught to think. They have not even been taught to write, except for the standard five-paragraph, “tell me what you’re going to tell me; tell me; then tell me what you told me” essay—which is hardly an essay at all because it explores nothing. It simply weaves a few facts together, artificially, mechanically, and inelegantly. And the only reason they’ve been taught this is so that they can pass the essay component of a standardized test, which will then be read by someone who’s reading for mechanics, not real analysis or synthesis.
What my husband’s dean calls “the assessment regime” has infected education in this country at all levels. In order for my college to be regionally accredited, for example, we periodically have to jump through a series of hoops governed by the latest fad(s) in higher education. I’m not knocking the necessity for improvement and internal accountability, but the accreditation process requires major input of statistics, and immediate results—an inappropriate goal in many fields that require long study in order for “results” to emerge. So the statistics give us a snapshot of the moment, but they tell us nothing about what’s really going on over the long term. I’m sure that ten years ago people were noticing that writing and math skills were both on the skids in this country. If anything, after all this assessment, and despite programs like “No Child Left Behind,” nothing has improved, and our students are falling further and further behind the rest of the world. In fact, one report (Math and Science in a Global Age: What the U. S. Can Learn from China) suggests that even though we know about practices that will produce significantly better results in math and science education, they are seldom employed in U. S. classrooms. The report cites several reasons for this, including badly-designed textbooks, but I would imagine that the pressure of immediate assessment “results” has at least something to do with it.
If anything is to change, the system has to be shaken up radically. I don’t have the energy, at my age, but I’d urge people interested in teaching to read Morris and reimagine the possibilities. I’d also urge parents to yank their kids out of preschool and take them into the garden, to the zoo, to the natural history museum, the art museums. When they’re older and the state requires them to be in school, do it yourself, or enroll them in a program like Montessori, that promotes experiential learning. If you can’t afford that, volunteer in the schools, participate in their governance, and make time out of school to compensate for what’s lacking or what you can’t change. No double income is worth the sacrifice many parents make. The stuff you can buy, the big house you can live in, the SUV—none of this is worth what you lose by not being the principle educator in your child’s life. Work at home, or don’t “work” at all; spend your own time learning, and pass what you learn on to your children. By the time I get them, it’s almost too late.
Oh; and before you do anything, get rid of the television set. Doing so couldn’t hurt, and it might make all the difference.

Friday, December 7, 2007

Thinking About Nowhere

Since I started this blog, I’ve spent quite a bit of time ruminating about different aspects of utopian vs. dystopian “life,” but I haven’t said much about what a utopia might actually “look” like, or what might go on there.

In part that’s because I’m still working on the book that prompted the blog, in preparation for putting it online (I bought the domains last weekend, actually). But since the book is in more or less its final form, I’ve begun to think about its implications, and how implementing it might actually happen. Throughout the time I’ve been interested in the idea of a good/no place (where things are somehow better than they are in the here and now, but remote in one way or another: physically, temporally, or philosophically), I’ve been asked about why I’m focused on this particular notion. Visions of utopia are usually relegated to the science fiction section of the bookstore—unless they’re concocted by political theorists or social critics like Thomas More, Edward Bellamy, William Morris, or even Aldous Huxley. To me they’re particularly valuable forms of speculation: thought experiments not unlike those that take place in a scientist’s lab. The conditions are laid out, and the writer lets the characters work through ideas rather than go looking for a cheese reward.

But people who think about utopia are most often considered dreamers, who lack a firm grasp of reality, or—even worse—are thought of as unrealistic, pie-in-the-sky seekers of the land of Cockaigne, the Big Rock Candy Mountain, the Golden Age, Shangri-La, or some other impossible and/or imaginary goal. The philosophical pursuit of utopia, however, goes at least as far back as Plato, who outlined his ideal state in the Republic, and whose heirs have included many of history’s most imaginative thinkers.

For me, the idea of utopia has long been the subject of intellectual inquiry, and when I was teaching and working on my since-abandoned doctorate at UT Dallas, I created a course called “Utopia and Technology” that considered the history of utopian thought and its relationship to technological development. Two other courses, one on the philosophy of technology and another on the Arts and Crafts movement, considered the ways in which thinkers responded to new technologies, either positively or negatively. Edward Bellamy’s Looking Backward, and William Morris’s News From Nowhere, for example, represented opposing reactions the Industrial Revolution.

What seems clear now, early in the twenty-first century, is that many of the utopian expectations about what technology could do for us have not panned out. In fact, it’s becoming more and more apparent to me that we’ve completely lost control of our machines—in the sense that we’ve no real sense of where we’re headed. Problematic paths such as genetic manipulation, cloning, and nanotechnology are being pursued with no true assessment of what they imply or what they might produce were they to get out of hand. In addition, the frantic pursuit of fossil fuel sources has led to potentially irreversible destruction of vital ecosystems, and the potential unleashing of microbes or pathogens to which human beings and other species will have had no time to develop any resistance or defense.

In such a world, utopian thought experiments would seem to be a necessary antidote to the almost compulsive quest for economic “growth,” endless accumulation of material possessions, bigger and more complex everything, and less and less consideration of potential consequences.

I am greatly encouraged by the recent interest in sustainability and “green living” reflected in the popular press. I am, however, skeptical about its depth, because it may turn out to be just another economic fad destined to fade away when people realize how difficult it is to give up some of their treasured luxuries (especially their big cars and granite counter tops) in order to reduce their impact on the environment. As the blue recycling trash containers appear more and more frequently on curbsides in my very conservative town, I am reminded that most people around here are not actually recycling to save the planet—they’re doing it so that they won’t have to pay taxes to build a new landfill. I’ll become rather more sanguine about motives when I see the number of the larger green bins dwindle.

Sustainability must become a moral pursuit, and not just a tax dodge. People really need to spend time thinking about the consequences of their choices, rather than just pursuing life as usual, with the occasional token “green” choice thrown in. This will be especially difficult in a culture not given to introspection, except on a “personal growth” level directed by the latest self-help guru. We don’t teach philosophy in our schools, and although there’s a great deal of talk about critical thinking in educational circles, I see little evidence of its actually being taught. Most of my students are ignorant of basic reasoning principles because they’re taught to write according to a formula (the infernal Five Paragraph Essay) that leads more frequently to sophistry than to logic.

As a country we’ve become very concerned about the lack of math and science knowledge among schoolchildren, but the pundits seem unaware that reasoning develops within a much broader scope. If we just focus on math, science, and rote writing skills, we won’t be any better off than we are now, because thinking and understanding can only be done well within a much larger context that includes historical and cultural perspectives. Our children not only score badly compared to those in other “developed” nations in math and science, but they don’t seem to know much about the outside world, either. Geography is apparently no longer taught much at all, because my students (as bright and creative as they tend to be) come to me not even knowing how to read a map properly. My hope is that hot new technical toys like Google Earth will help to ameliorate this problem—but only if someone provides children with a reason to be interested in something outside their immediate life-sphere. But unless we stop insisting that every other country be just like us, it won’t make any difference in the long run.

Monocultures can be lethal in biological communities, but they’re dangerous to human communities as well. The intuitive understanding that cultural variety is important seems to lie behind continued efforts to foster social diversity in this country. But we’re creating a world for our children in which the same restaurants serve the same basic menus, people live in identical suburbs with no architectural variety (or interest, for that matter), view the same kinds of television programs, see the same kinds of films, etc. Our kids dress virtually identically to one another, thinking that this reflects their individuality. Malvina Reynolds warned us about this, back in the sixties ("Little Boxes"--nicely performed here by Raymond Crooke), but we don't seem to have learned much since then. I am reminded of Ursula K. LeGuin’s novel, The Lathe of Heaven, in which a well-intentioned (but insufficiently imaginative) psychiatrist directs the hero (who has the power to effect change through his dreams) to create a world without conflict, and everyone turns the same shade of gray. To me this all seems entropic, as if we’re all slowly drifting toward identity with one another: all living the same kinds of lives at the same temperature so that conflict will no longer exist.

All of these problems and issues seem to provide fertile ground for utopian thought, and that’s more or less what I’ve been trying to do in this blog. The book, More News From Nowhere, has been my “laboratory,” where the parameters have been set up to allow the citizens of my eu-topia to address many of them in an effort to find a better way to live. Blogs have a way of generating conversations of the kind that precede my characters’ arrival in their ou-topia—and when I first began to write it (about ten years ago), I had no idea that this particular format would develop; nor had I any idea that internet forums would proliferate as they have. So it will be interesting to see, when I get the new website up and running later this month, if a little conversation about utopia can happen here.

Photo: A view of Big Pine, California, from near Uhlmeyer Spring. This is one of the settings for More News From Nowhere.

Tuesday, November 27, 2007

Writing the Desert

Three years ago, American desert writer Ellen Meloy died at home in Bluff, Utah. She was only 58. By the time I “met” Meloy, when I happened on a copy of The Anthropology of Turquoise at the local Half Price Books, she had already been dead several months. But her impact on my understanding of the world—especially the world of the west where we were both born—has been considerable.

Small coincidences drew me to her work; we were both born in California, at opposite ends of the aqueduct that deposited Owens Valley water into the San Fernando Valley. We were about the same age, and both solstice children (she summer; I winter). But most of all we were both desert rats. Drawn to the stark drama of the landscape, we both felt out of place elsewhere. I still do, of course, here in exile on what’s left of the prairie. But Meloy was the best interpreter of the desert I’d come across since Edward Abbey and Joseph Wood Krutch, and although I’ve read only two of her books (Turquoise, and the last book—published posthumously—Eating Stone) I have thought of her every year since, near the anniversary of her death. I’m saving her other two books, Last Cheater’s Waltz and Raven’s Exile for when I retire and can savor them without distractions. Preferably I’ll take them to Independence, California, and hole up in the Winnedumah Hotel for a couple of weeks to read them on the front porch, or up on the road to Onion Valley.

Every self-respecting, budding environmentalist (including, I think, Meloy) read Edward Abbey in the seventies. Of course, I didn’t really become a budding environmentalist until I moved to Long Island, and spent a few years raising a son in the Pine Barrens, taking courses in regional geology at Stony Brook, and later moving to Texas where I spent an inordinate amount of time trying to learn to love the prairie. Any time I made a trip west, however, I’d bone up on the desert and hoard books like Roadside Geology of Arizona and John MacPhee’s Basin and Range in preparation. So it was that I finally came upon Abbey late, when I read Desert Solitaire in anticipation of a trip home. And thus began my political awakening.

Abbey was only two years older than I am when he died in 1989. Someday I hope to come upon the stone that’s said to mark his grave, out in the Arizona desert somewhere. Before he and Joseph Krutch died, though, the former interviewed the latter, and published it in the collection, One Life At A Time, Please in 1988.

Joseph Wood Krutch, about whom I had once planned to write a master’s thesis, was born on November 25th, 1893. He became a kind of public intellectual who began by writing significant books like The Modern Temper (1929) and then fell in love with the natural world. By the time he died in 1970, he had written books like The Desert Year (1951), The Voice of the Desert (1954), a “biography” of the Grand Canyon (1957), and one of my favorite books, The Gardner’s World, a compendium of garden writing from Homer to modernity. One of the very first bits of nature writing I ever encountered was “The Day of the Peepers” (1949), which was recommended to me in the early seventies by my boss, Daniel J. O’Kane, who was then acting Dean of the Graduate School of Arts and Sciences at Penn. O’Kane is a microbiologist by trade, and had already introduced me to the Pine Barrens in New Jersey—thus fostering an interest that blossomed later when I moved to Long Island and could explore them at will.

One more recommendation: Mary Austin. I once came upon a real estate ad offering for sale the house where Austin lived in Independence, and fantasized about buying it. There’s a honking great huge historical marker outside it now, and it’s sort of possible still to imagine how it looked when she was writing Land of Little Rain (1903)—a book my grandmother nudged in my direction the minute I showed the slightest interest in such things. Austin had married in Bakersfield and moved to Independence, but after the Valley lost the “water wars” to Los Angeles, she moved back over the mountains to Carmel. In the meantime, however, she had written the first and one of the most evocative accounts of the desert and the valley that my family had then only just begun to inhabit.

There are other such writers, of course, and I’ll consider them another time. But for the moment, I’m remembering Ellen Meloy, and others of her ilk who have enriched my reading life immeasurably, and who have significantly deepened my affection for the land of little rain. While I was looking through the books mentioned above, I noticed that my copy of Krutch's The Voice of the Desert was one I had sent my father in 1992, as a Father's Day gift, and in which he had made a number of notations. On the last page he had highlighted the following passage, and I think it makes a fitting end to this post:

Of all answers to the question, "What is a desert good for?" "Contemplation" is perhaps the best.

Saturday, November 24, 2007


Once again the coalescence of news items has generated a rant—this time on my old favorite, greed. Part of it has to do with seeing on last night’s news broadcasts huge crowds of people racing into malls and big-box electronics stores (“Buy More” as they call “Best Buy” on my favorite new TV show, Chuck) on Thanksgiving day to get a head start on their Christmas shopping. Then, a report on last night’s local news show about the theft of plasma TVs from a downtown Baptist church and this morning’s item about Oral Roberts’s scion, Richard, and his ousting from Dad’s university brought it all together.

Today’s Dallas Morning News clarified the story about the heist in its article “Thanksgiving thieves rob First Baptist of Dallas.” Not only did the crooks get away with eight (count ‘em) plasma TVs, but they tied up three security guards to do it. Now, my first question ran something like this: why the hell did a Baptist church feel it necessary to buy eight plasma TVs worth (according to the News) $5000? And why does a church need to employ three security guards (to whom they were no doubt—or at least I would hope—paying overtime for working on Thanksgiving)? Good grief.

And then, of course, there’s Richard Roberts. According to the New York Times, Roberts has resigned from the presidency of the university founded by his father Oral amid “allegations of a $39,000 shopping tab at one store for Richard Roberts' wife, Lindsay, a $29,411 Bahamas senior trip on the university jet for one of Roberts' daughters, and a stable of horses for the Roberts children.” His activities have apparently been an object of concern for at least twenty years, and who knows what the tipping point might have been (one too many horsies?).

I’m not picking on evangelicals in particular; these activities are simply a manifestation of an overall phenomenon: the modern (American) propensity for overindulgence. But since they’re supposed to be preaching the gospel of the guy who purportedly said that it would be easier for a camel to pass through the eye of a needle than for a rich man to enter the kingdom of heaven . . . well, I’m certainly not the first person to point out the inconsistency.

But greed manifests itself everywhere, not just in the higher echelons of evangelical churches. Way too big houses, Hummers, giant plasma TVs—all the trappings of North Texas middle class life are essentially wasteful, extravagant symbols of the current idea that more is more, and that if my neighbor has it, I need to get it too.

We simply don’t seem to understand the concept of enough. We always think we need a bigger house, a fancier car, trendier clothes. The sheer excess emitting from the advertising sections of the newspaper is enough to spark communists to riot—if there were any left.

Last night, my husband and I watched the Tim Robbins film, The Cradle Will Rock, about the WPA/Federal Theater Project production of the musical of the same name by Marc Blitzstein in 1937. The film was highly entertaining, and rather poignant in places, but its images of the depression-era lines of folk seeking work melted into the background amid the personal stories of Blitzstein and his actors caught up in the politics of the McCarthy inquisition and anti-Red fever. It’s hard to believe that the film was about a moment in U. S. history that occurred only seventy years ago. It doesn’t seem that long ago to me, even though the events in the film happened ten years before I was born.

It also doesn’t seem that far away when I open my electronic copy of the Times and read about the impact of the sub-prime mortgage fiasco in Bob Herbert’s column, “Lost in a Flood of Debt.” Ordinary people, extraordinary events—not exactly the stuff of Greek tragedy, but reminiscent of what preceded the throngs of unemployed people standing in long lines waiting for the remote opportunity to earn a few nickels at odd jobs, or in a theatrical production.

I’ll be lecturing on “art between the wars” this week, and showing Robert Hughes’s film “Streamlines and Breadlines” from his series, American Visions. I’ll have to recommend that students see the Robbins film as well, not only for its amusing portrayal of Diego Rivera and the story of the ill-fated Rockefeller Center mural, but for a sense of the WPA’s importance as a public project focused on the arts and humanities. I doubt if anything of the kind will ever emerge again, because the concept of art as a necessary part of life will have been gobbled up by a focus not on need (art to enrich the soul), but on desire—a wish list of stuff everybody “needs,” like Hummers, huge houses, and plasma TVs in their churches. And whatever you can pick up for cheap if you shop on Thanksgiving day, instead of waiting a whole twenty four hours for the sales frenzy to begin.

Monday, November 19, 2007

Measure for Measure

Human beings seem to like grids. While the earliest models for houses might have been caves, followed by huts made of natural materials—neither of which conform to particularly rigid outlines—it doesn’t seem to have taken long for us to start building rectangular dwellings, especially when we started living in increasingly larger groups. The outlines of the houses at Çatalhöyük (one of the first city-like enclaves), for example, are already rectangular pueblo-like spaces that share common walls, a solution to the problem of living on the arid Konya plain, just as the ancestral Puebloans lived in Chaco Canyon, in the dry San Juan basin in New Mexico.

As I perambulated the Accidental Garden and its surroundings the other day, I was struck by the rigid boundaries set up by the original plat-drawings (the half-acre within the grid that became McKinney), reinforced by the fence builders, and then compounded by whomever was responsible for laying down the railroad ties, landscape-timber borders, and the rectangular space enclosed by a sidewalk just outside the back door. The only curves on the property lead to the alley, just in front of the garage, so that folks can drive in from the street, through the porte-cochere (which I realized only the other day means, literally, “car port”—but sounds so much more elegant if a car is a "coach"), and then exit into the alley to make their retreat from the house.

I had more or less acquiesced to this grid, but am now planning to bust it up on the way to completely rethinking the back yard, and eventually messing about with the front. Even my own drawings (like the silly cartoon of the pecans from my last “effort”) tend to place things within tidy rectangles, so that my new food garden design is angular, and the new herb garden will happen within the sidewalk rectangle I’ve designated for the potager. My Child Groom built a patio from salvaged bricks within that same rectangle (which is, itself, contained within the sidewalks). The only non-angular elements of the back yard are the round copper fire pit, and the path the puppies take when they go screaming out the back door to protect us from the marauding dogs and babies that get walked in the alley behind the house (also part of the city’s grid).

What made me start thinking about wreaking anarchy on at least part of the property was an essay by American Bungalow’s editor, John Brinkman. It’s called “Linear Thinking” and makes note of modern man’s desire for control, evident “in the way he categorizes, organizes and measures everything in his grasp, often through the use of straight lines and rectangles” (issue 56, 1). He likens what he has seen of this country from the air to a library globe, with its meridians lined out carefully (more grids) and color coding to delineate one country from another. Nature, of course, doesn’t follow grids, unless they’re designated by cleavage planes in rocks or other physical necessities. And even flat, sedimentary structures can be bent by metamorphic forces over geologic time—and we can see these graceful undulations exposed in numerous road-cuts throughout the country, where human beings have plowed through hills to make straight, geometrically obedient highways.

Brinkman thinks that there’s something innate that makes us appreciate the order that linearity imposes. “Could it be that this appreciation of definitive, confining lines,” he asks, “is the reason we are drawn to the Craftsman style?” He goes on to wonder if it’s something else (its honesty, instead of its rectilinearity), the connection of human hand and human eye—perhaps (and this is me, not Brinkman) the evidence of human creativity expressed in our ability to tame nature, bend it to our will. I’ve never really understood why I love my house’s Craftsman roots (distorted though they are by a 1922 north Texas mentality), but refuse to worship on the altar of authenticity so often reflected in magazines like American Bungalow and Style 1900. It may simply stem from the fact that I’m uncomfortable with human hubris and our desire to impose our will on everything around us. The most obvious reflection of this ambiguity shows up in my increasingly untamed garden.

During the last two weeks I’ve been lecturing in my art and design history classes about the human predilection for grids—such as in the framework for illuminated manuscripts (delimited with pin pricks and thinly drawn lines) and the foundational grid used by the Cubists on which they re-hung body parts after they’d “dissected” them into fundamental volumes, shapes, and lines. Next week I’ll be talking about Piet Mondrian (to whom I apologize profusely for the above illustration; it’s not the first time I’ve run roughshod over that poor man’s ideas) and his quest for the most fundamental, natural expression of art: in horizontal and vertical lines, primary colors, black, white. Mondrian saw nature in these basic elements, but we see it (or at least my students tend to) as the ultimate expression of human power over nature—the obliteration of the natural world into rigid geometry. Yet we live in houses that are every bit as linear as Broadway Boogie Woogie, and not nearly as playful.

A couple of months ago (September/October 2007), Natural Home magazine featured an astonishing house that is the antithesis of most modern houses. Modeled after a chambered nautilus, the house spirals around a central core: the kitchen—with living areas that seem to grow out of its center. It includes a greenhouse and a green roof, hand-cut stone, handmade glass, and virtually no straight lines. It must have been horrifically expensive to build, but surely a joy to live in. [See the gallery of photos; details are available here.]

I have neither the money nor the inclination these days to build a house. I like living in a recycled house, and have enough trouble trying to do what needs to be done in this one—although I am seriously thinking of using clay paint in some of the rooms to soften some of the angles. We’ll have rely on furnishings for our curves—but even those are few (at the moment I can think of a single round table, and the curved handles on the library steps—but little else). Any move away from angularity will have to take place in the garden, where we can take out the railroad ties and timbers, and let the puppies help us decide where to build paths that curve into the expanding wild bits, the tangled masses of wisteria, and the little round bodies of pecans that still litter the entire yard.

Saturday, November 10, 2007

When's the Stage Coming In?

My great grandparents, Thomas and Esther Tate, ran a stage-coach station out of Big Smokey Valley (Nye County), Nevada, from 1886 to 1901. My grandmother, Clarice Tate, was born there, and that’s where the family lived until they moved to the Owens River Valley in California. Clarice, who lived to be 104, and was pretty lucid until she was 99, was fond of indoctrinating her granddaughter with tales of life in the fin de siècle desert, sans plumbing, electricity, automobiles, and even trains (hence the need for the stage—it ran mail and people to places where trains didn’t go). When they moved west, they rode the narrow gauge “Slim Princess” over the mountains, and I imagine that they arrived through Queen, the station in the Inyos mentioned by Frank Norris in his novel, McTeague (about which I wrote in a section of my master’s thesis, Science and Scientific Models in American Literary Naturalism; the link is to an historic map of the area).

At any rate, Gram lived through the entire twentieth century, and saw every technological change that took place. Somewhere I even have pictures of the family’s first car, taken in a stand of old-growth forest, probably somewhere in Yosemite. In Big Pine, where my father was born, they saw the coming of electricity, indoor plumbing, and all the “mod cons” that eventually showed up in the Valley. I’m pretty sure that my current views about the role of technology in human life were nurtured by my frequent, long conversations with my grandmother about what life was like in the “olden days” (a bit hardscrabble at times, but not unpleasant), and how much she thought we had given up in exchange for efficiency and convenience. After her short-term memory gave out, she spent her remaining years in the remembered places of her childhood, and was know to ask my Dad when he came to visit, “When’s the Stage coming in?” and “Who’s going to be on it?” As the State Historical Marker says, Tate’s Station became a “local social center,” and it must have been pretty exciting for a little girl growing up in a vast expanse of desert.

This latest musing about my grandmother’s life came about as a result of several new connections: the activities of a child in Robert Gardner’s 1973 film, Rivers of Sand, a story by Ursula K. Le Guin (“The Building”), another essay in the latest issue of Orion, by Robert Michael Pyle (“Pulling the Plug”), and the current toy-recall frenzy.

Last night’s Visual Anthropology class watched Gardner’s film, which considers the lifeways of the Hamar people of Southern Ethiopia. At one point, Gardner’s camera lingers on the activities of a child, carefully selecting and placing stones within a circle, which prompted a bemused question from one of the clever folk who populate the class: “Are those the kid’s Star Wars action figures?” Since this student is a fellow science fiction fan, and one who is (as I am) fond of things like Star Wars action figures, I found the comment both pointed and funny.

I was reminded of the film moment and the comment later that night as I was re-reading Le Guin’s story “The Building,” (from Changing Planes) which involves children who build small models of structures with found stones (and later participate in a more organized building of a much larger structure). Twenty-first century Western children, of course, play with blocks (if they’re lucky) or, more likely, “playsets” –models of items from films (such as the Black Pearl playset I was ogling at Costco the other day, briefly fantasizing about swinging around the masts with Johnny Depp) or construction sets of Legos. The company my son works for makes tiny styrene Millennium Falcon models and X-wing fighters (which he designed) for a collectible card game (Star Wars Pocket Models). I’m not knocking these games, because they’re so wonderfully low-tech, or even the idea of the “playset” (because for reasons I don’t really understand, I love small, manageable “worlds” like dollhouses and train sets). But, as I peruse the holiday toy catalogues, it does seem that things are getting out of hand.

In a pinch, and without toy and game companies to insist otherwise (through their advertisers), kids would probably be able to reenact scenes from Pirates of the Caribbean or Return of the Jedi with sticks and pebbles—only we don’t much give them the chance to so any more. Instead, toy designers come up with more and more elaborate ideas that they farm out to Chinese factories where probably-unsuspecting workers spray them with lead-based paints or bizarre chemicals, clog the airwaves with commercials for them during the holiday season, seduce children into wanting them desperately, and then have to recall them when somebody gets around to testing them for harmful content. Sometimes that doesn’t even happen until somebody gets sick (as with “Aqua Dots” the other day). The world seems to become more complicated and less safe by the moment, for reasons that just don’t seem to make sense. The only “reason” people buy this crap for their kids is because toy corporations plow billions of dollars into advertising. The ads are inescapable if you watch children’s television shows, and the kids’ desires are thus formed. Not needs (like simple play): amplified, bare-faced desires, capable of being satiated only by the latest Next Big Thing. Baby greed. [An interesting coincidence that occurred to me as I was editing the first paragraph of this post: The film made from McTeague, produced by Erich von Stroheim, was called, simply, Greed.]

If we really want to grow imaginative, creative, healthy, vigorous, intellectually independent children, we need to minimize the amount of technology and heavily-structured play we expose them to. And we need to begin with the television set, from whence all desire seems to spring. This brings me to Pyles’s Orion essay, about—after not having lived with a television for decades—weaning himself from oppressive technologies in general, and e-mail in particular.

I love the idea of a television-free life. I’ve managed to lower the number of hours I spend in front of the tube somewhat, but I still enjoy watching the few decent SF offerings available on nights when I’m not teaching. Baseball season’s over, so I’ll be reading more, at least until next April. But the TV is still on more than I would like because I share the house with a part-time tennis coach, who uses the Tennis Channel as a learning tool. However, the living room, where I do most of my reading, is far enough from where the TV sits that I don’t even hear it. Pyle only uses his for watching movies—the ideal use of a television set in my book. He also doesn’t have internet service, and goes to a community source to do web searches and check e-mail. The consequences of doing so, however, mean that his "mail" box gets much fuller than mine does, and he’s rather oppressed by the volume. I understand that, because I’ve got four accounts, two of which tend to get very full if don’t check them several times per day. But I can’t give up e-mail because of what I do, and my use of this particular technology stems from my refusal to adopt another: the cellular telephone.

Both Pyle and I find telephones intrusive, but I actually tend to think of them as verging on the demonic. They not only interrupt me at home when I’m busy working (despite being on the “no call” list, solicitors and pollsters manage to find me), but they’re a constant and annoying presence in other areas of my life, even though I don’t own a cell phone myself. My students’ go off in class (despite dire warnings of consequences), people talk on them constantly in the hallways and elevators, and it’s rare to see someone driving a car who doesn’t also have a phone attached to his or her ear. E-mail helps remove me from this maelstrom, and for that I am grateful. But I’m also pining after the good old days—those described by my grandmother, when a letter arriving on the stage was treasured. I am blessed to have had parents and grandparents who wrote frequent letters to me, to and from faraway places, and blessed to have been raised by historians who thought saving those letters was essential. On the other hand, the last few years of my father’s life included a voluminous e-mail correspondence between us, and I have faithfully saved every one; re-reading them brings him right back, makes him present, refreshes memory.

And so, Dear Reader (all one of you), today’s lesson ultimately involves choices. If we decide that technologies can be selected carefully to enhance our lives, I think we can learn to live well with them. Nobody, after all, really wants to give up electricity or indoor plumbing. But I’ve found that I can do without cell phones and even air conditioning during a North Texas summer. And good on Robert Michael Pyle for unplugging from e-mail and re-embracing surface mail. Later today I’m going to write a real letter to an old friend in Buffalo, who sent me one not long ago, and mail it with the great stamps I've got tucked away for just such an occasion. I’m also going to go out to the garage and locate the crate of blocks my kids played with when they were little, and bring them into the house. And I’m going to take another set to a friend’s child tonight when we join them for dinner: small choices, with not much impact on the grand scheme of things. But gestures, symbols—and hopeful ones at that.

Note: The photo of the Nevada State Historical Marker for Tate's Stage is borrowed (until I can find my own shot) from a terrific website with lots of great stuff on Nevada (geo-caching, hiking, rock art). I hope linking her site (and she's got two blogs) will suffice for permission, since I don't plan to use it for long.

Saturday, November 3, 2007

And Now For Something Completely Different

The world will end, not with a bang or a whimper, but with a barrage of pecans. There are so many that I’ll never begin to get them all, and will shortly be conscripting children to go after them. I thought of soliciting unsuspecting trick-or-treaters (all you can haul!), but had to teach that night.

At any rate, in honor of the plenitude, I’ve decided to provide recipes. Pecans and everything. Perhaps even Spam, Spam, Pecans and Spam.

Pecan Pie

Use a pre-baked short crust, or a crumb crust made with gingersnaps and toasted pecans. In a pinch, a whole-wheat pastry crust will do. Then pour in your grandmother's pecan pie recipe, because who am I to mess with somebody else's Texas tradition?

Maple Pecan Cheesecake

Dedicated to my favorite cheesecake baker, Sterling Handrick

Make a gingersnap or graham cracker crumb crust as for pecan pie. Bake and set aside.

Mix 16 oz. of Neufchatel cheese (or combine Neufchatel and fat-free cream cheese to lower the fat load), 2 T cornstarch, ¼ t. salt, 1.5 c. maple syrup (use bulk Grade B from Whole Foods for better flavor), 3 large egg whites, ½ c. toasted chopped pecans. Mix cheeses, salt, and cornstarch at high speed, then gradually add maple syrup. Beat in egg whites last, until just blended. Pour into prepared crust, sprinkle pecans on top. Bake at 525F for 7 minutes, then lower the temp to 200F and bake 45 minutes longer or until set. Let cool, then cover and chill for about 8 hours. (Recipe adapted from Cooking Light.)

Pecan Shortbread

Coarsely chop a half cup of cooled toasted pecans (up to 10 minutes at 350F). Add them to your favorite whole-meal shortbread recipe and proceed as usual. If you don’t have a favorite, here’s one adapted from the Crank’s Recipe Book (Oxford: Alden Press, 1982).

4 oz. butter; 6 oz. wholemeal flour (white whole wheat works well); 2 oz. turbinado or raw sugar. Cut butter into flour, add sugar and pecans and work together. Press into 8” round tin, tidy the edges, and then score into 8 portions. Bake at 300F for about 45 minutes. Cut into pieces while still warm.

Salad with Pecans

Toast a cup (or more) of whole pecans; let cool. Mix a salad of sturdy greens (Romaine, endive), cubed pink lady or other tart-sweet apples (unskinned), green and/or red and/or black seedless grapes and/or dried slightly sweetened cranberries (like Newman's Own). Mix up a raspberry or other fruit vinaigrette and toss salad with enough dressing to suit. Add the pecans and toss lightly again.

Salmon Encrusted with Pecans

Chop a cup and a half of pecans finely; add fresh-ground pepper, lemon zest (about two tablespoons), and about a teaspoon of sea salt. Brush salmon fillets (two to four) with good strong olive oil, and spoon pecan mixture over the fish to cover. Press in lightly. Bake fillets at 400F only until hot all the way through, and pecans are toasted (be careful not to burn the pecans or overcook the fish).

Spam with Pecans and Sliced Peaches

My wonderful stepmother, back in the days when we didn't have much cash, could do more with a can of Spam than anyone but the Python crew. My favorite was Spam sliced (not all the way through; leave about 1/4 inch at the bottom of the loaf between each slice) with peach slices slipped between, and then baked. My variation adds chopped pecans on top before baking at 350F for about 25 minutes. The upscale, ecologically more appropriate version would use organically raised ham and fresh peaches, with a bit of turbinado sugar mixed in with the nuts.

Things We Have Lost

The last two weeks of my Visual Anthropology class have been fraught with lessons about the costs of modernity. And David Brooks’s recent op-ed piece in the Times, “The Outsourced Brain” (reprinted in today’s Dallas Morning News) points to the same problem: technology is quickly taking the place of natural abilities human beings developed over long periods to adapt to their environments.

I’m rather gratified that after watching two of John Marshall’s early films about the !Kung bushfolk in what is now Namibia (The Hunters, from 1958, and N!ai, The Story of a !Kung Woman from 1978), my students were suitably outraged by the effects of imperialism on the lives and traditions of these people who had lived relatively well for thousands of years in the Kalahari. Studies of the !Kung and related groups are standard fare of introductory anthropology courses, because they represent the possibilities of simpler, vanishing lifeways. Criticism of Marshall’s work abounds, but his record, especially in The Hunters, of his subjects’ hunting skills provides invaluable information about the ability of human beings to use cognitive skills—such as dead reckoning—that are all but lost to us now. In N!ai, Marshall shows us how women use their understanding of their physical world to gather fruits, roots, nuts, and plants that had provided a living diet for their band, which was supplemented by the occasional meat the men supplied. I don’t think that it’s fair to accuse Marshall—as some have—of romanticizing these people, because N!ai herself tells most of the story, and the segment of the film from the seventies is almost gruesome in its depiction of the changes wrought by the coming of the White Man (caps on purpose) and the imposition of a completely alien economy—if it can be called that. Both food-gathering and tracking knowledge among the aboriginal societies of the Kalahari will undoubtedly be lost within a generation.

But these losses are not being suffered just by the indigenous peoples of the world; what few skills remain among the descendants of Europeans in North America are even more rapidly disappearing, and not because anybody’s imposing anything on us. We are choosing, daily, to embrace technologies that, as Brooks notes, have begun to replace various of our cognitive skills: “the magic of the information age” he admits, “is that it allows us to know less.” In his case, the particular technology in question is his Global Positioning System. Whereas the the !Kung hunters of the Kalahari could find their way back from anywhere through their ability to read the landscape, we can now find our way back from anywhere by using a pricey little device with a soothing female “slightly Anglophilic” voice. And I’m pretty sure his wife never has to tell him to pull into a service station to ask directions any more.

We haven’t needed food-locating skills for some time—ever since the dawn of agriculture. Now food-distribution sources are available on practically every street corner, and food comes to us from far and wide. Some of us are, of course, beginning to recognize that the “far and wide” component is adding exponentially to the problem of carbon footprints and global warming, as well as the destruction of indigenous lifeways. Slowly, awareness seems to be growing that the way food is obtained, and what it costs in human terms, is as important as having enough to eat in the first place. We discussed in class last night the fact that even though !Nai’s family has regular access to “mealie meal” (a staple made from a variety of corn), the people see themselves, or at least they did in 1978, as “starving.” Human beings do not, it seems, live by bread alone.

Interestingly enough, the latest issue of Orion Magazine, which just arrived this week, contains two related articles. One, “Don Berto’s Garden,” is about tree gardens in Belize, where Don Berto raises native plants and where the author (David Campbell) and his students go to learn about their uses. This single article offers me more hope than anything I’ve read in recent times that the effects of imperialism might not be permanent. The second is about re-discovering a native North American wild food, “Stalking the Wild Ground Nut,” and seems hopeful about increasing awareness of the value of the most local of food—what grows nearby, naturally.

Suddenly the Accidental Garden is taking on new meaning. The idea of a tree garden is intriguing enough, but the notion of turning a quarter of my small corner of north Texas into a native food plant refuge is even more suggestive. Of course, if I were to manage it properly, the Chinaberry would have to go (not native, though lovely)—unless I could figure out some use for it other than for making grackles drunk. I’m heartened by the fact that I can identify most of the plant species growing there now, and can decide what to keep or remove on the basis of its usefulness (as food, medicine, dyestuff, etc.). I can also put in some new things without feeling guilty—choosing native plants that can work with the hackberries and box elders and the bur oak in some kind of permacultural relationship. Yet another experiment to try.

At least I don't need a GPS system to get around my garden. And now, thanks to my silly map, neither do you (although the Accidental Garden doesn't show on the map). First try with the Wacom--just having fun, folks. I'll get better, I promise. Besides, its supposed to be a cartoon.

Monday, October 22, 2007

Back to Work

The Times this morning featured an article on a home foreclosure auction in Minneapolis, where buyers hoped to acquire bargain-rate properties from victims of the recent sub-prime mortgage fiasco—yet another manifestation of the role bare-faced greed plays in our economy. Having participated in such an auction once, years ago in Philadelphia, in hopes of buying (with a group of friends) a big old house being sold for back taxes, I can understand the hope such an event generates: a buyer of modest means imagines that a foreclosure offers the chance to pick up a decent house for an affordable amount of cash. Unfortunately, it seldom seems to turn out well—it didn’t for us, because the bank that held the original note bought it back for just over what we could afford—and the same kind of speculation that brought on the sub-prime crisis in the first place is still going on. One pair of buyers were looking for houses they could buy for cheap, rent out for a year, and then resell when prices went up.

But this is just another symptom of the devaluation of needs and the valorization of wants that characterizes so much of modern life. The work that sustains us is underpaid (sometimes barely paid at all), the things we really need are turned into luxuries by people whose business it is to sell us bigger, more expensive, and/or more environmentally costly items than are actually necessary. Our ideas about what we need become colored by what the wealthy say is a “right”: such as having as big a house as we want, according to a recent letter-writer to Natural Home Magazine (July/August 2007). This was a response to an article extolling the virtues of smaller houses, a clarion call first sounded by architect Sarah Susanka with her book, The Not So Big House. A reader responded to the article with a huffy assertion that it was his right to build whatever size house he desired. It was not, according to him, Natural Home’s business to preach otherwise. In response to this letter, however, the magazine printed a very thoughtful counter-argument from reader Charles Flickinger, who noted that “As a culture, we desperately need new values (actually old values) to divert us from the insane path that consumerism has us blindly running down. We should know that the most important things aren’t things, and that conservation and thinking small are virtues” (September/October, p. 12).

Similar distortions of the relationship between want and need are apparent in the realms of food and clothing. Rather than concentrating agricultural efforts on securing safe, healthful foodstuffs, the agricultural industry (!) has begun to focus its attention on non-food products, such as corn and sorghum for biofuels. In addition, such phenomena as genetically modified crops (and the patenting of particular genes), and hormone injections in cattle are all designed for efficiency and volume—not for health (although the food engineers will claim that their patented genes reduce the need for chemicals), but to provide cheaper food whilst procuring greater profits. Never mind that the results are far less palatable than locally-grown, heirloom varieties (with unpatented genes), or that hormones fed to dairy cows inflate their udders to preposterous sizes (no woman who has ever breast-fed a child can see engorged cows without feeling sympathy). But the food is cheaper, and the better stuff is more expensive, so the less well-off will be stuck with tasteless, mass-produced pabulum unless they have access to community gardens. But we will be able to drive our big, gas-guzzling internal-combustion engines even when our sources of foreign oil turn off the taps.

Clothing, originally designed to protect members of a relatively hairless human species from the cold, has become a mega-multi-gazillion dollar industry. The “need” for fashion designers and retailers in the industry is reflected in the recent implementation of two BFA degrees in these areas at the college in which I teach. Beyond food, and beyond housing, the designing, manufacturing, and marketing of clothing have become the symbolic epicenter of the modern substitution of want for need. Once tied to cultural traditions and governed by the availability of materials, clothes have become the symbol of the person: you are what you wear. And what you wear is marketed to you by celebrity designers and manufacturing conglomerates, who have no concern at all about who you are, or whether or not what you wear is going to add to the turmoil in the world about garment workers’ wages, exportation of jobs, pesticide use on fiber crops, or whether or not our running around half-naked is going to color other countries’ views of what we stand for. Along with the slow/simple food and the small-house revolutions, I would really love to see a simple-clothing movement—all of which would focus on sustainability infused with conscientious design. In the nineteenth century, the women involved with the Arts and Crafts movement eschewed fashionable corsets and replaced them with more comfortable styles that allowed for freedom of movement. If a piece in the International Herald Tribune is any indication, such a change may already be afoot: Arts and Crafts: A New Organic Spirit in Fashion (although the article’s from May 2005). Homework for fashion design students: locate more indications that an Arts and Crafts revival is brewing in fashion. A sustainable clothing movement could start with a simple experiment, like the one that went into The Little Brown Dress. (Thanks to Jenny Lewis for pointing this one out to me.)

Undergirding all of the above (re-educating desire regarding basic necessities) is the very idea of work. One of the basic principles of Morris’s philosophy of work was that it not be onerous, monotonous, dangerous, mindless—but rather satisfying, purposeful, enjoyable, meaningful. It would be interesting to survey most workplaces today and ask employees to tick off a list of the above adjectives according to how they described the present situation. Perhaps we could add a few more: discouraging, exasperating, frustrating, as well as stimulating, exciting, creative. But I doubt that many of the jobs people have to do today would earn significant points on the positive side. Few of us have entirely satisfying jobs, nor should we expect that any job would be completely satisfying all the time. But too many people in the world slave away at jobs that have no meaning other than to satisfy consumer markets and corporate lust for profit. On a recent broadcast of PRI’s Fair Game, Faith Salie interviewed John Bowe about modern-day slavery in unlikely places like Florida (podcast here; and see a review of Bowe’s new book on the Cup of Joe Blog and Bowe’s own blog, Nobodies), where he pointed out that slave-labor conditions are alive and well in places awfully close to home.

The sad truth is that slavery aside (and it is difficult to put it aside, once you know how widespread it is), those jobs that produce basic necessities (food, clothing, shelter) earn the lowest wages, and reflect conspicuous consumption better than anything else: “gourmet” food, “designer” clothing, housing “estates.” The managers do quite well, while the people who do the slogging get paid next to nothing. I’m tempted to place teaching in this category as well—but that’s another blog altogether, and we do tend to be better paid than your average garment worker or fruit picker.

The only thing most of us can do about any of this is to take small steps. We can begin by recognizing what’s going on. For example, until I heard the interview with John Bowe, I had thought that the closest we came to slavery in this country was at some of the maquiladoras on the Texas-Mexico border. But now I know better, and I now have to conduct a bit of research in order to buy oranges and orange juice without contributing to the problem. Many of my students are already far more aware of these situations than I am, and our conversations often lead me to new insights. These same students often have less disposable income than I do, and so already shop at thrift stores—but they aren’t as able as I am to make more expensive food choices, and most are not exactly in a position to grow their own veg.

They are, however, in a position to make choices about how they work, and for them I have the following website: WhyWork (Creating Livable Alternatives to Wage-Slavery). At least one of my former students is living what these folks describe as the “portfolio life”—and the promise of being able to fulfill Morris’s quest for a life of useful work vs. useless toil seems to be more attainable in the digital age than I had originally thought. We don’t all have to go back to plowing the land in order to do meaningful work—as long as we’re conscious of where our food, shelter, and clothing come from, and as long as we make sustainable choices that ensure a decent life for those who provide us with our real necessities.

Thursday, October 18, 2007

Work, Work, Work

What will you do with your life? What work will you do? How will that work sustain you, your family, your community?

Do we ask any of these questions when, as teenagers or young adults, we plan our futures?

Most people seem to think of work as what they do to earn enough money to live where they live (or perhaps what will enable them to move somewhere more desirable—for whatever reason), buy food, clothing, pay rent, and support all of the other mundane realities of everyday life—both concrete and ephemeral. But the question of work keeps coming up in my reading, and keeps connecting with other concerns. And it's all related to Morris's basic question about how we live and how we might live.

Some folks, like Curtis White in the May/June issue of Orion Magazine (The Ecology of Work), and Wendell Berry, clearly think of work as something more, something with the potential to sustain a view of culture that extends beyond the everyday: something more permanent, like teaching and learning, making art, building, creating community, healing. I think agriculture fits into this general category, unless it’s seen simply as a business (an individual/family enterprise, instead of a corporate one). This second category of work also provides the means to satisfy basic needs, but adds the possibility of intellectual (or, for some, spiritual) satisfaction.

One of the first questions asked of a sentient child (one who has reached the age of actually having contemplated the answer to the question) is this: “What do you want to be when you grow up?” The answers we expect, such as “a fireman” or “an astronaut” or “a doctor,” often occasion a further question: “Why do you want to do that?”—and the answers are again fairly predictable: “Because they save people in fires” or “because I want to save us from aliens” or “I want to save people the way the doctor saved Grandpa.” Children’s answers often include rescuing others, which indicates somewhat altruistic tendencies—although the real reason may have to do with being seen as a hero. I’m certainly no psychologist, and have nothing but anecdotal evidence and experience to support my claims. But I do know that no six-year-old ever says “I want to be a philosopher” when asked about career choices. And few of them, these days, aspire to becoming farmers—unless they’ve already had experience on a farm, and then only if they haven’t had to watch Farmer John off the piggie that got made into the brats that went on the barbecue.

More and more and more we are insulated and removed from the work that actually sustains the first category I mentioned: food, clothing, shelter. We purchase all of the above from someone else, who has purchased it from a wholesaler, who has purchased it from (perhaps) the initial provider and has warehoused it to sell to the retailers. In the case of shelter, few of us build our own housing. Instead we pay someone rent to live in a dwelling he or she owns, or we “buy” a home through a bank, eventually paying a considerable amount more than the actual selling price in order to live in it while it’s being paid off. Somewhere down the line, someone whose work it is to build has contributed his or her labor to the construction of that dwelling. Sometimes we add our own labor by repairing or augmenting our homes, but more often than not this contribution requires few skills—scraping, re-glazing, and painting windows, for example. But the real repairs and the big jobs generally go to contractors or handy-folk who do the work for us. And then we grouse about how nobody seems to do a very good job at this sort of thing any more.

I am beginning to notice, however, that at least one aspect of “category one” seems increasingly to involve closer contact with actual work: food-gathering. Many of us grow at least a small portion of what we eat—summer tomatoes, a few herbs, occasional fruit and nuts. A growing number of us seek out local growers at farmers markets and even participate in co-ops that make the job of locating local food less onerous. And an even larger number buy from outfits like Whole Foods, Central Market, and Sprouts that feature foods from smaller (often local) farms, more humane dairies and ranches, and more access to organically-grown products. That this last phenomenon is a growing trend (and has caught the marketing department’s eye) can be seen in my local Tom Thumb (Safeway), which has recently undergone a radical makeover and been transformed into a Whole Foods/Central Market clone.

But that’s just it: it’s a trend. There’s not as much evidence of a philosophical sea-change—a realization that farm workers are exploited and underpaid, that people who work in agriculture, providing the rest of us with trendy foods, are generally paid much lower wages (and some are actually modern-day slaves) than people who sit in front of a computer all day—doing what? Crunching numbers? Designing ads to sell the latest food fad to the minions? Although we’re paying more attention to what’s in our food, the interest seems to be driven more by fear of cancer and heart disease (and the growing portion of our wages that go to health care insurance, if we’ve actually got it) than by any genuine, wide-spread concern about the people, animals, and environment that make the food possible in the first place. I may, of course be too harsh on my fellow beings, but the experience of seeing women decked out in full-length fur coats, buying organic wines at Whole Foods is pretty depressing. I can only hope that they go for the humanely-raised beef tenderloin while they’re at it.

After the success of this summer’s experiment with air conditioning (or the lack thereof), I’m primed for a new one: to see how much of my own food I can grow on my half-acre plot in historic McKinney, Texas (whose motto is “Unique By Nature”). Plans are afoot to re-design the back yard (leaving the Accidental Garden mostly to its own devices), laying out a veggie patch nearer the house, and to solve the perennial lawn problem by getting rid of most of it. The barren patch in the front yard will become home to herbs and edible flowers, as well as stuff birds and butterflies like, to replace the now-defunct herb garden that’s been shaded out of existence by volunteer trees. The front patch will get ample sunlight and confirm my neighbors’ conviction that I’m a communist—or at least an unrepentant tree-hugger. I plan to placate them by actually cultivating the space, and not just letting anarchy reign.

All this will, of course, require work: digging, actively composting (not just throwing rotting veg into the bin and praying for biological activity to occur), careful landscaping (this will be the hardest part; my tendency to let nature design itself, higgledy piggledy, would get in the way of food production), and actual attention. It may also mean fewer blogs, because one reason people don’t do the kind of work I’m talking about is that it takes “too much time” away from “more productive” activities—like sitting in front of a computer.

And now, if you don't mind, I must (before I head out to enlighten my students about the Classical tradition in Western art, and on what started this whole blog in the first place: William Morris and the Arts and Crafts Movement) go out and make my small daily dent in the bumper crop of pecans (from eight trees of at least three varieties) that are littering the entire property. The fuzz-tailed tree rats are not holding up their end of the bargain, so I am becoming--involuntarily--a nut farmer. Maybe I should change my name.

photo: Where the new food garden is going to go. This photo was taken two years ago, before the drought had ravaged the area.

Tuesday, October 9, 2007

Tangled Webs

Once again, a confluence of articles in the newspapers and magazines I read regularly, combined with events occurring in my classes, has led me blogward. It continues to amaze me that the focus required to produce an essay or two per week seems to generate spontaneous couplings of information, and some interesting partnerships.

Friday night I taught my first-ever class in visual anthropology. I started off showing a few short films from the Faces of Culture series, and introduced concepts like cultural relativism and ethnocentrism. Now, I’m a jaded old fart who’s known about all this stuff for the last fifty years, but my students were essentially unaware that in addition to introducing smallpox and other diseases to the Americas (about which they already knew), the colonialist impulse had wiped out the entire population of Tasmania, and is still in the process of eliminating most surviving small-scale economies. So the number of people on earth who maintain close, day-to-day contact with the mother planet is dwindling as I write. And while colonialism per se has largely disappeared, the ethnocentric impulse that fostered it is alive and well. Latent imperialists, especially in the United States, are still trying to convince others that we know what’s best for them, and that our values and ways of life are far preferable to theirs. Of course, the big difference now is that some of the “others” are fighting back—in ways completely anathema to those very values and ways of life.

Having just heard and read several stories on local and national media about how French women are now getting fat (along with their children and husbands), I realized how much of modern angst is connected with the techno-imperialism that has pretty much begun here and is spreading everywhere. Even the French, who have long been able to turn up their noses at American “lifestyles,” are buying into the technologically infused, too-rapid pace of life exported from North America. They no longer “have time” for the activities that kept them slim: long, leisurely meals with family and friends, walking everywhere, not snacking on fast food between meals.

This realization resonated with an article in the Sunday Dallas Morning News by Barbara Kingsolver (“Dig into the dirt”), whose new book, Animal, Vegetable, Miracle: A Year of Food Life joins a growing number of similar reflections on eating locally and reconnecting with our lunch. In her short piece for the News, she reminds us that her (and my) generation “has absorbed an implicit hierarchy of values in which working the soil is poor people’s toil. The real labors of keeping a family fed are presumed tedious and irrelevant” because “we have work to do, the stuff that happens in an office or agency or retail outlet.” As sympathetic as my students might be to the plights of aboriginal communities, I doubt that many of them are plunking down big bucks at a technical school in order to go back to farming the land and growing their own. As a culture, we have so thoroughly transformed ourselves into this radical form of homo faber that we cannot even imagine returning to our agricultural roots, except, perhaps, to put in a small veggie garden.

As Ronald Bailey puts it, in a piece critical of Kingsolver’s “latest fiction” for Reason Online, “I am very glad that people want to spend their lives raising tasty Mortgage Lifter tomatoes and Albemarle Pippin apples. And I am also very glad that I don't have to.” Bailey, donning the mantle of the Enlightenment to protect him against what he apparently sees as a growing tide of irrational Romanticism, reels out the stats: factory farming allows for more production on less land, results in cheaper prices, and has “liberated many like me from farm labor so that we could do other work.” Having grown up on a farm himself, and having participated extensively in its highly unglamorous tasks, he makes fun of people like Kingsolver and Michael Pollan (The Omnivore’s Dilemma) for what Bailey seems to think of as agricultural dilettantism. But he misses the point entirely, and ends up making Kingsolver’s case for her. He goes on to list all of the advantages of modern agriculture (which are primarily financial) including increased incomes and the ability to choose other work. He talks about the Peruvian farmers and sheepherders in New Zealand being able to sell their products to gourmands in America, but neglects to mention what the advent of modern farming techniques has done to other parts of the world, especially in places like Bali and India. But my students, who will be viewing a segment from the film, The Goddess and the Computer, will know that the “green revolution” has exacted a heavy price from the people it was designed (by us wise Westerners) to assist.

The fact that we simply have not found a balance between technological desire and the needs of our selves and our planet is becoming increasingly obvious. In an article in last week’s New Scientist (only the abstract is available unless you're a subscriber), Daniele Fanelli cites growing evidence that we are nowhere near being able to achieve sustainable development. The only nation that is, Cuba, will undoubtedly fall away as it grows more economically successful. The article focuses on research, conducted by a team from the Global Footprint Network, that “quantifies the area of land required to provide the infrastructure used by a person or a nation, the food and goods they consume, and to reabsorb the waste they produce, using available technology.” The result is the EF (Ecological Footprint) index, which the World Wildlife Fund recently used “to calculate that two more planets would be needed to support everyone in the world in the manner of the average UK citizen” (Fanelli). Like all such studies, this one has its critics, but the science seems solid enough that it should serve as a warning that we really do need to rethink priorities—and these probably should not include foisting our technological dependence on others.

Finally, I want to recommend a book: Alan Weisman’s The World Without Us, the most imaginative thought experiment to arrive since Orwell’s 1984 or Huxley’s Brave New World. Sometimes, when I’m feeling especially cranky, I like to think that the best thing that could happen to this planet would be for us just to leave. Weisman imagines what would happen if we were to disappear, suddenly, and he goes on to explore piles of evidence about how things would change, and how long it would take. It’s a riveting read, and a compelling reminder that we’ve spent most of the time since we left Africa (and a considerable amount of time before that) altering our environment willy nilly, seldom taking into consideration anything, or anybody, else. The material on the book’s website is also worth taking a look at, because there are some lovely CG images and some nice interactive features.

It’s perhaps ironic, but probably inevitable, that a book describing in such minute detail how long the effects of our technological juggernaut will last also makes use of high-tech web-based materials to augment the message. I guess it’s not called the “web” for nothing; the metaphor becomes richer the more we connect information, ideas, and everyday events, and our dependence on our technologies grows with every connection we make.

Photo: This was taken in Chaco Canyon, New Mexico, in 2003; it serves as a reminder of the impermanence of human presence--especially in light of Weisman's book--but also that some cultures do not want their "footprint" to be permanent. The ancestral Puebloans who built this place meant for it to fall away when they had gone.

Monday, October 1, 2007

Aliens, Redux

Well, I was wrong. The Dallas Morning News did print my letter. Sort of.

Thus begins a small rant about the cavalier form of editing that seems to be going on these days: never mind the context, just print the sound bite or whatever fits the space.

The context here is a story the News ran about a woman who was seriously injured by a man whose failed suicide attempt (by jumping from a relatively low highway overpass) knocked her out and caused her car to veer into adjoining lanes and into the paths of oncoming cars. As a result, she has been unable to work at her job as a manicurist, and the story recounted her plight as a non-English speaking immigrant who depended on her co-workers as translators. These same co-workers had been collecting donations at the salon to help her through her ordeal. The story prompted one reader to send the following letter:

Ten years, no English?

Re: "His leap almost took her down – Stranger's suicide try leaves driver with injuries, nightmares," Sunday news story.

I sympathize with Lan Nguyen and her ordeal and understand how that would surely traumatize anyone.

What really got me fired up was not what happened to her, but the fact that she has lived in the United States for 10 years and still cannot speak English.

If you're going to live here and be a productive citizen, then have enough respect to learn the language spoken by the customers that provide your income.

Kelly Williamson, Kaufman

Well, this letter got me fired up, leading to my sending the News the following response (copied here from my last post):

In regard to letters from Kelly Williamson and other readers complaining about immigrants’ English language skills (or lack thereof), I have one question for these critics: when was the last time you tried to learn a complex language as an adult? It’s one thing if you’re a child, at peak language-acquisition age; it’s entirely another if you’re adult—especially if you don’t happen to live in a particularly supportive community. I have also noticed that folks who live around here are neither very good at understanding (nor very tolerant of) “foreign accents.” They even need subtitles on the news to understand interviews with non-native English speakers! I’ve even heard adult, native-born Texans complaining about “Yankee accents,” insisting that they can’t understand what’s being said.

As long as people can make themselves understood, and translators are willing to help them out, what’s the problem? It’s not as if immigrants don’t want to learn; but who would even attempt the long and difficult process if they knew they’d face impatience or even ridicule for their efforts?

And this is what the paper printed:

Patience for novices

Re: "Ten years, no English?" by Kelly Williamson, Wednesday Letters.

When was the last time you tried to learn a complex language as an adult? It's one thing if you're a child at peak language-acquisition age, but it's entirely another if you're an adult, especially if you don't happen to live in a particularly supportive community.

As long as people can make themselves understood, and translators are willing to help them out, what's the problem? It's not as if immigrants don't want to learn, but who would even attempt the long and difficult process if they knew they'd face impatience or even ridicule for their efforts?

Candace Uhlmeyer, McKinney

So, the letter ends up sounding like a rather personal attack on Mr./Ms. Williams, and much less like a general indictment of local attitudes and lack of tolerance for accents. There's no indication in the printed letter of how impatience and ridicule might be manifested, and makes me sound rather like one of those whiny liberals who don't like it when communities aren't "supportive."

For a while I thought I might fire a snippy note back to the editor, but finally decided to vent here, where I'm the only editor, and I get to decide what's said. And, in today's letters to the editor, I've found someone who agrees with me, so I'll let her have the last word:

What's important here?

Re: "Ten years, no English?" by Kelly Williamson, Wednesday Letters.

America is made up of immigrants. If Lan Nguyen's customers are happy, what difference does it make what language she speaks?

Luckily for Ms. Nguyen, most Americans are much more compassionate than those who would exploit a sad situation just to make a political point.

Carol Perkins, Dallas

But then, perhaps Ms. Perkins and I have been taken over by pod people . . .

Photo: Again, I'm including what I hope is fair use of an image from Wikipedia.