Wednesday, October 1, 2014

Coming of Age in the Twenty-first Century


Early on, in my misspent youth, I aspired to be both good-looking and smart. So I read Vogue magazine when I could afford it, looking for interesting ways to dress on my embarrassingly meager salary, so as to attract interesting men. I didn’t get very far with this ambition, but I was apparently attractive enough so that after my first divorce I dated highly intelligent men with Good Prospects.  As it turns out, though, it might have been the other aspiration that attracted them, rather than my questionable gorgeosity.

To foster the “smart” end of the equation, I pretentiously read The New Yorker (again, when I could afford it), and made sure I picked up the latest issue whenever I found myself flying home for a holiday.  This act was only partly a sham, because I really did enjoy many of the articles, and actually appreciated their length.  I read prose by Ursula LeGuin, Pauline Kael, and others whose work I admired, while seatmates on the plane read Time (articles in which were already getting shorter and shorter). If I wanted a news magazine, I chose U. S. News and World Report, because it seemed more heady—but also because the articles were more thorough than those of its competitors.

Although my motives may not have been entirely pure, I did learn a great deal, and was pointed toward interesting writers, topics, and points of view.  By shamming intellectualism, I actually became an intellectual.  I continued to pursue my part-time Ivy League degree, and have enjoyed an intellectually challenging and rewarding life ever since.

And so it was with some amusement that I added The New Yorker to my Newsstand subscriptions on my iPad a couple of weeks ago.  My one regret now, however, is that I’ll probably never have the time to read everything I’d like to, now that my interests are so much broader and deeper than they were during my callow years.  I’ll probably have to cancel it when I finally retire, because it’s awfully pricey—although its digital features are quite wonderful—but for now I’ll use part of my Social Security check to keep it going.

What brought all this to mind was an article by A. O. Scott in the New York Times magazine (to which I also subscribe online), called “The Death of Adulthood in American Culture” (September 11, 2014). In it Scott bemoans the deaths of “the last of the patriarchs” (in television shows like The Sopranos, Breaking Bad, and Mad Men, none of which I have seen, although I kind of wish I had watched Mad Men, and may yet someday).  He doesn’t miss the sexist aspect of patriarchal authority, but notes that in progressing beyond patriarchy “we have also, perhaps unwittingly, killed off all the grownups.” He goes on to note the popularity of “Young Adult” fiction among so-called grownups, between the ages of 30 and 44. (Oddly enough, though, anything labeled "adult" fiction tends to be pornographic).

Now, while I’ll admit to being geekily fond of comic book movies, I haven’t been able to bring myself to pick up copies of the Divergent or Hunger Games books. I never did get into Harry Potter, either, beyond the first chapter of the first book.  The plots and characters of all of these franchises are so familiar and so borrowed from older, wiser works that I read while I was earning the aforesaid degree, that I can only see reading them as an effort to get inside the heads of my “young adult” students.  And to them I always recommend Ursula LeGuin's books written for younger audiences, because she's pretty egalitarian in her treatment of her readers.

It took me eight years to get my B.A., and during much of that time I was also working as an admin around scholars at Penn—many of whom are now dead, but from whom I learned how to become (eventually) an adult. So I find myself trying somewhat desperately (and perhaps pathetically) to find ways to lead my own students toward at least a few more worthy literary endeavors.  They will probably never read War and Peace.  But how about a bit of Dickens?

The Beloved Spouse and I watched David Suchet’s tribute to Agatha Christie the other night on PBS and wondered at Christie’s ability to write lucid, careful prose as a very young child.  It shouldn’t be all that surprising, once one is familiar with what children read in those days: Lewis Carroll’s Alice stories, J. M. Barrie’s Peter and Wendy, Kenneth Grahame’s Dream Days. The prose in these “children’s” books is so rich, complex, and erudite that they’re well worth reading by adults today—rather like some of the cartoons we watched as kids (Rocky and Bullwinkle, anyone?) that amused both children and parents. 

It seems a bit ironic to me that some of the animated films out today offer more intellectual stimulation than do novels aimed at teenagers. But The Box Trolls, which is amazingly animated and richly written, is far less juvenile than what I’ve seen of the Harry Potter movies. I’m not talking here about the Disney princess movies (however engaging they might be) that make me happy I don’t have a small daughter today and don’t have to deal with the princess phase. Rather, it’s Lemony Snicket and Despicable Me that seem to harken back to Kenneth Grahame’s hilarious, luminous story, “Its Walls Were Made of Jasper.” I tried reading the latter to my Art History students when I was teaching about manuscript illumination. But they didn’t get it: too many unfamiliar words.

Scott reminds us in his essay that Huckleberry Finn is commonly relegated to the children’s section of libraries, and that’s about as close as most of today’s students get to literary depth.  Numerous attempts to ban it suggest its potential power to do just what Socrates died for: corrupt the minds of the young by teaching them how to think. I’m pretty sure they don’t read Moby Dick anymore, and if they get any Shakespeare at all, it’s Romeo and Juliet.

Another irony exists in all this, and it has to do with J. M. Barrie.  Peter Pan didn’t want to grow up. The Disneyfied Peter Pan never did, and he lives on in popular culture’s reluctance to offer few if any grownup models.  In the end, Scott’s essay has explained to me why I watch so few television series today, and perhaps why most of the contemporary literature I read is science fiction or works about science.  I’ve long held that some of the most compelling prose comes wrapped in covers that depict galaxies far away.  One of the first science fiction novels I ever read was Wilmar Shiras’s Children of the Atom, which sparked an interest in science and speculative fiction that hasn’t left me even in my dotage.  That particular novel probably inspired the X-men comics and characters, and although aimed at the young, treated its readers as if they were—at least potentially—adults.   

Image credit: The cover of Kenneth Grahame's Dream Days, illustrated by Maxfield Parrish, via Wikimedia Commons.

Sunday, September 7, 2014

Celebrating--and Questioning--Wilderness

On September 3, the Beloved Spouse and I celebrated (or at least acknowledged) our twenty third anniversary. As it turns out, the Wilderness Act marked its fiftieth anniversary on the same date--unbeknownst to me until I read a message from Orion Magazine, which is my main source of environmental commentary these days.

My mention of the occasion, however, won't rank up there with the joyous celebrations of tree huggers everywhere, because I'm rather skeptical about the whole notion of wilderness in the first place. I'm pretty sure it doesn't really exist in the present day, because there is almost no place on earth that isn't (as Wilhelm Dilthey would put it) human affected (with the possible exception of volcanoes in the remote hinterlands of Iceland).

More true now even than it was in the nineteenth century (Dilthey's Introduction to the Human Sciences was published in 1883), there isn't even a tiny corner of this planet that doesn't bear at least a chemical imprint of human activity. Not only that, it's difficult for many people to even imagine what real wilderness, truly unaffected by so-called homo sapiens sapiens (the wise wise human species), would look like.  In my first-level art history class I ask my students to choose a photograph of an object (animal, vegetable, or mineral) as untouched by humans as possible, and I have to warn them that a picture of an elephant in a zoo, or a trail through a forest, or a field with a fence, or even a single rose is not a good choice. But there's an obvious irony connected with the task, because a photograph in itself is evidence of human interference.

In the midst of my pondering the notion of the wild, this morning I read Rob Nixon's review in the New York Times of Diane Ackerman's new book, The Human Age. The book seems to be both potentially fascinating and terrifying, and I'll probably buy it for the iPad--but I'm not sure I really want to be reminded just how much we're interfering with the future of the world.  Looking up Nixon's recent Slow Violence and the Environmentalism of the Poor led me to a June 2011 article in the Chronicle of Higher Education, "Slow Violence: Literary and postcolonial studies have ignored the environmentalism that often only the poor can see" which advocates moving the locally-focused understanding of environmentalism away from Thoreau and other members of our national canon (including those like Wendell Berry who are often invoked in this blog) outward: toward the rest of the world, far less cushioned from the ravages of human power than we are here.

As Nixon points in the article, there are other voices to be heard:
Figures like Wangari Maathai, Indra Sinha, Ken Saro-Wiwa, Abdul Rahman Munif, Njabulo S. Ndebele, Nadine Gordimer, Jamaica Kincaid, Arundhati Roy, and June Jordan have recorded the long-term inhabited impact of corrosive transnational forces, including petro-imperialism, the megadam industry, the practice of shipping rich nations' toxins (like e-waste) to poor nations' dumping grounds, tourism that threatens indigenous peoples, conservation practices that drive people off their historic lands, environmental deregulation for commercial or military demands, and much more.
While some of those enshrined in the American pantheon are now dead, none risked the fate of those like Ken Saro-Wiwa, who was executed by the Nigerian regime of General Sani Abacha for defending his people not against some marauding heard of infidel-purging jihadis (which might have garnered some coverage on CNN), but by us: big oil in America and Europe.

Earlier in the week I'd been reading about the current discussion on GMOs and whether or not they're "safe" and whether or not people have a right to know if they're in our food or not.  As the general controversy involving GMOs has developed, I've frequently thought that we might not be asking the right questions when it comes to allowing their use in the first place.  Just as I think that the question about preferring organically- vs. conventionally-grown food isn't whether or not it's "healthier," but rather if it's culturally better for us (more sustainable, economically more equitable, etc.), I wonder if the real question behind gene-manipulation in our food crops isn't more about whether it's ethically defensible and environmentally sustainable than whether it's going to poison us or cause birth defects or some such.

The question up for debate at the moment is really about labeling. Science seems to be bolstering the notion that modified foods do not pose a danger to our health. One long-time opponent of GMOs, Mark Lynas, has recently switched sides in the argument against their safety (as did Neil deGrasse Tyson), and is now advocating that we embrace them in order to stave off the looming cloud of hunger related to population growth. As both of these guys point out, the science backs the safety claim; not only that, if we're going to blast the climate-change deniers and the creationists for ignoring scientific evidence, we out not to base opposition to GMOs on unfounded skepticism.

For my part, I'm not particularly bothered by crops like Bt cotton--because we don't ingest it (and it's one of the most pesticide-dependent crops human beings grow, so that spraying chemicals significant harm both to the environment and to those who harvest it), but I'm also not so sanguine about our ability to ensure that food crops will never develop consequences down the line.  The people who develop and test the modifications are, after all, human. And we can be utterly blind at times, unable to foresee potential outcomes. Two words: atomic bomb. Or maybe one: thalidomide. Or an acronym: DDT.

Simply labeling products so we can make choices doesn't seem like it should be such a big problem, but "folks" (corporations are people, after all) like Monsanto, Bayer, and Dow are afraid that labeling will imply lack of safety and discourage people from buying their RoundUp-ready products.  Well, I already don't buy them for economic and ethical reasons; I don't like the fact that altered genes are escaping the plants they've been manipulated into the environment and generating unforeseen consequences (see this 2012 Mother Jones article). I don't buy milk with hormones in it because I identify too closely with those poor engorged cows I've seen in feedlot dairies.

In the end, though, it seems a bit luxurious to even be able to question our food sources, and to make choices at all because so many people in the world have to take what they get--some of which may be imposed upon them by the denizens of rich, still-imperialistic countries like our own. And while environmental consciousness seems to have risen here over time (as people become more aware of how what we do to the planet is hurting them), we don't seem to hold the same regard or concern when it comes to developing countries with resources we want to exploit.  So any benefits GMO foods might produce may come with cultural and economic costs in other realms.  And even with science currently able to proclaim the safety of genetic modifications, our track record isn't stellar when it comes to predicting future consequences.

Although humanfolk have been using artificial selection to manipulate crops since the beginning of agriculture (the very word implies manipulation), as Tyson points out, we're now messing with Mother Nature in ways that go far beyond simply mixing pollen from different species to increase the food supply.  And even though some GMO crops (like golden rice) might well help keep some people in Asia from debilitating malnutrition, the encroachment of Big Agriculture, like Big Oil, and the like isn't exactly sustaining human cultural traditions. It sure as hell doesn't bode well for the preservation of the wild.

I'm just not at all sure that we're actually capable, as a species, of making particularly wise decisions about the future. Wilderness as an idea is still with us, but as an actual, existing phenomenon I'm pretty sure there will be little left in another fifty years.

Image credit: The Bardarbunga volcano erupting in Iceland on September 4, 2014. The coincidence of this eruption and the idea that volcanoes are one of the few truly wild, non human-affected natural phenomena in existence was too much to resist. The image originally came from the Wikipedia article on the volcano--where you can also learn how to pronounce it. Blogger didn't like the coding, so I couldn't post a link.

Thursday, June 26, 2014

Summer Is Icumen In

Rather than celebrating the coming of this particular season with Medieval rapture, I'm tempted to substitute Ezra Pound's snitty little salute to winter ("Lhud sing goddam!") in place of "Lhud sing cuccu!" This is, after all, Texas, and it's only by the grace of some current climatic weirdness that we're not slow-cooking in our own juices already.

I'm generally a bit ambivalent about summer anyway, because I fully realize that what's actually happening is that the daylight hours are now beginning their long decline toward the winter solstice,  the "darkest evening of the year," and the end of my 67th--even though I'm "only" 66 (and a half) at the moment.

Until I read Akiko Busch's op-ed post in the New York Times last Friday (The Solstice Blues), I thought I was alone in not necessarily seeing this moment as some grand seasonal celebratory event:

The moment the sun reaches its farthest point north of the Equator today is the moment the light starts to fade, waning more each day for the following six months. If the summer solstice doesn’t signal the arrival of winter, surely it heralds the gradual lessening of light, and with that, often, an incremental decline in disposition. 

In North Texas, the end of June is usually marked by the onset of 100-degree days; but this year we've been treated to a bit of rain and temperatures hovering around 90 during daylight hours and low enough at night to sleep without air conditioning. Even the dogs don't seem all that uncomfortable with the humidity--although we do tend to hole up in a small room with a window unit to watch TV in the late afternoon/evening, by which time the house has heated up enough to make the humid air a bit heavy.  Still, by bedtime, it's tolerable again; with a breeze it's even better, because the bedroom has windows on three sides.

When I was looking for an image to illustrate this post, I had wheat in mind for several reasons. For one, a search for "summer" on Wikimedia Commons inevitably produces photos of wheat. In addition, I'm working on a new philosophical perspectives course called "Food and Culture," and my first lecture involves the early history of human victuals--inevitably involving the ancestors of modern wheat. Since a later topic in the course will consider food and art (or food in art), I snapped up the Carl Larsson painting (which I've probably used before) because it touches on both, and because his stuff always makes me think of utopia.

The Food and Culture course has been (ahem) cooking in my brain for several years, but it was only recently that I realized I could teach it without having to go through the bureaucratic effort involved in getting a new course accepted.  Years ago, when I helped design the general studies offerings for our baccalaureate program, I pinched an idea from SUNY Stony Brook and crafted a course called "Philosophical Perspectives," thinking that whatever philosophical types we might hire could then teach what they knew, from within whatever intellectual context they were trained.  I've taught several topics under this catch-all rubric (The Arts and Crafts Movement, Pioneers of Modern Design, and Technology and Utopia), and it finally occurred to me last quarter that Food and Culture could easily fit under the umbrella.

As it turns out, I'm only catching up with a trend that's been evolving since the '90s, and even in the immediate region there's a Philosophy of Food project developing out of the University of North Texas. What this means is that although I'm not exactly on the cutting edge of things, I can take advantage of the work people are already doing in order to locate resources for my students (30 as of the last time I checked). It also means that thinking about food and its cultural meanings is being taken more seriously than it was at first. In a Chronicle of Higher Education article I had saved from 1999, it was clear that the scholarly value of food-studies was somewhat suspect. Since then, however, a looming variety of food crises (scarcity, waste, obesity, malnutrition, and myriad others) has forced folks to start talking about the many ramifications of food consumption, including the effects of global climate change on the future of food.

William Morris's effort to move the education of desire into the center of our consciousness about how we live (and how we might live) fits squarely into the aims of my course: to get students to start thinking about the difference between need and want, and how understanding that difference can inform sustainable futures.  I'm not sure how this lot will respond to having to read and write and think about something most of them don't consider except at mealtime. But there's some potential here to poke a stick into complacency.  Since most of those enrolled are in a culinary management program, the results could be quite interesting.

We will, of course, be suffering through the worst part of this region's summer weather. But we'll be doing so within an overly air-conditioned classroom, sucking down cold bottled water, iced tea, diet sodas, and energy drinks--matters I'll have to design another course to deal with.

Image credit: Carl Larsson (1853-1919),  Summer (date unknown). Uploaded to Wikimedia Commons by Niklas Nordblad.

Monday, April 21, 2014

Earth Day 2014: Getting By

Despite the fact that I haven't posted in rather a long spell, I couldn't let Earth Day go by without at least a small mention. I do have plans afoot for renewing my commitment to this blog as soon as I've finished up the latest MOOC, especially since TBS and I are heading west in late June to visit family and the Auld Sod.  I suspect that adventures will ensue, since we're taking the puppies (who are now ten) and driving--although what kind of a vehicle will be involved is far from certain.  Traveling with dogs requires all manner of prior arrangements, and since these guys have only ever been to the vet and to Tyler and San Antonio, they're essentially an unknown factor in the equation.

The "Getting By" subtitle refers to our continuing ambivalence about this place. While we still love the house, the neighbors are problematic (see the fence in the opening photo for one source of angst), and somebody ratted us to the local feds on account of the overabundance of plant life in the Carbon Sink a few weeks ago.  As a result, I've spent days off mowing and whacking, and have even fired up the little electric chain saw a couple of times.  The Sink itself is now mostly mowed down, except for a patch I left because it's a ladybug nursery.  By Wednesday they'll have metamorphosed and flown (I hope to the new raised bed, where there are zucchinis and bush beans growing), and I can finish whacking with impunity.  

We currently plan to have a new wire and cedar fence erected, with gates to accommodate the lusted-after vintage travel trailer (or one of the newer, even cooler ones from Canada) we want to buy before we retire.  Since we have no guest room in the house, I'd like to do what my grandmother did in Lone Pine, and park the Shasta in the back yard for visitors.  Just in case. But the existing fence is really kind of ugly (though not as ugly as the new wooden job on the north, which was damaged in the winter ice storm and badly repaired; I get to look at it while I wash dishes), and a nice new one will provide a support for berries and climbing plants.

The good news is that after the nastiness of the winter, and an evening hiding in the closet with the dogs when a tornado hit about five miles away, we've had some welcome rain, and the plants damaged by the late freeze(s) have started to come back.  This year's Earth Day photo is, alas, a fake. Well, not so much a fake as a lie. Or at least somewhat inauthentic.  It represents what the herb garden looked like last year about this time, with the wild gladiolus fully abloom, and the delphiniums and pincushions and lavender up.  The lavender is now gone, and the pincushions just now beginning to emerge. The rosemary has been hacked back, delphiniums replanted, lavender still to come.  But I also have a rain lily (wonderful surprise, that), and some wild geraniums mixed among the primroses I'm letting grow along the fence on the east. 

There is still much more work to be done, especially in front, but the iris border is blooming better than it has since we moved in, and I think that might be a good sign.  My one hope is that if things do get colder here (who knows how the changing climate is going to affect us in the long run), maybe I can grow some lilacs.

Happy Earth Day, People. I've got a poster in my kitchen to remind me of the first celebration in 1970, when folks first acknowledged the need. That need is even more evident today, so the moment is very much worth noting.

 

Wednesday, January 1, 2014

Another New Year

As I slide into antiquity, I'm becoming more and more aware of shortcomings that wouldn't have bothered me a few years back: things like starting stuff and not finishing it. Like blogging.

When I first began this enterprise, some seven years ago, give or take a few months, I never thought I'd have the energy to keep it up.  But I did for quite some time, and even spun off a couple of other efforts when it seemed as though not everything I was talking about had to do with concerns that matched those of the nineteenth-century Medievalist utopians who inspired my work.  I'm not sure now how I found the time, because each of the essays took considerable effort to write, and were frequently prompted by readings on other peoples' blogs, or news items, or articles of interest in sympathetic books and magazines.

Nowadays I struggle to keep up with work, and seldom manage to scrape together the moments I need for sheer reflection.  Recent encounters with the monumental aspects of age--enrollment in Social Security and Medicate primarily--have drained my spirit further, and I'm now realizing in earnest that I have to make the time; it's not simply going to appear magically as the due reward of a life well spent.

It is also the sad state of things these days that most of us will not enjoy any sort of leisurely retirement.  My Beloved Spouse and I are planning to wrap up our working careers within the next ten years, but will keep slogging for now in order to pay off our mortgage and tuck away a bit of cash to augment rather meager retirement funds.  Some of my even more elderly friends (five or more years older than I am) are still at it, and don't foresee quitting as soon as they would like. 

The good news is that many of us are also in decent health, having been careful to avoid being naughty about food and smoking and such.  I've probably been repaired and Borged up enough to survive longer than my genes would indicate I have any right to expect.  With the right drugs and reasonable dietary and exercise habits, we should abide on this earth longer than we might have done in earlier times.  When my Grandmother was my age, she had already been a widow for six years, but had another thirty-eight years left to her. She also worked well into her eighties, as a receptionist in the local hospital, but was fond of good clean living, sans meat or alcohol.  Since I indulge in both, I'll have to take my chances, but moderation should help. 

One challenge of the new year is to find some balance between what I would like my students to accomplish, and what I have any right to expect of them.  I'm tempted to roll over, belly-up and give in to the ravages of modernity, admitting defeat and just muddling through.  But I do think these kids are worth some effort, so for as long as I can I'll keep searching for ways of engaging them, hoping that something will emerge from research or practice that will help bridge the temporal and experiential gaps that keep emerging like crevasses in a glacier.

My few reflections over the last year seem somewhat maudlin to me now, but I'm not as pessimistic or self-absorbed as I probably come off.  Most of us wouldn't be able to get up in the morning if we really thought that this is as good as it gets.  Every now and then, some small gleam comes wafting in, like Tinker Bell into the nursery, whether in the form of an interested student, a terrific book or film, or a good conversation with a colleague or child.  The Beloved Spouse builds a fire (of wood that didn't hit the house during the recent ice storm because we'd had the foresight to have the trees trimmed), the dogs snore on the hearth, and we enjoy the company of a sister or a new nephew or folks we haven't seen in some time. Life many not be superb, but it's certainly okay.

So, happy New Year to all. Be well, and be good to each other. May 2014 be somewhat better than just okay.

Image credit: January, from Très Riches Heures du duc de Berry by the Brothers Limbourg. Ca. 1412. This is one of the liveliest of the Calendar pages from this most famous of the Books of Hours painted by the brothers for the Duke.  Via Wikipedia.