- Donate
- Subscribe
My Account
Pastors
James D. Berkley
The unholy trinity: me, myself, and I.
Leadership JournalNovember 1, 2005
The sacrifice you want is a broken spirit. A broken and repentant heart, O God, you will not despise. Psalm 51:7
Ashleigh Brilliant, that odd vestige of the seventies who scribbled his offbeat humor on hippie postcards, once penned: “All I ask of life is a constant and exaggerated sense of my own importance.” People chortled at that observation thirty years ago. People absolutely live by it today.
Is egoism one of the greatest sins of Christian leaders, especially of those who are effective, successful, respected Christian leaders? Perhaps even of you? Our egos are a strange entity of proportion and moderation. Psychologically, a person with no ego would be a basket case without self-control or a concept of personal identity. Practically, a person with a deficient ego flounders in self-doubt, failure, and a lack of confidence. On the other hand, we find a person who has an inflated sense of self-worth an insufferable blow-hard, an egomaniac, a self-aggrandizing mass of arrogance. Every life needs some kind of balance between single-minded self-centeredness and excessive self-deprecation.
The famous rabbi Hillel wrote, “If I am not for myself, who will be for me? But, if I am for myself alone, what am I?” Rabbi Hillel, for all his wisdom, appears to be leaving out the most important factor: God. God always alters the ego scale. King David—rich, highly successful, and therefore, someone who had every reason to develop a robust ego—eventually realized that next to God he was nothing! In fact, the most pleasing thing he could do for God was to declare his own absolute ego bankruptcy.
A broken ego won’t ever hold enough air to become overly inflated. Next to God, no puny mortal ego dare claim hyperdimensions. We are the wearers of filthy rags next to God’s splendor. We are the vessels made for destruction, apart from the Potter’s grace. We are but dust and chaff. No, next to God, ego cannot inflate.
But because of God, we are fitted in regal finery. We are adopted as members of the royal family and destined for eternity. We are made just a little lower that God. Because of God, our personal significance becomes of staggering proportions. When we get that picture in our minds, an ego of proper dimensions ought to fit into place. We won’t, as a Texan once said, be forever trying to put a ten-gallon hat on an eleven-gallon head. A superheated ego and a true sense of the lordship of Jesus Christ just don’t fit in the same personality.
—James D. Berkley
Reflection
How can I acknowledge, celebrate, and take satisfaction in what God is doing in a and through me without becoming or appearing egotistic?
Prayer
Lord God, I give you the praise and glory for the good things you’ve placed in my life, and I ask you to break my arrogance in any areas of personal pride.
“The biggest addiction we have to overcome is to the human ego. Why? Because ego stands for Edging God Out.”—Kenneth Blanchard, popular author
Leadership DevotionsCopyright © Tyndale House Publishers.Used by permission.
- More fromJames D. Berkley
- Character
- Church Leadership
- Experiencing God
- Fellowship and Community
- God
- Integrity
- Leadership
- Passion, Spiritual
- Pride
- Spiritual Formation
- Values
News
Russ Breimeier
Christianity TodayNovember 1, 2005
Sounds like … classic progressive rock in the tradition of Kansas, Yes, and Spock's Beard, possibly appealing to fans of contemporary prog rock from the likes of The Mars Volta
At a glance … though perhaps a little less pop accessible than his previous efforts, Morse's latest is still quite admirable as a complex progressive rock symphony exploring the mystery of God and his holy temple
Track Listing
- The Temple of the Living God
- Another World
- The Outsider
- Sweet Elation
- In the Fire
- Solid as the Sun
- The Glory of the Lord
- Outside Looking In
- 12
- Entrance
- Insider His Presence
- The Temple of the Living God (reprise)
Progressive rock has long carried a stigma as a dying niche genre because of its lengthy compositions, heady lyricism, and intricate instrumental solos, thus keeping it from being radio friendly and commercially viable. But that seems to be changing now that corporate radio is losing favor with listeners, and bands like The Mars Volta, Sigur Ros, and System of a Down continue to sell out venues. It's not quite a revival, but the public does seem more receptive to the complexities of prog rock, pioneered by the likes of Genesis, Yes, and Kansas.
Which makes this a prime time to discover Neal Morse, an artist too good to be ignored in Christian music. It's remarkable that someone can create three albums in three years with such artistic depth. His first major solo release was 2003's Testimony, a stunning two-disc autobiography that put his lifelong spiritual journey to music. Nearly as impressive was 2004's One, a modern day mix of prog rock and pop that traced back to the Garden to explore the rift between God and man. Now comes the unconventionally titled ?—that's right, a simple question mark—exploring the mystery of God and his relationship with man as represented by the law and his temple. It confirms that Morse is progressive rock's answer to Michael Card, probing deep biblical topics and theology with intelligence and artistry.
The album shouldn't be viewed as a collection of 12 songs as much as a continuous hour-long rock symphony with 12 movements—a number that's almost certainly intentional since one of the movements deals with the significance of "12" in Scripture. The initial premise is that we all long to see "The Temple of the Living God" and be in his presence, yet our imperfection and unworthiness keep us from experiencing communion with him. Both pop ballad "Outside Looking In" and "The Outsider" take the perspective of an unclean leper and shame-filled sinner longing to bask in God's glory: "I am lonely and dead inside/Clearly God doesn't love me, so I'll just wait outside."
"The Glory of the Lord" offers a glimpse of just that, with choral bombast that would do Handel or Wagner proud. But as Christians, we know that because of what Jesus accomplished for us, we don't have to wait outside the temple. "In the Fire" confirms the need to sacrifice our sinful nature, while the funk-flavored "Solid as the Sun" suggests there is One who has been appointed to stand in God's presence to represent mankind. The symphony concludes with joyous rocker "Entrance" and the peaceful power ballad of acceptance "Inside His Presence," in which Morse declares, "When He died and was born the temple walls were torn/And God's Spirit poured out to all the ones without/Now the temple of the living God is you."
Like most prog rock efforts, the themes are deep and perplexing, but in contrast to a confusing album like Genesis' The Lamb Lies Down on Broadway, a little bit of effort clearly reveals what ? is all about. The words are poetic, yet biblically informed, and the booklet cites Scripture references throughout the lyrics. Thus, we've an album that's both artsy and evangelical, appealing to both Christian and non-Christian prog rock enthusiasts even more so than Testimony and One. It can still be hard to follow since the movements occasionally swing back and forth between unworthiness and acceptance, rather than adhering to a clearly established journey, but perhaps this was done to pace things stylistically.
All of this performed by a band that's somewhat less orchestrated and produced than Morse's previous two albums, relying on the traditional prog rock base of guitars, keyboards, and rhythm to more closely evoke classic Kansas and Yes. The musicianship remains breathtaking with Morse's amazing vocals, guitars, and keys at the core, surrounded by the equally impressive drums of Mike Portnoy and bass of Randy George. The album also features such guest prog rock luminaries as Steve Hackett (original guitarist for Genesis), Jordan Rudess (Dream Theater), and Neal's brother Alan (from their band, Spock's Beard).
Still, I miss the pop feel of Morse's previous two albums, which helped make his music sound a little more timeless. This album seems slightly more dated because of the overall flow of the music, the style of the instrumental solos, and dated effects like the talking guitar sound made famous by Peter Frampton. So unless you're already a fan of classic progressive rock, chances are you won't easily latch on to this one as easily as others. Yet even if ? may not be quite as enjoyable as Morse's past work, it's unquestionably admirable for its artistic and spiritual merits.
Copyright © Christian Music Today. Click for reprint information.
- More fromRuss Breimeier
Amy Laura Hall
The brave new world of meticulously planned parenthood.
- View Issue
- Subscribe
- Give a Gift
- Archives
While the nation engages in acrimonious debate over when life begins, the "suburban Washington, D.C."-based company GIVF is clear: "Life begins at the Genetics and IVF institute." This, GIVF's motto, runs through their advertisements for human ova, appearing in magazines across the country. GIVF recently ran a full-page ad on the inside back cover of the New York Times magazine, just opposite the "Lives" feature. The advertisement promises "Doctoral Donors in advanced degree programs, and numerous other egg donors with special accomplishments, talents, or ethnicity." GIVF also very helpfully offers both "Adult and Childhood Photos" of their donors. After all, "your decision has lifelong implications."
Designer babies? The Childbirth Center at Duke Health Raleigh Hospital recently ran an advertisement that unabashedly embraced this imagery, likening itself to a boutique. A petite young women in silhouette gazes at the window of a brightly hued shop, where three fashionable frocks hang on mannequins. The bold print reads: "finally, a childbirth center that's as stylish as you are." The smaller print continues, "In the world of hospital birthing centers, consider us the smart little boutique where you always find the latest thing (exactly in your size)." The last line concludes: "Just the place to find something perfect to take home with you."
Soon after entering the dubious field of reproductive bioethics, I began singing (to myself) David Byrne's "Once in a Lifetime." Surrounded by the various advertisements for procuring beautiful children to adorn beautiful houses I heard myself singing (in a rather irate voice), "How did we get here?!" and, more to the point, "My God, what have we done?!" Yet in the midst of my incensed little ditty, I have found myself also asking another question. Are we chugging up the biotech slope towards a qualitatively post-human world, or is this the "Same as it ever was?" This brave new world may be newly creepy, but is it really new? The project of perfecting our offspring through human ingenuity is as American as Benjamin Franklin. Suburban mommies buy our daughters spun-sugar polyester dresses brought to us by DuPont; is the boutique childbirth center simply the logical extension of "Better Things for Better Living … Through Chemistry"? Is there a radical difference between choosing the most élite piano teacher available for our budding young musician and choosing the very best gamete donor available? The question "How did we get here?" is important, but digging into the mess of parenting in the last century becomes, well … messy. It requires a hard and meddling look at what mainstream, middle-class women expect when we are expecting.
For the past several years, I have been digging into mainline Protestant and mainstream American culture from the 20th century, in an attempt to understand the turn to "Designer Children." Working as a bioethicist, I recognized that the default mode of bioethical reasoning among many upper-middle-class, mainline Protestants (my people) is a potent mix of social Darwinism, utilitarianism, and faith in scientific progress. In order to gain some critical purchase on these assumptions, I explored dusty issues of the Ladies' Home Journal and Together ("The Magazine for Methodist Families"), National Geographic, Parents' Magazine, and others. I have looked for photographic images of ideal family size and domestic cleanliness, for infant formula advertisements and the promises of pediatric psychopharmaceuticals, and for the links between mainstream, Protestant domesticity and the American eugenics movement. Although the various parts of this project do not easily fit into a master narrative, there are, I believe, discernable patterns.
I have come to believe that the repro-biotech revolution poses anew some very old questions: But who do you say that I am? And who is my neighbor? When was it that we saw you hungry or thirsty or a stranger or naked or sick or in prison, and did not take care of you?
"Progress is Our Most Important Product"
General Electric's longtime motto—"Progress is our most important product"—might have served as the American credo for the 20th century. "Is your baby enjoying The Results of Progress in infant feeding?" asked an advertising letter to modern-minded mothers of 1933, signed by Dan Gerber. The letter continued, "When you are confused about anything you do not understand, you ask someone who knows. Why not do this in the vitally important matter of food for your baby?" The advertisement announced that it was also the cover page for a brochure entitled "Progress in Infant Feeding," handed out "to thousands of visitors" at Gerber's Exhibit in the Hall of Science at the 1933 Century of Progress Exposition in Chicago. The theme of the Exposition was "Science Discovers, Genius Invents, Industry Applies, and Man Adapts."
Two decades later, during the postwar period, the geniuses of industrial invention focused their attention on the women of a newly burgeoning middle class. Ladies' Home Journal became "The Magazine Women Believe In" (the LHJ motto), and the cumulative question posed by way of advertisement after advertisement was: "Is your baby enjoying the results of progress?" DuPont indeed promised Ladies' Home Journal readers in March of 1955 "Better Things For Better Living … Through Chemistry," the better living embodied in a photograph of three smiling children in their Easter best standing in front of mom, who is wearing pearls and playing the piano. The sterile, uniformly blue backdrop reflects the carefully controlled antics of the childhood models. The "better living" featured in such images involved a particular configuration of "better." As summarized in an Ivory Snow advertisement (Parents' Magazine, 1958) women were to expect purity, safety, and efficiency. Through their buying power, they were to help to ensure these things for their family.
Themes of domestic security were enormously potent in the Fifties. Consider the marketing of nuclear power during that decade. The same year that DuPont Nylon was running its "Better Living" ads in LHJ, President Eisenhower's special assistant on disarmament promoted "Atoms for Peace" to LHJ readers:
Imagine a world in which there is no disease … where hunger is unknown … where food never rots and crops never spoil … Where "dirt" is an old-fashioned word, and routine household tasks are just a matter of pressing a few buttons … a world where no one ever stokes a furnace or curses the smog, where the air everywhere is as fresh as on a mountaintop and the breeze from a factory as sweet as from a rose … Imagine the world of the future … the world that nuclear energy can create for all of us …
—"Atoms for Peace," Ladies Home Journal, August, 1955.
The "Atomic Age" was to provide limitless sources of power, fueling shiny new refrigerators and other gadgets to perform routine household tasks in a jiffy. To naysayers, the author offered an ultimatum. Those American citizens who retained a sense of wariness about the technologically enhanced future needed to "try living in a primitive society without doctors, sewers, medicines and machinery of any but the most basic sort for about six weeks—and then see if they can still work up an argument against it."
"Not on the Move"
The threat of being labeled "primitive" runs through the last century. To progress, to move one's children and family forward, was not so much a right, perhaps, as a responsibility. This responsibility was made quite explicit in the immediate postwar period, in pieces such as Life magazine's May 5, 1947 photo-essay, "Family Status Must Improve: It Should Buy More for Itself to Better the Living of Others," where Americans were encouraged to stop saving and start spending—for the sake of their children and for the good of the nation.1
For this feature, an ordinary American family—Ted and Jeanne Hemeke and their children—was drafted to offer Life's readers a pictorial lesson in economically responsible domesticity. Contrasting "what is" and "what should be" photos serve to accentuate the aesthetics of properly ordered and appointed family life. On the one side, Ted Hemeke arrives home from his job in well-worn clothing that bespeaks his working-class status, with the child at his side in shorts, a wrinkled shirt, and shoes with no socks. His wife stands at the doorway of their "drab" home, one child in her arms and another sitting listlessly nearby. The yard is unkempt, the pathway to the door strewn with sticks and dry, wayward grass. In the photo below, Jeanne bends to shovel ashes from a "dirty" furnace. A child sits on the floor sucking her thumb as a shaggy dog and two kittens romp in the dusty mess.
On the other side, Ted arrives home in a business suit, holding the hand of a child with a knit hat, fashionable swing coat, and shiny shoes with socks. The yard is manicured, the home clearly a modern, suburban ranch-style. His wife again stands at the door, but the child in her arms is now in puff-sleeved dress, the other pedaling cheerfully on a trike. And in the photo below, Jeanne is no longer bent over an old furnace; instead, dressed in flowered frock, nylons, and shiny high-heels, she's cheerfully using an electric mixer in a bright kitchen, all decked out with the latest appliances. The baby of the family sits in a high chair sucking on a clean plastic toy. Her bottle of milk awaits her on the tray.
The accompanying text underlines the assumed connections between the responsibility to purchase consumer goods, broad economic growth, and proper domesticity. It is a civic duty to aspire to the "decency" standards represented by "a pleasant roof over [the family's] head, a vacuum cleaner, washing machine, stove, electric iron, refrigerator, telephone, electric toaster, and such miscellaneous household supplies as matching dishes, silverware, cooking utensils, tools, cleaning materials, stationary and postage stamps." For all of the optimistic democratic rhetoric of bringing "everyone" up to the "health and decency standard," however, I have come to suspect that the message of progress depended on the threat of being associated with those who were deemed "backwards"—with those families who were deemed not on the march forward.
Another photo essay from the immediate postwar period makes this point quite clearly. In its September, 1946 issue, National Geographic featured a series of photographs entitled "America on the Move." From travelers boarding a shiny new jet airplane, the piece moves on to show GI Bill recipients in a University of Wyoming trailer camp as well as vacationers in upstate New York, "Sheep on the Move" across the Grand Coulee Dam, and some of the "More than 110,000 Passengers" who travel through Washington National Airport every month.
In the middle of the feature, a two-page spread unambiguously signals the difference between families "on the move" and those who are not. Three children look into the horizon, pointing upward at their high-flying kite. The caption reads, "Chicagoans Enjoy High Winds and Soaring Kites on Outings to Sand Dunes at the Michigan Shore." Below are two children, separately photographed, with their heads touching the magazine's binding. The caption on the right reads, "This Tennessee Hill Boy's Traveling Days Will Come Later: Now, with food shortages world-wide, he is better off at home." The caption for the other photograph leaves even less open for interpretation: "To a Deep South Farm Urchin, There's No Place Like Home: His world is where his short legs can carry him between meals." The Americans "On the Move," those with whom the National Geographic readership were to identify, were "On the Move" inasmuch as they were able to distinguish themselves and their children from children like these. They were "On the Move" inasmuch as they were evolving, through technological progress, toward a new existence.
"The Future of the Race Marches Forward on the Feet of Little Children"
"Very early on in the midst of my digging I came across a series from a 1950 Ladies' Home Journal that stopped me in my tracks. A photo essay called "Baby's First Year" featured pictures of the photographer's own wife and children taken during the first year of their third child. In one of the earliest photos, the mother breastfeeds her newborn, her hospital bed sheets wrinkled under her plain button-up gown. The baby's sister (3 years old) and brother (18 months) peek into an obviously thrice-used bassinet with a well-worn blanket. The little girl's fingernails show signs of outside play, and her sweater has a small hole in it. In another photo, mother appears with all three in the doctor's office, the toddler barefoot and wearing only a diaper. As I flipped through the pages, these details lodged in my mind immediately at a subconscious level. The signs of worn, used, subtly soiled life with children were palpable. My snap-second thought was "The magazine must have been doing a feature on poor families."
My subsequent realization left me silent. This was merely the real, un-airbrushed life of a woman the age of my grandmother. Three small children in one family, the used blanket, the signs of joy but also weariness on the mother's face, the barefoot and diapered toddler—these seemed slightly off-kilter to my early 21st-century eye, signals of insufficient planning and purchasing and preparation. I registered the images in this way in no small part due to the contrast between the father's relatively spontaneous photographs from 1950 and the exquisitely staged images of domestic advertising in the decades that followed. My unthinking judgment rested on the accretion of maternal good sense passed down by "The Magazine Women Believe In" and by aspiring grandmothers to hopeful mothers to promising daughters in mainline America.
For the first two decades of publication, from 1929 until 1951, Parents' Magazine closed its editorial page with a quotation from Phillips Brooks, the Episcopal Bishop of Massachusetts at the end of the 19th century. The quotation sums up a basic assumption of American family life in the 20th century: "The future of the race marches forward on the feet of little children."
Alongside advertisements for potty-training devices, scientifically enhanced infant milk, and selective summer camps, the staff writers of Parents' presented mothers with the conceptual tools to form children who would participate in the march forward. Indeed, maternal expectations in the United States have been shaped by a subtle, working distinction between families whose children are marching forward and those whose children are backward. Legal, efficient forms of controlling birth changed parenthood from a probable given in marriage to a task that must be chosen responsibly and performed well. Aspiring young couples today often speak about parenthood as if each potential child, each possible life, must be justified—each conception brought about only under the best timing and after obviously adequate preparation. This is not surprising; they know that they are being watched, and judged. The cumulative message emerging from the mainstream conduits of better homes and households in the last century has been that a prospective mother should choose well in order to set her household on the right side of the divide between lives that are atavistic and lives that are evolving.
These norms shape maternal choices today, not only when a woman decides whether to buy her daughter new dresses for school or accept hand-me-downs from a neighbor, and when she decides whether to pursue ova donation or adoption, but also when she is faced with news that her fetus will not predictably advance "the future of the race." At present, women who go through prenatal testing and discover a genetic disability are deciding 9 times out of 10 that the life cannot be justified. One testimony from this world of prenatal testing consistently haunts me as I write in the field of repro-bioethics. While I have quoted it elsewhere, I believe it bears repeating, and repeating again, for the woman dared to speak out loud in her interview what has become a subtle but persistent logic of death at play in the United States:
I had my abortion on June 30th, and I was a mess. I was weeping all the time, I was inconsolable, and we went away for the 4th of July, and I couldn't calm down at all. We were watching the parade on Main Street in Hamlet, at my in-law's cottage, and a family with a kid with Down's was standing in front of me. Right there at the parade, honest to God, like a sign direct to me. And the thing was, I really looked at the kid, how she dripped her ice cream all over, how she couldn't be made to do what the other kids wanted. I looked at her and thought, "She doesn't belong in that family." She didn't look like them, she looked like someone else. Like a lot of someone elses, not quite from the same race, if you know what I mean. And it made me feel, well, that I'd done the right thing, that the one I aborted wasn't quite from my family, either.
—Emily Lockhardt [name changed in original], 37, white antiques restorer.2
This human striving to justify oneself and one's children and one's household seems today a particularly Protestant heresy, but it is also a heresy that Protestantism should know how to name. To all presumptions of human striving, the Protestant tradition has posed one word, a Word through which we are each created, a Word that justifies us in spite of us. It is also a Word that may send each middle-class, aspiring household out into the world with gratitude and anew sense of abundance—ready to risk association with the very children and households for whom Christ showed preference.
When speaking to the public about a "culture of death," Pope John Paul II often went to great lengths to encompass the many manifestations of violence. This seemed to confound secular reporters, eager as they were to isolate a sound bite on the pope's opposition to abortion, embryonic stem-cell research, or euthanasia. Yet the call to affirm life, each and every life, involves a seemingly disconnected myriad of affirmations and denials.
In the midst of this research project, I have heard from couples who intentionally moved into struggling neighborhoods, each buying a house large enough to share with teenagers fleeing drug violence or single mothers in need of help. One young doctor chose primary, pediatric care over the high-end research promoted in the City of Medicine because he wanted to be present to his future children and to be a physician for a community considered a lost cause. One new father risked solidarity with new mothers by becoming the first professor at Duke Divinity School to take paternity leave. A congregation proudly founded as the proper Methodist downtown church now welcomes (somewhat awkwardly) its future in the form of children whose clothing and manners are considerably less formal. One pastor in an otherwise bow-headed and bow-tied suburban church allows her children to wear t-shirts and baggy jeans to worship, over the grumblings of the deacons, in order to help make the space more hospitable to others who wander in. Some couples have refused private school, carving out time to volunteer to wipe noses and spread peanut butter at the struggling public school. Some parents count "other people's" mainstreamed children with special needs to be in an important sense also their own. Some mothers have refused prenatal testing, others have embraced "at risk" adoption, others have thrown baby showers and offered babysitting for the pregnant teenager in the congregation who took the road increasingly less traveled.
These stories reflect a kind of holy foolishness. I have gathered from such testimonies the sense that to resist the norms of meticulously planned parenthood requires tackling head-on two facets of mainline Protestant life in the United States. First, resistance involves faith in a future secured neither through scientific progress nor by way of the march of children to advance the race but through the inscrutable birth of one child, the Word made flesh in an inauspicious manger surrounded by donkeys. Second—and this is for most parents the trickiest part—resistance involves refusing to justify my children according to the measures of ostensibly good housekeeping. Resistance involves eschewing the means by which I am to distinguish my own daughters from children who seem vaguely "backward," from those who are considered "at risk," from neighborhoods that seem godforsaken and from schools that are deemed by quantifiable results to be "subpar." Resistance means not only following the Word born in Bethlehem but bringing one's own children along, to identify with and live among those who are considered today to be the least of these.
Amy Laura Hall teaches ethics at Duke University Divinity School. She is also an elder in the United Methodist Church. This essay reflects her research as a Henry Luce Fellow in Theology, for a book called Conceiving Parenthood: The Protestant Spirit of Biotechnological Reproduction, scheduled for publication by Eerdmans in 2006.
1. See Lizabeth Cohen, A Consumer's Republic: The Politics of Mass Consumption in Postwar America (Knopf, 2003).
2. Quoted in Rayna Rapp, Testing Women, Testing the Fetus: The Social Impact of Amniocentesis in America (Routledge, 2000).
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAmy Laura Hall
John Wilson
- View Issue
- Subscribe
- Give a Gift
- Archives
Bestseller lists are useful, but they are no substitute for what you see with your own eyes: the books people are reading on trains and planes and buses, in coffee shops and waiting rooms, the paperbacks that emerge from a backpack or a capacious purse, twice- or thrice-read already. There was a time, about ten years ago, when Anne Rice was evidently the most popular writer in America. And I hadn’t read even one of her books. (Not my cup of tea.)
In 1997, an editor at another magazine asked me to review The Anne Rice Reader: Writers Explore the Universe of Anne Rice, edited by Katharine Ramsland. I went to our splendid public library and came home with a stack of Rice’s novels.
The Reader turned out to be even more dreadful than I expected after a look at the contents page. In addition to editing the collection, Ramsland contributed several pieces, including an account of the convoluted history of the filming of Rice’s novel, Interview with a Vampire. (There you can find Tom Cruise reflecting on his character, the vampire Lestat: “Lestat is an adventurer. There were no other vampires in New Orleans when he arrived. That is an adventurous spirit. Here’s a guy who goes out among people and goes to the opera and studies music. He’s a fascinating character.”)
What a change to move from earnest psychobabble to the creepily mesmerizing monologue of Interview with a Vampire. Rice drafted this novel—her first, and the foundation of all her subsequent triumphs—in five weeks late in 1973. The year before, her daughter Michelle, her first child, had died of leukemia, shortly before her sixth birthday. Rice and her husband, the poet and painter Stan Rice, had been drinking themselves numb, and she had just recovered from a serious viral infection when she began the novel, basing it on a short story she’d written and set aside in 1968.
Great titles seem to condense the essence of an entire novel into a phrase or even a single word: The Great Gatsby, Hud, The Crying of Lot 49. So the title of Interview with a Vampire, with its insolent incongruity, at once draws the reader in and displays the swaggering imagination—the attitude—that set the book apart. Interview: the quint-essential late 20th-century form, medium of celebrity. Vampire: discarded mythology, hokey figure of darkness. But Interview with a Vampire: explosive fusion.
A less likely recipe for bestsellerdom could hardly have been imagined. (To the commentators in the Reader, armed with a fistful of archetypes and 20-20 hindsight, the novel’s success seems obvious, predictable.) Rice broke every rule laid down in textbooks and writing seminars. Talkiness, the experts said, is to be avoided like the plague. Interview is all talk, endless talk, with the monologist taking his emotional temperature every page or two. Readers loved it.
Yes, as many have observed, the book enacts a seduction. In a cheap room in San Francisco, a vampire tells his life story to a boy with a tape recorder. When he finishes his tale at the book’s end, the listening boy begs to be made a vampire too—never mind the breast-beating of Louis, the storyteller. And so it happens.
The primary seduction, though, is that to which the reader submits. We’re not quite 25 pages into the book when Louis describes to the boy his first “kill,” his first human victim, a runaway slave. “Killing is no ordinary act,” the vampire tells the boy:
It is the experience of another’s life for certain, and often the experience of the loss of that life through the blood, slowly. It is again and again the experience of that loss of my own life, which I experienced when I sucked the blood from Lestat’s wrist and felt his heart pound with my heart. It is again and again a celebration of that experience; because for vampires that is the ultimate experience.
Now it is possible to read this with detachment, noting that the language is sometimes powerful (that “slowly” is masterful), sometimes maddeningly slipshod (as in the slack concluding clause: “because for vampires that is the ultimate experience”). It is possible to read it without endorsing the claim, implicit here, that we are being told something profound about human sexuality. But if, thus warned, we continue to read as the boy continues to listen, then—the logic of Rice’s narrative suggests—it is because we long to be vampires too. I finished the novel with the sense of moral contamination that some books leave us with.
Which doesn’t mean that—in this book or in the novels that followed—Rice simply argues that killing is OK if that’s your inclination. What the books suggest instead is rather murky. On the one hand, Rice celebrates the free spirit, rejecting the Catholicism in which she was raised and all its strictures—and so also the claims of any moral absolutes. (As Ramsland puts it in the Reader, “To her mind, writing about pure abstractions like the traditional notions of good and evil hinders real understanding.”) And yet Ramsland quotes her as saying, “I do not think I could go on if I didn’t believe in goodness.”
In short, there was a profound contradiction at the heart of Rice’s work. And so I concluded that review in 1997 by recalling Simone Weil—”Imaginary evil is romantic and varied; real evil is gloomy, monotonous, barren, boring. Imaginary good is boring; real good is always new, marvelous, intoxicating”—and wondering if, having taken imaginary evil to its limits, Rice might be poised to taste the intoxicating waters of grace.
That review was never published. I’m not sure why. I stuck it in a folder and forgot about it for eight years. Then I received from Knopf an advance copy of Rice’s new novel, Christ the Lord: Out of Egypt, with an afterword in which she explains how she lost her faith as a young woman and how, in 1998, drawn by the magnetic person of Jesus, she asked a friend if “she knew a priest who could hear my confession, who could help me back to the Church.” She recounts her plunge into the strange world of New Testament scholarship and the years of reading that lie behind this new novel. Among the scholars she most warmly acknowledges is N. T. Wright.
Have you ever seen the painting I loved as a child, Jesus holding the lost sheep? Kitschy? Perhaps. But today there must be great rejoicing in heaven.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJohn Wilson
Alan Jacobs
What does it profit a man to defeat the Dark Lord but lose his soul?
- View Issue
- Subscribe
- Give a Gift
- Archives
The stab of envy came instantly, unexpectedly. I was somewhere quite new to me: on one of the enormous ferries that run between the mainland of British Columbia and Vancouver Island. As we moved westward we traded shifting clouds for brilliant morning sunshine. My wife and I had every expectation of a delightful day on the island, and had even managed to procure some surprisingly good coffee from a helpful machine. We sat at a small round table, sipping the coffee and gazing on the small islands in the Strait of Georgia; all was well indeed. But then my eye strayed to a neighboring table. There sat a ten-year-old boy, gazing fixedly upon the face of his father, who was reading in a tense whisper from Harry Potter and the Half-Blood Prince. It was July 16, 2005. The book had been released just eight hours earlier, at midnight, and though I had felt a slight pang when I discovered that I would be vacationing in Canada at the time—celebrating my 25th wedding anniversary, as it happened—I dismissed it immediately, and gave the matter no further thought. (Except, that is, to order a copy from Amazon Canada and have it sent to the B&B where we would be staying. With my wife’s permission, of course.) I had every reason to believe that the book would be waiting for me when we returned that evening, but at the moment that prospect yielded little comfort. (I got still less when the book didn’t show up at all. But that’s another story.) It occurred to me that this was the first time since the first book in the series that anyone I knew read a Potter installment before I did. When the second one, Harry Potter and the Chamber of Secrets, appeared in Britain some months before it was scheduled to appear in the United States, I ordered that volume from Amazon U.K.—as did thousands of others, a practice that quickly led Scholastic, J. K. Rowling’s American publisher, to insist upon simultaneous release of future volumes. From then on I read each book on the day of its publication, and even wrote an essay in praise of J. K. Rowling (one that received much critical commentary from my Christian brothers and sisters).
Harry Potter and the Half-Blood Prince (Book 6)
J. K. Rowling (Author), Mary GrandPré (Illustrator)
Arthur A. Levine Books
672 pages
$16.44
Why this excitement? Why would a middle-aged man—who also happens to be a professor of literature—get so worked up about a series of books for young people? Indeed, why do so many millions of people get similarly worked up, as they have about no other books? There is no real answer to this question, though every time another book in the series is released the newspapers of the world fill with speculations. The closest we can come to an answer is to note that J. K. Rowling does three things exceptionally well: first, she creates characters readers really care about—not just Harry but also Ron, Hermione, Hagrid, Dumbledore, Neville, etc.—usually because they possess some admirable trait (kindness, or courage, or wisdom) but are also somehow vulnerable; second, she writes suspenseful plots, so that you really want to know how it’s all going to come out; and third, she creates a whole imaginative world that people love to inhabit, even after they already know what happens in the stories. Many writers can do one of those things; a few can do two; hardly any can achieve all three. (Tolkien is one of them, which is why he also, though a very different and much greater writer than Rowling, is equally beloved.) It’s the combination that makes her special.
Critics who complain that Rowling’s writing style is pedestrian or cliché-laden—Harold Bloom being prominent among them—therefore miss the point. She is certainly not much of a stylist, she does indeed fall sometimes into cliché, and in fact a key moment in the new volume, one meant to be deeply moving, is marred by the kind of grammatical error that makes an English teacher like me grind his teeth and mutter about the decline in the professional skills of editors. But the last thing I want when I’m reading a Harry Potter book is to pause and admire the felicity of the diction. This ain’t Emily Dickinson, after all. And I found that grammatically erroneous passage deeply moving anyway because I cared about the characters involved, I cared about the story, I cared about the world.
That world—let’s start there—has been a source of great delight to me over the years. Rowling’s imaginative universe takes every dusty old piece of furniture from the common stock of tales about witches—pointed hats and cloaks, flying broomsticks, eye of newt and toe of frog, the whole shebang—cheerfully accepts it, and raises it to the next power. She adds to that the love of odd names that also characterized Charles Dickens, matching his Dick Swiveller with her own Argus Filch, and his town of Eatanswill with her village of Hogsmeade. It is tempting to heap up examples. She has a keen ear for the absurd, and has picked up curious words and phrases from all over the place: the names of two of her main characters, Dumbledore and Hagrid, seem to have been taken from a passage about country dialects in Thomas Hardy’s Mayor of Casterbridge. (A “dumbledore” is a bumblebee, and to be “hag-rid” is to be worn out.) The portraits at Hogwarts School of Witchcraft and Wizardry talk, and the subject of any one will occasionally depart to visit the inhabitants of the others; in the great wizard shopping street called Diagon Alley one can buy Self-Stirring Cauldrons; rooms and tents and even automobiles are often bewitched so that their insides are larger than their outsides. Each book in the series has added to this storehouse of treasures and curiosities.
But Harry Potter and the Half-Blood Prince does so less than any of its predecessors. Such new information about the magical world that we acquire is disturbing if not terrifying: we learn, for instance, of the Horcrux, an object enchanted to receive a portion of a person’s soul—but only when that person has severed a bit of his soul by murdering someone. One of the few light-hearted moments in the book comes early on, when Harry and his friends visit Weasleys’ Wizard Wheezes, the joke shop run by Fred and George Weasley, and see a variety of magical pranks and tricks. But one of the new comical items Fred and George are proud of—Peruvian Instant Darkness Powder—much later in the book enables one of Harry’s enemies to escape capture, and this escape leads, indirectly at least, to the death of a beloved character. There is no less magic in this book than in any of the others, but any distinction between serious and frivolous magic is being occluded, or even erased.
So too is the distinction between “good” and “dark” magic—or, as the magicians of the Renaissance would have put it, between magia and goetia. In the previous installment of the series, Harry Potter and the Order of the Phoenix, a group of students wants to learn how to defend themselves against possible attacks by Dark wizards, especially the Death Eaters, the most trusted servants of the greatest and Darkest of Dark wizards, Lord Voldemort, Harry’s great antagonist. They are all taking a course called Defense Against the Dark Arts, but it is useless, so they determine to study under the tutelage of Harry, who by this time has had to defend himself against the Dark Arts more than a few times. Harry’s dear friend Hermione Granger invents a way to inform people of future meetings: she enchants coins so that their serial numbers are replaced by the date and time of the next meeting of the Defense Association. Clever indeed! But the same enemy who buys Peruvian Instant Darkness Powder from Weasleys’ Wizard Wheezes learns of the trick and employs it to bring Death Eaters into Hogwarts Castle. Moreover, the meetings of the Defense Association take place in a place called the Room of Requirement, which alters its shape, size, and furnishings in order to meet the needs of the people using it; and this room is also commandeered by Harry’s enemy, again following our heroes’ example.
These are sobering events that require some reflection. In Harry Potter’s world, magic does not involve communing with spirits. (The contrast with the recent Bartimaeus books of Jonathan Stroud—in which the only power that wizards have is the power to summon and command spirits—is noteworthy.) Rowling has imagined magic as a kind of technology, but one that works only for some people. And even those people have to study and practice to be able to use the technology correctly: learning to use a wand is not so different from learning to drive a car. Like many of the technologies we are familiar with in our Muggle world, magical ones tend to be morally neutral: insofar as they have power, that power can be used for good or evil, and the greater the power, the greater its effect in either direction. So one is tempted to say that what Hermione designed for good purposes was taken by a Dark wizard and used for evil ones; but such a judgment would be too facile.
Yes, Dolores Umbridge—the Defense Against the Dark Arts teacher who, in her other capacity as High Inquisitor of Hogwarts, prohibited secret meetings—is a nasty piece of work; and yes, though she is not a Dark wizard herself, her policies aid and abet the forces of Darkness, and inhibit the ability of good wizards to combat those forces. Hermione’s little invention would seem, then, perfectly justified in the troubling circumstances; and at the time no one questions it. But here at the end of the next volume we see it in a new light: we are reminded that, after all, it was a device to ensure secrecy, to prevent the faculty and staff of the school from learning what some students were up to. And when the school is led by Albus Dumbledore rather than Dolores Umbridge, the success of such deception becomes disturbing.
Yet it must be said—and this too is a reflection prompted only by the concluding chapters of book 6—that Dumbledore himself has not only tolerated deception by Harry and his friends, he has positively encouraged it. Key to many of Harry’s secret adventures is the Invisibility Cloak that he inherited from his father—but it was actually given to him by Dumbledore, and once when Harry had lost it, Dumbledore returned it to him. Near the end of Chamber of Secrets, Dumbledore acknowledges that Harry has “a certain disregard for the rules,” but he does so with a twinkle in his eye—even though he makes this comment in listing the traits prized by Salazar Slytherin, the ancestor (literally or figuratively) of the Dark wizards that plague with wizarding world in these books. Rowling raises the possibility here that Dumbledore’s encouragement of deceptive practices by his most gifted and devoted students has been a significant mistake.
If so, it would not be his only one. In the latter pages of Order of the Phoenix Dumbledore confesses that he had withheld important information from Harry—information about the link between Harry and Lord Voldemort—for several years. He says that he did so out of concern and affection for Harry. But in fact secrecy seems to be habitual with Dumbledore. In a recent interview, Rowling made this intriguing comment: Dumbledore’s “wisdom has isolated him … where is his equal, where is his confidante, where is his partner? He has none of those things.” By the time book 6 begins, Dumbledore has recognized this problem, because he immediately begins taking Harry deeper and deeper into his confidence, trusting him more fully and even relying on him. Indeed, one of the most moving passages in the entire series occurs at a crucial moment in this book, when Harry is trying to help a Dumbledore weakened by powerful Dark magic: “It’s going to be all right, sir … Don’t worry,” Harry says. “I am not worried, Harry,” the great wizard replies. “I am with you.”
But by the time I put down Harry Potter and the Half-Blood Prince—rather hag-rid from the excitement and pain of it all—I wondered if Dumbledore had not learned his lesson too late. In the course of the book he reveals much to Harry, but when he has the chance to answer a question that has been of obsessive concern to Harry, and to many other characters, since the first book in the series, he refrains:
“Professor … how can you be sure Snape’s on our side?” Dumbledore did not speak for a moment; he looked as though he was trying to make up his mind about something. At last he said, “I am sure. I trust Severus Snape completely.”
But why, Professor, why do you so completely trust Severus Snape? That question, along with many others, most of them less consequential, will be answered in the final volume, which Rowling has said she will not begin serious work on until next year. Therefore fans of the series will have plenty of time to reflect on the disastrous (or apparently disastrous) events of book 6, and to speculate on possible ways the story could be brought to conclusion.
I find myself thinking especially of something I have already mentioned: the draining away of delight from the books, the narrowing of Harry’s horizons to a point, that point being an ultimate encounter with Lord Voldemort. At this stage in the series—the last book could of course surprise me—it is hard to imagine that there will be much room in Harry’s mind for any other thoughts. Earlier, when the threats were less immediate, when Harry could be confident in the protection of others, and when he had not yet learned of the depth and strength of his perverse bond with Voldemort, he could revel in the distractions of Quidditch, the wizard sport at which he excels; but Rowling has already said that we have seen a Quidditch match for the last time. In book 6 the only real refuge from war with the Dark Lord is found in adolescent romance; and that, while often funny, is never felt by those who experience it as light-hearted pleasure.
More dismaying is the book’s suggestion that Harry (and therefore the story) may not return to Hogwarts at all. Now, I strongly suspect that it will be necessary for Harry to return to Hogwarts in one way or another—he needs to return to the Room of Requirement, I think, and there may be some relics of the Hogwarts founders that he should investigate—but if he did not, there would be a great gaping hole at the heart of the book, because Hogwarts has been a key character in the books, and almost as central to the series as Harry himself. In any case, that such a suggestion can even be made indicates the seriousness of the crisis that has come upon Harry and the whole wizarding world. Everything is expendable except struggle with the Dark Lord; and everything that pleases us can be used by the forces of evil for their own purposes.
This foreclosure of possibilities for Harry, the narrowing of his world to a single dreadful task, is an exaggerated and intensified version of what growing up is for everyone. As Robert Nozick once wrote, “Although [young people] would agree, if they thought about it, that they will realize only some of the (feasible) possibilities before them, none of these various possibilities is yet excluded in their minds. The young live in each of the futures open to them … Economists speak of the opportunity cost of something as the value of the best alternative foregone for it. For adults, strangely, the opportunity cost of our lives appears to us to be the value of all the foregone alternatives summed together, not merely the best other one. When all the possibilities were yet still before us, it felt as if we would do them all.” The “opportunity cost,” for Harry and for many others, of defeating Voldemort is terrifyingly high. Handled in a certain way, the denouement of this story could confirm every child’s worst suspicions of what it means to grow up.
But I do not think that Joanne Rowling wants to say that adulthood consists in foregoing all delight, all leisure and playfulness, and that young people had better get used to it. Rather, she is showing that there are times when some people, at least, must forgo such pleasures so that they may be retained, or regained, by others. And it is at this point that the comparisons between Rowling’s books and The Lord of the Rings—comparisons that I have tended to dismiss— begin to ring true. Reading the last pages of Harry Potter and the Half-Blood Prince, I found myself hearing in my head some of the last words Frodo utters to Sam: “I tried to save the Shire, and it has been saved, but not for me. It must often be so, Sam, when things are in danger: some one has to give them up, lose them, so that others may keep them.” Harry has indeed given up many things: all the delights of Rowling’s imaginative world that I have mentioned, and many more. We are left to wonder whether he must give them up permanently, or whether, his quest complete, he will remain whole enough to reclaim them.
Four people very dear to Harry have died trying to protect him from Lord Voldemort, and at the end of book 6 he is determined that no others shall do so. From this point on he will move forward alone: he ruthlessly, if regretfully, cuts as many ties as he can. But—here again we are reminded of Tolkien, of the refusal of Merry and Pippin and (above all) Sam to abandon Frodo—Ron and Hermione make their position clear: “We’re with you whatever happens.” I expect that the final book of the series will pay proper homage to the first one, in which the skills of all three friends were necessary to prevent Voldemort from claiming the Philosopher’s Stone and thereby achieving endless life. Which is another way of saying that I believe that Voldemort will, in the end, be defeated.
But what will be the cost of victory, to Harry and to those he loves? I am not confident that Harry, Ron, and Hermione will all survive the seventh book. But even if they do, I wonder what the agon will do to them. I especially wonder what will be left of the Harry Potter we first met almost a decade ago. Let us meditate on this: in each of the two most recent books in the series, Harry has tried to use an Unforgivable Curse, each time on a person whom he has great reason to hate. Yet he has been unable to perform the curses, because his heart is not in them, his will is not fully behind them. “You need to mean them, Potter,” says one enemy; “You need to really want to cause pain—to enjoy it.” “No Unforgivable Curses from you, Potter!” says the other of Harry’s intended victims. “You haven’t got the nerve or the ability.” Harry, for all the misery and loss he has suffered—perhaps because of all the misery and loss he has suffered—finds it impossible to summon and will true hatred. Without that will, without that hatred, will he be able to do what he knows he must do: kill Voldemort? It seems unlikely. But would a Harry who can summon the hatred to kill, even if the Dark Lord himself is the victim, still be the Harry Potter we have come to love?
In the early books in the series—indeed up through the fifth book—the obvious and recurrent historical analogue to the story is the beginning of World War II: the Minister of Magic, Cornelius Fudge, is a Neville Chamberlain figure, an appeaser, in denial about the real state of affairs even though all the evidence is right before his eyes; while Dumbledore (in a kind of “political wilderness” at Hogwarts) is the clear-eyed, straight-talking Churchill of the tale. But the sixth book treats life, not in conditions of open battlefield warfare or air assault, but under the constant but unpredictable threat of terrorism. Thus the debate at the end of the book about whether Hogwarts should remain open: some want it closed to protect the students, while others argue that the students would be safer at Hogwarts than at home, and in any case, they say, the supporters of Voldemort must not be given the satisfaction of knowing that they had closed the school. (This strongly resembles the debates that go on in, say, Israel— though Israelis seem almost fully to have chosen the second option, opting for at least the semblance of normalcy no matter what.)
Rowling denies conscious reference to the current historical moment, and indeed her description of this new wizards’ war seems mandated by the intrinsic shape of the story—by the necessary form of Voldemort’s rebellion. Still, the first chapter of this book is rather eerie: Cornelius Fudge and his successor show up in the office of the Muggle Prime Minister, who is troubled by a series of strange and destructive events. When he learns that these are not accidents or natural disasters, but rather the work of Voldemort and his Death Eaters, he splutters, “But for heaven’s sake—you’re wizards! You can do magic! Surely you can sort out—well—anything.” To this Cornelius Fudge, with a wan smile, replies, “the trouble is, the other side can do magic too, Prime Minister.” (Rowling held a midnight book-release party at Edinburgh Castle on July 16, and had originally planned to read this chapter to the children whom she had invited; but the then-recent Underground bombings in London caused her to decide instead on a chapter from an earlier book.)
Therefore, the great question facing readers who look forward to the seventh and last Harry Potter book is not just which side will win, but which magic will triumph. Dumbledore has always fought Voldemort through overt and covert action—again, his honesty and courage counter Fudge’s head-in-the-sand befuddlement—but he has refused to fight on Voldemort’s terms, always refraining from Dark magic (like the Unforgivable Curses). But the effectiveness of that noble refusal now seems to have been called into question. As Harry moves towards his final confrontation with Voldemort, he, by contrast, seems determined to use the weapons of evil against evil. But what does it profit a man to defeat the Dark Lord but lose his soul?
Alan Jacobs is professor of English at Wheaton College. He is the author most recently of The Narnian: The Life and Imagination of C. S. Lewis, just published by HarperSanFrancisco.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAlan Jacobs
Harry S. Stout
The Mind of the Master Class is a masterpiece.
- View Issue
- Subscribe
- Give a Gift
- Archives
In The Mind of the Master Class: History and Faith in the Southern Slaveholders’ Worldview, Elizabeth Fox-Genovese and Eugene Genovese embark upon a task of rehabilitative intellectual history remarkably similar to that undertaken by the Harvard historian Perry Miller in the 1930s. Both have chosen a decidedly unfashionable subject for serious study. In the 1920s and ’30s, the Puritans were the bete noires of serious American culture. When he began work on the two-volume opus that would become The New England Mind, Miller recalled in the foreword to the 1954 edition,
The Mind of the Master Class: History and Faith in the Southern Slaveholders' Worldview
Elizabeth Fox-Genovese (Author), Eugene D. Genovese (Author)
Cambridge University Press
824 pages
$41.84
Oddly enough, I found myself driven to study the structure of the original Puritanism of New England in a time when the perverse tendencies of the American sensibility were most excited against my subject. All around me, in the 1920s, I was being shown by pundits and philosophers whom I respected, that “Puritanism” was the source of everything that had proved wrong, frustrating, inhibiting, crippling in American culture.
In his magisterial reassessment Miller came to the opposite conclusion. Far from being incidental or marginal to “serious” American intellectual history, the Puritans represented “one of the major expressions of the Western intellect” in American culture. Whatever feelings of personal revulsion or disagreement Miller harbored for his subjects (and as a self-confessed atheist they were certainly present), he recognized that an enormously significant component of America’s cultural and intellectual legacy had been missed by his smugly superior intellectual peers.
In 21st-century America, antebellum Southern slaveholders are the new Puritans, who stand for everything that is repulsive in American history. Racist, violent, misogynist, willing to destroy the nation to preserve their “peculiar institution,” slaveholders in post-civil rights movement America are about as politically incorrect a subject for sympathetic study as any scholar could choose to explore. “To modern sensibilities,” the Genoveses recognize, “it is a preposterous idea that a slave system could engender admirable virtues. … In our own time it seems perverse, not to say impossible, to try to separate the horror of slavery from the positive features of an ordered and independent social system.”
Yet, like Miller, the Genoveses have chosen to invest years of significant research into reconstructing the slaveholders’ intellectual world and its place in the larger currents of Western thought. They do not come to their subject as fellow-believers caught up in some neo-Confederate madness, and in fact have written often and compassionately on the inhumanity of slavery. But still they persist in their intellectual project. In so doing, they disentangle the “horror” of slavery from the genuine virtues of a corporate social ethic that has virtually disappeared in modern industrial America. As well, they issue a powerful critique of northern conceits by showing how the defeat of the Confederacy meant not less racism, but more. Northern victory promoted a “new racism” that empowered the American white race “to rule the world, civilize the heathens of Asia, Africa, and Latin America, and rightfully put them to work for the master race.”
The parallels between Miller’s intellectual history and the Genoveses’ go beyond engaging hostile intellectual cultures to encompass remarkably similarities in style, method, and argument. First, style. Both Miller and the Genoveses adopt a style of discourse and argumentation that might best be labeled bombastic. For both, ideas are not trivial matters for casual talk at cocktail parties but utterly serious pursuits worthy of being treated in life-and-death terms. Occasionally humor appears, but usually with the object of satire or reductio ad absurdum arguments. Neither are they shy to put forward their interpretation or belittle their opponents, both historical and contemporary. It is a style of discourse whose roots are ultimately medieval, grounded in what the Jesuit scholar Walter J. Ong termed “agonistic structures,” in which words are weapons. In these embattled terms, the really critical question becomes, do they fight fair? And to this reviewer, the answer—in both cases—is yes.
In terms of methodology, the Genoveses, like Miller, are fanatical scholars and researchers. In a mark of characteristic hubris, Miller refused to footnote all of his sources, but when independently compiled years later, they confirmed that he had read virtually the entire canon of Puritan texts—mostly sermons—before the age of Evans micro-cards and photocopiers. A scholar’s scholar, Miller ransacked the primary sources, achieving a depth of understanding and knowledge that no one else of his time—including Christian clergy and theologians—could begin to approach. Implicitly, Miller’s prodigious archival research challenged the historical community to match his industry and consequent interpretation—or shut up.
Central to the “mind” that Miller elucidated was classical Western history and Protestant Christianity. By looking at Cambridge and Harvard, as well as Puritan literature, Miller described a culture of enormous philosophical erudition, well steeped in the Christian classics, the ancients, and the Renaissance and Reformation. One figure in particular loomed large over the intellectual enterprise: the French Protestant philosopher Peter Ramus whose new system of “logic” (really rhetoric) presented “reality” as it existed in the mind of God. That reality became the organizing device for Puritan preaching and social engineering.
The depth and range of the Genoveses’ exploration of Planter intellectual culture and education is no less thorough and encyclopedic. Like Miller they probe deeply into the antebellum world of sermons and theology, and like Miller they also examine higher education and the authors read and studied by the slave-holding élites. In the Plantation South, no less than Puritan New England, public culture was defined by a learned mix of classical history and theology. Ancients were read widely in the South, and knowledge of Greek and Latin was a highly valued skill that any gentleman should possess. The medieval Schoolmen were read also with approval, despite their Roman Catholic context. (By war’s end, some Southern intellectuals were actually wondering if the Reformation—with its individualistic ethos—was a good thing after all.) “Modern” philosophers from Hume to Locke were read, critiqued, and integrated into a distinctive Southern world view which privileged the corporate and hierarchical social ethic that upheld slavery as a positive good.
Theologically, Southern intellectuals tended to take their cues from Presbyterian theologians like James Henley Thornwell, Robert Dabney, Thomas Smyth, and Moses Hoge, who all defended slavery and the larger planter household it supported from carefully grounded Scriptural arguments and precedents. Indeed, in a fascinating—and certainly provocative—evaluation of Northern and Southern biblical arguments over slavery, the Genoveses argue that the South got the better of the argument if one stuck to the literal “Word” of the biblical text in contradistinction to some vague “Spirit” of Scripture based on abstract understandings of neighbor love or the Golden Rule: “To speak bluntly: The abolitionists did not make their case for slavery as sin—that is, as condemned in Scripture. The proslavery protagonists proved so strong in their appeal to Scripture as to make comprehensible the readiness with which southern whites satisfied themselves that God sanctioned slavery.”
Of course, the notion of a “literal” understanding of Scripture is itself embedded in theological debates stretching from the patristic era to the present, and the Genoveses rightly concede that the slaveholders’ hermeneutic would have relatively little bearing on 21st-century debates. (Even among self-identified fundamentalists you’d be hard pressed to find churchmen invoking St. Paul to defend present-day slavery in West Africa and Southeast Asia, where a conservatively estimated 12.5 million human beings—mostly women—suffer in bondage.) But in the antebellum South, clergy of all persuasions condemned Northern critiques of slavery as, in all cases, sinful, and some went so far as to think the institution would be continued in heaven.
If the worldview of antebellum Planters was remarkably similar to that of colonial Puritans (including the practice and acceptance of slavery), the Genoveses assert baldly that the same could not be said of the Puritans’ Northern intellectual descendants. The reasons? Slavery and capitalism. While antebellum Planters remained strongly anchored in a hierarchical, patriarchal, and orthodox Protestant past, Northerners moved ever more in individualistic and “liberal” directions which, the Genoveses argue, were rendered necessary by the evolving market economy that undermined all sense of community (theological and practical). Antebellum Planters, like the Puritans before them, understood the extended family (including slaves) as a “Little Commonwealth”—the indivisible unit of society. Northern Protestants, steeped in Lockean epistemology, the egalitarian rhetoric of the Declaration of Independence, and a humanized Christ, bereft of all sense of sin and judgment, privileged the individual as the basic unit of society.
When one adds the Genoveses’ interpretation of post-colonial Northern Protestant sell-outs to capitalism, the similarities in interpretation between Puritans and Planters become even stronger. Both Miller and the Genoveses tell a story that features what Miller termed a “jeremiad” for a “lapsed” America. In Miller’s telling, “declension” began already by the second generation among children who could not measure up to the orthodox giants the Founders had been. In the Genoveses’ account, Puritan orthodoxy in the North held at least through the hyper-Calvinism of Jonathan Edwards in the late 18th century, but then, sure enough, declension set in and formerly orthodox Calvinists capitulated to the sirens of rationalism, capitalism, and faulty biblical exegesis.
Both Miller and the Genoveses emphasize declension over and against persistence and continuity. Miller saw a flip side of declension: the boon of Americanization. Even as the Puritans declined in the 18th century, Miller argued, their democratic spirit lived on and informed the great national transition “from Puritan to Yankee.” For the Genoveses, in contrast, there is no flip side: Northern declension led to unfettered capitalism and the loss of continuity with an organic past. In the “knotty theological debates” of 19th-century America, the Genoveses argue, divergent material realities and social constructions shaped divergent theologies in ways more significant than any surface similarities:
No simple dichotomy between Trinitarianism and social corporatism versus anti-Trinitarianism and individualism would bear examination, but a tendency toward correlation does exist and did exist in the minds of orthodox Southerners. Anti-Trinitarianism correlates nicely with the bourgeois individualism of modernism, whereas revolts of both the antibourgeois Left and Right have repeatedly fallen back on Trinitarian theology.
But as brilliant as Miller and the Genoveses are, one must ask if their jeremiads stand up to scrutiny. Miller’s declension model never could explain Jonathan Edwards, whom in a later biography he portrayed as almost divine, nor could it account for the pervasive religiosity and orthodoxy that characterized Congregational churches down to the Revolution. Nor can the Genoveses’ declension model explain the 19th-century Northern neo-Edwardseans, be they of the “New England Theology” or “Old School” Presbyterian orthodox represented by Charles Hodge at Princeton Theological Seminary.
To document the supposed lapse of 19th-century Northern Protestants into Unitarianism or proto-Unitarianism and “liberalism,” the Genoveses rely less on the sort of exhaustive analyses of primary sources that accompany their analysis of the South than on the dated work of historical theologian Joseph Haroutunian, Piety Versus Moralism: The Passing of the New England Theology, which argued that Northern Protestantism “drenched its theology with humanism.” Haroutunian’s thesis was certainly provocative and appealed to a coterie of neo-orthodox theologians lamenting the rise of liberalism in their own generation. But it simply does not hold up under rigorous analysis. Many—perhaps most—Protestant clergy in the North did not embrace abolitionism but, like Hodge, argued that slavery was not a sin per se. As for their latent universalism and proto-Unitarianism, it simply did not exist. Recent and painstaking research by scholars such as Joseph Conforti, David Kling, and Douglas Sweeney reveal an amazingly consistent “Edwardsean” tradition. In sum, relying on Haroutunian to characterize 19th-century theology in the North is roughly analogous to invoking Ulrich B. Phillips on the subject of slavery and the plantation household over and against the more recent scholarship of Eugene Genovese and Elizabeth Fox-Genovese. It simply doesn’t compute.
In discussing the South—the main theme of the book—the Genoveses are on much firmer ground. But even here, one yawning gap remains, namely the Civil War, or, as they call it, “the War for Southern Independence.” References to “the War” recur in the book, to be sure, but with little amplification or analysis. Were the Genoveses to pursue more extensively the arguments rendered in the crucible of war, it would become apparent that the similarities between Northern and Southern Protestants were more significant than they assume.
Both adhered to a Christian orthodoxy grounded in Puritan notions of a “covenanted nation,” a chosen people who, by virtue of their orthodoxy, could claim God was on their side. With that claim, both sides could enter into the bloody business of killing one another without any effective limits on the butchery. Since both sides read their Old Testament alongside their Puritans to arrive at an identical national jeremiad, they knew that defeats were only calls to reform and repentance—which, when properly pursued, would induce a covenant God to render victory to “His” people. Only when we recognize the profound similarities between Northern and Southern Protestantism does it become possible to understand the ferocity of the conflict. In fundamental ways, the Civil War represented a fratricidal war between two similarly grounded theologies struggling for the soul of the continent.
The originally Puritan idea of a covenanted Christian nation dominated Confederate discourse. Indeed, in a rare slip, the Genoveses assert that the Confederate clergy mounted an “unsuccessful campaign … to declare the Confederacy a ‘Christian society.’ ” Sorry, but nothing could be further from the truth. Confederate pastors and moralists enjoyed an immense rhetorical advantage over the North because of their Christian Confederate Constitution. And they didn’t hesitate to exploit it. Following the resounding Confederate victory at First Bull Run, the Rev. Edward Reed preached a thanksgiving sermon at Flat Rock, South Carolina and reiterated the stock Confederate truism that the Federal Constitution was fatally secular: “Whether through inadvertence, or, as is unfortunately more probable, from infidel practices imbibed in France by some members of the Convention … it contained no recognition of God. Our present Constitution opens with a confession of the existence and providence of the Almighty.”
Much to their dismay, orthodox Northern Protestants found themselves forced to agree, and issued repeated calls for a constitutional amendment identifying America as a “Christian nation.” So adamant were they in seeking to emulate the South with a constitutional amendment that a desperate Abraham Lincoln threw them a sop with piously phrased proclamations of fasting and thanksgiving and a new national motto, “In God We Trust.” Ironically, when Lincoln determined that his new theocentric motto would be stamped on the nation’s coinage, he inadvertently fashioned a telling symbol of the North’s conflation of capitalism and Christianity.
Unfortunately, the Genoveses have relatively little to say on the actual war years except to romantically describe Confederate ministers as “prophetic” in their “brave efforts” to endorse a just but humane war. No doubt there were individual ministers who lived up to this description, but they were not representative. Far more typical was the case of Robert Dabney, whom the Genoveses frequently cite. Dabney, a famed preacher and theologian, served as Stonewall Jackson’s aide-de-camp and chaplain. In matters theological he was rigorously orthodox, but in terms of war, he proved as bloodthirsty as any. Preaching at the funeral of a fallen comrade, Lieutenant Abram Carrington, Dabney singled out the young men in his congregation for a ringing affirmation of hatred and blood revenge:
Let me exhort the young men of this community to be “followers of him [Carrington] as he also was of Jesus Christ.” And especially would I now commend by his example, the sacred and religious duty of defending the cause for which he died. … Surely [his] very blood should cry out again from the ground, if we permitted the soil which drank the precious libation, to be polluted with the despot’s foot! Before God, I take you to witness this day, that its blood seals upon you the obligation to fill their places in your country’s host, and “play the men for your people and the cities of your God,” to complete the vindication of their rights.
With rhetoric like this, emanating from the Southern pulpit no less than the Northern, it becomes clearer how the war could continue until there were no more bodies to sacrifice on the altars of their nations. Calvin’s God, after all, was on their side.
Whatever quibbles readers will have with The Mind of the Master Class, the book represents a stunning tribute to the power of the mind in American culture and the central role that religion has played in that mind, for better and for worse. Humanitarian and liberal wishes to the contrary notwithstanding, scholars of American cultural and intellectual history will now have to reckon with Planter ideology alongside Puritanism as major expressions of the Western intellect in American history. One can only hope that this powerful book will lead to a renaissance in constructive scholarship as far-reaching as that which flowed from Perry Miller’s achievement seventy years earlier.
Harry S. Stout is Jonathan Edwards Professor of American Religious History at Yale University. He is the author most recently of Upon the Altar of the Nation: A Moral History of the American Civil War, forthcoming from Viking in 2006.
1. Joseph Haroutunian, Piety Versus Moralism: The Passing of the New England Theology (Harper & Row, 1932).
2. See Kenneth P. Minkema and Harry S. Stout, “The Edwardsean Tradition and the Antislavery Debate, 1740-1865,” Journal of American History, Vol. 92 (2005), pp. 47-74; or Mark A. Noll, America’s God: From Jonathan Edwards to Abraham Lincoln (Oxford Univ. Press, 2002), pp. 414-15.
3. See, e.g., Joseph A. Conforti, Samuel Hopkins and the New Divinity Movement: Calvinism, the Congregational Ministry, and Reform in new England between the Great Awakenings (Eerdmans, 1981); Douglas Sweeney, Nathaniel Taylor, New Haven Theology and the Legacy of Jonathan Edwards(Oxford Univ. Press, 2003); or David W. Kling, Field of Divine Wonders: The New Divinity and Village Revivals in Northwestern Connecticut, 1792-1822 (Penn State Univ. Press, 1993).
4. Edward Reed, A People Saved by the Lord (Charleston, 1861), p.9.
5. Dabney’s sermon was reprinted in the [Richmond] Central Presbyterian, March 12, 1863.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromHarry S. Stout
Culture
Peter T. Chattaway
Evidence that demands a verdict.
- View Issue
- Subscribe
- Give a Gift
- Archives
For some, the existence of evil is one of the great arguments against the existence of God; for others, it is one of the great arguments in his favor. Many films about demonic possession and exorcism fall into the latter camp, and the film that defines this genre more than any other is, of course, William Friedkin’s The Exorcist (1973). Based on the bestselling novel by William Peter Blatty, it draws a strong contrast between modern scientific rationalism—depicted as cold, harsh, and mechanical, a view of the world that reduces body and mind to a mere collection of parts—with a more traditional worldview that boldly affirms the supernatural. Ironically, while there is something dehumanizing about the medical treatment that a possessed young girl is subjected to, the demonic possession itself affirms her personhood, as well as the reality of a mysterious unseen world beyond what science can prove or explain. And Blatty’s original novel makes a point of linking the cosmic conflict to more familiar forms of evil, reminding us that evidence of this spiritual battle is before our eyes all the time. The novel begins with a page that cites the Holocaust, the persecution of Christians, and similar examples of real-world cruelty, as if to say, Why do we need a “sign” such as demonic possession in order to believe that this struggle is real?
Nevertheless, this generation asks for signs, and writers and artists step up to provide them. Blatty called Mel Gibson’s The Passion of the Christ “a tremendous depiction of evil,” and Thomas Hibbs, author of Shows About Nothing: Nihilism in Popular Culture from “The Exorcist” to “Seinfeld,” noted that Gibson’s film, like Friedkin’s, set “primitive” faith against the smug skepticism of post-Enlightenment culture.1 The latest example is The Exorcism of Emily Rose, directed by the openly evangelical Scott Derrickson from a script he co-wrote with Paul Harris Boardman, who is more skeptical; the longtime writing partners joke that theirs is a Scully-Mulder sort of relationship, with Boardman providing the doubts that complement Derrickson’s beliefs. The two have collaborated on several screenplays, primarily horror films like Dracula 2000, Urban Legends: Final Cut, and Hellraiser: Inferno (all 2000), the last of which Derrickson also directed. For his part, Derrickson has said that his interest in this genre is fueled by C.S. Lewis’ The Screwtape Letters—a didactic but entertaining collection of letters written by a senior devil to one of his underlings—and Walker Percy’s Lancelot, in which the protagonist says the search for something “purely evil” is “the only quest appropriate to the age.”
But if The Exorcist responded to modernity by taking us back to a premodern sensibility, The Exorcism of Emily Rose forges ahead into the even murkier waters of postmodernity. The Exorcist was the story of a demon-possessed girl, but The Exorcism of Emily Rose is the story of people who tell the story of a demon-possessed girl—and competing versions of her story, at that. All of this is complicated further by the fact that the film, which is loosely based on actual events, blurs the line between reality and fiction. Between 1968 and 1976, a young Bavarian woman named Anneliese Michel experienced symptoms that she came to believe were a sign of demonic possession. Eventually the local Catholic bishop authorized an exorcism, which lasted several months—but she died of malnutrition and pneumonia, and her parents and two priests were tried and found guilty of negligent manslaughter. The film preserves and transmits a number of the facts involved in Michel’s case, but revises many of them and adds its own fictitious details, too; and then, in the closing titles, it speaks in the past tense of the film’s characters as though they themselves had actually existed. So the movie encourages the viewer to seek the truth behind cases of possession like Michel’s, but it also gives the viewer one more screen of fiction to cut through in search of that truth.
The Exorcist, as Hibbs notes, was a hybrid of sorts: part classic horror film, part murder mystery. Likewise The Exorcism of Emily Rose, in which classic horror elements are framed within the narrative conventions of a courtroom drama. Father Richard Moore (Tom Wilkinson), the priest who has been charged with criminal negligence in the death of Emily Rose (Jennifer Carpenter), cares little about his freedom or reputation; he just wants to tell Emily’s story. Erin Bruner (Laura Linney), the lawyer assigned to his case, cares more about legal strategy, and initially takes the job because she thinks it will help her make senior partner at her law firm. Father Moore comes across as naïve and uncritical; the Catholic church is officially skeptical with regard to claims of demonic possession until certain criteria have been met, but if Father Moore ever subjected Emily to that process, we don’t see it. Erin, however, is an agnostic who drinks too much, keeps a Carl Sagan book on her bedside table, and speaks glibly about the way she recently defended a killer who is now “sunbathing on a Miami beach.” Naturally, it isn’t long before cracks begin to show in her cynicism.
Their opponents include prosecutor Ethan Thomas (Campbell Scott), an active Methodist with a reputation for being anything but a choirboy in the courtroom, and a couple of medical experts, who come across as arrogant or at least a little too sure of their superior knowledge. Ethan himself not only doubts Father Moore’s story, he is openly hostile to the priest and other witnesses for the defense, which makes him a less than appealing representative for those who may share his skepticism. And when Ethan tells the jury that Father Moore’s beliefs are rooted in “archaic and irrational superstition,” one cannot help but wonder if he is also meant to represent Protestant hostility towards certain kinds of Catholic belief. The film distances us from Ethan in other ways, too. While we share certain private moments with Erin, Father Moore, and Emily herself, we never see Ethan outside the public spheres of the courtroom or the bar where the lawyers gather and sometimes do business—and where, in yet another distancing move, Ethan turns down an offered drink and asks for water instead.
So while the film does present arguments for both sides of the case, the viewer is still aware that the conversation is being steered in certain directions. Every time a witness describes the strange phenomena Emily saw, the voices that came from her mouth, or the contortions her body went into, another witness offers a scientific or naturalistic explanation, and it is left to the viewer to decide which of these explanations makes the most sense. Often, both explanations are depicted in flashback sequences, but the film has been sold as a horror movie, so the more sensational flashbacks are longer and better developed. Even so, the filmmakers must have sensed that the courtroom scenes were outweighing the scary flashbacks, and so Erin is haunted by strange phenomena too, some of which—like the way spooky things keep happening at 3:00 am—feel a bit hokey.2 A subplot involving a frightened psychiatrist (Duncan Fraser) who witnessed the exorcism and dithers on whether to testify in Father Moore’s defense also falls back on clichés, such as a car accident that happens at the worst possible moment.
However, it would be wrong to say that The Exorcism of Emily Rose offers a clear apologetic for the faith. In fact, there is quite bit to this story that might give a Christian pause. Dr. Adani (Shohreh Aghdashloo), a cultural anthropologist who specializes in demonic possession, testifies that Emily died not because her priest told her to abandon her medical treatment, but because the drugs the doctors gave her interfered with the “psycho-spiritual shock” that exorcism is supposedly intended to provide. The viewer may be gratified to see the medical establishment’s logic turned on its ear, but is this not another naturalistic explanation for what is supposed to be a supernatural matter? Does the power of Christ compel demons only when chemicals stay out of the way? We are also told that Emily was a devout Catholic, but many Christians would assert that baptized, Spirit-filled believers cannot be possessed by demons. Father Moore goes even further and says that Emily will one day be recognized as a saint precisely because she was possessed by demons. He bases this on Emily’s claim to have seen the Virgin Mary, after which she experienced the stigmata.3 In a letter to Father Moore, Emily says the Virgin offered to take her into the afterlife, but she chose to stay behind and cope with the demons instead—and to refuse further treatment, including further rites of exorcism. “People say that God is dead,” Emily writes, “but how can they say that if I show them the Devil?”
Thus the film spells out what was only implicit in The Exorcist: by proving the reality of evil, we can prove the existence of God. But there are problems with Emily’s argument, not the least of which is that many cultures have believed in demons and wicked spirits without believing in the Almighty God of Judeo-Christian faith. (In the film itself, this point is underscored by Dr. Adani’s cross-cultural testimony.) I am also reminded that Linney starred in another recent movie about alleged real-life supernatural events, The Mothman Prophecies (2002). That film was based on a book by occult specialist John A. Keel, which concludes with a quote attributed to Charles Fort, the collector of bizarre stories whose accounts of raining frogs inspired P.T. Anderson’s Magnolia (1999): “If there is a universal mind, must it necessarily be sane?” Thirty years ago, The Exorcist told a modern, mechanized world that the spiritual world is real. But today’s postmodern world might need to hear something slightly different. Getting people to believe in the supernatural realm is one thing. Getting them to believe in God is something else.
Peter T. Chattaway lives in Canada and writes about movies.
1. Thomas Hibbs, “The Horror & The Passion,” National Review Online, April 9, 2004; http://www.nationalreview.com/hibbs/hibbs200404090651.asp
2. Perhaps demons really have adapted to modern clocks and become so punctual, but scenes like these always bring to mind that moment in End of Days (1999) when Arnold Schwarzenegger asks which time zone a prophecy refers to.
3. Interestingly, the film’s very first shot—of blood dripping from a barbed-wire fence—is taken from the skeptical prosecutor’s explanation for the wounds on Emily’s hands.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromPeter T. Chattaway
Matthew Lundin
The debate over confession in 16th-century Germany.
- View Issue
- Subscribe
- Give a Gift
- Archives
In the Presbyterian church of my youth, we were admonished each week to confess our sins silently to God. To an impressionable young mind, a minute or so of silence could provoke all sorts of uncomfortable speculations—whether I had committed the unforgivable sin, whether I had confessed all of my sins, why it was I could not focus my attention on God. To me, these were great, lonely pauses, despite the soft coughs and sniffling. Each week, however, the silence was always quickly followed by a warm reassurance of God’s free grace and forgiveness. Though I may have been theologically naïve, I knew intuitively, even then, that this proclamation of grace was the main thing—that its validity did not depend on my secret worries.
The Reformation of the Keys: Confession, Conscience, and Authority in Sixteenth-Century Germany
Ronald K. Rittgers (Author)
Harvard University Press
330 pages
$93.00
Like many who grew up Protestant, my experience of confession has been of a general or purely private sort—either a liturgical reading or a silent prayer. Only Catholics, it seemed, were required to confess their sins to a priest. To the best of my memory, I do not recall having heard a Protestant sermon on the passage in Matthew—the traditional proof-text for Catholic confession—in which Jesus entrusts Peter with the “keys to the kingdom of heaven.” I may have heard one, but the exotic powers invoked in the verse—”whatever you bind on earth will be bound in heaven, and whatever you loose on earth will be loosed in heaven”—were no doubt too subtle and obscure to remain long in my mind.
There was a time, however, when the issue of the “keys” in general—and private confession in particular—aroused great interest among Protestants. In The Reformation of the Keys: Confession, Conscience, and Authority in Sixteenth-Century Germany, Yale University historian Ronald Rittgers tells a fascinating story of how the first generation of Protestants in the German city of Nürnberg struggled to determine exactly what role private confession and absolution should have in their new church orders. Because English-language sources on the subject are scarce, it may come as a surprise to many readers that German Lutherans developed a distinctively evangelical form of private confession, which lasted into the 18th century. In The Reformation of the Keys, we can relive one 16th-century city’s struggle to work out the practical and institutional implications of the new Protestant doctrine of grace, particularly as it related to confession and clerical authority.
“I will give you the keys of the kingdom of heaven.” Jesus’ words to Peter had momentous consequences for Western civilization. For centuries, the Church in the West would use them to support its exclusive claims to religious authority. It alone could bind and loose sins; it alone could dispense the divine grace necessary for salvation. On this foundation was built the powerful and omnipresent Church of the Middle Ages, able to hold sway over princes and emperors and to rouse all of Western Europe to crusade. The power of the “keys” also justified the medieval Church’s interest in the most intimate details of everyday life, especially through the sacrament of confession—the “second plank” thrown to drowning Christians when they sinned after baptism.
During the Middle Ages, the Church remained more or less secure in its possession of these powers. Though ecclesiastical authority suffered severe setbacks during the 14th and 15th centuries, only in the 16th century did Protestant Reformers radically challenge the traditional interpretation of the “keys”—and with it the sacerdotal authority the medieval clergy had claimed for itself. If God’s mercy was fully and freely available through faith, then the clergy’s traditional right to parcel out grace was a human fiction, abused for the sake of money, power, and status. Emboldened by this new interpretation, communities throughout Europe began dismantling the innumerable institutions, penitential rites, and clerical prerogatives that had clustered around the Church’s monopoly on forgiveness.
The German city of Nürnberg, the setting of Rittgers’ book, was one of the first European locales to enact these revolutionary changes. In the early 16th century, Nürnberg was a self-governing community with a population of approximately 40,000—one of the largest and most powerful cities in the fragmented Holy Roman Empire.
As in many other German cities, the force of Reformation in Nürnberg had a great deal to do with its powers of negation. Though Luther’s theology offered a new, positive vision of grace, it gained strength and popularity largely as a rejection—both of traditional clerical authority and the elaborate system of penances, pilgrimages, processions, and pious bequests through which medieval Christians had sought to appease God. In the new “evangelical” doctrine, grace was not in the possession of a clerical caste, to be dispensed incrementally during each sinner’s lifetime. Rather, it was offered freely and fully to all sinners. This new vision radically simplified religious life. The effect seems to have been something akin to a flood sweeping over fields crisscrossed by an elaborate irrigation network. Abundant waters made the old, grooved channels irrelevant. At times, they threatened to destroy the fields altogether.
By the mid 1520s, just a few years after the publication of Luther’s 95 Theses, the Nürnberg city council had dismantled much of the traditional religious system. The magistrates abolished penitential rites, shut down the city’s main monasteries, and took control of poor relief and marriage (two jurisdictions that had previously belonged to the Church). The secretary to the city council, Lazarus Spengler, a layman, wrote a theological treatise, An Apology for Luther’s Teaching (1519), in which he attacked medieval churchmen as “preachers of fables,” who taught the laity that they could buy their way out of purgatory as at a “fair” or “merchant’s market.” Reformers aimed to convince Nürnbergers that they need not rely on the elaborate penitential rites they had learned as youths.
After depriving the clergy of their traditional monopoly on grace, however, the city was left to determine exactly what authority a new evangelical pastorate would possess. How much power would they have to drive sinners to repentance and bring them safely into grace? For Protestants, of course, it was primarily the preached Word that enacted and promoted the sinner’s rebirth, by leading her to trust God’s free promise of forgiveness. Even so, a great quarrel arose in Nürnberg about the role private confession should play in bringing about true faith and salvation.
Medieval theologians had vigorously debated exactly how the clergy administered grace, but almost no one questioned that priests alone could administer divine forgiveness. Before laypersons took communion each year during Holy Week, they were required to confess all their sins to a priest, receiving from him absolution as well as penances for repaying God and shortening one’s time in purgatory. Rittgers stresses the mercantilistic aspects of the traditional system. For late medieval men and women, purgatory was a sort of “debtors’ prison.” Penance was a means of settling one’s spiritual account with God. If we extend these metaphors further, we might liken the priest to a banker or cashier—the middleman without whom these transactions could not take place.
For the new Protestants of Nürnberg, there was no longer any need to pay God back for past sins. The gospel proclaimed a free and full remission of sins and their penalties; all that was required of the sinner was heartfelt trust in God’s promises. By the 1520s and 1530s, however, some of Nürnberg’s leading theologians began to wonder whether the city’s residents were abusing their newfound spiritual freedom, taking it as an excuse for indifference and license. Many residents of the city had simply stopped going to confession. Some preachers argued that more direct and immediate clerical authority was necessary to guide sinners to a correct faith.
Luther himself was a strong advocate of private confession. For him, it was a source of invaluable consolation—reassurance that the gospel was truly pro me (“for me”). In some ways, Lutheran confession was meant to be an antidote to the penitential mentality of late medieval confession. Congregants need not confess all their sins, but only make a general statement of sinfulness; the pastor was to offer unconditional absolution. There was no need for further penance. Where the medieval sinner was to be kept suspended “between hope and fear,” Lutheran confession was meant to instill absolute confidence in personal salvation.
During the late 1520s and early 1530s, Nürnberg magistrates and theologians developed and then promulgated a new church order reflecting these new ideas. Like Luther, civic leaders were reluctant to do away with private confession altogether. Fearing that congregants might partake of the Lord’s Supper unworthily, the new church order required pastors to interview Christians privately before communion. This was no Catholic interrogation. Rather, the interview was intended primarily to gauge the individual’s knowledge of the “evangelical” message, since faith for Protestants depended on the correct understanding and perception of the gospel. During the interview, the pastor proclaimed free absolution to the sinner, commending it as a defense against despair. The congregant could confess as many or as few specific sins as he saw fit.
Though the new Nürnberg church order became a model for other Lutheran communities, it did not end debates in the city about the power of the keys. The City Council had long striven for full control of the city’s church life and now, thanks to the Reformation, had final say in all religious matters. Thus, they only reluctantly granted the clergy the power to bar unworthy individuals from communion. Moreover, the new church order seemed to provide an alternative to private confession. In response to popular demand, the City Council had authorized the inclusion of a general declaration of God’s forgiveness in the Sunday liturgy.
General absolution in Nürnberg soon provoked fierce controversy, and the intricacies of this debate are discussed in great detail in Rittgers’ book. Suffice it to say that the city long retained the practice of private absolution, though the confession of particular sins remained voluntary. For Rittgers, evangelical private confession—and indeed, the evangelical faith in general—was all about confidence. Contrary to what some historians have suggested, Protestant leaders in Nürnberg were not “puritanical.” Indeed, they generally avoided using sin as a means of social control—of threatening or manipulating the population. Indeed, in Rittgers’ view, the city council deliberately sought to protect lay consciences from such exploitation. The new confession was meant to take away fear of the afterlife, not increase it.
At times, The Reformation of the Keys may present a slightly too rosy picture of the new Protestant order. Even so, the book offers a valuable corrective. Ever since the Reformation, critics have been quick to point out the antinomian potential of Protestant doctrines of grace. Now, with Rittgers’ magisterial study, we can follow one community as it sought to find the right balance between “freedom and discipline.”
Matthew Lundin is a doctoral candidate in early modern history at Harvard University.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromMatthew Lundin
Lauren F. Winner
Jan Karon and the clerical novel.
- View Issue
- Subscribe
- Give a Gift
- Archives
I am in minor literary mourning. Light from Heaven, the final Mitford novel, has just been released. I stayed up all night reading it, and when I had finished, I remembered a story my mother used to tell me. As a girl, she was a devoted reader of Hugh Lofting’s Dr. Dolittle series. When she came to the last page of the last installment—Dr. Dolittle’s Puddleby Adventures—she bawled. She hated knowing that she would never again encounter new Dolittle tales.
For those who do not number themselves among Jan Karon’s millions of fans, here’s a quick summary: The Mitford novels, set in small-town western North Carolina (think Lake Wobegon with less irony), follow the quaint adventures of an Episcopal priest named Father Tim and his next-door-neighbor-turned-significant-other-turned-wife Cynthia. I use the term “adventures” loosely, because the most adventurous thing Father Tim ever does is take an airplane ride. Most of the time he’s making hospital calls, drafting sermons, packing picnic lunches, reading Wordsworth, walking his remarkable dog Barnabas (who responds not to the usual canine commands, but to the recitation of Scripture), and writing a lot of letters (and, increasingly, emails).
Light from Heaven finds Father Tim and Cynthia just outside Mitford, looking after Meadowgate Farm, while Meadowgate’s owners, the Owenses, spend a year in France. As ever, Father Tim wrestles with that Protestant demon, Usefulness. Especially now that he has retired, he is deeply concerned that he not Waste Time, but find a way of going about the business of being useful to someone. Fortunately, his bishop calls with a charge: go revive a small mission church whose doors have been closed for decades. Of course, Father Tim and Cynthia are just the ones for the job. Along the way, they take in not one but two stray children. Meanwhile, a few Mitfordians die, a few more get married, a lost sibling gets found, a million orange marmalade cakes get baked … just a typical year in Karon-land.
People either love these novels or hate them. Some readers treasure their sojourns in Mitford because real life lacks the certain warm community feeling that Mitford has in spades. Others dismiss this very sensibility as a tad too twee. (An aside: I learned the word twee from a Milford novel. Cynthia drops it into a letter to Father Tim in A Light in The Window, a fact that itself might inspire naysayers to rest their case, screeching “Who on earth uses the adjective twee?”).
I’m obviously in the first camp, but nonetheless I must repeat a disclaimer I issue every time I ruminate about Jan Karon’s Mitford novels: I realize that they are not Great Literature. I realize that they are not comparable to the very novels I will, in a few paragraphs, compare them to. But they are excellent specimens of what they are. I have read just about every Mitford knockoff published in recent years, and Karon’s stylistic sensibility, humor, and local color beat the copy-cats by a country mile. Not to mention the fact that the first two novels in the series were hugely significant in my own conversion to Christianity. This, it seems to me, is one of God’s little jokes: other people get to tell about how Dostoyevsky or Karl Barth drew them to Christianity, while intellectually prideful me will spend the rest of my life explaining that I was converted in part through the ministrations of fictional Father Tim.
Still, Great Literature or no, the Mitford novels do participate in a venerable literary tradition: clerical fiction, a capacious category which would include everyone from Trollope and Hawthorne to Susan Howatch and Marilynne Robinson (whose Gilead is, among other things, a superbly unconventional clerical novel). F. Scott Fitzgerald could even squeeze in there if you count his short story “Absolution.”
Many clerical novels spotlight the challenges of clergy’s lives. To wit, James Street’s novels The Gauntlet (1945) and The High Calling (1951). Street’s hero is a Southern Baptist pastor called London Wingo, who’s sympathetic, even if he has absorbed a flabby sort of humanism, and who struggles to balance the needs of his congregation with the needs of his family. Other clerical novels—one thinks here of George Eliot, and Trollope—expose the changing role of the minister in society. Still others, like Sinclair Lewis’s Elmer Gantry and Peter De Vries’ too-little-read 1958 novel, The Mackerel Plaza, seem born of the author’s desire to unmask Christian hypocrisy. But every clerical novel can prompt reflection on what the life of the church can and should be. A parody like De Vries’ may rightly be interpreted not as a dismissal of Christianity but rather as a heartfelt expression of distress at expressions of Christianity that have gone totally off the rails.
In one respect, the Mitford novels, though decidedly evangelical, are more reminiscent of the Catholic clerical tradition than the Protestant. It is not too gross an oversimplification to suggest that in novels featuring Catholic priests we more often find portraits of faithful lives well lived. In fiction, Protestant clergy seem given over to other tasks: wrestling with doubt inflamed by scientific criticism, Darwinism, or humanism (as in Harold Frederic’s The Damnation of Theron Ware), or getting mired in hypocrisy and blatant sin (as in John Updike’s Month of Sundays). It is in Catholic clerical literature that we find priests who, though flawed, are nonetheless devoted to pastoring, to the cure of souls.
Put differently: I once began to write a novel (I have begun to write about 23 of them) about a widow who had insomnia and read a lot of sermons in the middle of the night. That, at the start, was all I knew about the widow. I shared my idea with a novelist friend, who responded in some alarm, “Well, something has to happen, some plot, other than this woman’s spiritual development.” In Mitford, not a whole lot happens other than the characters’ spiritual development, and in this way—this unashamed willingness to place Christian growth at the center of a novel—Karon recalls not Frederic or Updike or De Vries or Street, but rather some of the great Catholic novelists.
Consider, for example, Georges Bernanos’ peerless The Diary of a Country Priest. Like Bernanos, Karon is an unabashed apologist, even evangelist for the Christian faith. Like Bernanos’ hero, Father Tim is unafraid of (in Bernanos’ phrase) the “red-hot iron” that is the Word of God. Like Bernanos’ priest, Father Tim understands that he loves his parishioners best when he suffers with them. Similarly, Father Tim’s pious, selfless devotion to the Barlowe boys—abandoned by their alcoholic mother, threatened by their violent father—recalls François Mauriac’s Abbé Calou, the priest at the center of A Woman of the Pharisees, who, like Father Tim, is determined to introduce rebellious boys to Jesus Christ. Even Father Tim’s struggles with the witchy Edith Mallory, struggles that find an evangelistically satisfying conclusion in Light from Heaven, might well be modeled on Bernanos’ priest’s struggles with the Countess, or, in Mauriac, Calou’s struggles with the titular pharisienne.
Many of my clergy friends won’t read the Mitford novels. They say that Mitford is no escape—it reminds them too much of their own jobs. Or they say that Father Tim is utterly unrealistic, laughably so. One interlocutor told me dismissively that Father Tim had “boundary issues” (I wondered if my friend’s penchant for psychologically analyzing fictional characters she claimed to have no time for reflected her own boundary issues).
But maybe Father Tim’s willingness to take in all those stray kids, and his readiness to weep at the bedsides of parishioners who’ve become family to him, are not indicators of an inability to maintain good boundaries. Maybe, instead, Father Tim embodies an ideal of the self-sacrificial servant who knows more about love than he does about the strictures of our therapeutic culture. (Again, shades of Bernanos’ country priest, who tells us “I have undertaken to visit each family once every three months at least. My colleagues consider this excessive.”) Some might say that Karon’s portrait of Father Tim is idealistic to the point of hagiography. The riposte: perhaps he is a portrait of Christian maturity. Perhaps after lives of service and self-sacrifice we, too, would embody his faith at age seventy.
These novels sparked my conversion for lots of reasons: I was marooned in Manhattan, missing home, which for me is western North Carolina. I was attracted by the vibrant community Jan Karon imagined. But the final attraction was this: I saw the way that faith pervaded the lives of the Mitfordians, and I wanted that faith. Today I look at Father Tim and see his solid, unwavering faith and his heartfelt service, and I want it still.
Lauren F. Winner is the author of Girl Meets God (Algonquin/Random House) and, most recently, Real Sex: The Naked Truth About Chastity (Brazos).
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLauren F. Winner
Edward E. Ericson, Jr.
A disease that left its mark.
- View Issue
- Subscribe
- Give a Gift
- Archives
Editor's Note: This article first appeared in the November/December 2005 issue of Books & Culture, taking note of a cluster of books occasioned by the 50th anniversary of the Salk polio vaccine.
Polio and Its Aftermath: The Paralysis of Culture
Marc Shell (Author)
Harvard University Press
336 pages
$37.99
Post-Polio Syndrome: A Guide for Polio Survivors and Their Families
Julie K. Silver (Author)
304 pages
$29.15
Living with Polio: The Epidemic and Its Survivors
Daniel J. Wilson (Author)
University of Chicago Press
312 pages
$45.76
"Hey, driver! Hold up!" a voice cried from the bus' back door. "I can't wake up my buddy, and he's gotta get off here." Two boys were coming home from a Cubs game at Wrigley Field, and Richie, age ten, finally got Roosk, age nine, off the bus and down the half block to home. Roosk told his mom he was tired and dropped on his bed, not hearing her ask if he was ok. "Oh no, polio? Please God, not polio." Thus did she pray the prayer of all Chicago parents in the epidemic year of 1949. Roosk woke up 16 hours later with a fever. Doc Olson came over, did doctor things, and announced, "No polio." Later, Doc said to take the boy to the church's family-week camp. "I'll be there and can keep an eye on him."
Polio was no respecter of persons. The most famous victim was Franklin Delano Roosevelt. "Super Crip" crusaded against the plague by helping establish the National Foundation for Infantile Paralysis (NFIP), later known as the March of Dimes, the greatest health-related fund-raiser ever. Other notables stricken were Justice William O. Douglas, Canadian Prime Minister Jean Chrétien, track star Wilma Rudolph, actress Mia Farrow, filmmaker Francis Ford Coppola, writer Wilfred Sheed, scholar Edward Le Comte, and … and Hitler's propagandist Joseph Goebbels.
The first major polio epidemic in the United States hit in 1916, spreading out from New York City, and for the next 39 years, the public simply took what poliovirus dished out. In the 1920s and 1930s, the annual rate of cases was 4 per 100,000; by the early 1940s, the rate doubled; by the late 1940s, it redoubled; by the early 1950s, it had reached 25 per 100,000, with a peak of 37 per in 1952. Between 1937 and 1955, 415,624 cases were reported, 57,879 in 1952 alone. Newspapers ran tallies of local victims—like baseball box scores—by age, sex, type of paralysis. The United States has 1.6 million living polio survivors, 600,000 of whom show ongoing effects; comparable figures for the world are 24 million and 7.5 million. Fears that vaccination is part of the West's conspiracy to commit genocide keep polio from being stamped out worldwide.
Polio is a viral intestinal infection; it is contagious. Polio is difficult to diagnose early, because its typical symptoms are the common ones of fever, headache, sore throat, muscle pains, and nausea. Poliovirus is carried by fecal waste and enters the body through the mouth. It then attacks nerve cells. Surviving nerves can sprout new connections to the "orphaned" muscles, thus doing double duty. This neurological disease is erratic, is relatively seldom fatal, and can leave temporary or permanent paralysis as an aftereffect. The legs are the favorite target of spinal polio, the most common type. Bulbar polio, the other familiar type, attacks the brain stem (bulb), impairs swallowing and breathing, and is deadlier; iron lungs are used for this type.
At the camp Roosk's gait had become stiff-legged and ungainly. Other mothers hurried their children indoors. The family women got him into the cabin. Said one aunt, "I don't care if it is polio; he's our Rooskie Boy." When Doc arrived, one look and he said, "Oh no, it's polio. Sometimes we can't tell until paralysis sets in. Take my wife's car and drive the boy straight to the Municipal Contagious Disease Hospital." For all seventy-five miles back to Chicago, Gram prayed aloud.
Polio is, counterintuitively, "a disease of cleanliness." Until modern hygiene and sanitation kicked in, newborns picked up the virus from their mothers, but in mild doses that produced the antibodies needed for lifelong inoculation. Thus, the cleaner, the riskier; the better, the worse.
A hospital doctor stuck a huge needle right into Roosk's spine. It hurt, hurt, hurt. Then came three long days in a men's ward where, as Roosk remembers it, no one spoke. His parents wrote chalkboard signs that he read through two walls of glass. Did he want ice cream? No. A newspaper? He shook no again. That's when they got really scared.
The fiftieth anniversary of the Salk vaccine in 2005 has brought forth a raft of books about polio, four of which are sampled here. David Oshinsky writes the definitive history of the war against polio in America. Daniel Wilson traces the experience of polio from beginning to end. Marc Shell subjects polio to a cultural-studies examination of the disease and of all the books, movies, and assorted cultural artifacts directly or obliquely related to it. Julie Silver offers practical advice on how to manage the post-polio syndrome.
Oshinsky, a University of Texas historian, traces the dramatic race to find a polio vaccine. Amid a large cast of characters, FDR and his associate at law, Basil O'Connor, play major roles. In the late 1940s, their March of Dimes tried house-to-house solicitations: "Turn On Your Porch Light! Help Fight Polio Tonight!" The Mothers' March on Polio, 2,300 strong by 1950, became "one of the indelible images of postwar America." Between 1951 and 1955, the NFIP raised $250 million. The money funded research and defrayed the medical costs of needy patients, with eighty percent qualifying.
Roosk's family was asked to pay a grand total of 24 dollars. Roosk's father vowed to repay the March of Dimes for the full bill.
Rival researchers Jonas Salk and Albert Sabin raced to create a workable vaccine. Salk, then at the University of Pittsburgh, got there first. The Salk vaccine trials of 1954 involved more than 1.3 million children, one of the largest clinical tests ever undertaken. The favorable outside evaluation announced in 1955 set off huge celebrations. Car horns honked; church bells rang out; banner headlines screamed, "Polio Is Conquered." The moment had come to re-punctuate "Oh no, polio" as "Oh, no polio." Salk appeared on Time magazine's cover and at President Eisenhower's White House. He accepted all too gladly the public's desire for a singular hero. At his coming-out news conference, Salk said not a word about the dedication of the many assistants ranging behind him onstage. Julius Youngner, for one, never forgot and never forgave. Fifty years later, he observed that Salk did nothing else of scientific note: "Being small-minded myself, I take some pleasure in that. Schadenfreude, it's called."
Roosk used to sell his blood for its antibodies, which went into a stopgap pre-vaccine serum. Salk put Roosk out of business.
If Salk was a glory-hog, Sabin was worse: "arrogant, egotistical, and cruel." He sneered at Salk's lionization: "You could go into the kitchen and do what he did." Salk's was a killed-virus vaccine. Sabin, who had been working at the University of Cincinnati on a live-virus vaccine, did not finish first because it is harder to attenuate live virus than to kill it. Sabin was the professionals' favorite. But Salk was the people's choice—until one batch of his vaccine went out with some live virus and caused five children to die. Salk is named among history's most famous scientists, but Sabin's vaccine triumphed, its victory cinched by its oral delivery system. By 1961, only a thousand new cases in the United States were reported.
Readers who want to know what it was like to be a polio victim should try Wilson. True, the Muhlenberg College historian opens with forbidding dissertationese and informs us dullards that polio narratives "slight the experience of polio patients who died during the acute phase of the illness" and that "the earliest narratives shape those that follow." Don't be discouraged. Citations from many narratives by "polios" spice the subsequent chapters. Wilson takes readers through the whole polio experience: the onset, the acute stage, early recovery in rehabilitation hospitals, life on the polio wards, the long process of recovery, the efforts to reestablish normal living, the demands of sustaining the new normalcy—and then, after all the travail, the return of the "old foe" in the guise of post-polio syndrome, when "Use it or lose it" becomes "Use it and lose it."
Wilson's book is a hall of mirrors for "polios." Anyone, for example, who underwent a spinal tap "never forgot it."
It hurt, hurt, hurt. Roosk never forgot her name: Dr. Brown.
"Forty and fifty years later, polio survivors still have vivid memories of the fun they had" on the polio wards.
Roosk's 24 days at St. Luke's were some of his happiest ever. Daytime was the best time for pranks; nights were for reading books under the covers by pen flashlight. Roosk didn't learn until later that his parents met a couple there who had a son and a daughter one weekend and neither the next.
Patients felt keenly the indignities of assisted use of urinals and bedpans.
Daily, some young nurse's aide placed her hand where Roosk didn't want her to.
Polio had a "special affinity for the legs."
Roosk's paralysis, from the neck down, gradually—how to describe it?—drained down his left leg and into the foot.
Rehabilitation almost always required physical therapy.
Roosk avoided physical therapy by agreeing to skip afternoon class to rest up for after-school playtime. His playmates envied his polio.
Sometimes tendons were transplanted to restore a modicum of mobility, though not in the first year and not under the age of ten.
Roosk had a tendon transplant at age 13 to allow some minimal control of his toes. Also, a triangular chunk of bone was cut out to keep his foot from growing hooflike. The ugly result was a deformed left foot nearly two inches shorter than the right. As a high school senior, "the kid who limps" played on the football team to prove to the coach that his foot could hold up for basketball.
Outpatient treatment was common for the first 16 months or so, until such recovery of muscle function as was possible ended.
Roosk was treated (at no cost) by star orthopedist Dr. Chandler, and 6 N. Michigan Ave. remains almost a shrine.
Most "polios" had a strong will to succeed, and many excelled at schoolwork and pursued careers requiring intellectual work.
Roosk became a college professor.
Marc Shell's book puzzles blurb writers Wilson and Silver; both say "there is nothing quite like" it. Shell is a Harvard professor of English, a MacArthur Fellow, and miserably unhappy. Amid the current surge of books about polio, Shell complains about "the general repression of the memory of polio." He has ransacked used bookstores on all continents except Antarctica in search of any and all polio narratives, which he suggests libraries have seen fit to make unavailable. Then he expends great effort examining the arts for treatments of polio and finds what he is looking for: "the Polio School" in literature, for instance, and "the dozens of movies about paralysis" (note: not polio) that "would, if taken together constitute a reinterpretation of the history of cinema." The index lists 28 movies about polio and 29 more that are "polio-inspired," including—get ready—The Ten Commandments, The Greatest Story Ever Told, Star Wars, and The Wizard of Oz. With cultural artifacts appearing and disappearing—as if Prufrock's aimless women come and go, talking of polio—readers unable to connect Shell's particulars and his generalizations may feel the tug of the hermeneutics of suspicion. Only advanced education could direct a survivor to lay a cultural-studies template over his polio.
Ah, but Shell refuses to call himself a survivor, because victims only "convive" (live with) their polio. And that is his main point: Polio is not a thing of the past for those who ever went through the acute phase. Release from the disease's clutches comes only with death. This point is pretty obvious, and it applies to sufferers of many other traumas as well. But it is not false.
At age 37, Roosk had a great fall and landed with all his weight on his weakest extremity. Humpty-Dumpty could not quite be put back together again. He no longer could run. Having discovered early that other kids could now out-throw him and that power-hitting should give way to place-hitting, he learned only when he took up golf how diminished was his fine motor control. He also noticed that one leg was submerged by bathwater before the other. Little self-discoveries kept surprising the apparently incurious man.
The most memorable aspect of Shell's book is his rage—his understandable rage. For his parents told him there was "nothing wrong" with him; he just had a cold in his leg. But he remembered being paralyzed. His father, "for all the love he bore me," took a leather strap to the lad's polio-weakened body parts, as if "to whip the demon out of me." And his mother "actually put my father up to it." So Shell laid his long-range plan: "The child that I was then counted on becoming this adult that I am now, who would try to write that child's polio memoir." Adult bitterness spills over everything. Let others celebrate the "conquest of polio." But even now there is no cure, no effective treatment, merely a vaccine. The NFIP was corrupt, doctors were greedy, parents lied. Polio wards were "made" to "look like concentration camps"— and Shell's next thought is of Cherokees on their Trail of Tears and Japanese Americans in their internment camps. Shell is upset with "polios" who try to achieve some normalcy in their lives, furious with those who inject divine providence into the polio equation, and livid over Christ's "taunt" of doctors, "Physician, heal thyself!" After all, "even he can only do resurrection."
Unsurprisingly, wrath leads to faulty generalizations. "A child polio was the cause of family shame." People are in denial that "all forms of polio damaged neurons, permanently." There is "a consistent tendency to overlook" post-polio syndrome and "often to declare that it doesn't even exist" (this, at the very time when books and internet sites devoted to PPS are flourishing). Charitable polio organizations used a popular song's inspiring lines, "Walk on, walk on, with hope in your heart, / And you'll never walk alone"—about which Shell comments, "To us polio-children, the words of this love song meant both that we would never be able to walk without braces and that we would always be dependent on someone to lean on." Who is this "us," Kemo Sabe?
Wrath, even when inspired by self-pity, is a deadly sin. It is Shell's book's fatal flaw.
Julie Silver, who studied polio because her mother had it, has written a helpful "how-to" book on post-polio syndrome. PPS is about as difficult to diagnose as polio itself. Its symptoms are new weakness, unaccustomed fatigue, muscular pain, new swallowing problems, new respiratory problems, cold intolerance, new muscle atrophy. But who doesn't have such symptoms as aging proceeds? The best one can do in diagnosing PPS is to eliminate all other causes of these symptoms—an impossibility. It is difficult, even, to estimate the percentage of polio survivors who become afflicted by PPS. Silver estimates somewhere between 25 and 60 percent. Yet, since the symptoms range from mild to severe, everything remains in question.
Silver's book functions best as a reference guide. Her advice is level-headed, though limited, and her addiction to lists makes for dreary reading. Are you fighting fatigue, dear oldster? Read the list of twenty possible causes, and if you can eliminate the other 19, PPS is your culprit. If you tire quickly, read the ten steps to energy conservation and pacing. To relax without sleeping, try one or more of eleven ways. To avoid falling, keep in mind 16 intrinsic risk factors and 18 extrinsic ones. If you never had polio, follow her advice anyway.
Polio contributes to back trouble. Yes, but … Can it contribute to meralgia paresthetica? to atrial fibrillation? to … ? But society cannot afford research just to satisfy the curiosity of a dying breed.
On the biggest question, Silver can offer no help. What is polio's impact on longevity? Surely, compensatory overload will take a toll on nerves and muscles. PPS does indeed seem, now that it has been identified, to be a "cruel trick" for those afflicted. Yet, hearteningly, Silver closes with the reminder (pace Shell), "Given the magnitude of their afflictions, most polio survivors cope remarkably well."
Even those who take a certain perverse pleasure in being identified with the 20th century's hallmark disease realize that they now belong to a museum tableau. If they know what frail vessels carried society to the land of scientific promise, most appreciate the ride. They should know, as well, that their scourge bestows no unique advantages for moral reflection. The guises of suffering differ from one to another; the age-old mystery of it remains universal. Some human beings grow through suffering; others shrivel. An incipient utopianism characterizes those who complain perpetually about their polio. They seem to want a world free of sickness and war, of tsunamis and tornadoes, of sin and death. There may be such a world; some say there is. But it is not this one.
Edward E. Ericson, Jr., professor of English, emeritus, at Calvin College, is currently collaborating on two books about Solzhenitsyn, forthcoming from ISI.
Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromEdward E. Ericson, Jr.