In Endowed By Our Creator: The Birth of Religious Freedom in America, Michael I. Meyerson describes the successful quest for a sweet spot for faith without coercion.
The following has been posted on the Books page of the History News Network site.
Perhaps the most surprising thing regarding the ongoing controversy about the relationship between governmental and religious institutions in the United States is the fact it is ongoing. From the age of Paine to the Age of Aquarius, rising tides of religious skepticism have been apparent to champion and critic alike. Conversely, periodic Great Awakenings in the last 275 years have made faith ascendant. Each in its moment seemed to have unstoppable momentum. Yet here we are in the 21st century with arguments as heated as they've ever been. Inevitably, partisans invoke the Founding Fathers to bolster their respective claims. As University of Baltimore School of Law professor Michael I. Meyerson shows in this impressively researched book, each side of the sacred vs. secular camp can find ammunition to support its respective point of view. But he regards such partisan exercises as misleading at best and dangerous at worst. That's not because the Founders lacked a clear vision, he says, but rather because that vision was cast in terms of a union in which church and state -- but not God and state, or religion and state -- would be separate.
One of the mistakes contemporary Americans make is their assumption that the Founders' views were static. Actually, Meyerson's narrative, which stretches from the late colonial era to the presidency of James Madison, shows they lived in a world in which the state of faith was highly fluid. It varied between colonies, across time, and among the Founders themselves, who in the face of political exigencies sometimes took positions that were philosophically inconsistent. In fact, the very term "religious freedom" was subject to multiple meanings. For the Puritans, freedom meant liberation from having to tolerate the self-evident corruptions of the crypto-papist Church of England. For others, it could mean simply the right to worship without expulsion. Or a that mandatory taxes would be siphoned toward a church of a believer's choosing. It did not necessarily mean a right to vote or hold office. Even the word "Christian" could be ambiguous (Catholic membership in this category was widely regarded as suspect.) Some colonies, like those of New England, were marked by a high degree of (Congregationalist) homogeneity. Others, particularly the middle colonies, were highly diverse. Though many colonists were aware of the religious terrain beyond their borders, they nevertheless remained worlds of their own, even decades after the Revolution.
It is nevertheless the case that the political imperatives of the Revolution forced the colonists to reckon with their diversity and make allowances for it, which they did with varying degrees of grace. The hero of this book is George Washington, whom Meyerson sees as singularly far-sighted in his ecumenical vision, which he viewed as a practical necessity in his view as commander-in-chief of the the Continental Army, a crucible in the formation of a national identity. But Meyerson views Washington as more than simply pragmatic -- not simply tolerant but accepting of just about all religious persuasions (with the partial exception of the Quakers, whose pacifism he regarded as suspect during the war). And as president he was able to speak and act with remarkable skill and tact in his dealings with the American people, repeatedly invoking terms like "Our Creator" while sidestepping terminology with a sectarian cast.
The Constitution is widely viewed as a conservative document designed to cool the passions of the revolutionary era. But in Endowed by Our Creator, it is depicted an instrument of ongoing religious liberalization. The mere creation of a federal frame, even one that respected states' rights, implicitly offered a contrast, if not an example, for states struggling to disestablish tax-supported churches. But again, severing formal links did not mean the suppression of religiosity, even among government leaders. There was a spectrum of opinion between the relatively orthodox John Adams and the consistently anti-establishment James Madison. But the former would sign a treaty with Tripoli in 1797 that famously stated "the United States of America is not in any sense founded on Christian religion," while the latter once described Christianity as the "best and purest religion." In contrast to his close partner Thomas Jefferson's famous invocation of a "wall of separation" between church and state, Madison offered less forbidding metaphor of a line. Such shadings notwithstanding, Meyerson's overall point in this book is that these people were striving for a sweet spot -- and that they found it.
As such, his book is meant as a rebuke to the aggressive evangelical and expansive secular humanist alike. Of the two, however, he seems more concerned about the former. Meyerson challenges the judgment of a series of current and former Supreme Court justices, but repeatedly singles out Antonin Scalia for what he regards as disingenuous, if not intellectually slipshod, assertions about the government's right to confer privileges on religious institutions. The fact that there may be two sides to an argument doesn't mean that the truth falls squarely in the middle, and it would appear that Meyerson is more concerned about the overweening claims of faith-based government advocates.
In its broadest outlines, Meyerson's argument seems broadly consonant with that of Jon Meacham's American Gospel: God, the Founding Fathers, and the Making of a Nation (2007). But this is a more temporally focused and rigorously documented study, and as such is more useful for scholarly purposes. At the same time, it's written in clear, vigorous prose. It should be, and, with the aid of divine providence, is likely to be, durable.
Tuesday, June 26, 2012
Friday, June 22, 2012
Jim is vacationing in northern New Hampshire. His Kindle reading, long delayed, is Suzanne Collins 2008 bestseller The Hunger Games. This is a book that has already been much read, sometimes more than once, by multiple members of his household. His interest will be in the Roman analogies in the novel, which are obvious from even the most basic plot details. They were also evident in the movie, which was a shared family experience. Should the book be compelling enough, the two sequels will follow. (This from a man who only made it through two Harry Potters, estimable as they were.)
Next up: Seth Grahame-Smith's Abraham Lincoln: Vampire Hunter. (The movie, as well. Benjamin Walker was so great in Bloody Bloody Andrew Jackson that it's a good bet he'll enhance anything where he appears.)
The summer seems to arrive in layers: Memorial Day, last day of classes, graduation, final meetings, summer solstice. Now, finally, by just about any definition, it seems to be here. Best to all in savoring its fruits.
Next up: Seth Grahame-Smith's Abraham Lincoln: Vampire Hunter. (The movie, as well. Benjamin Walker was so great in Bloody Bloody Andrew Jackson that it's a good bet he'll enhance anything where he appears.)
The summer seems to arrive in layers: Memorial Day, last day of classes, graduation, final meetings, summer solstice. Now, finally, by just about any definition, it seems to be here. Best to all in savoring its fruits.
Tuesday, June 19, 2012
Rock formations
Rock of Ages and the problem of great mediocrity
In the fall of 1978, a big event happened in my life: the progressive rock band Styx released its eighth album, Pieces of Eight. I'd become a Styx fan the year before when I went to my first rock concert and became a devotee. Pieces of Eight was about the first time I'd experienced a new release by the band I had been following, and I regarded buying and hearing the album as a ritual of initiation.
The really great days that fall where ones where I'd come home from school to an empty house and so could blast the album as loud as I pleased. I particularly loved one track, "Blue Collar Man." Singer/songwriter/guitarist Tommy Shaw depicted an unemployed worker declaring his determination to overcome adversity:
It was the closest I've come to perfect sublimity. Class consciousness, thick rhythm guitar, hot guitar solo: what more could anyone want?
Little did I know that Lester Bangs, a rock critic I later much admired, would write of Pieces of Eight in Rolling Stone that "What's really interesting is not that such narcissistic slop should get recorded, but what must be going on in the minds of the people who support it in such amazing numbers." (The album went triple platinum.) I was not yet paying attention to people like Elvis Costello and Bruce Springsteen, who would open large windows into what rock music could really be. Nor had I acquired the three academic degrees which would furnish the means by which I absorbed truly towering works of art that would enrich my life and provide me with a living.

But here's the thing: no work of art, no aesthetic experience, has ever provided me with more satisfaction than "Blue Collar Man" did. That magic is inevitably gone; I watched a latter-day configuration of Styx perform the song on YouTube this morning and my reaction was closer to Lester Bangs than my 15 year-old self (though I do think the late Bangs's condescension toward Styx finally outstrips that of Tommy Shaw toward the blue collar workers he sentimentalized). But I was impressed by how well he was still singing thirty years later, and could not help but still feel a frisson of pleasure in hearing that old riff.
I was reminded of Styx, Pieces of Eight, and "Blue Collar Man" when I went to see Rock of Ages, Broadway-musical turned film at my local multiplex this weekend. I went less for the music -- critically benighted heavy metal of the eighties -- than its strong cast of veterans, which included Paul Giamatti, Catherine Zeta-Jones, Alec Baldwin, Russell Brand, and Tom Cruise. They were all fun to watch, particularly Cruise, who performed a fascinating near parody of himself that suggested a keen intelligence behind his portrayal of a lurching rock god named Stacee Jaxx. I was also impressed by the performances of newcomers Julianne Hough, Malin Akerman, and Drew Boley.
What also struck me -- what the musical taught me -- is how many great melodies animated the otherwise macho-encrusted songs of bands like Foreigner, Journey, and Poison. (Doesn't hurt to have Mary K. Blige reinterpret Pat Benetar's "Shadows of the Night" and Journey's "Anyway You Want It," or to have clever medleys that splice together Foreigner's "Jukebox Hero" with Joan Jett's always refreshingly unpretentious "I Love Rock & Roll.") By any rational standard, the story is hopelessly hackneyed -- boy meets girl/boy loses girl/boy gets girl, against a backdrop of prudish harridans trying to prevent any fun. On the other hand, having Brand and Baldwin declare their love for each other amid a hilarious deconstruction of REO Speedwagon's "I Can't Fight This Feeling" was worth the price of admission alone.
To at least some extent, the relative low regard of this music is a function of the people who embraced it: young, white, non-impoverished men (and a few women) have never been much admired by the literati, musical or otherwise. The fact that much of this literati is itself comprised of older, white non-impoverished men is not incidental. I don't particularly want to invert a hierarchy here or valorize a subculture with some obvious limitations. But a sense of fairness, to myself and others, compels me to pay tribute to these would-be Caesars. Rock on, boys. As is so often the case, if you listen hard enough you can hear the love in other hearts.
In the fall of 1978, a big event happened in my life: the progressive rock band Styx released its eighth album, Pieces of Eight. I'd become a Styx fan the year before when I went to my first rock concert and became a devotee. Pieces of Eight was about the first time I'd experienced a new release by the band I had been following, and I regarded buying and hearing the album as a ritual of initiation.
The really great days that fall where ones where I'd come home from school to an empty house and so could blast the album as loud as I pleased. I particularly loved one track, "Blue Collar Man." Singer/songwriter/guitarist Tommy Shaw depicted an unemployed worker declaring his determination to overcome adversity:
I'll take the long nights, impossible odds
Keeping my eye to the keyhole
If it takes all that to be just who I am
Well I'm gonna be a blue collar man
It was the closest I've come to perfect sublimity. Class consciousness, thick rhythm guitar, hot guitar solo: what more could anyone want?
Little did I know that Lester Bangs, a rock critic I later much admired, would write of Pieces of Eight in Rolling Stone that "What's really interesting is not that such narcissistic slop should get recorded, but what must be going on in the minds of the people who support it in such amazing numbers." (The album went triple platinum.) I was not yet paying attention to people like Elvis Costello and Bruce Springsteen, who would open large windows into what rock music could really be. Nor had I acquired the three academic degrees which would furnish the means by which I absorbed truly towering works of art that would enrich my life and provide me with a living.

But here's the thing: no work of art, no aesthetic experience, has ever provided me with more satisfaction than "Blue Collar Man" did. That magic is inevitably gone; I watched a latter-day configuration of Styx perform the song on YouTube this morning and my reaction was closer to Lester Bangs than my 15 year-old self (though I do think the late Bangs's condescension toward Styx finally outstrips that of Tommy Shaw toward the blue collar workers he sentimentalized). But I was impressed by how well he was still singing thirty years later, and could not help but still feel a frisson of pleasure in hearing that old riff.
I was reminded of Styx, Pieces of Eight, and "Blue Collar Man" when I went to see Rock of Ages, Broadway-musical turned film at my local multiplex this weekend. I went less for the music -- critically benighted heavy metal of the eighties -- than its strong cast of veterans, which included Paul Giamatti, Catherine Zeta-Jones, Alec Baldwin, Russell Brand, and Tom Cruise. They were all fun to watch, particularly Cruise, who performed a fascinating near parody of himself that suggested a keen intelligence behind his portrayal of a lurching rock god named Stacee Jaxx. I was also impressed by the performances of newcomers Julianne Hough, Malin Akerman, and Drew Boley.
What also struck me -- what the musical taught me -- is how many great melodies animated the otherwise macho-encrusted songs of bands like Foreigner, Journey, and Poison. (Doesn't hurt to have Mary K. Blige reinterpret Pat Benetar's "Shadows of the Night" and Journey's "Anyway You Want It," or to have clever medleys that splice together Foreigner's "Jukebox Hero" with Joan Jett's always refreshingly unpretentious "I Love Rock & Roll.") By any rational standard, the story is hopelessly hackneyed -- boy meets girl/boy loses girl/boy gets girl, against a backdrop of prudish harridans trying to prevent any fun. On the other hand, having Brand and Baldwin declare their love for each other amid a hilarious deconstruction of REO Speedwagon's "I Can't Fight This Feeling" was worth the price of admission alone.
To at least some extent, the relative low regard of this music is a function of the people who embraced it: young, white, non-impoverished men (and a few women) have never been much admired by the literati, musical or otherwise. The fact that much of this literati is itself comprised of older, white non-impoverished men is not incidental. I don't particularly want to invert a hierarchy here or valorize a subculture with some obvious limitations. But a sense of fairness, to myself and others, compels me to pay tribute to these would-be Caesars. Rock on, boys. As is so often the case, if you listen hard enough you can hear the love in other hearts.
Friday, June 15, 2012
'Home' Front
In Home, Toni Morrison goes back to the farm -- and the Korean War
The following has been posted on the Books page of the History News Network site.
Is there a better modern American historian than Toni Morrison? In novel after novel, in a career now in its fifth decade, she has emerged as the premier chronicler of our national experience. John Updike showed comparable temporal imagination; E.L. Doctorow has built a body of work of similar breadth and depth. But it's Morrison's canon -- jagged, allusive, clarified through the searing lens of race -- that seems the most consistently vivid. In its kaleidoscopic array of characters, her fiction is reminiscent of William Faulkner, but her world seems bigger, even as it shares a locus in the American South.
Though it seems unorthodox to say so, given the towering status of the Civil War-era Beloved (1987), I found Morrison's last novel, A Mercy (2008), which explored a seventeenth century world in which slavery had yet to assume a recognizably modern shape, to be her most satisfying in it scope and the generosity of its vision. Her new novel, Home, zooms forward to the early 1950s. In the popular imagination, this is a moment whose representation veers between Eisenhower-era affluence and Cold War anxiety, both of which are discernible at the periphery of Morrison's vision. She blends them even as she captures the lingering shadow of the past in a setting that includes cotton fields and refrigerators, eugenics and situation-comedies, fellow travelers and Klansmen, all jostling in the present tense.
Home, which is a novel pumped up into novel dimensions to justify a $24 list price, is a chronicle of the (ironically) named Money family, black Texans forced by racial terror to flee to Georgia and begin unhappy new lives. The core of the family are siblings Frank and Ycidra ("Cee") whose devotion to each other sustains them amid the indifference and/or hostility of their blended, extended family. Frank leaves home to join the army, where he serves in the Korean War, an experience that leaves him with what we would call post-traumatic stress syndrome. Cee marries a lout and moves to Atlanta, where she falls under the sway of an evil doctor (there's a creepiness of this part of the story, with its echoes of the Tuskegee experiments, that's worthy of a Gothic novel). When Frank gets word that his sister is in danger, he manages to pull himself together and make a journey from Portland to Atlanta to save her. The question is whether he can, and whether they have the heart to go back home.
Home is a book studded with brutality in which the most awful violence and degradation are as endemic in a small town or on a city street as they are in a war zone. But it's also one where the irrational kindness of strangers seems plausible and hopeful, where the bonds of community can partially repair wounds and sustain lives. As usual in Morrison's fiction, the novel is broken into chapters with multiple narrators. And as usual, too, Morrison places special emphasis on the resilience of working-class African-American women, though she's tart enough, and balanced enough, to make sure none of them are saints (some a good deal less than that). Just when you think she might be lapsing into sentiment, Frank's character makes a discomfiting disclosure that scrambles any easy notions of victimization and oppression.
Home is unlikely to rank at the top of Morrison's corpus; it's too slight, and too similar in structure and themes to her earlier work. But it showcases a writer at the height of her powers in evoking a moment and its historical counter-currents. And it ranks among her most readable stories. It is also, like so many of her novels, a book certain to reward re-reading: you can go Home again. And you should.
Monday, June 11, 2012
Brilliant Executions
In Bring Up the Bodies, Hilary Mantel continues brewing a powerful historical concoction
The following has been posted on the Books page of the History News Network site.
Toting around a book -- even one anonymously encased in a Kindle, as is increasingly the case with me -- is a natural conversation-starter. And in the last couple years, I've had a number of conversations about Hilary Mantel, first when I was carrying around her 2010 masterpiece Wolf Hall, and lately its successor (the second installment of a planned trilogy). Most of the time, my answer to the query of what I'm reading the evokes the mild curiosity of people more interested in hearing about a book than actually tackling it themselves. In the case of Wolf Hall, however, some friends had beat me to it, and raved. Yet there was another reaction that surfaced a number of times, one for which I had some sympathy: impatience. Yes, it's good, some said. But a bit slow.
I heard that again with regard to Bring Up the Bodies, and it delayed my acquisition of the novel, which I had half-resolved to let slide. But my curiosity about Mantel's real-life protagonist, Thomas Cromwell, Secretary to King Henry VIII, got the better of me. I'm glad I read it, and endorse this novel (as I did Wolf Hall). But this book too is a bit slow, though the narrative picks up steam in as it proceeds.
Mantel is an avowed revisionist. In most of the many times the saga of Henry VIII's reign has been chronicled, Thomas Cromwell is depicted as the hatchet man, a ruthless power player who destroyed the literally sainted Thomas More for More's refusal to acquiesce in the larger religious/political implications of the King's desire to dump his first wife, Katherine of Aragon, in favor of Anne Boleyn. In Mantel's telling, however, More, who's relatively incidental to the story, comes off as a pompous prig, while Cromwell's Machiavellian realism has a mordant wit that's tremendously engaging. Cromwell certainly has his personal loyalties, in particular to his mentor, Cardinal Thomas Wolsey, who was also a victim of the King's intransigent marital will. But the core of Cromwell's appeal is his lively intelligence, which he deploys with tireless energy on behalf of a monarch who rewards this upwardly mobile commoner with power and honors that the elicit admiration and envy of peers and superiors in equal measure.
Wolf Hall focused on Cromwell's role in the rise of Boleyn. Bring Up the Bodies focuses on Cromwell's role in her fall, which results from the King's frustration over Boleyn's failure to deliver a male heir -- the one child of their union became Elizabeth I -- and his growing infatuation with one of Boleyn's attendants, Jane Seymour (whose family estate is named Wolf Hall, an allusion to where the end of the last book was heading). The narrative action here is compressed into nine months between 1535 and 1536 when Cromwell, knowing that Boleyn and her allies regard him as an enemy, pre-emptively strikes by aligning himself with former adversaries who hate her even more than they hate him.
As with Wolf Hall, Cromwell shows himself to be a ruthless political operator, which is troubling this time for two reasons. First, we're forced to confront that this attractive character commits evil acts. As he explains to one of his hapless victims, told to confess to crimes he probably did not commit, "He [i.e. Cromwell] needs guilty men. So he has found men who are guilty. Though perhaps not guilty as charged." Publicly, those charges involve incest and adultery, and results in a series of executions, including the notorious one of the Queen herself. Privately, they are reprisals against those who doomed Cromwell's beloved Wolsey. So Henry gets what he wants even as Cromwell gets what he does.
The other reason Cromwell's machinations are disquieting is pointed out by his friends: the people with whom he's allied will seek to dispose of him at their first convenience. Cromwell knows this, and his fate -- he would be executed five years later -- looms increasingly large over the story, which is slated for resolution in the third volume of what is now projected as a trilogy.
But the real satisfactions of these books is not the plot, but rather the ways Mantel is able to evoke the rhythms and textures of sixteenth century life, while making that life seem recognizable amid its strangeness. Bring Up the Bodies is at heart a series of conversations, some of them internal, in which characters make striking, yet plausibly prescient, remarks. "Our learning both acquired or pretended; the stratagems of state, the lawyer's decrees, the churchmen's curses, and the grave resolutions of judges, sacred and secular: all and each can be defeated by a woman's body, can they not?" Cromwell thinks. "God should have made their bellies transparent, and saved us the hope and fear. But perhaps what grows there has to grow in the dark." Other observations almost seem to wink in their contemporary relevance: "Chivalry's day is over. One day soon moss will grow in the tilt yard. The days of the moneychanger have arrived, and the days of the swaggering privateer; banker sits down with banker, and kings are their waiting boys." But sometimes those winks are ironic: "Though the whole of England has taken an oath to uphold her children, no one abroad thinks that, if [Boleyn] fails to give Henry a son, the little Elizabeth can reign."
It's precisely because I -- God help me -- like Cromwell so much that I find myself dreading the final installment of this saga. Bring Up the Bodies ends in triumph for its protagonist, but, as Mantel concludes, "the word 'however' is like an imp coiled beneath your chair." Our greatest triumphs are temporary; our greatest disasters are trivial. So it's probably worth it to slow down and listen to the remarkable voice that emerges from these pages. In the end, time is the only thing we've got.
The following has been posted on the Books page of the History News Network site.
Toting around a book -- even one anonymously encased in a Kindle, as is increasingly the case with me -- is a natural conversation-starter. And in the last couple years, I've had a number of conversations about Hilary Mantel, first when I was carrying around her 2010 masterpiece Wolf Hall, and lately its successor (the second installment of a planned trilogy). Most of the time, my answer to the query of what I'm reading the evokes the mild curiosity of people more interested in hearing about a book than actually tackling it themselves. In the case of Wolf Hall, however, some friends had beat me to it, and raved. Yet there was another reaction that surfaced a number of times, one for which I had some sympathy: impatience. Yes, it's good, some said. But a bit slow.
I heard that again with regard to Bring Up the Bodies, and it delayed my acquisition of the novel, which I had half-resolved to let slide. But my curiosity about Mantel's real-life protagonist, Thomas Cromwell, Secretary to King Henry VIII, got the better of me. I'm glad I read it, and endorse this novel (as I did Wolf Hall). But this book too is a bit slow, though the narrative picks up steam in as it proceeds.
Mantel is an avowed revisionist. In most of the many times the saga of Henry VIII's reign has been chronicled, Thomas Cromwell is depicted as the hatchet man, a ruthless power player who destroyed the literally sainted Thomas More for More's refusal to acquiesce in the larger religious/political implications of the King's desire to dump his first wife, Katherine of Aragon, in favor of Anne Boleyn. In Mantel's telling, however, More, who's relatively incidental to the story, comes off as a pompous prig, while Cromwell's Machiavellian realism has a mordant wit that's tremendously engaging. Cromwell certainly has his personal loyalties, in particular to his mentor, Cardinal Thomas Wolsey, who was also a victim of the King's intransigent marital will. But the core of Cromwell's appeal is his lively intelligence, which he deploys with tireless energy on behalf of a monarch who rewards this upwardly mobile commoner with power and honors that the elicit admiration and envy of peers and superiors in equal measure.
Wolf Hall focused on Cromwell's role in the rise of Boleyn. Bring Up the Bodies focuses on Cromwell's role in her fall, which results from the King's frustration over Boleyn's failure to deliver a male heir -- the one child of their union became Elizabeth I -- and his growing infatuation with one of Boleyn's attendants, Jane Seymour (whose family estate is named Wolf Hall, an allusion to where the end of the last book was heading). The narrative action here is compressed into nine months between 1535 and 1536 when Cromwell, knowing that Boleyn and her allies regard him as an enemy, pre-emptively strikes by aligning himself with former adversaries who hate her even more than they hate him.
As with Wolf Hall, Cromwell shows himself to be a ruthless political operator, which is troubling this time for two reasons. First, we're forced to confront that this attractive character commits evil acts. As he explains to one of his hapless victims, told to confess to crimes he probably did not commit, "He [i.e. Cromwell] needs guilty men. So he has found men who are guilty. Though perhaps not guilty as charged." Publicly, those charges involve incest and adultery, and results in a series of executions, including the notorious one of the Queen herself. Privately, they are reprisals against those who doomed Cromwell's beloved Wolsey. So Henry gets what he wants even as Cromwell gets what he does.
The other reason Cromwell's machinations are disquieting is pointed out by his friends: the people with whom he's allied will seek to dispose of him at their first convenience. Cromwell knows this, and his fate -- he would be executed five years later -- looms increasingly large over the story, which is slated for resolution in the third volume of what is now projected as a trilogy.
But the real satisfactions of these books is not the plot, but rather the ways Mantel is able to evoke the rhythms and textures of sixteenth century life, while making that life seem recognizable amid its strangeness. Bring Up the Bodies is at heart a series of conversations, some of them internal, in which characters make striking, yet plausibly prescient, remarks. "Our learning both acquired or pretended; the stratagems of state, the lawyer's decrees, the churchmen's curses, and the grave resolutions of judges, sacred and secular: all and each can be defeated by a woman's body, can they not?" Cromwell thinks. "God should have made their bellies transparent, and saved us the hope and fear. But perhaps what grows there has to grow in the dark." Other observations almost seem to wink in their contemporary relevance: "Chivalry's day is over. One day soon moss will grow in the tilt yard. The days of the moneychanger have arrived, and the days of the swaggering privateer; banker sits down with banker, and kings are their waiting boys." But sometimes those winks are ironic: "Though the whole of England has taken an oath to uphold her children, no one abroad thinks that, if [Boleyn] fails to give Henry a son, the little Elizabeth can reign."
It's precisely because I -- God help me -- like Cromwell so much that I find myself dreading the final installment of this saga. Bring Up the Bodies ends in triumph for its protagonist, but, as Mantel concludes, "the word 'however' is like an imp coiled beneath your chair." Our greatest triumphs are temporary; our greatest disasters are trivial. So it's probably worth it to slow down and listen to the remarkable voice that emerges from these pages. In the end, time is the only thing we've got.
Thursday, June 7, 2012
Regraduating

The Felix Chronicles, # 35
In which we survey the annual spring harvest
I make a detour when I arrive at school for a final round of faculty meetings to take a look at the Quad. Surprisingly, there are no obvious traces of yesterday’s ceremonies. Less than 24 hours ago, this space was teeming with parents, grandparents, alums, along with hundreds of students —- some of whom were wearing caps and gowns and about to dissolve into living ghosts. Today, all that remains is a sole folding chair. And since it’s brown, not black like the hundreds that had been set up, I’m not even sure it was here yesterday. The only sign that anything relatively unusual had happened are the distressed stripes of grass running horizontally across the Quad. The maintenance crew will take care of that in pretty short order, and this space will revert to a stretch of silence, punctuated only by the occasional round of elementary school kids singing here on summer afternoons, or administrators walking to and from their cars. Birds and bees will hold dominion for a season.
I’m relieved it’s finally over. It’s been three weeks since the seniors finished classes, a period punctuated by end-of-the-year parties, final exams, the prom, the senior dinner, and other rituals. Graduation is the most tedious. People typically experience a string over a string of a dozen or so years: elementary school and middle school, then high school, college, each a little more bittersweet and dogged by anxiety, followed perhaps by a postgraduate degree. And then that’s it for a generation. But we teachers (especially high school teachers) go through the motions every year. The students, the speeches, the recitation of the school song: they all tend to run together. If anything is likely to be memorable, it’s the weather: hot or rainy, surprisingly cool or surprisingly beautiful. There’s usually a moment of genuine gladness at some point in the morning, as we witness the visible signs of maturity in some of our charges. And there’s often a moment of genuine regret, too, when we face an esteemed colleague’s retirement, the graduation of the final child in a cherished family, or a fond farewell from a clutch of friends who complemented each other so nicely. Any of these people may reappear at some point, in some perhaps transfigured way. But the uncertainty of such scenarios, and the certainty of time’s passage, make such moments bittersweet at best.
It’s always a relief when you get in the car and head home after such rituals, and I’m glad to seize a life, however quotidian, that’s truly my own. For years now, it’s been my habit to come home from graduation and mow the lawn. I think of Winslow Homer’s 1865 painting “Veteran in a New Field,” which depicts a recently returned Civil War soldier threshing wheat. Figuratively speaking, my campaign is over, and I’m eager to get back to my farm.
This notion of closure is among the greatest satisfactions of teaching. Other walks of life are comparably cyclical. But I don’t think any afford the kind of clean lines and closed books that a life in schools does. Many working people take extended summer vacations, but few of them are as expansive and sharply chiseled as that afforded by an academic schedule. As we are all veterans of schooling, this experience is a virtual birthright. But only teachers refuse to relinquish it.
The time will come—unexpectedly quickly —when my longings will turn away from completion and repose toward the rebirth that comes with the fall. In my case, the longings typically return long before it's time to actually return to the classroom. But as I make my way from meeting to meeting, from a final faculty softball came to a final trip to the local watering hole before we all disperse, I pause to savor the cadence. The present is past. And history will be born anew.
Monday, June 4, 2012
Re-presented 'Past'
My 2009 book Essaying the Past: How to Read, Write and Think about History has recently been published in an updated and expanded edition by Wiley-Blackwell. This edition features a new chapter on introductions and conclusion, as well as further information on citing electronic sources like e-books, blogs, and other new realities of the digital age. Some references have also been updated (I replaced a discussion of Avril Levigne with Taylor Swift) as well as more recent examples of student writing to illustrate effective technique.
In addition to Essaying the Past, I'm working on two other related projects both with Wiley-Blackwell as well. The first is a new edition of my 2001 anthology Popular Culture in American History, featuring some of the best scholarship on the subject in the last quarter-century. The other is a new textbook, tentatively titled Stages, Pages and Screens: A Brief History of the Modern Media. Pub dates for these books are estimated between late 2013 and early 2014. Hope you'll get a chance to have a look.
In addition to Essaying the Past, I'm working on two other related projects both with Wiley-Blackwell as well. The first is a new edition of my 2001 anthology Popular Culture in American History, featuring some of the best scholarship on the subject in the last quarter-century. The other is a new textbook, tentatively titled Stages, Pages and Screens: A Brief History of the Modern Media. Pub dates for these books are estimated between late 2013 and early 2014. Hope you'll get a chance to have a look.
Wednesday, May 30, 2012
Poetic license (to kill)
In Bonnie Parker Writes a Poem, Steven Biel explains how a culture created a character
The following has been posted on the Books page of the History News Network site.
Over the course of the last two decades, Steven Biel has become the foremost scholar of what might be termed the folklore of consumer capitalism. His 1996 book Down with the Old Canoe (recently reissued in an updated tradition) traced the collective memory both in the immediate aftermath and the century since the Titanic disaster of 1912. In American Gothic (2005), he explored the meanings -- some contradictory, others downright zany -- that have been attached to the classic 1930 Grant Wood painting. Though fundamentally a different kind of enterprise, his first book, Independent Intellectuals in the United States 1920-1945 (1992) derived some of its energy from a preexisting fascination with the legendary writers whose careers he proceeded to reinterpret. Biel is unparalleled in his ability to unearth, and then link, disparate sources in American culture and establish organic links between them.
Biel's new e-book, Bonnie Parker Writes a Poem: How a Couple of Bungling Sociopaths Became Bonnie and Clyde, represents another satisfying chapter in his body of work. Anyone who's managed to get farther than the 1967 Arthur Penn movie Bonnie and Clyde, starring Faye Dunaway and Warren Beatty -- which, in truth, is probably not all that many people -- consider the "bungling sociopaths" part of the title common knowledge. It's the "how" here that's intriguing. Biel's point of departure is the self-mythologizing poem the improvisational female outlaw fashioned for mass consumption at the end of her brief career as a gangster. (The poem is included as part of the e-book.) But Biel is less interested in the way Parker effectively wrote herself and companion Clyde Barrow into cultural history -- though he analyzes her work with the deftness of a literary critic -- than the way cultural history imprinted itself on her. With an almost archeological command of detail, he sifts through the books and movies Parker is known to have known. The years preceding her crime spree were a germinal moment in the formation of the gangster genre, which Parker absorbed and recorded in surprising detail. Such an approach permits a new perspective not only on characters like those played by James Cagney, but also real-life ones like Pretty Boy Floyd and John Dillinger (who felt Bonnie and Clyde's ineptitude gave outlaws a bad name).
From there, Biel pivots to analyze media coverage of Bonnie and Clyde in the days preceding their deaths in a hail of bullets, as well as their subsequent mythology in movies that extend from Bonnie and Clyde to Natural Born Killers (1992). This tradition extends to a series of hip-hop songs by Tupac Shakur, Eminem, and Jay-Z (with Beyoncé as the voice of Bonnie). Though I suspect the couple represent a fairly arcane pop culture reference these days, it's probably only a matter of time before the simmering resentment against bankster culture gives avowed criminality a good name again.
Bonnie Parker Writes a Poem is part of "Now and Then," a new e-book series that mixes new works by established writers (Hilton Kramer, William O'Neill) along with reissues of classics by famous writers (Ulysses S. Grant, Jean-Paul Sartre). Running in the $1-3 range, these short books are part of the shifting landscape of publishing in the Kindle era, and suggest its emerging possibilities. With this one, which runs about the length of a healthy New Yorker or New York Review of Books essay, Biel makes a distinguished contribution to an emerging literary form.
Friday, May 25, 2012
Jim is observing the Memorial Day holiday weekend. His recent reading has included the latest installment of Robert A. Caro's monumental biography, The Years of Lyndon Johnson. This volume, the fourth, is entitled The Passage of Power. The first book, The Path to Power (1982), covered Johnson's early life as a young man on the make. The second, Means of Ascent (1990) describes Johnson's rise in Congress, including an unforgettable account of his 1948 U.S. Senate race that he "won" by "87" votes, resulting in his satirical nickname "Landslide Lyndon." Volume three, Master of the Senate, (2002), chronicles LBJ's years as Senate Majority Leader, culminating in his successful passage of the toothless, yet epochal, 1957 Civil Rights Act, the first such law to overcome the resistance of the segregated Southern delegation -- of which Johnson had always been a member -- since the era of Reconstruction. In reading these books, one is alternatively moved and appalled by the stunning combination ruthlessness and altruism that drove Johnson so relentlessly.
As its title suggests, The Passage of Power covers a transitional period in Johnson's life: his departure from the Senate to run a botched bid for the presidency in 1960, followed by his risky acceptance of a place on the ticket as vice-president to John F. Kennedy, his miserable years in the political wilderness, and his subsequent accession to the presidency upon Kennedy's assassination in November of 1963. It seemed impossible to tell the story of the Kennedy assassination again in a compelling way, and yet Caro's account (published recently in The New Yorker) is absolutely riveting. Part of the drama from narrating it from LBJ's point of view comes from a literally simultaneous meeting taking place in which investigators in Washington are learning of politically corruption that seems virtually certain to sink his political career.
One of the great pleasures in these books is the way Caro stuffs them with mini-biographies of other people. So, for example, Means of Ascent offers a deeply compelling portrait of Johnson's opponent in the 1948 Senate race, the deeply principled and politically successful Coke Stevenson. In The Passage of Power, it's Robert Kennedy -- whose mutual hatred with Johnson has long been legend -- who gets the Caro treatment. The biographer mines existing sources exhaustively, but then adds new interviews that make his interpretations fresh. So it is, for example that we learn Texas in the presidential election of 1960 was as much a source of electoral fraud as the far more well-known case of Illinois.
At 600 pages, The Passage of Power ranks as one of the smaller segments of Caro's Johnson saga. But it goes quickly. As one recent reviewer aptly suggested, Caro's books are like J.K. Rowling's Harry Potter series for history buffs. There aren't many better ways to spend a holiday weekend. One can only look forward to the fifth and final installment, but Caro himself probably doesn't know when that will be.
As its title suggests, The Passage of Power covers a transitional period in Johnson's life: his departure from the Senate to run a botched bid for the presidency in 1960, followed by his risky acceptance of a place on the ticket as vice-president to John F. Kennedy, his miserable years in the political wilderness, and his subsequent accession to the presidency upon Kennedy's assassination in November of 1963. It seemed impossible to tell the story of the Kennedy assassination again in a compelling way, and yet Caro's account (published recently in The New Yorker) is absolutely riveting. Part of the drama from narrating it from LBJ's point of view comes from a literally simultaneous meeting taking place in which investigators in Washington are learning of politically corruption that seems virtually certain to sink his political career.
One of the great pleasures in these books is the way Caro stuffs them with mini-biographies of other people. So, for example, Means of Ascent offers a deeply compelling portrait of Johnson's opponent in the 1948 Senate race, the deeply principled and politically successful Coke Stevenson. In The Passage of Power, it's Robert Kennedy -- whose mutual hatred with Johnson has long been legend -- who gets the Caro treatment. The biographer mines existing sources exhaustively, but then adds new interviews that make his interpretations fresh. So it is, for example that we learn Texas in the presidential election of 1960 was as much a source of electoral fraud as the far more well-known case of Illinois.
At 600 pages, The Passage of Power ranks as one of the smaller segments of Caro's Johnson saga. But it goes quickly. As one recent reviewer aptly suggested, Caro's books are like J.K. Rowling's Harry Potter series for history buffs. There aren't many better ways to spend a holiday weekend. One can only look forward to the fifth and final installment, but Caro himself probably doesn't know when that will be.
Monday, May 21, 2012
Successful mistake
In The Weight of Vengeance: The United States, the British Empire, and the War of 1812, Troy Bickham traces the origins and outcome of a conflict which, contrary to the popular view, was quite consequential
The following has been posted on the Books page of the History News Network site.
The War of 1812, now in its bicentennial year, is widely regarded as an asterisk in American history. Sparked by a series of British decrees limiting U.S. trading rights during the Napoleonic era that were suspended even as the U.S. declared war, the conflict was a military draw that ended with the status quo ante. Andrew Jackson's celebrated victory at the Battle of New Orleans in 1815 took place after peace terms had already been negotiated (though not yet ratified). As such, the War of 1812 seems not only unnecessary, but just plain stupid.
In The Weight of Vengeance, Troy Bickham, who teaches at Texas A&M, does not assert that the war was fought over high-minded principle. But he does think it had a logic that transcended its stated grievances over trade, the legal status of sailors who may or may not have been British deserters, or the fate of Canadians and Indians in North America. These issues were real enough. But Bickham sees the war as effectively about the two nations' respective self-image. An insecure United States felt a need to assert itself as part of the family of civilized nations. And Britain felt a need to put its former colony in its (subordinate) place. But neither belligerent was in a particularly good position to realize its objectives, and both were subject to considerable internal opposition to their official government positions.
Bickham's parallel arguments seem mirrored by its structure. The book deftly alternates chapters that trace the pro-war and anti-war constituencies in both. For a while, it seems this approach to the subject, however admirably balanced, will only underline the way the various players effectively neutralized each other. But as his analysis proceeds, a decisive view of the war becomes increasingly clear -- and increasingly persuasive.
In Bickham's telling, U.S. conduct in declaring war was remarkably, even stunningly, reckless. The nation's armed forces, particularly its navy, were absurdly unprepared to take on the greatest global power of the age. Its financial capacity for war-making was ridiculously weak, made all the more so by the unwillingness of even the most determined war hawks to make the commitments necessary to place and maintain soldiers in the field. Many observers have noted that there was considerable opposition to the war from the start, much of it with a sectional tenor -- the secessionist tendencies of New England, made manifest by the Hartford Convention of 1814, have long been a staple of high school U.S. history exams. Bickham duly notes this, but asserts the divisions between presumably unified Jeffersonian Republicans were even worse (the principal threat to President James Madison, running for re-election in 1812, came from fellow Republican DeWitt Clinton.) Even in the one universally acknowledged advantage the U.S. military had -- its ability to strike first with an invasion of Canada -- was hopelessly botched. Once that happened, and once the defeat of Napoleon in 1814 freed Britain to redirect its energies across the Atlantic, the U.S. suffered a series of national humiliations, the sacking of Washington D.C. only the most obvious among them. By the fall of that year, the American position was bad and getting worse, with plans for an invasion of New Orleans on the horizon. (The lack of discussion of this strategic and diplomatic dimension of the conflict is a surprising and disappointing omission.)
Viewed in this light, the Treaty of Ghent that ended the conflict is not anti-climactic; it's deeply counter-intuitive, if not a once-in-a century stroke of luck. As Bickham explains, the reasons for the outcome have very little to do with the United States. On the one hand, Britain was under considerable diplomatic pressure to resolve the American situation in ways that did not complicate its broader strategic objectives in Europe. On the other hand, there was tremendous domestic agitation to wind down a quarter-century of war that had taxed the patience of an electorate to the breaking point. At the very moment Britain might have permanently hemmed in American imperial ambitions, it effectively abandoned its wartime objectives in the name of tax relief. The fate of Florida, Texas, and the fate of Native Americans -- who at one point were to get a swath of territory that cuts across modern-day states like Indiana and Michigan -- were cast. Manifest destiny could now become common sense.
The Weight of Vengeance also discusses other hemispheric implications of the War of 1812, among them the emergence of a distinct Canadian identity (which Bickham feels is overstated) and the diminishing importance of the Caribbean in British imperial calculations. As such, book the reflects the increasingly global cast of U.S. historiography generally, even as it remains attuned to domestic politics. This multifaceted quality is among its satisfactions, including readable prose. It's doubtful that the bicentennial of the war will amount to much more than a commercial or academic blip in the next few years. Whether or not that's fair, the conflict receives a worthy chronicle here that will clarify its meaning for anyone who cares to understand it.
The following has been posted on the Books page of the History News Network site.
The War of 1812, now in its bicentennial year, is widely regarded as an asterisk in American history. Sparked by a series of British decrees limiting U.S. trading rights during the Napoleonic era that were suspended even as the U.S. declared war, the conflict was a military draw that ended with the status quo ante. Andrew Jackson's celebrated victory at the Battle of New Orleans in 1815 took place after peace terms had already been negotiated (though not yet ratified). As such, the War of 1812 seems not only unnecessary, but just plain stupid.
In The Weight of Vengeance, Troy Bickham, who teaches at Texas A&M, does not assert that the war was fought over high-minded principle. But he does think it had a logic that transcended its stated grievances over trade, the legal status of sailors who may or may not have been British deserters, or the fate of Canadians and Indians in North America. These issues were real enough. But Bickham sees the war as effectively about the two nations' respective self-image. An insecure United States felt a need to assert itself as part of the family of civilized nations. And Britain felt a need to put its former colony in its (subordinate) place. But neither belligerent was in a particularly good position to realize its objectives, and both were subject to considerable internal opposition to their official government positions.
Bickham's parallel arguments seem mirrored by its structure. The book deftly alternates chapters that trace the pro-war and anti-war constituencies in both. For a while, it seems this approach to the subject, however admirably balanced, will only underline the way the various players effectively neutralized each other. But as his analysis proceeds, a decisive view of the war becomes increasingly clear -- and increasingly persuasive.
In Bickham's telling, U.S. conduct in declaring war was remarkably, even stunningly, reckless. The nation's armed forces, particularly its navy, were absurdly unprepared to take on the greatest global power of the age. Its financial capacity for war-making was ridiculously weak, made all the more so by the unwillingness of even the most determined war hawks to make the commitments necessary to place and maintain soldiers in the field. Many observers have noted that there was considerable opposition to the war from the start, much of it with a sectional tenor -- the secessionist tendencies of New England, made manifest by the Hartford Convention of 1814, have long been a staple of high school U.S. history exams. Bickham duly notes this, but asserts the divisions between presumably unified Jeffersonian Republicans were even worse (the principal threat to President James Madison, running for re-election in 1812, came from fellow Republican DeWitt Clinton.) Even in the one universally acknowledged advantage the U.S. military had -- its ability to strike first with an invasion of Canada -- was hopelessly botched. Once that happened, and once the defeat of Napoleon in 1814 freed Britain to redirect its energies across the Atlantic, the U.S. suffered a series of national humiliations, the sacking of Washington D.C. only the most obvious among them. By the fall of that year, the American position was bad and getting worse, with plans for an invasion of New Orleans on the horizon. (The lack of discussion of this strategic and diplomatic dimension of the conflict is a surprising and disappointing omission.)
Viewed in this light, the Treaty of Ghent that ended the conflict is not anti-climactic; it's deeply counter-intuitive, if not a once-in-a century stroke of luck. As Bickham explains, the reasons for the outcome have very little to do with the United States. On the one hand, Britain was under considerable diplomatic pressure to resolve the American situation in ways that did not complicate its broader strategic objectives in Europe. On the other hand, there was tremendous domestic agitation to wind down a quarter-century of war that had taxed the patience of an electorate to the breaking point. At the very moment Britain might have permanently hemmed in American imperial ambitions, it effectively abandoned its wartime objectives in the name of tax relief. The fate of Florida, Texas, and the fate of Native Americans -- who at one point were to get a swath of territory that cuts across modern-day states like Indiana and Michigan -- were cast. Manifest destiny could now become common sense.
The Weight of Vengeance also discusses other hemispheric implications of the War of 1812, among them the emergence of a distinct Canadian identity (which Bickham feels is overstated) and the diminishing importance of the Caribbean in British imperial calculations. As such, book the reflects the increasingly global cast of U.S. historiography generally, even as it remains attuned to domestic politics. This multifaceted quality is among its satisfactions, including readable prose. It's doubtful that the bicentennial of the war will amount to much more than a commercial or academic blip in the next few years. Whether or not that's fair, the conflict receives a worthy chronicle here that will clarify its meaning for anyone who cares to understand it.
Thursday, May 17, 2012
Bond voyage
Curator Bruce Bustard's Attachments: Faces and Stories from America's Gates captures the dramas of immigration in a new exhibition at the National Archives in Washington
The following has been posted on the Books page of the History News Network site.
For many years now, I've dealt with the topic of late 19th/early 20th-century immigration in my teaching by relying on pieces from Life Stories of Undistinguished Americans, as Told by Themselves, a collection of first-person accounts first published in book form in 1906. There are few better ways of dramatizing this epic global transformation, which typically must be dealt with in sweeping generalizations, than vivid primary source documents like "Story of a Polish Sweatshop Girl" or "Story of a Chinaman," which render the daily of immigrant life with vivid granular detail, from monthly budgets to racial harassment. I was interested in Attachments, the companion volume to a new exhibit at the Nation Archives, for the way it might help amplify this primary source approach to the subject. At first I wasn't so sure it would; the approximately 20 primary source brief essays that accompany the documents in the book rarely contain the voices of immigrants themselves. But the cumulative impact of those documents -- photographs, letters, standardized forms, among others -- is surprisingly forceful, given that the book runs less than a hundred pages.
The core of Attachments is three chapters called "Entering," "Leaving," and "Staying." One need not get far in the first to see the striking variety of reasons why people came to the U.S., among them political persecution, the force of family ties (which were sometimes invented to circumvent stringent rules), and economic opportunity. A number of stories involve people fleeing the Holocaust.
Strikingly, the longest chapter is "Leaving," a reminder that a large percentage of immigrants left the U.S., willingly and unwillingly, to return to their native lands. Looking at records of the deported, we see the reasons range from political radicalism to the theft of peas, with the broad category of "moral turpitude" considered capacious enough to include everyone from prostitutes to those unfortunate enough to have the wrong kinds of friends. Even those who were ultimately not deported were forced to endure long periods of waiting. One particularly striking tale in the book concerns an American-born Caucasian who forfeited her citizenship by marrying a Chinese man -- she became a "lawfully domiciled Chinese laborer" in South Dakota -- who was forced to reapply for citizenship after returning the U.S. after a trip abroad.
A disproportionate number of stories in Attachments involve Asians. This reflects the racist attitude of the American government, including the Chinese Exclusion Act of 1882 and the "Gentleman's Agreement" barring the Japanese after 1907. While the east coast's Ellis Island was largely a way station for immigrants to get into this country, west coast's Angel Island was largely an interception station to keep them out. Europeans had their own problems with the quotas established in 1924; in a number of cases people were thrown out of the U.S. on the basis of questionable political beliefs. Even Mexicans, who were not subject to them, still had to scale bureaucratic hurdles.
The poignance of Attachments derives in part from the very fragmentary quality of the tales it contains. We (literally) get snapshots of people in motion, the facts of their lives listed on standardized forms but captured by the emotionally rich faces in their photographs (taken to prevent fraud) and accompanying documents. These people, otherwise lost to history, get resurrected, a haunting reminder of the hopes and struggles of people seeking a promised land achingly in view.
The following has been posted on the Books page of the History News Network site.
For many years now, I've dealt with the topic of late 19th/early 20th-century immigration in my teaching by relying on pieces from Life Stories of Undistinguished Americans, as Told by Themselves, a collection of first-person accounts first published in book form in 1906. There are few better ways of dramatizing this epic global transformation, which typically must be dealt with in sweeping generalizations, than vivid primary source documents like "Story of a Polish Sweatshop Girl" or "Story of a Chinaman," which render the daily of immigrant life with vivid granular detail, from monthly budgets to racial harassment. I was interested in Attachments, the companion volume to a new exhibit at the Nation Archives, for the way it might help amplify this primary source approach to the subject. At first I wasn't so sure it would; the approximately 20 primary source brief essays that accompany the documents in the book rarely contain the voices of immigrants themselves. But the cumulative impact of those documents -- photographs, letters, standardized forms, among others -- is surprisingly forceful, given that the book runs less than a hundred pages.
The core of Attachments is three chapters called "Entering," "Leaving," and "Staying." One need not get far in the first to see the striking variety of reasons why people came to the U.S., among them political persecution, the force of family ties (which were sometimes invented to circumvent stringent rules), and economic opportunity. A number of stories involve people fleeing the Holocaust.
Strikingly, the longest chapter is "Leaving," a reminder that a large percentage of immigrants left the U.S., willingly and unwillingly, to return to their native lands. Looking at records of the deported, we see the reasons range from political radicalism to the theft of peas, with the broad category of "moral turpitude" considered capacious enough to include everyone from prostitutes to those unfortunate enough to have the wrong kinds of friends. Even those who were ultimately not deported were forced to endure long periods of waiting. One particularly striking tale in the book concerns an American-born Caucasian who forfeited her citizenship by marrying a Chinese man -- she became a "lawfully domiciled Chinese laborer" in South Dakota -- who was forced to reapply for citizenship after returning the U.S. after a trip abroad.
A disproportionate number of stories in Attachments involve Asians. This reflects the racist attitude of the American government, including the Chinese Exclusion Act of 1882 and the "Gentleman's Agreement" barring the Japanese after 1907. While the east coast's Ellis Island was largely a way station for immigrants to get into this country, west coast's Angel Island was largely an interception station to keep them out. Europeans had their own problems with the quotas established in 1924; in a number of cases people were thrown out of the U.S. on the basis of questionable political beliefs. Even Mexicans, who were not subject to them, still had to scale bureaucratic hurdles.
The poignance of Attachments derives in part from the very fragmentary quality of the tales it contains. We (literally) get snapshots of people in motion, the facts of their lives listed on standardized forms but captured by the emotionally rich faces in their photographs (taken to prevent fraud) and accompanying documents. These people, otherwise lost to history, get resurrected, a haunting reminder of the hopes and struggles of people seeking a promised land achingly in view.
Friday, May 11, 2012
Swiftly rocking
The following post is an excerpt from a work in progress, Stages, Pages, and Screens: A Short History of the Modern Media, under contract with Wiley-Blackwell. This piece, on Taylor Swift, is one of a number of sidebar articles slated to appear in that book. --JC
“It’s my party and I’ll cry if I want to,” pop singer Leslie
Gore asserted in her 1963 hit single “It’s My Party” (followed later that year
with its sequel, “Judy’s Turn to Cry”). Ever since, generations of young women
– Janis Ian, Debbie Gibson, Alanis Morissette, Avril Lavgine, among others –
have given voice to the hopes and fears of adolescent females in pop music. As
such, Taylor Swift is part of a long tradition. But in the space of a few
years, she has staked a claim to cultural history that may well prove to be
broader and deeper than most.
Some careers in pop music are the
product of private turmoil and professional struggle. Youthful adversity has
shaped the legends ranging from Elvis Presley to Shania Twain. Swift’s
background, by contrast, is one of comfort and security. She was born on
December 13, 1989 in eastern Pennsylvania, the eldest of two children. Both her
parents were in the financial services industry before at the time of her birth
– her mother left the profession to become a full-time mom – and the family had
a Christmas tree business on the side. Music figures strongly in her heritage;
Swift’s maternal grandmother was a professional opera singer, and both her
paternal grandparents were musicians. She herself was named after singer/songwriter
James Taylor (an important fact considering the trajectory of her evolution in
the music business). Swift demonstrated a penchant for performing very early in
life, appearing frequently in school and local stage productions and entering
karaoke contests. She was inspired by the career of child-sensation Leann
Rimes, who influenced Swift’s orientation toward country music. She was a child
herself when her mother began taking her down to Nashville in a quest to get
the attention of record company executives. While lightning didn’t strike
immediately, Swift got sufficient encouragement in the form of development
deals (which paid some recording costs in exchange for a future option to sign)
and the family decided to relocate to Hendersonville, Tennessee, a suburb of
Nashville, when she was fourteen years old. Between 2004 and 2006 she began
collaborating with professional songwriters, as well as forming a professional
relationship with producer Nathan Chapman and executive Scott Borchetta, who was
in the process of founding his own label, Big Machine Records. In 2006 Swift
released her first single, “Tim McGraw,” named after the country star she later
befriended. The song, in which she expresses the hope that a former boyfriend
will think of her whenever she hears a particular McGraw song, combines an
aching sense of loss with a subtle sense of retribution, two qualities that
would characterize Swift’s work in years to come. A string of subsequent hits
from her 2006 self-titled debut album followed, including “Teardrops on My
Guitar” and “Our Song.”
For a mere adolescent, Swift showed an unusually adult degree of
discipline as a songwriter and recording artist, and extended it to other
aspects of her career: relentless touring (generally expected of a country
music star) and assiduous attention to detail in terms of managing her career
in arenas like social media (which was not). She was really the first country
music star of the digital age, selling millions of downloads in an industry
only gradually making the transition from compact disc, and one who
demonstrated a desire to connect with her fans reminiscent of the young Bruce
Springsteen, an artist Swift is said to admire. (She is also a fan of a
favorite of her mothers, the rock band Def Leppard, with whom she has
performed.) These qualities, combined with skillful promotion, made her second
album Fearless (2008) one of the most
successful of the decade, spawning a whole new series of hit singles, among
them “Love Story,” “You Belong with Me,” and the title track, which describes
the hope and anxiety of a high school freshman on the first day of school with
disarming directness.
Swift was richly rewarded for her
talents, not only in terms of phenomenal sales, but also in the bevy of awards
she won for her work, among them a series of prestigious Country Music Awards
(CMAs). But her career took an unusual turn in September of 2009 when she won a
Video Music Award (VMA) from MTV for Best Female Video. Swift had just begun
her speech acknowledging the honor when she was interrupted by rapper Kanye
West, who grabbed the microphone she was using and congratulated her but opined that his friend Beyoncé really
deserved the honor for her song “Single Ladies (Put a Ring on It).” Swift was
stunned into silence and left the stage. When “Single Ladies” ultimately took
the award for Video of the Year, a gracious Beyoncé coaxed Swift back to finish
her remarks. Amid the widespread condemnation of West – President Barack Obama
called him a “jackass” – Swift received sympathy and a new wave of attention.
In the fall of 2010, just as she was turning 21,
Swift released her third album, Speak Now. In the liner notes, she described it as a concept
album whose songs “are made up of words I didn’t say when the moment was right
in front of me. These songs are open letters. Each is written with a specific
person in mind, telling them what I meant to tell them in person.” Though her
subjects are never identified explicitly, it’s not hard in some cases to see to
whom they’re directed. So, for example, the song “Innocent” seems directed at West, expressing sympathy
for his well-known inner turbulence for forgiving him for his excess (“who you
are is not what you did”). Another, less charitable song, “Dear John,” is
addressed to former paramour John Mayer – the bluesy style of guitar playing
alone is a dead giveaway. In one way or another, Swift’s well-chronicled
romantic life had always been the source of most of her music, and this album
is no exception.
That said, Speak Now represented an important developmental leap forward.
For one thing, Swift wrote all the songs on the album herself (though she no
doubt got input from Chapman, among others). For another, the
record marked a bold foray into a new musical direction: Speak Now is at heart a rock record. To be sure, Swift’s
country heritage continued to be evident, nowhere more so than on the hit
single “Mean,” which was marked by bluegrass elements. (The song, a cheerfully
acidic rant, was directed toward a critic who complained that she couldn’t
sing.) But a bona fide heavy metal element was evident on a number of tracks,
in particular the catty “Better than Revenge,” in which she excoriates a rival
for stealing her boyfriend. But the best showcase for Swift’s command of a rock
idiom is the shimmering title track, reminiscent of the early Beatles in its
catchy hook and hand-clapping. The song, almost cinematic, is reminiscent of
the 1967 movie The Graduate,
except that this time it’s the girl, not the guy, who rescues her true love
from marriage to someone else.
Perhaps the most important dimension of
Swift’s growth in Speak Now is a new
sophistication in her songwriting. The great appeal of her early records was
their emotional simplicity (albeit a deceptive one in that such an effect was
achieved through a strong sense of songcraft, something that often involves
subtraction rather than addition). “You Belong with Me” is a schoolgirl’s
lament that she can’t compete with a cheerleader for the heart of a boy; the
cliché riddled “Love Story” works not so much because the imagery is original but
rather because you believe that the adolescent who invokes Romeo and Juliet is
living a romantic drama for the first time. In Speak Now, however, the conflicts are more recognizably adult ones.
In the album’s opening track, “Mine,” the narrator tells her boyfriend, “you
made a rebel of a careless man’s careful daughter,” a line that manages to
encapsulate a lonely childhood and suggest how liberating having a partner can
be. The very exultant intensity of “Mine” seems to derive from how close a
call, how truly unexpected, such an outcome was – and is. “Do you believe it?”
she asks toward the end of the song, her voice joy mingling with surprise.
In “The Story of Us,” the surprise is not
that a love story ends happily ever after, but miserably. The narrator, who believed
she was part of a blessed union, instead finds herself locked in a stubborn
struggle with a man – “you held your pride like you should have held me,” she
complains – that defies a script about the way a relationship should work.
Another song marked by hard-driving guitars, “The Story of Us” derives much of
its power from the exasperation in Swift’s voice – and the abrupt way the song
severs at the end.
Speak
Now was another triumph
for Swift, selling over a million copies in the first week of its release in
October of 2010, and four million copies by year’s end. In the five years
following the release of her first album she has sold over 20 million records –
this at a time when the record sales have dropped sharply amid a global
recession and the upheaval caused by the digital music – and was cited by the Guinness Book of World Records for
scoring 11 consecutive singles on the Billboard
pop charts. If one were to assume she never made another hit record, her place
in the annals of pop music history would be secure.
There are those who wonder how much
staying power Swift has. Certainly, the history of pop singers, female and
otherwise, is littered with sensations whose youthful work remained memorable
but whose later work has, rightly or wrongly, largely been forgotten. The range
of Swift’s themes – she studiously avoids politics, for example – may also lead
one to wonder how much room she has to grow. (Certainly Speak Now has more than its share of love songs that could just as
easily have ended up on Fearless in
their adolescent frame of reference.) But she has also shown herself to be an
apt pupil in the ways of the pop music, and made the transition to adulthood
with relative grace. Perhaps her fate will be closer to that of Joni Mitchell,
the singer-songwriter she expressed an interest in portraying in a forthcoming
movie, whose body of work has won her generations of admirers. At the moment,
at least, there are plenty of people who are eager to grow old Swiftly.
Monday, May 7, 2012
Prime time
New York Times reporter Patricia Cohen renders a sprightly rendition of mature adulthood with In Our Prime: The Invention of Middle Age
The following has been posted on the Books page of the History News Network site.
The notion that middle age is essentially a cultural construction is not one that will be surprising to historians. But New York Times journalist Patricia Cohen makes this case with breadth and verve. Though it seems to sprawl at times, with a range of opinions that can become tiresome in their predictable diversity -- every opinion about middle age has its rejoinder -- In Our Prime is a serious and useful survey in the subject likely to remain a standard of its kind for some time to come.
Cohen begins by noting that until the twentieth century, there was rarely discussion of what he have come to know as middle age. To the extent that the concept was understood, it was generally regarded as one of productive maturity -- often enviable to the youthful, who longed for the gravitas maturity conferred. This situation began to change a century ago, heavily influenced by the advent of mass media, particularly movies and advertising, which substantially changed the terms of the equation.
The status of middle age receded still further in the first half of the twentieth century, as psychologists Sigmund Freud and G. Stanley Hall focused on infancy and adolescence as the crucial staging grounds of personal identity. Not until the path-breaking work of Erik Erickson was there much effort to delineate a notion of midlife, and even he backed into via his attempts to segment the either end of a lifetime. Ironically, it was not until the 1960s, in the zenith of youth culture, that there was any real effort to systematically define and trace midlife using longitudinal studies and neurological research backed by serious foundation money. In recent decades these efforts have led to a greater understanding of the the (still imprecisely defined) concept senescence. Current scientific opinion emphasizes the plastic nature of the brain long after maturity, with recent speculation that there are certain kinds of aptitude (like responding to unexpected stress) that older people seem to handle better than younger ones, even if there are not currently good ways to measure a quality that falls into the category of wisdom.
In the last third of the book Cohen surveys "the Midlife Industrial Complex," which she sees as a largely capitalist-driven phenomenon. She notes how a wide array of conditions associated with age, ranging from physical appearance to sexual drive, have been medicalized in recent decades by huckters seeking to exploit the emotional vulnerabilities and relatively deep pockets of Baby Boomers. Yet even this seems to have a silver lining, as marketers are gradually realizing that their mania for the 18-49 demographic overlooks some of the most fertile terrain for their wares. Such a recognition has begun to have an impact on television, for example, where shows geared to more mature and diverse audiences have become more common.
In Our Prime has an even tone and intellectual depth that talks frankly about some of the most dismaying aspects of the aging process. But its overall mood is upbeat: mid-life -- which Cohen resists defining precisely even as the book ends -- is a lengthening time of opportunity. Her message of hope is worth buying, literally and figuratively.
The following has been posted on the Books page of the History News Network site.
The notion that middle age is essentially a cultural construction is not one that will be surprising to historians. But New York Times journalist Patricia Cohen makes this case with breadth and verve. Though it seems to sprawl at times, with a range of opinions that can become tiresome in their predictable diversity -- every opinion about middle age has its rejoinder -- In Our Prime is a serious and useful survey in the subject likely to remain a standard of its kind for some time to come.
Cohen begins by noting that until the twentieth century, there was rarely discussion of what he have come to know as middle age. To the extent that the concept was understood, it was generally regarded as one of productive maturity -- often enviable to the youthful, who longed for the gravitas maturity conferred. This situation began to change a century ago, heavily influenced by the advent of mass media, particularly movies and advertising, which substantially changed the terms of the equation.
The status of middle age receded still further in the first half of the twentieth century, as psychologists Sigmund Freud and G. Stanley Hall focused on infancy and adolescence as the crucial staging grounds of personal identity. Not until the path-breaking work of Erik Erickson was there much effort to delineate a notion of midlife, and even he backed into via his attempts to segment the either end of a lifetime. Ironically, it was not until the 1960s, in the zenith of youth culture, that there was any real effort to systematically define and trace midlife using longitudinal studies and neurological research backed by serious foundation money. In recent decades these efforts have led to a greater understanding of the the (still imprecisely defined) concept senescence. Current scientific opinion emphasizes the plastic nature of the brain long after maturity, with recent speculation that there are certain kinds of aptitude (like responding to unexpected stress) that older people seem to handle better than younger ones, even if there are not currently good ways to measure a quality that falls into the category of wisdom.
In the last third of the book Cohen surveys "the Midlife Industrial Complex," which she sees as a largely capitalist-driven phenomenon. She notes how a wide array of conditions associated with age, ranging from physical appearance to sexual drive, have been medicalized in recent decades by huckters seeking to exploit the emotional vulnerabilities and relatively deep pockets of Baby Boomers. Yet even this seems to have a silver lining, as marketers are gradually realizing that their mania for the 18-49 demographic overlooks some of the most fertile terrain for their wares. Such a recognition has begun to have an impact on television, for example, where shows geared to more mature and diverse audiences have become more common.
In Our Prime has an even tone and intellectual depth that talks frankly about some of the most dismaying aspects of the aging process. But its overall mood is upbeat: mid-life -- which Cohen resists defining precisely even as the book ends -- is a lengthening time of opportunity. Her message of hope is worth buying, literally and figuratively.
Thursday, May 3, 2012
Cover version
Here's a sneak preview of the cover of my forthcoming book, to be published by Oxford University Press later this year. The
book looks at the way
trajectories of American history are embedded in the careers of movie stars. It
surveys the careers of six actors and how each body of work as a whole offers a
coherent vision of U.S. history. These versions are not necessarily conscious,
are never incontestable, and indeed may be marked by any number of internal
tensions. But for better and worse
they reflect and project collective understandings that are quite powerful and
often independent of scholarly opinion (which will be a point of reference
throughout). One chapter, “Tending
to the Flock,” traces the surprising strain of Jeffersonian-styled communitarianism
that runs through Clint Eastwood’s apparently individualistic corpus. Another,
“Shooting Star,” explores the way Daniel Day-Lewis reconfigures Frederick
Jackson Turner’s vision of the frontier.
A third, “Rising Sons,” focuses on Denzel Washington’s recurrent choice
of roles involving parenting and mentoring in the context of African American
history (a motif with an often religious subtext). A fourth, “Company Man,”
looks at Lincolnian accents in the movies of Tom Hanks, the generational heir
of Jimmy Stewart. A fifth looks at the feminist trajectory of Meryl Streep, and the final chapter explores the career of Jodie Foster as an American loner. These are all people with considerable power to choose their
roles, and thus to register patterns that would be otherwise difficult to trace
among more workaday actors. The generational thread that connects these people,
all born in the middle third of the twentieth century, is the climate of
institutional skepticism that has dominated American life in the decades since
they came of age.
There are thus three
concentric circles of argument in the project: one about specific actors and
the often surprising cohesion in their bodies of work; one about the
generational tenor of American life in the late 20th and early 21st
centuries; and one about the way a notion of history – defined here as a
belief, rooted in perceptions of collective experience, about the way society
changes – that threads through the work of people who are often thinking about
other things, an existential condition that applies to many of us.
Sunday, April 29, 2012
Fit print
In Merchants of Culture: The Publishing Business in the Twenty-First Century, John B. Thompson has written a page-turner about those who make them (virtual and otherwise)
John B. Thompson begins this book with a publishing anecdote that will be familiar even to those on the margins of the business: the story of how Randy Pausch, a professor of computer science at Carnegie Mellon, gave a talk in 2007 as part of a series at the university with the title "The Last Lecture." As it turned out, Pausch was dying of pancreatic cancer, giving his well-received presentation an element of poignance that generated a wave of national publicity. What proved truly stunning, however, was how eager New York publishers were to acquire the book that became The Last Lecture: Pausch, a first-time big-time author was paid a $6.75 million advance by Hyperion, a Disney company. How could that possibly make sense?
In 400 chiseled pages, Thompson explains why such an offer came about, and why it made sense -- indeed, The Last Lecture proved to be a lucrative acquisition for Hyperion. He does so with the the methodological acumen of the sociologist he is (at the University of Cambridge). Thompson conducted hundreds of interviews for Merchants of Culture, supplemented by new interviews with many of his sources for this newly released second edition of the book (the first was published in 2010). Much of Thompson's analysis builds on that of his 2005 book Books in the Digital Age, which focused on scholarly publishing. Here he focuses on trade publishing, the hyper-commercial industry focused in New York and London.
It's in the nature of any project of this sort that it stands to date quickly. But Thompson has done a notably good job of keeping his findings timely -- the figures here run into mid-2011, capturing the arrival of the e-book transformation of the industry at that moment it shifted from an abstract possibility to an increasingly evident reality. In some sense, however, the book feels fresh and up-to-date because of an intuitive grasp of temporal proportion; his perspective dates back to the corporate consolidation of the publishing industry in the 1970s, and he traces trends that in many cases have been decades in the making.
The organizational strategy for Merchants of Culture consists of chapters focused on key constituencies in the industry: on on the rise (and decline) of retail chains; the growing power of literary agents; the consolidation of publishing houses; and so on. He also takes note of what is now an established trend of a blockbuster mentality so typical of the major media, along with emerging ones like "extreme publishing" (quickly-produced books designed to plug gaps in financial projections) and the "hidden revolution" in the manufacture and distribution of books. Naturally, he gives plenty of space to major players like Amazon.com, and the transformational role of the Kindle -- with attention to both those who celebrate as well as fear its power.
Thompson has a measured tone, and his goal here is clearly to explain how the field -- a term he identifies as a conceptual construct within sociology -- interlocks in ways that may not always be obvious to an outsider. He does, however, weigh in with some mild-mannered judgments. Thompson thinks a corporate mentality erodes the long-term attention to backlists that are crucial to the ecology of the industry. He notes that big-time publishers like Random House and HarperCollins, unwilling to tend backlists, have instead been buying them by acquiring other imprints, a strategy that has come close to running its course. He sees a polarization in the industry: business conditions are most propitious for behemoths with deep pockets or scrappy little houses, some of them academic players that run a trade operation on a shoestring. But he notes there's precious little ground for medium-sized houses like Farrar, Straus & Giroux (which leverage prestige and typically federate to maximize back-office resources). Thompson is also attentive to the fact that publishing can be most brutal not to first-time writers, but rather those who establish a track record that is found wanting and who must then struggle to survive in an increasingly indifferent field.
As someone who has worked in publishing as well as published books with trade, academic, and specialty publishers, I must say I have never encountered a work as incisive and complete as Merchants of Culture. This one will surely be a backlist perennial, and must reading for anyone with a stake in the business.
John B. Thompson begins this book with a publishing anecdote that will be familiar even to those on the margins of the business: the story of how Randy Pausch, a professor of computer science at Carnegie Mellon, gave a talk in 2007 as part of a series at the university with the title "The Last Lecture." As it turned out, Pausch was dying of pancreatic cancer, giving his well-received presentation an element of poignance that generated a wave of national publicity. What proved truly stunning, however, was how eager New York publishers were to acquire the book that became The Last Lecture: Pausch, a first-time big-time author was paid a $6.75 million advance by Hyperion, a Disney company. How could that possibly make sense?
In 400 chiseled pages, Thompson explains why such an offer came about, and why it made sense -- indeed, The Last Lecture proved to be a lucrative acquisition for Hyperion. He does so with the the methodological acumen of the sociologist he is (at the University of Cambridge). Thompson conducted hundreds of interviews for Merchants of Culture, supplemented by new interviews with many of his sources for this newly released second edition of the book (the first was published in 2010). Much of Thompson's analysis builds on that of his 2005 book Books in the Digital Age, which focused on scholarly publishing. Here he focuses on trade publishing, the hyper-commercial industry focused in New York and London.
It's in the nature of any project of this sort that it stands to date quickly. But Thompson has done a notably good job of keeping his findings timely -- the figures here run into mid-2011, capturing the arrival of the e-book transformation of the industry at that moment it shifted from an abstract possibility to an increasingly evident reality. In some sense, however, the book feels fresh and up-to-date because of an intuitive grasp of temporal proportion; his perspective dates back to the corporate consolidation of the publishing industry in the 1970s, and he traces trends that in many cases have been decades in the making.
The organizational strategy for Merchants of Culture consists of chapters focused on key constituencies in the industry: on on the rise (and decline) of retail chains; the growing power of literary agents; the consolidation of publishing houses; and so on. He also takes note of what is now an established trend of a blockbuster mentality so typical of the major media, along with emerging ones like "extreme publishing" (quickly-produced books designed to plug gaps in financial projections) and the "hidden revolution" in the manufacture and distribution of books. Naturally, he gives plenty of space to major players like Amazon.com, and the transformational role of the Kindle -- with attention to both those who celebrate as well as fear its power.
Thompson has a measured tone, and his goal here is clearly to explain how the field -- a term he identifies as a conceptual construct within sociology -- interlocks in ways that may not always be obvious to an outsider. He does, however, weigh in with some mild-mannered judgments. Thompson thinks a corporate mentality erodes the long-term attention to backlists that are crucial to the ecology of the industry. He notes that big-time publishers like Random House and HarperCollins, unwilling to tend backlists, have instead been buying them by acquiring other imprints, a strategy that has come close to running its course. He sees a polarization in the industry: business conditions are most propitious for behemoths with deep pockets or scrappy little houses, some of them academic players that run a trade operation on a shoestring. But he notes there's precious little ground for medium-sized houses like Farrar, Straus & Giroux (which leverage prestige and typically federate to maximize back-office resources). Thompson is also attentive to the fact that publishing can be most brutal not to first-time writers, but rather those who establish a track record that is found wanting and who must then struggle to survive in an increasingly indifferent field.
As someone who has worked in publishing as well as published books with trade, academic, and specialty publishers, I must say I have never encountered a work as incisive and complete as Merchants of Culture. This one will surely be a backlist perennial, and must reading for anyone with a stake in the business.
Tuesday, April 24, 2012
Controlled energy
In More Powerful than Dynamite: Radicals, Plutocrats, Progressives, and New York's Year of Anarchy, young Thai Jones resurrects a lost metropolis
The following has been posted on the Books page of the History News Network site.
I didn't plan to read this book. I'd put it on a pile of forthcoming titles, one I consulted after finishing the last book I reviewed sooner than planned. I thumbed through the first few pages of a couple on that pile and found myself engaged by the portrait of New York City mayor-elect John Purroy Mitchel on New Year's Eve of 1913. Maybe this book about the year 1914 was worth embarking upon after all.
It was only after I was well into it that I realized More Powerful Than Dynamite has an arresting provenance that makes the particular manner of its execution all the more remarkable. At first I wasn't too surprised by blurbs that didn't quite come from the usual suspects. Kenneth Jackson, sure -- blue chip. Little odd to have him share a back cover with Noam Chomsky, though. And Marge Piercy. Don't think of Samuel G. Freedman as a fellow traveler. Bill Ayers? Don't imagine you'll find this book lying around Obama '12 campaign headquarters. Outside of radical circles, this is not exactly an endorsement a lot of writers would flaunt.
Turns out the Ayres connection is not merely incidental. The jacket copy informs us that Thai Jones was "born while his parents were fugitives from justice" and that he "went by a series of aliases until the age of four." Jones's previous and first book, The Radical Line: From the Labor Movement to the Weather Underground, One Family's Century of Conscience (2004), describes a genealogy of radical leftist politics. In the foreword of this book, Jones explains his interest in 1914 New York originated in a now largely forgotten anarchist bomb blast on upper Lexington Avenue that paralleled the notorious one by the Weather Underground in Greenwich Village in 1970. In both cases, radicals were victims of a blast they intended to inflict on others.
I rehearse this background for Dynamite because one might plausibly expect its author to carry the torch for his family's radicalism. Or, perhaps equally plausibly, to repudiate it with a fierceness that derives from that source. But this is a remarkably measured piece of writing by a truly gifted young man still in his thirties. Jones is a former reporter for Newsday, and this book began as a PhD dissertation at Columbia. It combines the lean prose of a journalist with the depth of an academic. But Jones's eye for detail is novelistic, and he is a master of understatement. He turns the neat trick of making moderation marvelous.
Many of the events discussed in Dynamite -- the Ludlow Massacre out in the Colorado coalfields; the reform efforts of the Mitchel administration; and, of course, the outbreak of the First World War -- will be familiar to students of the period. Ditto for a cast of characters that includes Woodrow Wilson, Upton Sinclair, and John D. Rockefeller, Jr. But this biographically-driven narrative is populated by a host of obscure ones like the International Workers of the World activist Fred Tannenbaum, police commissioner Arthur Woods, and the charismatic hunger-striker Becky Edelsohn, all of whom burst into life on these pages (nowhere more so than in the otherwise sleepy suburb of Tarrytown, which in May of 1914 gave Manhattan a run for its money in political drama). Jones narrates public demonstrations with cinematic clarity -- Occupy Wall Street was downright genteel compared to the string of uprisings in the city in the first half of 1914 -- even as he manages to capture the inner life of his characters with an empathy that's moving in its own right. So it is that we experience the radical Alexander Berkman's melancholy nostalgia for the terrorism of his youth, Mayor Mitchel's awkwardness in serving citizens he didn't particularly care to meet, and Commissioner Wood's careful, patient efforts to learn from previous police mistakes maintaining public order. We even feel some sympathy for poor John D. Rockefeller Sr., who can't get through a round of golf without being importuned for stock tips by grasping companions.
Which is not to say that Jones suspends judgments. He notes that Rockefeller Jr. was deeply anguished by the Ludlow situation, which it was his family responsibility to manage. "But," he notes, "while Rockefeller was unwilling to ignore the the inequities of business, he was equally unable to intercede against the executives of Colorado Fuel and Iron." This dithering literally proved fatal, a sin for which Rockefeller sincerely tried to atone. Conversely, Jones shows that while Woods showed far more respect for the First Amendment than any of his predecessors (more for tactical than philosophical reasons), he replied to criticism about authorizing unprecedented wiretaps of suspected radicals by saying, "There is altogether too much sappy talk about the rights of the crook . . . He is a crook. He is an outlaw. He defies what has been put down as what shall be done and what shall not be done by the great body of law-abiding citizens. Where does his right come in?" Jones wisely lets us draw our own conclusion without comment.
The author's self-control has a deeply historical quality; he shows us people living through dramas whose outcomes they could not know, struggling to understand what is happening to them and trying, not always successfully, to grow from their experiences. Young Fiorello LaGuardia was an admirer of Mayor Mitchel who honored his memory -- to a point. The leaders of his Progressive stripe "had attempted to separate government from politics, but that does not work in a democracy," a mistake LaGuardia did not make. One of the few people who comes off truly badly in this book is Walter Lippmann, who coined the phrase of its title. As he is in so many accounts of this period, Lippmann is everywhere and always seems to have a pithy remark that's both incisive and at least faintly condescending. He's heartless, and in his way is harder to take than Rockefeller the younger.
Toward the end of this book -- a little later than we should, really -- its larger argument comes into focus, which involves the role of Progressives as mediators between the plutocrats and radicals of the subtitle. Jones asserts that the events of 1914 were decisive in swinging reformers toward the right, which had lasting implications for American politics. Perhaps there's grist here for his next book.
In any case, Dynamite showcases a rare talent notable for its equipoise in balancing heart and head. Jones serves the memory of his subject with quiet grace. And he serves his readers with stories that deserve to be remembered. Here truly is a career worth following.
The following has been posted on the Books page of the History News Network site.
I didn't plan to read this book. I'd put it on a pile of forthcoming titles, one I consulted after finishing the last book I reviewed sooner than planned. I thumbed through the first few pages of a couple on that pile and found myself engaged by the portrait of New York City mayor-elect John Purroy Mitchel on New Year's Eve of 1913. Maybe this book about the year 1914 was worth embarking upon after all.
It was only after I was well into it that I realized More Powerful Than Dynamite has an arresting provenance that makes the particular manner of its execution all the more remarkable. At first I wasn't too surprised by blurbs that didn't quite come from the usual suspects. Kenneth Jackson, sure -- blue chip. Little odd to have him share a back cover with Noam Chomsky, though. And Marge Piercy. Don't think of Samuel G. Freedman as a fellow traveler. Bill Ayers? Don't imagine you'll find this book lying around Obama '12 campaign headquarters. Outside of radical circles, this is not exactly an endorsement a lot of writers would flaunt.
Turns out the Ayres connection is not merely incidental. The jacket copy informs us that Thai Jones was "born while his parents were fugitives from justice" and that he "went by a series of aliases until the age of four." Jones's previous and first book, The Radical Line: From the Labor Movement to the Weather Underground, One Family's Century of Conscience (2004), describes a genealogy of radical leftist politics. In the foreword of this book, Jones explains his interest in 1914 New York originated in a now largely forgotten anarchist bomb blast on upper Lexington Avenue that paralleled the notorious one by the Weather Underground in Greenwich Village in 1970. In both cases, radicals were victims of a blast they intended to inflict on others.
I rehearse this background for Dynamite because one might plausibly expect its author to carry the torch for his family's radicalism. Or, perhaps equally plausibly, to repudiate it with a fierceness that derives from that source. But this is a remarkably measured piece of writing by a truly gifted young man still in his thirties. Jones is a former reporter for Newsday, and this book began as a PhD dissertation at Columbia. It combines the lean prose of a journalist with the depth of an academic. But Jones's eye for detail is novelistic, and he is a master of understatement. He turns the neat trick of making moderation marvelous.
Many of the events discussed in Dynamite -- the Ludlow Massacre out in the Colorado coalfields; the reform efforts of the Mitchel administration; and, of course, the outbreak of the First World War -- will be familiar to students of the period. Ditto for a cast of characters that includes Woodrow Wilson, Upton Sinclair, and John D. Rockefeller, Jr. But this biographically-driven narrative is populated by a host of obscure ones like the International Workers of the World activist Fred Tannenbaum, police commissioner Arthur Woods, and the charismatic hunger-striker Becky Edelsohn, all of whom burst into life on these pages (nowhere more so than in the otherwise sleepy suburb of Tarrytown, which in May of 1914 gave Manhattan a run for its money in political drama). Jones narrates public demonstrations with cinematic clarity -- Occupy Wall Street was downright genteel compared to the string of uprisings in the city in the first half of 1914 -- even as he manages to capture the inner life of his characters with an empathy that's moving in its own right. So it is that we experience the radical Alexander Berkman's melancholy nostalgia for the terrorism of his youth, Mayor Mitchel's awkwardness in serving citizens he didn't particularly care to meet, and Commissioner Wood's careful, patient efforts to learn from previous police mistakes maintaining public order. We even feel some sympathy for poor John D. Rockefeller Sr., who can't get through a round of golf without being importuned for stock tips by grasping companions.
Which is not to say that Jones suspends judgments. He notes that Rockefeller Jr. was deeply anguished by the Ludlow situation, which it was his family responsibility to manage. "But," he notes, "while Rockefeller was unwilling to ignore the the inequities of business, he was equally unable to intercede against the executives of Colorado Fuel and Iron." This dithering literally proved fatal, a sin for which Rockefeller sincerely tried to atone. Conversely, Jones shows that while Woods showed far more respect for the First Amendment than any of his predecessors (more for tactical than philosophical reasons), he replied to criticism about authorizing unprecedented wiretaps of suspected radicals by saying, "There is altogether too much sappy talk about the rights of the crook . . . He is a crook. He is an outlaw. He defies what has been put down as what shall be done and what shall not be done by the great body of law-abiding citizens. Where does his right come in?" Jones wisely lets us draw our own conclusion without comment.
The author's self-control has a deeply historical quality; he shows us people living through dramas whose outcomes they could not know, struggling to understand what is happening to them and trying, not always successfully, to grow from their experiences. Young Fiorello LaGuardia was an admirer of Mayor Mitchel who honored his memory -- to a point. The leaders of his Progressive stripe "had attempted to separate government from politics, but that does not work in a democracy," a mistake LaGuardia did not make. One of the few people who comes off truly badly in this book is Walter Lippmann, who coined the phrase of its title. As he is in so many accounts of this period, Lippmann is everywhere and always seems to have a pithy remark that's both incisive and at least faintly condescending. He's heartless, and in his way is harder to take than Rockefeller the younger.
Toward the end of this book -- a little later than we should, really -- its larger argument comes into focus, which involves the role of Progressives as mediators between the plutocrats and radicals of the subtitle. Jones asserts that the events of 1914 were decisive in swinging reformers toward the right, which had lasting implications for American politics. Perhaps there's grist here for his next book.
In any case, Dynamite showcases a rare talent notable for its equipoise in balancing heart and head. Jones serves the memory of his subject with quiet grace. And he serves his readers with stories that deserve to be remembered. Here truly is a career worth following.
Saturday, April 21, 2012
Westerns civilization
Daniel Day-Lewis and the
Persistent Significance of the Frontier in American Cinema
Jim Cullen
The following is the text of my keynote address for the "Focus on Teaching" luncheon at the Organization of American Historians Annual
Meeting, Milwaukee, Wisconsin, April 21, 2012
The story of the forthcoming book on which
this talk is based begins in 2001, when I left academe and began working as a
high school teacher. In the
process of trying to plan the first semester of a U.S. history survey, I made a
curious discovery after generating a slate of movies I planned to show over the
course of the fall semester: every one of them starred Daniel Day-Lewis. There
was The Crucible. And Last of the Mohicans. And The Age of
Innocence. Later I added Gangs of New York and There Will Be
Blood. All told, there were nine times I ran an annual event I dubbed
"The Daniel Day-Lewis Film Festival."
Maybe it's not surprising that my
predilections would express themselves without conscious effort. But keep in
mind that we're talking about Daniel Day-Lewis here. As anyone
vaguely familiar with his work knows, Day-Lewis is legendary for the
extraordinary variety of characters he has played, and the vertiginous
psychological depth with which he has played them. I first became aware of
Day-Lewis in early 1985, when, in the space of a week, I watched him portray
the priggish Cecil Vyse in the tony Merchant-Ivory film adaptation of E.M.
Forster’s Room with a View and then saw him play Johnny, the punk East
End homosexual, in Stephen Frears's brilliantly brash My Beautiful
Launderette. Day-Lewis went on to have a distinguished career, winning the
first of two Academy Awards for his portrayal of the handicapped Irish poet
Christy Brown in My Left Foot in 1989, but between 1988 and 2007 he
played a string of American figures that ranged from a seventeenth century
Puritan to a twentieth-century art collector.
What could this mean, I wondered? Every
year like clockwork, I watched these films again with my students, marveling at
the inexhaustible nuances of Day-Lewis's performances. I began to ask myself:
Could it make sense to think of actors as historians? That people, in the
process of doing a job whose primary focus was not thinking in terms of
interpretation of the past, were nevertheless performing one? And that in doing
so repeatedly over the course of a career would articulate an interpretive
version of American history as a whole?
Of course, such people are aware when
they're dealing with historical situations (or contemporary situations with
historical resonances), and may make real effort to exercise historical imagination
as part of their work. But that's the point: it's part of their work. We
all understand that there are many people out there who "do" history
without writing books—archivists, curators, and, of course, filmmakers,
including both documentarians as well as writers and directors of feature
films, who work consciously and conceptually to craft an interpretive
experience for their audiences. What intrigues me about actors, though, are the
obvious limitations and obstacles to executing a purely historical function.
Their work is always embedded in a larger context in which their control of the
material is limited—actors do not typically write their own lines—and their
craft is collaborative, part of enterprises that will always be at as much
aesthetic and commercial as they will be historical. What’s interesting to me,
though, is the way in which very successful actors with a good deal of control
over their choices reveal patterns of thought that are widely shared but rarely
so evident.
Indeed, my primary interest is less in
Hollywood movies or actors than in the way history is absorbed into the fabric
of everyday life—messy, fragmented, more suggestive than direct. This is
actually how it’s lived for students: meta-narratives – of history as
progressive, or circular, or an illustration of the way you can’t fight city
hall – into which into which they plug the various incidents and movements they
learn about inside and outside the classroom. Those meta-narratives are a kind
of historiographic folklore. Every once in a while, historians are the source
(or, at least, powerfully shape) that folklore. In the case of Daniel
Day-Lewis, I gradually realized that this Irish immigrant had somehow absorbed
the frontier myth of Frederick Jackson Turner.
Turner is to the
historical profession what Sigmund Freud is to psychology: a towering giant of
a century ago one whose ideas are now consciously rejected by just about
everybody in his profession—and unconsciously absorbed by just about everybody
else. Turner's 1893 essay "The Significance of the Frontier in American
History" is probably the single most important piece of historical
scholarship ever published in the United States. Written at a time when the
modern research university was just emerging, it was an example of a literary
genre—the analytic essay of the kind you’re now hearing—that was just coming
into its own.
A Wisconsin
native, Turner first delivered "Significance" on the evening of July
12, 1893 at an AHA meeting in Chicago, held amid the fabled World Columbian
Exposition held in that city to celebrate the 400th anniversary of Christopher
Columbus's arrival in America. It seems almost comical to imagine the 31-year
old Turner (then, as now, young for a historian) standing in the front of a
room talking to about 200 colleagues while thousands of his fellow Americans were
taking amusement park rides and surveying the huge temporary stucco buildings
of the so-called White City, a site which was artificially lit thanks to the
technological innovations of the Westinghouse Corporation. But like Westinghouse lighting, the
so-called "Turner Thesis" unveiled in Chicago would prove to be more
durable than any of these fleeting material realities, in large measure because
it was so succinctly stated at the end of the first paragraph of his paper:
"The existence of an area of free land, its continuous recession, and the
advance of American settlement westward, explain American development."
From the vantage
point of over a century later, it may be hard to appreciate just how edgy an
assertion this really was. Turner had been trained back east at Johns Hopkins,
under the tutelage of the legendary Herbert Baxter Adams. Adams was a proponent
of the then-dominant "germ" theory, which argued that western
civilization owed its origins to the forests of Germany, out of which emerged a
Teutonic seed that spread across western Europe, jumped to America, and now
dominated the world. Like so much academic thought of the time, this approach
to history was modeled on science, both in its new emphasis on empirical
research and its use of a biological model—more specifically a (Social)
Darwinian model—to explain historical change.

Over the course
of ensuing decades, the Turner Thesis itself evolved from maverick idea to
common sense, ratified by Turner's appointment at Harvard in 1910. By
mid-century, it had a wide impact on subsequent historians. But in the second
half of the century the thesis came under increasing attack on a variety of
fronts. Some scholars questioned Turner's data, others its implications,
particularly his assertions that the frontier was the engine of U.S. democracy.
The most serious challenge came from those historians, notably Patricia
Limerick, who rejected the assumptions underlying the very idea of the frontier
and the implicit omissions involved in discussing "empty" land that
was in fact inhabited by multicultural populations. To Limerick, Turnerism was
little more than a racist fantasy, at one point joking that for her and
like-minded scholars the frontier had become “the f-word.”
Actually, Turner
did not consider the frontier an unalloyed good. While he viewed it as a
usefully nationalizing phenomenon as well as a wellspring of democracy, he also
recognized that a frontier mentality tended to resist even benevolent forms of
outside control, and fostered a grasping materialism. It also led to a lax
approach to government that fostered the creation of a spoils system. Moreover,
Turner clearly understood, even if he didn't dwell on it, that the extension of
the frontier was a matter of conquest for which he used the correct imperial
term of "colonization."
But the biggest
problem Turner has with the frontier in 1893 is that it's dead. He makes this
clear in the first sentence of "Significance," which discusses
recently updated information from the U.S. Census Bureau indicating the
disappearance of an unbroken line in the American West, which he described as
"the closing of a great historic moment." What the Mediterranean had
been the Greeks, the frontier had been to the Americans. "And now," he
wrote in a closing sentence laced with melancholy, "four centuries from
the discovery of America, at the end of a hundred years of life under the
Constitution, the frontier has gone, and with its going has closed the first
period of American history." The Turner Thesis, in effect, was the
frontier's obituary.
What would take
its place? Turner did not say. Richard Hoftstader would write 75 years later
that the latent pessimism of the frontier thesis was in sharp contrast to the
ebullient optimism Turner attributed to frontier communities. But while Turner never offered an alternative—indeed, he had considerable
trouble writing books, and never quite realized the huge potential suggested by
"Significance"—his politics were considered generally consonant with
those of his friend and colleague Woodrow Wilson. For such people, the frontier
was less a living reality—as it had been for the previous generation of
political reformers, the Populists—than a metaphor that denoted opportunity on
a large scale in a new domain. That’s why Turner called the closing of the
frontier the end of the first period
of American history.
The frontier
remained fertile symbolic terrain for much of the twentieth century, nowhere
more obvious than in the 1960 presidential campaign of John F. Kennedy, whose
slogan was "The New Frontier." But its appeal went a good deal beyond
politics, evident in the rhetoric of the space program as well as that of the
Internet. Nowhere, however, was its power more evident than in U.S. cultural
life. Turnerism is the bedrock of assumptions for the whole genre of the
Western, for example, and the Western, in turn, is the seedbed of other
cultural genres stretching from sci-fi to hip-hop. Along with the legacy of
slavery, the frontier is what makes American culture American.
But if people of
the 20th century experienced the transformation of the frontier from
reality into myth, those of the 21st are witnessing its
transformation from myth into memory. Now belief in the frontier as a living
symbol is itself receding in our imaginations. The proximate cause is
our economic situation, which has cast doubt on the upward mobility that so
many of us have considered our birthright so long, and which is so deeply
intertwined with our sense of a frontier. This sense of doubt is not new. It
has recurred periodically throughout American history, such as the Great
Depression and amid the political scandals and economic stagflation of the
1970s. The current narrative of geopolitical decline, however, is one of rare
and growing depth.
Here I’ll break
to say that I don’t have time to do justice to DDL’s whole body of work, but
instead will focus on three illustrative examples: The Crucible (1997); Last of
the Mohicans (1992) and Gangs of New
York (2002).
The Crucible is a story that’s typically read one of two ways. The first and perhaps
primary one is what prompted Arthur Miller to write it: as a warning about the
dangers of social conformity and letting irrational fears—in particular a fear
of Communism that dominated American public life at the time of the play’s
premiere—govern everyday life. The second tends to see the story in terms more
specific to its time and place: seventeenth century New England. Such an angle
of vision leads one away from viewing it as an indictment of American character
generally, and more one of self-righteous Puritanism specifically. Both of these views have cogency, of
course. But I’d like to look at The
Crucible as a frontier story.
There are some good historical
reasons to do so. Salem, Massachusetts is not typically seen as a frontier
town; after all, it was founded in 1626, even before Boston, and was 66 years
old when the witch trials took place. Still, if Salem itself was not in fact a
frontier, it was quite close to a bona fide one: the district of Maine, which
would be part of Massachusetts until 1820. For most of the seventeenth century,
the beaver and timber trade of northern New England were major sources of
prosperity for Massachusetts.
The outbreak of King Philip’s War
in Rhode Island in 1676, which spread northward and lingered until later in the
decade, broke a relatively long stretch of peaceable relations with the
region’s Indians. The outbreak of another war 1689—popularly known as King
William’s War, but known in the region as the Second Indian War—destabilized
the region still further. These wars destroyed lives, livelihoods and homes,
and created a significant number of refugees, some of them ending up in Essex county,
where Salem is located. Mary Beth
Norton has documented that a significant number of accused witches as well as
their accusers had ties that can be traced to Maine in the 1670s and 80s. Just how decisive a factor Indian war
really was in triggering the witch trials is open to debate. But it is
certainly plausible to see frontier-related stresses as a factor in what went
wrong in Salem in 1692.
As far as the makers of The Crucible were concerned, this is all
inside baseball. In the original script for the play—and in the movie—Miller
has the first of the accusers, Abigail Williams, pressure her confederate,
Betty Parris, by saying “I saw Indians smash my dear parents’ heads on the
pillow next to mine, and I have seen some reddish work done at night, and I can
make you wish the sun had never gone down!” This fictive context is important in establishing a basis for the core
malignancy of Williams’ character. But it’s more in the spirit of background
information than a proximate explanation for her behavior.
The most important element in establishing a frontier
dimension for the film version is the portrayal of Daniel Day-Lewis’s John
Proctor. To put it most simply, the film version of The Crucible underlines the degree to
which Proctor was an outside man. This was true in fact: the real Proctor, who
was about 60 in 1692, lived on the outskirts of Salem proper, where he operated
a tavern. Proctor appears to have been a local iconoclast: he was among the
first to ridicule the witchcraft proceedings; allegedly beat his servant, Mary
Warren, who confessed to witchcraft and accused others; and stood up for
Elizabeth, who was his third wife. This may be why he was the first male to be
accused of witchcraft, and why he was hanged for it.
The film version of The Crucible, exploiting the
possibilities of the medium, makes Proctor an outside man in a much more
literal sense as well. Our first view of him, about ten minutes into the film,
shows him threshing wheat in a field with his sons. The imagery seems to come
straight from a Winslow Homer painting: big open spaces, water in the distance,
brilliant blue sky. The camera pans from the inlet to the interior to reveal
his wife Elizabeth (a superb Joan Allen) summoning him. Over the course of the
story, walls will literally and figuratively close in on him.
In art and life, the Salem Witch
trials were a disaster wrought by Puritans. The deaths of nineteen people and
the concomitant misery that resulted was byproduct of the social conformity
implicit in the communitarian character of Puritanism, the most
institutionally-minded people in British North America. But one of the many
paradoxes of Puritanism is that this communitarian impulse was accompanied by
another, individualistic one, that was at least as powerful. The Puritans had
always placed great value on the primacy of the individual conscience; the
belief that one’s own relationship to God mattered more than what Pope or King
might say is precisely what brought them to America. And it’s that independence
of mind that led the John Proctors of New England to stand up to, and finally
defeat, tyranny from within.
This libertarian strand of
cultural DNA that had drifted across the ocean found a hospitable climate on
these shores. As Frederick Jackson Turner would later write in “Significance,”
“the frontier is productive of individualism.” Turner would often contrast
“antipathy to control” in the frontier mentality with that of the Eastern
establishment. As he well knew,
however, the Eastern establishment was itself
a frontier product, and never entirely transcended it. In an obvious and
irrefutable sense, John Proctor is a tragic figure. But as embodied by Daniel
Day-Lewis in this movie, he is a fierce and willful force whose intensity
cannot be contained by his death. His children, literal and figurative, will
conquer a continent—a topic that would be the focus the next film in the
Day-Lewis sequence of U.S. history.
* * *
In the almost two centuries since
its publication in 1826, James Fenimore Cooper’s Last of the Mohicans has been like the sheet music for a pop song:
a loose set of characters and plot points in a standard that has been
rearranged and embellished countless times. Like a lot of pop classics,
Cooper’s source material lay in the public domain, namely collective memory of
the French and Indian War, which ended a quarter-century before he was
born. Cooper, who was raised in
upstate New York—his father was a large, and controversial, landowner in the
baseball Mecca we know as Cooperstown—wrote
about a time when the region was a frontier, and in so doing wrote what many
scholars of the western consider an early example of the genre.
From a modern standpoint,
Cooper’s fiction is almost unreadable in its stilted language and slack pacing.
What has
lasted in Mohicans—what indeed has
proven to be amazingly supple—is a set of characters and a loose plot. In the
last hundred years, the principal medium through which this story has been
re-told has been film—hardly surprising, given the proto-cinematic quality of
the story. The first movie version of the novel, short and silent, came out in
1911. A 1920 version, also silent and selected for the National Film Registry,
an impressively executed piece of work with lots of exterior shoots, generally
follows the outline of the novel. A 1932 twelve-part serial version of the
story was cheap, unintentionally comical, but surely thrilling to people like
my father, who would have gone to see them as a kid part of a full slate of
Saturday matinee movie-going. The
best-known version of the movie prior to 1992 was the 1936 version starring
Randolph Scott, who went on to be a fixture of Westerns through the fifties.
So by the time director Michael Mann and co-screenwriter
Christopher Crowe tackled Mohicans in
the early 1990s, they had a treasure trove of material to work with.
That said, the most important precedent for the filmmakers of 1992 movie was a
long tradition of artistic license. The pivotal figure in this regard—the linchpin of
the movie, and that of the point I’m ring to make here—is the character of Hawkeye
(here called Nathaniel), more specifically the
Nathaniel of Daniel Day-Lewis. This is much more than a matter of which
lines of the script he utters. To put it simply, the Day-Lewis incarnation of
Cooper’s frontiersman is a singularly magnificent figure. Though he lacks the
muscularity of the typical movie-star hero, he is an impressive physical
specimen: lanky but taut, strong but agile. But Nathaniel’s presence is much
more than physical. The Hawkeye of all too many Mohicans—nowhere more so than the original—is a hayseed who’s not
(quite) as dumb as he looks. Randolph Scott’s Hawkeye is one of the better
ones, because the geniality he gives the character doesn’t undercut his sense
of competence. But Day-Lewis blows his predecessors away with the sheer
intensity of his self-assurance. He is a perfect Turnerian specimen, as at ease
in a pick-up game of lacrosse as he is dining at the cabin of his friends.
The fact that this protagonist is
not the entirely restless loner of
Cooper’s saga, that in this version there’s a place in his life for a woman who
by the end of the film will stand by his side wherever he may go, is very much
a part of the film’s larger design. The movie eschews the traditional funeral
scenes of most Mohicans by having that
last Mohican Chingachgook spread the ashes of his son Uncas over the western
mountains amid a setting sun. As sorry as we feel for Chingachgook, this
version of the movie—as I will discuss, there are in fact two 1992 versions,
with subtly, but significantly, different endings—has a hopeful feel. That’s because
we feel so strongly that the tragedy of Uncas notwithstanding, Hawkeye really
is Chingachgook’s son (we moderns consider race and even parenthood a social
construction, after all), and that in his presumed merger with his lover Cora—whose
name takes on a new significance—the seed of a new national identity will be
planted. As a hybrid, it will be resilient. And have plenty of room to grow. In
this, the first film Day-Lewis made about American history, he embodies the
frontier in its brightest phase and greatest height.
* *
*
One of the more notable—and,
given the circumstances of its unveiling in Chicago, ironic—limits of Frederick
Jackson Turner’s vision involved his difficulty incorporating cities into his
vision of U.S. history. As the esteemed environmental historian William Cronon
has observed, “Turner consistently chose to see the frontier as a rural place,
the very isolation of which created its special role in the history of American
democracy. Toward the end of his career, he looked with some misgiving on the
likelihood that there would be an ‘urban reinterpretation’ of American history
that might ‘minimize the frontier theme’—as if frontier history had little or
nothing to do with cities.”
And yet as Richard Hoftstadter,
himself also a critic of Turner admitted, “the great merit of Turnerism, for
all its elliptical and exasperating vagueness, was to be open-ended. The
frontier idea, though dissected at one point and minimized at another, keeps
popping up in new forms, posing new questions.”
It is in this spirit that a frontier perspective can help us understand the
role of Daniel Day-Lewis in the next installment of his cinematic history, Gangs of New York.
New York, it should be said, is
not typically viewed as frontier territory any more than Salem, Massachusetts
is. For one thing, it’s an island, not a continent. For another, it was
effectively urban from the moment of its Dutch inception as New Amsterdam. And
yet one can plausibly view Manhattan as a frontier in two senses. First, like the
rest of North America, New York was a geographic space that was settled along
an irregular line of development over a long period of time, albeit from south
to north rather than from east to west. And second, the frontier was a process
of demographic transformation, as immigrants of one kind or another gradually
gave way to other ethnic and racial groups, often in process of gentrification.
If Mohicans began
as a novel rooted in historical events, Gangs
began as a history laced with fiction. The core source material was The Gangs of New York, a 1928 book by
journalist and crime writer Herbert Asbury. The character Day-Lewis plays in
the movie, Bill Cutting, a.k.a. Bill the Butcher, is modeled on the real-life
figure Bill Poole.
It’s appropriately ironic that
the Butcher’s gang goes by the name of the Native Americans. The historically
accurate term denotes what was at the time a growing number of U.S. citizens
who were increasingly hostile to the rising tide of immigrants, especially
Irish immigrants. This tide would crest with “Know-Nothing” Party in the 1850s, a temporary but powerful
force in 19th century U.S. politics. Of course in our day the phrase
“Native American” is a synonym for Indian. Though a passionate racist who
considers only white, Anglo-Saxon Protestants real Americans, the Butcher’s
situation in Gangs of New York
resembles no one’s more aptly than that of a Delaware sachem confronted with
growing numbers of outside interlopers and deciding to take a stand against
them.
In an opening scene set in the
winter of 1846, the Butcher-led natives prevail in a gang fight with the Celtic
horde led by Priest Vallon (Liam Neeson), victorious despite their enemy’s
greater numbers. Yet the Butcher has only bought time. He can manage, even
absorb, the steady stream of new arrivals for an interval. Indeed, it’s one of
the paradoxes of the Butcher’s character that he can employ his former enemies,
and even tease them affectionately about their ethnic foibles. But like a
hydra-headed monster, Vallon’s legacy returns in the form of his son, whose
ironically Teutonic name—“Amsterdam”—will ultimately challenge the Butcher for
supremacy. In the meantime, however, the unwitting chief takes a shine to the
kid and nurtures him in the ways of tribal power. As such, he’s like a
triumphant Indian warrior who incorporates the kin of vanquished foes into his
own clan.
When, about two-thirds of the way
through the movie, the Butcher learns the true identity of his protégé, he
turns on him with ferocity. Bill goes to visit the newly elected (Irish)
sheriff of the Five Points, who has allied himself with Amersterdam, and deals
with him in a manner redolent of a Wild West standoff. Watch for what might
plausibly be termed a tomahawk throw.
SHOW CLIP
Gangs of New York represents a transposition of roles for Daniel Day-Lewis: in Last of the Mohicans, he was Hawkeye;
this time he’s effectively Chingachgook. Like generations of dime novel readers
and fans of westerns, we admire him in his savagery, which has a kind of nobility
even as it is unacceptable as a basis for contemporary society. As with Indians
of the frontier, Bill the Butcher must die so that we, a non-WASP multiracial
majority, might live. It’s
Leonardo DiCaprio’s Vallon who represents the synthesis of cultures who will
survive as a hearty hybrid and make a modern America.
And yet we remain haunted by the
specter of the natives.
*
* *
About halfway through this talk,
I mentioned that there were two different versions of the 1992 Last of the Mohicans. The first—the one
shown in theaters and in the VHS release of the movie on home video—concludes
the way most versions of the story typically do, with Chingachcook, sprinkling
the ashes of Uncas, declaring that he is the last of the Mohicans. It’s at that
point that the music swells, the camera pulls back, and the credits roll.
Here’s the second version.
SHOW CLIP
Frederick Jackson Turner’s “The Significance of the Frontier in American History” was a lament wrapped in hope. Turner dealt with the current of existential dread that runs through his realization that the frontier had closed by writing sunny prose and by arming himself with a Progressive faith that new frontiers would come along in the twentieth century to replace the old one. “In place of old frontiers of wilderness, there are new frontiers of unwon science, fruitful for the needs of the new race; there are frontiers of better social domains yet unexplored,” he wrote ebulliently in 1914, three decades after “Significance.” I can’t help but be moved by the old man’s lyricism: “Let us hold to our attitude of faith and courage, and creative zeal. Let us dream as our fathers dreamt and make our dreams come true.”
And so we did, from the moon to
that crabgrass frontier we know as suburbia, where these words are being
written. But here in the twenty-first century, the most obvious truth about the
frontier is that mythology itself is a finite resource. It gets consumed and
recycled no less than land. If
there is a saving grace—or, at any rate, a rough justice—in the racist
brutality that has threaded the myth of the frontier, it is that the people who
made it are themselves compost.
But for now, we are here.
Subscribe to:
Posts (Atom)