Monday, June 27, 2011

The Singles Life

Academic publishing in the age of downloads

The following essay was published in the current regular weekly edition of History News Network


As anyone who has written a book knows, about half the battle involves conceptualizing a project broadly enough to have wide appeal, while narrow enough to be executed in a manageable amount of time and space. An American Studies scholar in the classic mold—which is to say one trained at the intersection of history and literature—I have typically done this by formulating an argument in terms of a general proposition and crafting a series of case studies that illustrate the point. My current project, for example, rests on the proposition that certain Hollywood actors have a historiographic vision embedded in their careers on the basis of the roles they’ve chosen. Clint Eastwood’s work has been essentially Jeffersonian (albeit ambivalently so). Daniel Day-Lewis dramatized the persistent significance of the frontier in American history, even when he played an urban gangster. Denzel Washington’s work circled around the trope of paternity, in what I consider a kind black republican fatherhood akin to white republican motherhood. And so on. The project has a set of overarching themes, among them a generational one: all my subjects came of age in the last third of the twentieth century amid a growing skepticism about institutions.
I consider it important that the book achieve a sense of cohesion in order to make the project compelling to a scholarly press. Without such stitching, my project would be a collection of essays. And essays are now the kiss of death almost everywhere in the publishing business, even university presses that were once their natural home. The prevailing wisdom is that nobody reads them.
In large measure, that’s because nobody buys them. Even cheap scholarly paperbacks cost $25, more than a new hardcover trade book.  University libraries and scholars themselves will spring for academic press titles (often at a publisher’s exhibit during a conference), though sales commonly run in the hundreds, not thousands. This sense of small scale, in turn, hobbles authors and publishers further, since they lack the muscle for promotion or even standard sales tactics like online discounting. Students in particular are highly sensitive to price, buying used books when they buy any at all. Textbooks may be impossible to live without. But a skipping homework in the form of scholarship: that saves money and time. It’s almost virtuous.
Students, of course, have come of age in a different media environment than their teachers have. For them, even compact disks are ancient history, and a stereo system should be no larger than a laptop. They get most of their information from websites, and it’s rare these days to get student essays whose citations consist of anything but URLs. They will pay for music, to the iTune of $1.29. As a result, the music industry has been shaken to its foundations.
The publishing industry is next. (In some precincts, it’s already wobbling.) Amazon’s Kindle has become for books what the iPod is to music, with other competitors rushing in as well. Electronic books have been on the scene for a few years now. They have become a major force in trade publishing—the New York Times bestseller list now regularly documents the large and growing proportion of e-books in total sales; at Amazon.com, e-books have overtaken print— but the phenomenon yet to hit academic publishing in a large-scale way. Open up a publication like The New York Review of Books, whose bread-and-butter is advertising by scholarly publishers, and all the titles listed have cloth and paper prices, not e-books.
There are still multiple fronts of resistance. To some extent, it comes from publishers, who have been slow and tentative to issue e-books. I’ve been surprised to the extent to which students themselves have dragged their heels. (Kids today can be so old-fashioned!) Scholars themselves are the biggest obstacle. Almost by definition, they love print—reading it, collecting it, even smelling it. Even those who have moved on to e-readers are often enmeshed in administrative machinery that’s easier to passively accept than revise. So every May and November, faculties commit, months in advance, to textbook adoptions for the coming semester to follow the strictures of the campus brick-and-mortar bookstore.
This will change. Publishers say, honestly, that consumers overestimate the savings that come from not having to print and warehouse books, and assert they will be unable to remain solvent if they lower prices much below what print books cost. But that assumes solvency in terms of the way they’re structured now—which isn’t working anyway, given used-book sales and other pressures. We’re going to have to re-imagine the economics of the scholarly enterprise, an imperative that’s likely to have cultural consequences. Not all of them will be bad.
At some point in the process of working on my new book, I found myself wondering: does this make sense as a book? Might it not make more sense as a set of five or six chapter-length e-books? Very often, instructors only use part of a book; in any case, they rarely assign the whole thing at once. If something like this caught on, I might tell my students in class that we’re going to discuss X tomorrow, and some of them could download it before they left the room. I made a pitch along these lines to a major university press, which reacted with respectful interest. It’s in their future, I was told. But not yet.
Around the same time, I received an offer from Amazon.com to publish one of my chapters as a Kindle Single. This series, which was launched earlier this year, consists of writing that’s longer than a typical article but shorter than a typical book. Singles are priced very cheaply—no more than $2.99 per title. On the advice of my editor, I priced mine at 99 cents. He explained that shoppers really do care about price, and that a higher sales ranking can become a positive feedback system. What sweetens the deal is that I get a 70% royalty rate (minus a small fee for wireless delivery on each copy). That’s about seven times better than I could ever hope for from a scholarly press.
Amazon also has another new program, Kindle Direct Publishing (kdp) that allows writers to publish work themselves with similarly attractive arrangements (though you only get 35% royalty if your book costs less than $3 or more than $9.99). This is a potentially excellent way for scholars to stay in circulation. I’m considering publishing some of my older work, like my out-of-print dissertation, in this format. I also seriously considered it for the remaining chapters of my work in progress (at the current time, Amazon will only publish one work by an author in the Singles format).
In the end, though, I’m still hoping to publish my work as a book in both print and e-book formats. This reflects my own cultural conservatism, along with a belief that I really am in the middle of producing something in which the whole is greater than the sum of its parts. It also reflects my hope for legitimacy in the form of having my manuscript vetted by experts, a practice that has always been the hallmark of professional scholarship. Last but not least, there are credibility and resources—copyediting, design, promotion—that a good publisher affords. Publishers of all kinds are being buffeted these days by demands that can be unrealistic. But they’re too important to disappear. We just have to find new ways for them—and us—to work.

Friday, June 24, 2011

Focus on Iris

Taxi Driver as the locus of Jodie Foster's lifelong vision

The following post is part of an ongoing series on Jodie Foster specifically and Hollywood actors as historians generally.

By the second half of the seventies, it was increasingly clear to those in the business that Jodie Foster was more than just a novelty act.  Her emerging persona – bright, confident, impatient with the strictures of traditional authority – was perfect for the post-sixties zeitgeist. “It was just at the beginning of women’s liberation, and she kind of personified that in a child,” Brandy Foster recalled in 1988. “She had a strength and uncoquettishness. Maybe it comes from being raised without a father to say ‘Turn around and show Daddy how pretty you look.’” That last line is arguably self-serving – plenty of fathers are happy enough with tomboys, or at any rate wish for more than daughters who look good in a dress – but the elder Foster was surely right that the her daughter’s unselfconscious demeanor would have been filtered widely through an ideological lens in ways that worked in her favor.
One of the first people to recognize this was Martin Scorsese, himself at the beginning of one of the legendary careers in cinematic history. Scorsese cast Foster in a small part as a tough-minded tomboy who befriends Ellen Burstyn’s son (played by Alfred Lutter III) in Alice Doesn’t Live Here Anymore (1974). Foster has only two scenes, one of which involves faking an injury as part of goading Lutter’s character into an act of shoplifting, but she’s a compelling presence, just as she was in real life. “Jodie just walked into our office on the Burbank lot, and she had total command,” Scorsese later remembered. “A total professional, especially at the age of twelve, is totally reassuring.” Kris Kristofferson, who starred in the film, was also impressed. “She came right up and shook my hand, all business,” he said in the commentary that accompanied the DVD release of the film. “She wasn’t like a little girl at all.” This poise allowed her to win an important part in the early Alan Parker movie Bugsy Malone (1976), a mock gangster musical with an all-child cast.
But the turning point in Foster’s career as an actor – and a landmark in providing a durable touchstone for her preoccupations as an artist – was her second Scorsese project, Taxi Driver (1976). A signature document in the emergence of independent cinema in the 1970s, the film tells the story of Travis Bickle (Robert De Niro), a deeply troubled Vietnam vet-turned cabbie who takes a shine to, and is spurned by, a beautiful young woman (Cybill Shepherd) who works at the Times Square office of a slick but inane presidential candidate. In his extreme emotional isolation, Bickle becomes increasingly paranoid and determined to do something significant, which we gradually sense may be an assassination attempt on the candidate. But Bickle accidentally encounters a child prostitute named Iris (Foster) and begins an effort to rescue her from her pimp, Sport (Harvey Keitel). The movie has a conclusion of legendary violence – and irony.
This is not a film many mothers would let their children see, much less perform. But there’s little indication that Brandy Foster had reservations about the twelve year-old Jodie acting in Taxi Driver. Foster herself was eager to take the part; clearly the recipient of a cosmopolitan upbringing, she was already a foreign film aficionado and understood this was more than just another acting job.  Child welfare authorities in California, however, were not so sure this was a good idea. Only after an evaluation by a UCLA psychiatrist (who reported that Foster had a very high IQ) and the intercession of former California governor Edmund “Pat” Brown (father of the current governor) was she permitted to proceed with the role. As anyone familiar with the story knows – and pretty much everybody over the age of about 40 does – Foster did indeed end up paying a heavy psychic price for playing Iris, though not for a reason easily foreseeable at the time.
Still, Foster has repeatedly and consistently described her experience working on the film as one of the highlights of her career. “When I did Taxi Driver, it was like the first time I ever did a role that was a little out of character,” she told the Los Angeles Times in 1981. “I felt like I had accomplished something.” She repeatedly affirmed the importance of working on the film many times in the decades since, perhaps most succinctly in a 2007 interview with Entertainment Weekly: “Taxi Driver was the best thing that ever happened to me.”
A big part of the reason why was tutoring she received at the hands of De Niro, whose advice – and example – transformed acting from a lark into a vocation. Their major scene together, which takes place in a diner, is truly extraordinary and worthy of the Academy Award nomination Foster garnered, whatever her age. She glides seamlessly from worldly adolescent (“Didn’t you ever hear of women’s lib?” she asks, pouring sugar on top of toast with jelly, in sarcastic reply to the suggestion that she belongs at home) to naïve teenager (she rejects Bickle’s suggestion that Sport is a killer with the assertion that he’s not because he’s a Libra, “which is why we get along so well”). But every once in a while, Bickle says something to her in the form of concern for her welfare, and her face, partially hidden by absurd green sunglasses, momentarily but unmistakably stricken. Just there are a couple times in their scenes together where Bickle is about to give up and she tugs on her sleeve or reaches out in a way that reveals the child beneath the tough exterior, which makes her subsequent scene with Sport, who smoothly reassures her with effortless lies, all the more appalling.
  Indeed, if one makes the slightly unorthodox move of viewing Taxi Driver from the point of view of Iris (whose very name suggests both vision and flowering, or culmination), we see that amid is nihilistic vision of politics and seemingly antiheroic protagonist, it is a movie with a moral vision. In part, that’s because there’s a real villain in it: Sport. Travis Bickle is clearly a very troubled man, and his destructive impulses, which are as likely to turn inward as outward, pull him in some very dangerous directions. But however awkwardly expressed, or mingled with other imperatives, there’s something altruistic about his desire to rescue Iris. But there’s nothing redemptive about Sport: he lies and exploits Iris all the more mercilessly because his mild veneer and emotional manipulation make overt violence unnecessary (and because he’s obsessively guarded in dealing with his customers). Screenwriter Paul Schrader, who comes out of the Dutch Reformed Church, has a self-conscious theological vision that he often brings to his work, most obvious in the complicated relationship between Jesus and Judas in his 1988 screenplay The Last Temptation of Christ (also directed by Scorsese). For our purposes, the important thing to note that while Taxi Driver in some ways seems to reflect a vaguely leftist, relativistic, countercultural critique of American life common in the films of the 1970s, it is animated by a powerful vision of evil – atavistic, unexplained, palpable evil – that suffuses the city like the vapor rising up into the street in its unforgettable establishing shot at the start of the movie. It corrupts Iris; it shadows Travis. But it saturates Sport, and explodes in the climax of the movie. Though it lacks any formal theological or philosophical framework, this notion of implacable, unexplained malice will ultimately become a fixture of Jodie Foster’s body of work, a vector presses down on most of her films and which gives many of them the melancholic weight that has always made her a bit unusual even as she went on to become an artist who would operate in the heart of the Hollywood mainstream.
 Next: Foster's adolescent career


Wednesday, June 22, 2011

Enlightening 'Switch'


Media maven Tim Wu illuminates the course of history -- and its future trajectory -- in The Master Switch: The Rise and Fall of Information Empires

The following review was posted recently on the Books page of the History News Network site. 

If The Master Switch was a piece of software, it would be a killer app: not exactly original, and relatively narrow in function, but terrifically practical and elegant. A bit like an iPad, in fact, though one closer to the spirit of Steven Wozniak than as executed by Steven Jobs. (Remember Woz? The author would not be surprised if you don’t.) And therein lies the heart of its haunting argument about the way the future of communications will likely be found in the past.

The story Tim Wu tells in this book is a cyclical one. It begins when someone – an Alexander Graham Bell in telephony, a Thomas Edison in film, a Philo Farnsworth in television – emerges at the vanguard of a disruptive new technology, in the Joseph Schumpeter sense of creative destruction. Taken alone, that technology itself is insufficient; it needs an imperial-minded entrepreneur, like Alexander Vail of AT&T or David Sarnoff of RCA, to build a viable legal, political and economic infrastructure for it, and who then go on to dominate it, whether by vertical integration or the creation of a government-sanctioned cartel. These dominant players then fend off subsequent challenges, including those posed by genuinely better mousetraps in the form of new technologies like FM radio, cable television, or the Internet. But eventually the Old Guard gets felled, sometimes by opponents it failed to see coming, and sometimes by quirky historical circumstances: who would have figured that Richard Nixon would be the patron saint of cable TV? The new empire may include members of the old elite, as in NBC’s transition from radio or television, or in the reconstituted AT&T, whose colossal but shadowy power was glimpsed in the War on Terror, when the company complied with a Bush administration order to tap domestic phone lines. But there are always new players and new rules to be broken.

In its broadest outlines, Wu's story is clear enough – in meta-narrative terms, it’s as old as China – but the drama is in the details. It’s fascinating to learn, for example, that the dawn of radio broadcasting in the United States involved telephones. Farmers ignored by AT&T strung together galvanized and barbed wire to create party lines whereby they could all pick up their phones at the same time and get reports on news and the weather. Or that the venerable Bell Labs had pretty much perfected the technology for the answering machine decades before it appeared on the market, but smothered it in a mistaken belief that the ability to leave messages would destroy the telephone business.

But Wu doesn’t peer down these roads not taken not simply because they’re interesting in their own right. Or to make the good point that it’s our imaginations and memories that limit us as much as our technical capabilities. He also wants us to understand the decisive power that private enterprise has had on the shape of our communications, a power that goes far beyond the marketplace and plays a major role what we hear – and what we don’t.  As he notes, the government is constitutionally prohibited from limiting free speech. But the Catholic Church isn’t. And in its ability to convince a few (Jewish) movie moguls that it was easier to sidestep than resist it, the Church pretty much determined the artistic limits of the film business for decades. (Wu seems to overlook the tremendous subversive power of genres like film noir in this regime, but the point would be hard to entirely refute.)

Wu is no conspiracy theorist. He understands, and duly honors, the stability and achievements made possible by a sense of scale and freedom from competition – yes, once upon a time, the word "competition" did not have the talismanic status it enjoys in government and business. But he notes that our mania for deregulation in recent decades, one that began by unleashing pent up innovation, has now been perverted to the point that we regard anti-monopolistic measures as themselves onerous. He thus calls for a kind of federalism in the private sector that maintains barriers between hardware, content, and the lines, virtual and otherwise, which connect them. The maintenance of these barriers, provided by the government, should be both vigorous and limited.

As Wu sees it, the contemporary state of play on the electronic frontier is divided between two sides embodied by Google and Apple. The former, operating in a spirit comparable to the early broadcasting, is open, diverse and resistant to commercial exploitation. For these very reasons, Apple -- increasingly allied with the entertainment conglomerates -- promotes the walled garden approach that emphasizes quality control (not necessarily in that order). Though many observers have criticized Google’s vast reach and overweening tendencies, Wu emphasizes its vulnerability. Google owns virtually nothing in communications world it mediates, and he suspects it’s only a matter of time before wire operators (like the Chinese government) and content providers (Fox) put on the squeeze by opting out of its search algorithms. If so, the age we’re living in may well be far more golden than we imagine.

A final word about this persuasive book must be said about its graceful execution. Given the scope of its argument, The Master Switch is surprisingly short at just over 300 pages. It’s also written with an exceptionally deft hand. (My favorite line is about the Federal Communication Commission’s reaction to the prospect of cable television in the 1930s, which Wu compares to “a farmer dismayed by a tractor’s lack of horses.”) I ran into the book largely by accident, though my copy comes from the fifth hardcover printing, suggesting it has found an audience. It deserves one, as its argument will likely only grow more relevant.

Sunday, June 19, 2011

The Boss's Brother


By way of remembering the late, great heart of the E St. Band, I'm re-running this post from 2009. Thank you, Big Man. --J.C.

Clarence Clemons tells a bit less than all, with the help of television writer Don Reo, in Big Man: Real Life and Tall Tales


The following review was published yesterday on the books page at the History News Network.

At one point in Clarence Clemons's often amusing new memoir Big Man -- if a book laced with fictional anecdotes that shares space with a co-writer can be called a memoir -- the legendary E Street Band saxophonist imagines himself at Fenway Park in Boston at the 2004 World Series with the great pop singer Annie Lennox. "I loved you guys on the Born to Run cover," Lennox tells Clemons.

"I'm on the back," he says.

"What?"

"I'm not
on the front cover. I'm on the back. I'm talking about the album as it came out. You've got to turn it over to see me. That's how they printed it."

"Really?"

"Really."

Really. While we do get a piece of him on the front, we don't see a recognizable Clemons except
on the on the back cover. So he's speaking an uncomfortable fictive truth. With the exceptions of the security and concession staff, and the seemingly inevitable black back-up singers, Clemons was the only non-white person I saw at my last Springsteen show at Giants Stadium last month. (Clemons's co-author, Don Reo, relates running into a self-described black female fan in 2008 who says her favorite Springsteen song is "Jessie's Girl" -- a 1981 gem by Rick Springfield.) In his ever-self aware way, Springsteen refers to the racial question that lingers over the Born to Run album cover in his one-page foreword to this book, in which he cites "a friendship and a narrative steeped in the complicated history of America." Huck, meet Jim.

I feel a bit uneasy bringing all this up at the start of a review of Clemons's book, not because the racial tension implicit in Springsteen's career isn't real, but because, to use his word, it's complicated. Clemons is now the only featured black member of the E Street Band. But it wasn't always so; indeed, in the early days the band was truly interracial, and one of its members, David Sancious, went on to have a distinguished jazz career. Moreover, a number of important African American performers, among them Donna Summer and Aretha Franklin, have recorded Springsteen's songs.

More to the point, Clemons himself doesn't seem to regard his anomalous presence in Springsteen's career as especially problematic. A native of Norfolk Virginia, he's spent his entire life living mostly among white people. His co-writer, Reo, a television writer and producer, is white, and they're seem quite comfortable talking about matters of race in ways that range from comic to serious. But always in passing.

Actually, the Clemons who emerges in this book is a notably easygoing man whose big appetites -- in every sense of that word -- enlarge the spirit of those around him. He does not appear to be a particularly introspective figure -- one can base this assertion simply on the fact that he's been married five times -- and he exhibits a casual sexism one sometimes sees in his generation. But it's not hard to imagine how he might invite the confidence of a Robert DeNiro, for example, who once told Clemons that his legendary "Are you talking to me?" monologue from Taxi Driver came from listening to Springsteen on stage (as such it's a testament to the transformative power of De Niro's artistry that the the actor could turn joyous patter into an ominous threat). Actually, Big Man is a kind of touring travelogue in which we meet a great many celebrities, including some unlikely ones such as Damon Wayans and Kinky Friedman. We get the idea that Clemons and Co. have lived a rarified life for a very long time. And there's some interest in that, even if the book goes on a little longer than it probably should.

That said, there is a swiss cheese quality to Big Man, which grazed the bottom reaches of the New York Times bestseller list recently. We get a fairly straight stretch of narrative about Clemons's early life, and a great deal about the Magic tour of 2007-08, for the understandable reason that the star-struck Reo was with the band for much of it. We also come to understand that Clemons has been in a great deal of pain with bad knees and hips that have made performing an ordeal in recent years, which the authors manage to convey in such a way that makes his unaffected persistence and good cheer all the more impressive, even as they are couched in intimations of mortality.

There's surprisingly little on Clemons's role in the making of Springsteen records. To some extent, that's because making studio albums is a much more painstaking and fragmented process than live performance; much of Clemons's work on Born to Run, for example, was daubed in by producers and engineers. Still, it would have been nice to hear more about the process, and how Clemons interpreted Springsteen's musical instructions in signature performances like "Rosalita" or "The Promised Land." On those rare occasions we do get such glimpses, they're fascinating, as in Clemons's offhand explanation of how Springsteen wrote "Hungry Heart" in the space of ten minutes.

Glimpses of Springsteen's relationship with Clemons are also few and far between. There's a lovely little set piece of the two on the boardwalk in Asbury Park in the early years. There are also allusions to some hard feelings when Springsteen broke up the band in 1989 -- ironically, Clemons was in Japan with the victim of another breakup, Ringo Starr, when he got the phone call -- and the depiction of the aftermath of an argument between Clemons and Springsteen. To a great degree, this is surely because Clemons is temperamentally not inclined to dwell on such events. But I'd also bet my last dollar that it's because Springsteen was closely monitoring what Clemons would be permitted to say. (That foreword has the feel of a stamp of approval.)

Which, in turn, brings us back to the defining core of their relationship. Clemons may be the Big Man, but Springsteen is, well, the Man. Clemons tells a story from the early seventies about how he and drummer Max Weinberg got marooned on the Garden State Parkway when their car broke down and they were in panic trying to inform what was certain to be an impatient Springsteen. "He was the Boss even back then," Clemons observes. He's probably more like the CEO now. There's very little evidence here that Springsteen treats Clemons as an intellectual, or is much a part of his everyday life offstage. But there's a lot of evidence that he has taken very good care of Clemons, financially and with real affection. After all, Huck really did love Jim.

Thursday, June 16, 2011

Star aborning


Jodie Foster, child prodigy

The following post is part of an ongoing series on Jodie Foster specifically and Hollywood actors as historians generally.

 
Movie stars are a bit like star quarterbacks in the National Football League: it’s very hard to tell ahead of time who they’re going to be. Lots of quarterbacks start early, and show real promise. But sports lore is full of first-round draft picks who disappoint, and virtual afterthoughts who get a chance and unexpectedly set the league on fire. Late blooming of the Kurt Warner or Tom Brady variety is proverbial. Yes, there are the prodigious Elways and Mannings, too. But they’re maddeningly elusive.
Similarly, when you look at the background of an Eastwood, a Washington, or a Hanks, you typically see signs of talent, or at any rate, distinctive elements of character that retroactively get written as premonitions. But they’re rarely at the forefront of their peers. Many show early interest in their craft, and real promise. Then again, so do a lot of people.  When I was in high school, I thought my classmate Edie Falco was truly lovely, and a good actor, too (I had a moment of glory with her in my high school musical production of My Fair Lady.) But she was only one member of the thespian crowd, and not the person I predicted would end up more famous than any of us.  I’m delighted. But I’m also surprised, notwithstanding her outstanding work in The Sopranos and beyond.
Actually, there’s real reason to think that early success is more a liability than an asset, whether because of burnout or simply because what’s cute in a child does not necessarily carry over into adolescence, much less adulthood. Again, there are exceptions – Mickey Rooney and Drew Barrymore come to mind. But most child actors end up like Buddy Foster of Mayberry R.F.D. (1968-71): they get attention in commercials, move on to bona fide celebrity in television and/or films, and are largely finished with show business with the onset of puberty.
Foster’s kid sister, however, turned out to be a different story. A child of Hollywood in a literal as well as figurative sense, she bucked the odds to have a half-century long career. Actually, it’s a little shocking to realize that her collaborators stretch from the grande dame of the theater, Helen Hayes, born in 1900, through screen legends like Michael and Robert De Niro, all the way to Kristen Stewart, born in 1990. Not yet fifty, Jodie Foster is already a grande dame herself.
Yet her pedigreed background, real as it is, was marked, as many such backgrounds are, by early struggle and subsequent setbacks. She was born Alicia Christian Foster on November 19, 1962, in Los Angeles, the youngest of four children of Evelyn “Brandy” Foster (nee Almond) and Lucius Fisher Foster III, a former Air Force Officer turned real estate broker. The Foster ancestry can be traced back to the famed Pilgrim John Alden, reputedly the first man to step off the Mayflower in 1620.] Lucius, like his daughter, is a graduate of Yale.
But the most salient fact about Foster’s father is his absence from her life. Her parents’ contentious marriage was over before she was born, and there is little indication that they ever had a functional relationship. Instead, her second surrogate parent appears to have been her mother’s companion, Josephine Dominguez Hill (1930-1984), who was known as “Aunt Jo,” and whose bowdlerized moniker, “Jo-D,” became Foster’s nickname.
But the driving force in her life, without question, was Brandy. Driven, controlling, and forced by her divorce to improvise financially, she networked her son into commercials and television, where he became the family breadwinner to the impressive sum of $25,000 a year. Brandy was his manager, professionally supervising his career and actively seeking to expand it.
But it rapidly became apparent that the future lay with Jodie. Walking at six months, talking at nine, she had apparently taught herself to read by the age of three. After finishing a local elementary school, Brandy enrolled her in the prestigious Lycée Français in Los Angeles, where Foster rapidly became fluent in French. (She served as Robert De Niro’s translator at the Paris press conference for Taxi Driver in 1976, has made French films, and dubs her own movies to this day.) Although actors are not typically particularly gifted or committed students academically, Foster also demonstrated uncanny ability at managing her time despite her manifold professional commitments, all the way through a Yale career that culminated in an honors thesis on the fiction of Toni Morrison. At one point in her life she seriously considered pursuing a doctorate in literature.
The important thing, though, were her prodigious skills as an actor – among them an ability to memorize lines rapidly – which emerged a very early age. In what remains a kind of pop culture folklore, Foster became the so-called “Coppertone Girl” in advertisements for the tanning cream, though she’s mistakenly considered the inspiration for the iconic illustration of the child whose bikini bottom is tugged on by a dog in print ads rather than the toddler who appeared in the television commercial of 1965. (Asked her name by a casting agent, she reputedly responded “Alexander the Great.”) Over the course of the next few years, she became a staple of the advertising business, appearing in 50 commercials by he time she was ten.I was stunned to come across her doing a 1971 GAF Viewmaster commercial with Henry Fonda (!), in which Fonda, in grandfatherly mode, chats with a few kids precious kids about the virtues of this newfangled toy. Each child has something precocious to say – “extremely interesting,” says one; “the three-dimensional color pictures are extraordinary,” says another – culminating in Foster’s line, delivered with preternatural off-handedness: “I always considered the GAF Viewmaster an ingenious device of great educational value.” (“Gee, I always thought it was just a lot of fun,” Fonda concludes.) Even here, her acting chops were apparent, head and shoulders above her peers.
The commercial work led to a string of appearances in now-forgotten TV shows -- The Doris Day Show in 1969, Kung Fu in 1973 – along with others of somewhat more significance, like Julia, starring Diahann Carroll, notable in the history of television as the first show with an African American lead actor (Foster appears in a 1969 episode and arguably does a better job than the actor cast as her father). It also led to what might be termed an apprenticeship with Disney studios in which Foster was cast in a series of supporting roles: as the junior partner of Johnny Whitaker in Napoleon and Samantha (1972, featuring a very young Michael Douglas); as Becky to Whitaker’s Tom Sawyer in a halfway decent production of Tom Sawyer (1973); and as the daughter of Vera Miles in the 1973 western One Little Indian, starring James Garner (who would team up with Foster 21 years later in a movie version of the old TV show Maverick). Foster also made an appearance as the daughter of Raquel Welch in the 1972 roller derby movie Kansas City Bomber.  Her primary job in all these roles was to be cute, and her principal talent was largely a matter of the concentration and consistency necessary to literally and figuratively hit her mark in a professional setting. She also proved to be a trooper amid adversity, surviving a mauling at the paws of a lion during the filmmaking of Napoleon and Samantha.
But by the second half of the seventies, it was increasingly clear to those in the business that Foster had a distinctive character as an actor, and that her emerging persona – bright, confident, impatient with the strictures of traditional authority – was perfect for the post-sixties zeitgeist. “It was just at the beginning of women’s liberation, and she kind of personified that in a child,” Brandy Foster recalled in 1988. “She had a strength and uncoquettishness. Maybe it comes from being raised without a father to say ‘Turn around and show Daddy how pretty you look.’” That last line is arguably self-serving – plenty of fathers are happy enough with tomboys, or at any rate wish for more than daughters who look good in a dress – but the elder Foster was surely right that the her daughter’s unselfconscious demeanor would have been filtered widely through an ideological lens in ways that worked in her favor.
One of the first people to recognize this was Martin Scorsese, himself at the beginning of one of the legendary careers in cinematic history. Scorsese cast Foster in a small part as a tough-minded tomboy who befriends Ellen Burstyn’s son (played by Alfred Lutter III) in Alice Doesn’t Live Here Anymore (1974). Foster has only two scenes, one of which involves faking an injury as part of goading Lutter’s character into an act of shoplifting, but she’s a compelling presence, just as she was in real life. “Jodie just walked into our office on the Burbank lot, and she had total command,” Scorsese later remembered. “A total professional, especially at the age of twelve, is totally reassuring.” Kris Kristofferson, who starred in the film, was also impressed. “She came right up and shook my hand, all business,” he said in the commentary that accompanied the DVD release of the film. “She wasn’t like a little girl at all.” This poise allowed her to win an important part in the early Alan Parker movie Bugsy Malone (1976), a mock gangster musical with an all-child cast.

Next: The breakthrough -- Taxi Driver


 

Sunday, June 12, 2011

Foster fears

In this first in a series of posts on the career of Jodie Foster, the author explains why he almost never started on them. 

I've always been a little afraid of Jodie Foster.
Like a lot of fears, this one is grounded in attraction. Certainly there are plenty of reasons to be drawn to Foster, among them talent, intelligence, and the skillful deployment of both in a skillfully managed public persona in which genial good manners do not quite cloak a certain steeliness that she’s too smart to think she can entirely hide. I also feel a generational affinity for Foster; born three weeks before I was, she has always been a presence in my life, a yardstick more compelling and immediate, than, say, Clint Eastwood, who has also always been around but as a more remote figure.  Like a lot of artists who are in it for the long haul, her preoccupations have corresponded to where she has happened to be in the life cycle, and so I can say, not entirely ironically, that we grew up together.
For a long time, the principal source of my Jodie Foster anxiety has been gender difference. Naturally, many a geeky heterosexual white guy could not help be attracted to her, and for that very reason be content to regard her from the anonymous safety of a movie theater. Such is the intimidating allure of movie star. But as everyone of her generation knows, Foster had the uniquely unhappy experience of being stalked by a presidential assassin, and ever since I’ve always regarded my interest in her with inner suspicion, a nagging fear that that it amounts to a manifestation of the dark impulses that lurk in my psyche, whether I’m conscious of them or not.  I attribute this fear in part to a Catholic upbringing, and in part to coming of age in a feminist era in which the people who educated me went to great pains to emphasize the dangerously predatory character of masculinity inherent in the male gaze. Too much interest in Foster was akin to virtually stalking her, and I came close at one point in planning this book to concluding that to even write this about her would be a form of disrespect and would thus not do so.
It didn’t take too long to conclude this was a bit extreme. Perhaps later than I should have, I recognized I conflated a fear of Foster with a fear of myself, and while those fears have been understandable, even legitimate, I’ve come to regard them as unduly paralyzing: having dark impulses is not the same thing as acting on them, and having them does not negate any series of others. Moreover, insofar as Foster would ever care, I’d have to guess that she’d prefer to have a good faith, if imperfect, effort to engage her work rather than studied silence about it. After all, she decided to embark on a public career as an actor, long since accepting that the benefits outweigh the costs. It seems rational to conclude that one of those benefits would surely involve attention designed to affirm her significance as an artist in her time, and a claim that this significance will endure after her time, whoever one may choose to define the term.
And so it is that I have embarked on this chapter. However, in doing so I’ve come up against a different, even more disconcerting, discovery: Jodie Foster does not really love American history. I’m not saying that she’s some kind of radical or traitor or self-conscious incendiary. My guess is that there have been any number moments in her life, like the aftermath of 9/11 or a not entirely implausible imagined scenario of her feeling an unaccountable surge of affection during the singing of the national anthem at an L.A. Dodgers game, where she could cite patriotic feeling.  But I get very little sense in looking at Foster’s body of work that she has tried to engage the American experience as an experience, of thinking of the nation-state as a crucial organizing principle of her experience. Clint Eastwood, Denzel Washington and Tom Hanks have all done this. Even a foreigner like Daniel Day-Lewis has. But Foster? Not so much. Actually, that phrase is important: Foster doesn’t particularly love American history, but she doesn’t really hate it, either. (Would that she did!) It’s just not that important. The subject comes up – she’s made a number of films with historical settings – but her primary preoccupations are not historical per se. To grapple with her history is, in one sense, to grapple with an absence.
It’s tough when someone you like and respect, someone who you at some level you feel you know or at least came of age in the same world, apparently just doesn’t care about the things that you do. Foster is not Malcolm X or Emma Goldman, people with strong clear ideas one can comprehend if not entirely accept, and which any right-thinking liberal would try to understand. I regard her as one of us: for all the obvious differences, she inhabits the same country I do, and was entertaining me long before I ever considered what that might mean. And yet it’s clear that even as she has represented American life in ways I recognize, and said things I comprehend and accept, she feels about it very differently. What does she know that I don’t?
I began this chapter afraid to find out.


[1] My wife has a better, literal, claim of coming of age with Foster, in that they went to Yale together, and were distant acquaintances. I’ve always been proud on those occasions she’s mentioned the exchanged smiles between the courteous star and the equally courteous civilian, which I regard as one more of many vindications of my good luck and judgment in the woman I married.

Thursday, June 9, 2011

Unleashed


In The Dogs of War, Emory Thomas shows how nobody knows anything

The following review was posted recently on the Books page of the History News Network site.

Emory Thomas is the éminence grise of Confederate history, A veteran military biographer, he is best known for his 1979 book The Confederate Nation, which remains the standard history of the subject (and has just been republished). In The Dogs of War: 1861, Thomas zeroes in on a specific moment of the Civil War -- the three month period between Fort Sumter and the First Battle of Bull Run, April to July of that year -- to emphasize the confusion and ignorance that shaped the mutual perceptions of North and South, which stumbled into a conflict of a scale and an outcome virtually no one imagined.

But that's not really the principal value, or even intent, of this little book. Instead, Thomas takes a moment whose outlines will be familiar to anyone with a passing knowledge of the war and instead uses it as a case study for what might be termed empirical epistemology. To paraphrase William Goldman's famous maxim of the film business, nobody knew anything, even those who were presumed to know, then and since. That included politicians, the professional military, and rank and file volunteers -- who were volunteers to a great extent precisely because they didn't know what they were getting into.

This maxim extends to the respective presidents of the two belligerents. Though this is a point that's been made before, Thomas usefully emphasizes that Abraham Lincoln greatly overestimated Southern Unionism, perhaps because as a man who was born in the South and married and a Southerner, he overestimated his familiarity with the region and his belief that ordinary non-members of the elite would think like he did. Lincoln carried this conviction, which shaped his approach to Reconstruction, to the grave. As Thomas notes, it would ultimately be vindicated, but proved inadequate to the demands of the moment in 1861.

Interestingly, Thomas depicts the oft-maligned Jefferson Davis as having a far more realistic view of the challenges he faced, and a perhaps more rational view of the strategy to take in light of the long odds.  That the Confederates lost was less a matter of fuzzy thinking, Thomas suggests, than an unrealized hope that the rebels could experience George Washingtonian luck in outlasting their opponents. In his regard, he's similar to the long revered Robert E. Lee.

Thomas makes some skillful juxtapositions between the miscalculations of Americans at the outset of the Civil War, and those of the Iraq War in 2003. He makes a chilling comparison between a memo from Brigadier Janis Karpinski, who presided over Abu Gharib prison, and one from Henry Wirz and Andersonville. The message is clear: almost by definition, going to war means getting blindsided. It should be avoided -- whatever your aims -- at almost all costs.

Because it's so tightly framed and reads something like a well-written lecture, The Dogs of War would fit nicely as a night or two of reading as a prelude to class discussion. It also leads one to wonder whether its utility and future really lies in the electronic realm, where one suspects it could be most efficiently delivered, read, and afforded. Oxford University Press has been issuing a lot of these short Civil War books lately, such as Louis Masur's fine recent 100-page synopsis, The Civil War. In publishing terms, among others, the past may really be prologue to a future that's practically in view.

Monday, June 6, 2011

Lowe's Highs


In Stories I Only Tell My Friends, Rob Lowe relates incidents from a dramatic life (offscreen)

The following review was posted last week on the Books page of the History News Network site. 

Rob Lowe is Forrest Gump with a brain.

Lowe's new memoir, the cleverly titled Stories I Only Tell My Friends, pretty much does what you expect a book of this kind to do: drop lots of names. You read it to hear about people like Tom Cruise, Emilio Estevez, Demi Moore, and various other members of the "Brat Pack" (so dubbed in a 1985 New York magazine profile) who briefly dominated Hollywood in the early 1980s. And you do indeed get stories about youthful excess, backstage romances, and the like. And since Lowe was a star of the hit NBC series The West Wing, you know at the outset that he's going to end up a Friend of Bill and visit President Clinton at the White House.  If you're a big Rob Lowe fan, this will be sufficient to buy the book.

What's really surprising, though, are how many other brushes with fame Lowe has had in his lifetime, in contexts that are totally unexpected. Like the amusing story where Cary Grant gives him soap on a rope. Or the one involving a failed audition by Janet Jackson. Or his adolescent visit to a San Fernando Valley chop shop, where he witnessed the shooting of Star Wars scenes involving the Millennium Falcon and Death Star (George Lucas & Co. needed lots of cheap space). Long before he he had achieved any fame himself, Lowe found himself tugging on George McGovern's coat in the 1972 presidential campaign.

Other stories are more grave. Lowe had a well-publicized romance with Princess Stephanie of Monaco, but less well known is the assassination of the security man who escorted him to and from the palace. Lowe had a brief but intense acquaintanceship with John F. Kennedy, Jr. on the eve of his death. And he also had a chilling intersection with the terrorists involved in 9/11. Even for a celebrity who you would expect to meet lots of people, the volume and variety of his contacts are uncanny.

Such anecdotes aside, the narrative core of Stories I Only Tell For My Friends amounts to Memoirs of a Former Next Big Thing. Perhaps ironically, the most vivid part of the book is Lowe's childhood; he does a nice job of evoking the confusion and hurt of a child in a broken family in Dayton, Ohio, and his mother and brothers' relocation to California in an age when cheap New Age thinking only seemed to feed his mother's neuroses. His longtime playboy status notwithstanding, Lowe does a persuasive job of portraying himself as a clueless adolescent, and offers these evocative reflections on the first time he was engulfed by a posse of shrieking fans:

On the one hand, how cool is it to be mobbed by a bunch of girls my age? It's any guy's dream, right? And it is part and parcel of being a star. But on the other hand, the whole experience feels a little shitty. And feeling shitty about something that's meant to be exciting makes me feel worse. The girls' reactions seems almost programmed, like they were both the performers and the audience in a teen-angst drama that had nothing to do with me. It certainly wasn't about what a good actor I was. And if I was such a hottie to them, why didn't I have the same effect on those who knew me well at school? And so the first wisps of an idea appear on the horizons of my consciousness. And the idea is this: If you really knew me, you wouldn't like me nearly as much.

About halfway through, the book loses some of its narrative momentum. Lowe spends far too much time on his debut movie, The Outsiders, though the glimpse into director Francis Ford Coppola's working style is intriguing. The sad truth is that on the whole, Lowe did not really live up to the enormous hype he generated, surely one factor in his descent into addiction and the notorious "sex tape" controversy that damaged his image. The great irony of his career is that his movie star looks notwithstanding, Lowe has enjoyed his greatest successes as a supporting player -- particularly in the Austin Powers movies and in The West Wing -- and it's to his credit that he recognized this and deployed his skills where they proved useful.

Stories I Only Tell My Friends is an entertaining read by a reflective man who has tried hard to make sense of his varied experiences. One finishes the book believing in Lowe's core decency, and that his default setting is to assume that you are his friend until you demonstrate otherwise. In a celebrity memoir genre where sympathy can often wear thin amid tedious detail and unwittingly damaging self-revelation, Lowe ends up looking pretty good.

Friday, June 3, 2011

True Blues

In Loyalty: The Vexing Virtue, Eric Felten makes a deft case for a knotty concept

The title is wonderfully concise: Loyalty is indeed a vexing virtue. In this intriguing and elegantly written book, Wall Street Journal writer Eric Felten explores an idea which is difficult to reject in the abstract, but which is almost always proves devilish in the details. Using illustrations that that span Greek tragedies to the distasteful deeds of Tiger Woods, Felten wears his learning lightly and yet always instructively in this little gem of a book that is cleverly jacketed in true blue, with gold lettering and an icon of a dog.

Felten, who champions loyalty, focuses on two core problems with it. As we all understand, any positive virtue -- prudence, piety, or any other, cardinal or otherwise -- has its downsides. What is perhaps peculiar to loyalty is its capacity to enable other vices. The same solidarity among soldiers that wins wars also permits atrocities; the trust we place in princes engenders arrogance that leads to tyranny. Loyalty is an essential lubricant for the social contract, but it also permits the most slippery of conduct.

But what's really rough about loyalty, Felten says, is that even in those cases where it is most justified -- very often because it's justified -- loyalty inevitably leads to conflict. It's easy to fight for God and Country; all too often, it's God or Country. Who among us has not had to choose between friends? And who can say with any certainty that loyalty to a child trumps that of a spouse? Felten notes that a cast of characters ranging from King Agamemnon to Immanuel Kant have insisted that it's possible to square this circle. He's unconvinced, and he makes a compelling case that we shouldn't be, either.

After introductory chapters that lay out these philosophical dilemmas, Felton moves to a set of domains for loyalty: marriage, friendship, business, and politics/institutions. What's perhaps most intriguing about the book is that he increasingly moves beyond simply framing the dilemmas of loyalty and takes positions that are all the more arresting because of their nuanced, self-aware character. Felten understands the power of passion and the insistent waywardness of the human heart, which in effect has a mind of its own. But he favors the rush of endorphins that come at the end of the marathon of marriage to the euphoria of spring infatuation. He finds that in the world of commerce, loyalty doesn't make much sense, not only because it fails to describe the way consumers or businesses actually behave, but also because loyalty can too easily become euphemisms for dead weight or taking relationships for granted. In friendship and politics -- two realms Felten believes should remain separate -- loyalty almost always involves condoning behavior with which one disagrees. But he sides with Sir Walter Scott's encomium that "I like a highland friend who will stand by me not only when I am in the right, but when I am a little in the wrong."

Strictly speaking, loyalty has no ideological valence. But in early 21st century public discourse, it skews Right more than Left. One of the more reliable stratagems by which the Left has tried to dislodge the dominant libertarianism of contemporary politics is the embrace of "social justice," a term whose egalitarian overtones resist the individualist accents of a Reaganesque vernacular. But social justice has little room for loyalty. Its great strength is its rejection of privilege; its great weakness is its perceived bloodlessness.  Felten notes that cosmopolitan liberals all too often dismiss patriotism as a pernicious zero-sum ideology, while glibly maintaining that a critical stance toward one's country represents a higher form of loyalty. In many cases, that's surely true. Such people would surely cringe at Theodore Roosevelt's characteristically bombastic 1918 pronouncement that "The man who loves other countries as much as his own stands on a level with the man who loves other women as much as he loves his wife." But it may be no accident that marriage and patriotism have declined in tandem with a broader anti-institutional tendency in U.S. society in recent decades. And maybe that's been a good thing, at least in some respects. But people lacking strong loyalties of their own will always be vulnerable to the terrible loyalties of others.