WW, May 31, 1819
Look: There’s Walt Whitman.
It’s very late – no, make that very early – and he’s walking down
Broadway, right near St. Paul’s Church. It’s now spring, and while the
day was unmistakably warm, there’s a wintry chill in the air tonight.
Walt doesn’t mind. He’s wearing wool pants, a cotton shirt, a wool
jacket and a hat (“a plain, neat fashionable one from Banta’s, 130
Chatham Street, which we got gratis, on the strength of giving him this
puff,” he writes in a recent piece in the Aurora,
a newspaper he’s editing these days). His boots and cane – a little
silly for a hale man in his early twenties to be using one, but then
Walt is a bit of a fop – resound on the paving stones, as do the sounds
of horses’ hooves up ahead, where a carriage crosses his path. Broadway
is unusually quiet tonight and the gas lamps only barely cut the
darkness. He sees two candles in a nearby window on the corner of Fulton
Street.
He’s been to a play. Good company, good seats, good – not great – show.
He’s a little tired now as he heads back to his boardinghouse, but
happy. He thinks of a woman he saw on his way to dinner. He remembers,
and puts aside, a disagreeable task. He hears those boots as they hit
the paving stones. He’d like a pair of new ones.
After
some false starts and missteps, he’s finally beginning to make progress
in his chosen vocation. He’s got big plans – an idea for a novel he
wants to start soon – and inchoate dreams of fame and fortune.
“Strangely enough, nobody stared at us with admiration,” he thinks, with
that plural pronoun he likes to use in his pieces. “Nobody said ‘there
goes Whitman, of Aurora! –
nobody ran after us to take a better, and better look – no ladies turned
their beautiful necks and smiled at us – no apple women became pale
with awe – no news boys stopped and trembled, and took off their hats.’”
Walt smiles, self-mockingly. But he knows we’re watching him. He is
enjoying this.
Friday, May 31, 2013
Tuesday, May 28, 2013
Captializing the self-made myth
Cornelius Vanderbilt and the birth of the Robber Barons
The following post is part of a series -- actually, it's the last in a series -- on the the emergence of the self-made myth in U.S. History.
The following post is part of a series -- actually, it's the last in a series -- on the the emergence of the self-made myth in U.S. History.
The immediate roots of a new order began
to emerge in 1815, when former New Jersey Governor Aaron Ogden bought a license
from the Livingston interests, which he used to go into business with a Georgia
entrepreneur named Thomas Gibbons. Business as usual there. But the two began
squabbling, their dispute reaching the point of Gibbons going to Ogden’s home
to challenge him to a duel. When Ogden had Gibbons arrested, the latter
resorted to a less gentlemanly form of revenge: he launched his own ferry
route. It ran between Elizabethtown, New Jersey and New York City in 1818;
Gibbons argued that he was allowed to do this under a 1793 federal law that
governed coastal (as opposed to, say, Hudson River) trade. Ogden sued Gibbon
for violating the state monopoly, and the case snaked its way through the legal
system, culminating in the landmark Gibbon
v Ogden Supreme Court ruling, which established the supremacy of federal
over state law in regulating commerce. [Stiles 43]
For our purposes, however, what matters
in this story is a footnote that became an epic in its own right. At point in
his struggle with Ogden, Gibbons hired a young man named Cornelius Vanderbilt
to be the captain of one of his steamships. Every other person in this saga –
Fulton, Livingston, Ogden, Gibbons – came from elite backgrounds and
personified aristocratic privilege. Not so Vanderbilt. Born on Staten Island in
1794 as the child of an entrepreneurial Quaker mother and Dutch ferry captain
father, he had already established himself as an up-and-comer in the industry,
so much so that some of his friends were surprised by his willingness to work
for Gibbons. Gibbons himself was nervous about his protégé. “He is striking at
everything,” he said in 1822, when Vanderbilt was waging his war by fearlessly
plying New York waters, dodging ice, wind, and process servers. “I am afraid of
this man.” [epigraph] Gruff, uneducated, and steely in his discipline,
Vanderbilt represented a new breed of businessman. Parrington described him as
“hard-fisted” and tough as a burr oak.” A few years later, Matthew Josephson,
whose classic 1934 book The Robber Baronsremains a classic even if later observers consider him too severe, called him
“a Self-Made Man, for whom the earlier, ruder frontier was the native habitat.
At the same time, his industrial conscience was already free of those
presumptive, restraining codes, as those of the habitual prudence of Franklin’s
age of early capitalism."
After Thomas Gibbons died in 1826,
Vanderbilt worked for his son William, but soon went off on his own, building a
series of ships, starting a series of companies, and expanding his domain to
include the trans-Atlantic trade. He entered the chaotic struggle to capture a
transcontinental route, which involved steamship travel to Nicaragua, a short
rail trip to its west coast, and another steamer to on the Pacific side. For
Vanderbilt and like-minded businessmen, a corporate license was like a marriage
license: something that pretty much should be had for the asking, success a
function of how well you do once you get it rather than a ratification of your
(insider) credentials. Increasingly, local, state, and federal governments
agreed. By the 1830s, the Whig politicians who had superseded the Federalists
were still arguing that corporations should be closely allied to the state, and
that governments in some cases should actually own and operate emerging
businesses like railroads. But the Panic of 1837, which caused a large number
of such enterprises to go bankrupt, undercut these arguments. Though there were
widespread suspicions that turning economic activities from shipping to rail
and even postal delivery to private interests would engender corruption,
consensus formed that this was the lesser evil. So it was that Vanderbilt
emerged as a self-styled corporate populist.
Corporate populist: it sounds like a
contradiction in terms. But it made perfect sense to Vanderbilt – and in some
precincts of this country, particularly those populated by executives who lack
degrees from fancy universities, it still does. Vanderbilt’s recent biographer,
T.J. Stiles, asserts that Vanderbilt “seems to have believed the Jacksonian
rhetoric he so often repeated, a creed of laissez-faire individualism, a vision
of a world in which any man might get ahead by his natural gifts than by
government favors. And yet, in pursuing his private interests wherever they
took him, he felt no obligation to act in the public interest; when competition
had served his purpose, he freely sold out or created new monopolies. As he
operated on a vast new scale, he brought to head the contradiction inherent in
the private ownership of public works – a paradox that would grow starker when
he moved from steamships into railroads in the climactic phase of his life.”
As Stiles makes clear, Vanderbilt’s
reconfiguration of the self-made man was more than a matter of a ruthless
willingness to invest in ships (or railroads), enter price wars, and buy or buy
out. He and an emerging array of collaborators/competitors that included Daniel
Drew, Jim Fisk, and Jay Gould were also true visionaries in understanding that
the corporation was a uniquely powerful instrument for generating wealth that
was literally beyond the imagination of earlier self-made men. “In this age of
the corporation’s infancy, they and their conspirators created a world of the
mind, a world that would last into the twenty-first century. At a time when
many businessmen could not see beyond the physical, the tangible, they embraced
abstractions never before known in daily life. They saw that a group of men
sitting around a table could conjure ‘an artificial being, intangible,’ [Stiles
is quoting John Marshall, who also grasped some of the implications of the
corporation] that would outlive them all. They saw how stocks could be driven
up or dropped in value, how they could be played like a flute to command more
capital than the incorporators could muster on their own. They saw that everything in the economy could be
further abstracted into something that might be bought or sold …the subtle eye
of a boorish boatman saw this invisible architecture, and grasped its
innumerable possibilities."
Others who understood that the rules of
the game were changing were not nearly so sanguine about it, and saw Vanderbilt
as the leading edge of a new unnatural
aristocracy. In a famous 1859 newspaper article, New York Times editor Henry J. Raymond compared Vanderbilt to
“those old German barons who, from their eyries [eagles nests] along the Rhine, swooped down upon the
commerce of the noble river and wrung tribute from every passenger who floated
by.” Raymond didn’t actually coin the term “Robber Baron,” but the
term took root among his contemporaries, resonating through the generations
until it got a new lease of life from Josephson, whose book of the same title
became a byword for ensuing generations.
It’s significant that Raymond’s
denunciation of Vanderbilt appeared in 1859, which is to say before the Civil
War. It shows that the outlines of a new capitalist order had already taken
shape before the war radically accelerated it through the sheer scale of the
coming wartime economy. (Vanderbilt patriotically donated his time and
resources to the Union cause, and after it founded the university that bears
his name in Tennessee as an act of national reconciliation – he wasn’t purely
selfish.) But the fact that Raymond would accuse Vanderbilt of “competition for
competition’s sake” shows the ongoing difficulty even discerning observers were
having in accepting that the world had changed.
But changed it had. From this point forward, the self-made man would essentially be perceived as a business man. While this perception was never altogether accurate, it nevertheless persisted, perhaps because it was in the interest of those who benefit from this conflation. We live in the shadow of their success.
Sunday, May 26, 2013
Jim is observing the Memorial Day weekend quietly at home (his wife is off in Chicago visiting their oldest son at UC). The highlight of the weekend: a trip to see Mud, the new movie starring Matthew McConaughey as desperate fugitive hiding out on an island on the Arkansas side of the Mississippi River, where he is discovered by two young teenaged boys (Tye Sheridan and Jacob Lofland). The story, which carries with it echoes of The Adventures of Huckleberry Finn, turns on whether McConaughey's character, named Mud, is really who he says he is, and what role his supposed girlfriend (Reese Witherspoon) is playing in this drama. Beautifully acted, with strong, nuanced script by writer/director Jeff Nichols, himself a Little Rock native, Mud will prove to be a durable little gem.
Upon returning home from the movie, Jim surfed cable channels and came across two other McConaughey supporting roles: his turn as an overgrown adolescent in Dazed and Confused (1993), and last year's performance as a strip-club entrepreneur in Magic Mike. McConaughey has had a checkered career with plenty of turkeys (Ghosts of Girlfriends Past, anyone?) but has forged a successfully career for himself played flawed, charismatic people you find yourself rooting for, in part despite/because they're largely losers.
Best to all for a happy and relaxed holiday weekend, one paid for by the blood of soldiers. Notwithstanding mistaken, accidental, or unnecessary wars, of which the United States has had its share, the willingness of soldiers to make sacrifices remains essential glue for our way of life.
Upon returning home from the movie, Jim surfed cable channels and came across two other McConaughey supporting roles: his turn as an overgrown adolescent in Dazed and Confused (1993), and last year's performance as a strip-club entrepreneur in Magic Mike. McConaughey has had a checkered career with plenty of turkeys (Ghosts of Girlfriends Past, anyone?) but has forged a successfully career for himself played flawed, charismatic people you find yourself rooting for, in part despite/because they're largely losers.
Best to all for a happy and relaxed holiday weekend, one paid for by the blood of soldiers. Notwithstanding mistaken, accidental, or unnecessary wars, of which the United States has had its share, the willingness of soldiers to make sacrifices remains essential glue for our way of life.
Monday, May 20, 2013
Incorporating the self-made man
What we think of as the essence of the capitalist system -- competition -- was anathema to the way corporations were originally created and operated.
The following post is part of an ongoing series on the self-made man in U.S. history.
Nowhere was the the emergence of a modern industrial economy in the 19th century more clear than in the new meaning and uses of a longstanding institution: the corporation. Prior to 1800, corporations were created – “chartered,” to use the technical term – by governments for the purpose of promoting economic activity in the name of the public good. This was how, for example, the colonies of Virginia and Massachusetts had been established (the latter smuggling in a religious agenda its directors hoped royal authorities would overlook). Corporations were licensed monopolies granted to aristocrats, natural and otherwise, for what was assumed to be plural benefit. Members of the corporation received a unique right to trade – and the right to bar anyone else from trading – in a particular commodity, territory, or both. In return, the corporation would, in terms of tax revenue and political loyalty, support the national interest against other imperial rivals or domestic miscreants who attempted to trade without government supervision and permission.
Today we think of economic competition
as the very essence of a healthy economy. That’s because we live in a
capitalist society. But in the mercantilist world of colonial America,
competition was a problem, not a solution – something that fostered conflict
and distraction from the common good. This notion was subject to increasing
pressure – the colonists considered the East India Company as a locus of
oppression and launched the Boston Tea Party to attack it – and a growing sense
that corporations functioned as instruments of private corruption. But a belief
in the viability of, even need for, the traditional corporation survived
Revolution. It was especially prevalent among the old Federalist elite. In
1810, retreating Federalists in Massachusetts succeeded in turning Harvard College
into a private corporation to protect it from what they feared would be the
rising Jeffersonian mob mentality.
One can see both the persistence of this
corporate ideal and its erosion in the career John Jacob Astor (pictured above), a merchant who
amassed one of the great private fortunes in American history. In 1808 the
federal government granted him a charter for the American Fur Company. For
Astor, the new corporation was a crucial instrument in consolidating his
growing power in the lucrative fur trade, furnishing him with the basis of a
commercial empire that would span Europe, Asia, and the North American
continent (he established an outpost he called Astoria in what became the
Oregon territory). For the U.S. government, Astoria’s company was a vehicle for
contesting British domination of the fur trade and establishing commercial
links with Russia and China.
Astor was a good example of one of those
people Parrington described as “strange figures, sprung from obscure origins.”
Born in Germany in 1763, he migrated to London when he was 16 and began working
for his brother, who manufactured musical instruments. He came to the New York
in 1784 and gravitated toward the Hudson fur trade, where his fierce commitment
and sound instincts soon paid dividends. In this sense one might say Astor was
a natural aristocrat, but he had little sense of civic virtue. In the words of
one recent biographer, “He felt no compassion toward the larger community or
for the country that gave him so much. He might be butcher’s son, but he
scorned Thomas Jefferson’s ideals of equality for white men. He stood outside
the narrow circle of landed families who controlled New York politics, but like
them he believed only members of the gentry and self-made men were capable of
discerning the common good … Liberty, he believed, gave a man of humble birth a
chance to advance himself, but to give the common worker a voice in political
affairs was wrong and fraught with danger.” Astor also
became increasingly impatient with what he regarded as government meddling in
his business. He resisted federal efforts to prevent the exploitation of
Indians by establishing non-profit trading posts where the sale of liquor to
Indians would be banned, because alcohol was his chief bargaining chip in
dealing with Indians.
By the time of Jefferson’s presidency, a
full-scale rebellion against the paternalist premises in the corporate ideal
was underway. As historian Johan Neem explains, “Jefferson, like other
Americans, believed that permitting the spread of voluntary associations and
corporations would threaten civic equality by allowing a small minority, a
cabal, to exercise disproportionate influence over public life.” But given
the growing desire and need for such institutions in a society where the reach
of government was relatively limited – here I’ll pause to note Frenchman Alexis
de Tocqueville’s famous observation in Democracy
in America (1835/40) that Americans were instinctive joiners with a
near-mania for founding associations – the solution of this problem,
counterintuitively, was indicated by Jefferson’s great lieutenant James
Madison, who argued in the Federalist
Papers #10 that the great bulwark against minority rule was allowing a
profusion of interests and factions. If private corporations threatened the
state, the goal should be to have more, not less of them, and in so doing
dilute their power.
Perhaps ironically, this imperative
intersected with another that was more characteristic of Jeffersonian opponents
like Alexander Hamilton: to affirm the supremacy of the federal government over
that of the states. His goal was to create a gigantic free trade zone with the
same language, law, and financial system. Hamilton of course passed from the
scene after his assassination in 1804, but his mantle was picked up by another
Jeffersonian adversary, Supreme Court justice John Marshall.
The point where these two imperatives
converged – a convergence so apt because it embodied the spirit of the
quickening industrial revolution – were steamship companies. The turn of the
nineteenth century was truly an epochal moment in the history of seafaring
because it marked the transition from the age of muscle and wind to that of
wood (later coal), burned to generate the steam that could drive propellers and
paddlewheels. The crucial figure here was Robert Fulton, who is not only
credited with developing the first commercially viable steamboat, but who also
was an important figure in the early passenger business. That’s because Fulton
married Harriet Livingston, the niece of Robert Livingston, a commercial
magnate in one of the most powerful New York families of the Revolutionary era.
Fulton and Livingston received a corporate charter that granted them monopoly
control over a series of ferrying routes across various bodies of water in
metropolitan Manhattan (and beyond). In those cases where their hold on a route
was less than secure, the Livingston interests would either buy off competitors
or sell them franchises. Such practices were (and would remain) widespread an
industry where multiple carriers in a given market were perceived to be more
the exception than the rule. Price wars would erupt, rivals would be ousted,
and equilibrium would be re-established. Again, competition was the problem,
not the solution.
Next: Cornelius Vanderbilt, master of the new order.
Tuesday, May 14, 2013
Mass-manufactured self-made men
The transformation of a myth in the industrial era.
The following post is part of an ongoing series on the self-made man in U.S. History.
“All over the land were
thousands like them, self-made men quick to lay hands on opportunity if it
knocked on the door, ready to seek it out if were slow in knocking, recognizing
no limitations on their powers, discouraged by no shortcomings in their
training.
–Vernon
Parrington, Main Currents in American
Thought, 1927
It has long been understood, in
economics as in so many other ways, that the Civil War marked a dividing line
in American history. Before the war, the United States was an overwhelmingly
agricultural nation with a small mercantile elite; after the war, it became a
modern industrial society in which the factory steadily displaced the farm from
the center of the nation’s consciousness, and the urban worker steadily
displaced the yeoman as the embodiment of the nation’s working classes.
There was also a transformation within
the world of commerce. Andrew Carnegie, who was born into one world but came of
age in the other, described the difference in a famous 1889 essay that
represented the conventional wisdom of the time – and ever since. “Formerly
articles were manufactured at the domestic hearth or in small shops which
formed part of the household,” he wrote. “The master and his apprentices worked
side by side, the latter living with the master, and therefore subject to the
same conditions. When those apprentices rose to become masters, there was
little or no change in their mode of life, and they, in turn, educated in the
same routine succeeding apprentices. There was, substantially, social equality,
and even political equality, for those engaged in industrial pursuits had then
little or no political voice in the State.” While some were inclined to affirm,
even sentimentalize this vision, Carnegie was not among them. “The inevitable
result of such a mode of manufacture was crude articles at high prices,” he
asserted. Far better was the (inevitable) replacement of this regime with a
more efficient, if less egalitarian, system of mass production.
Not everyone agreed such a system was
better, of course. Indeed, a significant part of the history of the 19th
century involved focusing on the ravages of this new order, both in terms of
the material deprivations it imposed on unskilled labor, as well as in the
evisceration of social and political quality. But its reality rarely seriously
questioned; nor was the role of the Civil War in bringing it about. Charles
Francis Adams Jr., in the army, was struck in 1871 by the “greatly enlarged
grasp of enterprise and the increased facility of combination” that
characterized the U.S. economy in the years following 1865. “The great
operations of war, the handling of large masses of men, the lavish expenditure
of unprecedented sums of money, the immense financial operations, the
possibilities of effective co-operation were lessons not likely to be lost on
men quick to receive and to apply all new ideas.”
But, as Adams perceived, the vast new
sense of scale in the American economy was marked by a paradox: the growing scale
of the economy was managed by a shrinking number of individuals. Nowhere was
this more obvious than in the definitive industry of the 19th
century: railroads, presided over by people with names like Vanderbilt, Drew,
Gould, Fisk, and Huntington. “Single men have controlled hundreds of miles of
railway, thousands of men, tens of millions of revenue, and hundreds of
millions of capital,” he noted. “The strength implied in all this they wielded
in practical independence of control both of governments and of individuals;
much as petty German despots might have governed their little principalities a
century or two ago.”
Railroads, along with other forms
industrial capitalism, were springing up all over the world in the second half
of the 19th century, bringing with them great disparities of wealth
and power from Brazil to China. But nowhere were such phenomena more obvious,
even glaring, than in the United States, where equality had long been the
hallmark of American society. And
yet this outcome was not simply a commercial coup d'état by the new breed of
industrialists The fact that they imposed their will begs the question how they
were be allowed to, and why the oppressions caused by their success, while
often loudly protested, never resulted in a successful challenge to their right
to run that they considered their business. Which leads us to an important
reality of the post-Civil War order: it was governed by a cultural logic that
took shape much earlier in the century. At the heart of this logic was a
transformation in the understanding of the self-made man in the decades before
the war.
The key to understanding this transformation was a concept
that had guided the Founding Fathers: natural aristocracy. Charles Francis
Adams Jr.’s great-grandfather had used the term in a 1790 letter to Samuel
Adams. “Nobles have been essential parties in the preservation of liberty,
whenever and wherever it has existed,” John Adams wrote to his cousin. “By
nobles, I mean not peculiarly an hereditary nobility, or any particular
modification, but the natural and actual aristocracy among mankind. The
existence of this you will not deny.”
A generation later, Thomas Jefferson invoked the phrase in his own
correspondence with Adams. “I agree with you that there is a natural
aristocracy among men. The grounds of this are virtue and talents,” Jefferson
explained, contrasting it with “artificial aristocracy, founded on wealth and
birth, without either virtue or talents.” This elite was rooted in
accomplishment, not privilege: it was self-made. For all their differences in
temperament, experience, ideology, Adams, Jefferson, and other Founding Fathers
had a deep personal investment in it as the basis of their careers (though
Adams, it should be said, cast a skeptical eye that that it could be engineered
as easily as Jefferson seemed to think it could be).
But the legitimacy was of this self-made
aristocracy went far beyond that: its moral basis was civic. “May we not even
say, that that form of government is the best, which provides the most
effectually for a pure selection of these natural aristoi into the offices of government?” Jefferson asked Adams,
noting that in general, the common people “will elect the really good and
wise.” Adams was not so sure, lamenting the “Stupidity with which the more
numerous multitude” tended to be tricked by fake aristocrats. But he never
doubted the necessity of a natural aristocracy were the republic to survive.
As many subsequent observers have noted,
the Founding Fathers were in an important sense victims of their own success.
In crafting a remarkably tensile Constitution that checked some of the more
venal impulses of their successors, and in bequeathing a nation with relatively
secure boundaries and vast natural resources, they in effect made mediocrity
possible (both Jefferson and Adams were appalled by Andrew Jackson, who wore
his lack of refinement as a badge of honor). Or, to put it more charitably,
they created the possibility for natural aristocracies whose primary impetus
was not civic, the way it had been for Franklin, Clay, and Lincoln. The pursuit
of happiness could take new forms.
Whether as a necessary evil or a
positive good, the Founding Fathers believed that there had to be a place for
the voice of the people in choosing natural aristocrats to be their leaders.
But by the early decades of the nineteenth century, an imperative to create and
maintain that channel of communication – evident in the steady recession of
eligibility requirements for voting, especially in the new territories that
rapidly became states – created democratic imperatives that took on a life of
their own, in large measure because allowing cream to rise was an important
premise of natural aristocracy itself. Today we’re very aware of the glaring
limits of this vision – the way it excluded women, African Americans, Native
Americans, and even many immigrants. But the expansion of the electorate,
typified by the abandonment of property qualifications for voting, created a
polity that was really striking in its relative scale and in the force of
internal logic that would inexorably lead not only to the absorption of such
outsiders, but also the possibility of liberty experienced and expressed
outside the boundaries of traditional politics.
No one captured these dynamics more
vividly than the early 20th century cultural historian Vernon
Parrington, whose three-volume history, Main
Currents in American Thought, remains among the most lively chronicles of
our national life. “Society of a sudden was become fluid,” he wrote of the
early nineteenth century. “Strange figures, sprung from obscure origins, thrust
themselves everywhere upon the scene. In the reaction from the mean and skimpy,
a passionate will to power was issuing from unexpected sources, undisciplined,
confused in ethical values, but endowed with immense vitality. Individualism
was simplified to the acquisitive instinct.” The hallmark of such
figures, whether in the form of frontiersmen like Davy Crockett or showmen like
P.T. Barnum, was the way their notion of the self-made man operated
independently of – even defied – the logic of natural aristocracy. Mobility,
literal and figurative, was becoming an end unto itself.
Next: the transformation of the corporation in 19th century national life.
Next: the transformation of the corporation in 19th century national life.
Wednesday, May 8, 2013
No. 500
You are reading the 500th post on this blog. It began on February 4, 2009 with a piece about "Outlaw Pete," a new song on Bruce Springsteen's latest album Working on a Dream (which in retrospect looks like one of the Boss's weaker efforts, though the song holds up well). In the years since, a new post for American History Now has gone up on average about once every three days. They can be categorized the following ways:
- Posts that chronicled the lives of fictive students and teachers (The Felix Chronicles, The Maria Chronicles, and the abandoned Horace Chronicles);
- Posts that functioned as excerpts of first drafts of books, principally Sensing the Past, which was published in January 2013, as well as A Brief History of the Modern Media (forthcoming) and an abortive project on the history of the self-made man;
- Book reviews that were cross-posted at the History News Network, where I have been Book Review editor about as long as I've had this blog;
- Short posts on what I've been reading, watching or listening to on vacation or traveling;
- Some miscellaneous stuff (ranging from tributes to Abraham Lincoln to Billy Joel).
- To participate in some of the excitement about new media, and the opportunities for ordinary people to become bloggers and publish work in ways that had previously been limited to those with access to capital, the professional publishing infrastructure, or both;
- To give me a creative outlet at a time when I was between book projects and was unsure what do to next.
And I'm tired. I began the blog four years ago amid a fallow period in my writing career. In the last four years I wrote two books and prepared second editions of two others, but my momentum has stopped: I'm back where I started, unsure where to go next -- or, indeed, whether to go any further, writing-wise. American History Now no longer seems like an exciting new experiment. Actually, it feels like an experiment that's run its course. I have no clear deadline for ending it, but I believe it will be soon. That may not matter much, not only because it won't be missed, but because on any given week it's old posts, not new ones, that get viewed by readers. This is actually what I hoped for; I conceived the blog less as a vehicle for journalism than as a repository for a body of work that would become a very small part of a very large public record.
In closing, I'd like to thank three sets of people: my family, for sustaining (and putting up with) me; Google, whose Blogger platform has been a truly marvelous gift, and you, dear reader, for the privilege of your attention. May you find a lifetime of pleasure in the written word, wherever you may happen to encounter it.
Monday, May 6, 2013
Powerfully obscure
In The Forgotten Presidents: Their Untold Constitutional Legacy, Michael J. Gerhardt, tries, not altogether successfully, to make people like Franklin Pierce and Calvin Coolidge interesting
The following review has been posted on the Books page of the History News Network.
In The Forgotten Presidents, University of North Carolina law school professor Michael J. Gerhardt looks at a dozen presidents, beginning with Martin Van Buren and ending with Jimmy Carter, and argues that each had more of an impact than many people -- not simply a public at large that may only be vaguely familiar with their names, but also professional historians more likely to interested in more prominent figures -- and suggests their impact has been greater than is commonly recognized. As his subtitle makes clear, Gerhardt is not arguing that these presidents had compelling personalities, or that their political gifts or tactics were especially notable. Instead, he argues that each made essentially administrative decisions that either marked a precedent in the history of the presidency itself or quickened a tendency in the nature of office. Much of Gerhardt's analysis focuses on topics like presidential appointments, vetoes, and relationships with other branches of government, especially the courts and the U.S. Senate.
Insofar as there's a narrative trajectory in this series of profiles, it's that presidents of all times and parties have tended to guard and strengthen the prerogatives of the office. To be sure, there have been any number that have been avowedly in favor of limited government. But, as Gerhardt shows, these figures (Van Buren, Franklin Pierce, Grover Cleveland the first time around) are among the least successful in U.S. history. He also shows that the two Whig presidents elected to office, William Henry Harrison and Zachary Taylor, began their terms avowing deference to the legislative branch, in large measure as a reaction to the perceived high-handedness of Andrew Jackson. But both men, as well as the vice presidents (John Tyler and Millard Fillmore) who succeeded them, found this theory of government wanting. Indeed, even those executives who did profess a federalist approach to governing, from Cleveland to Coolidge, nevertheless fought hard to maintain and extend their power in their own domain when it came to things like removing cabinet officers or naming Supreme Court justices. And others, notably Cleveland the second time around -- he gets two separate chapters for each of his administrations -- became increasingly convinced of the need for presidential initiative in lawmaking.
Gerhardt is a scrupulous scholar who explores some compelling byways of presidential history; we learn, for example, that Carter, not Barack Obama, was the first president to confront the prospect of government default -- twice. But on the whole, this is pretty dry stuff, rendered in a pretty dry way (Gerhardt tends to list implications of presidential histories in lists with paragraphs that begin, "First," "Second," and so on). A little more context might also have been nice. For example, Gerhardt draws a series of contrasts between William Howard Taft and his mentor, Theodore Roosevelt, but the effect of of his emphasis on Taft's belief in limited government leads wonder to wonder why he wasn't a latter-day Jacksonian Democrat instead of a progressive Republican -- or, if he wasn't really progressive, why Roosevelt was so keen to have him as his successor. We sometimes lose the forest amid the trees.
Perhaps it's useful to end where we began: with Gerhardt's emphasis on law and policy rather than politics and personality. Though the book is organized in such a way that seems to emphasize individuals, the real story here is more about the presidency itself than the people who held the job. As such, The Forgotten Presidents makes a point worth remembering.
The following review has been posted on the Books page of the History News Network.
In The Forgotten Presidents, University of North Carolina law school professor Michael J. Gerhardt looks at a dozen presidents, beginning with Martin Van Buren and ending with Jimmy Carter, and argues that each had more of an impact than many people -- not simply a public at large that may only be vaguely familiar with their names, but also professional historians more likely to interested in more prominent figures -- and suggests their impact has been greater than is commonly recognized. As his subtitle makes clear, Gerhardt is not arguing that these presidents had compelling personalities, or that their political gifts or tactics were especially notable. Instead, he argues that each made essentially administrative decisions that either marked a precedent in the history of the presidency itself or quickened a tendency in the nature of office. Much of Gerhardt's analysis focuses on topics like presidential appointments, vetoes, and relationships with other branches of government, especially the courts and the U.S. Senate.
Insofar as there's a narrative trajectory in this series of profiles, it's that presidents of all times and parties have tended to guard and strengthen the prerogatives of the office. To be sure, there have been any number that have been avowedly in favor of limited government. But, as Gerhardt shows, these figures (Van Buren, Franklin Pierce, Grover Cleveland the first time around) are among the least successful in U.S. history. He also shows that the two Whig presidents elected to office, William Henry Harrison and Zachary Taylor, began their terms avowing deference to the legislative branch, in large measure as a reaction to the perceived high-handedness of Andrew Jackson. But both men, as well as the vice presidents (John Tyler and Millard Fillmore) who succeeded them, found this theory of government wanting. Indeed, even those executives who did profess a federalist approach to governing, from Cleveland to Coolidge, nevertheless fought hard to maintain and extend their power in their own domain when it came to things like removing cabinet officers or naming Supreme Court justices. And others, notably Cleveland the second time around -- he gets two separate chapters for each of his administrations -- became increasingly convinced of the need for presidential initiative in lawmaking.
Gerhardt is a scrupulous scholar who explores some compelling byways of presidential history; we learn, for example, that Carter, not Barack Obama, was the first president to confront the prospect of government default -- twice. But on the whole, this is pretty dry stuff, rendered in a pretty dry way (Gerhardt tends to list implications of presidential histories in lists with paragraphs that begin, "First," "Second," and so on). A little more context might also have been nice. For example, Gerhardt draws a series of contrasts between William Howard Taft and his mentor, Theodore Roosevelt, but the effect of of his emphasis on Taft's belief in limited government leads wonder to wonder why he wasn't a latter-day Jacksonian Democrat instead of a progressive Republican -- or, if he wasn't really progressive, why Roosevelt was so keen to have him as his successor. We sometimes lose the forest amid the trees.
Perhaps it's useful to end where we began: with Gerhardt's emphasis on law and policy rather than politics and personality. Though the book is organized in such a way that seems to emphasize individuals, the real story here is more about the presidency itself than the people who held the job. As such, The Forgotten Presidents makes a point worth remembering.
Thursday, May 2, 2013
Record profits
In Democracy of Sound: Music Piracy and the Remaking of American Copyright in the Twentieth Century, Alex Sayf Cummings traces churning tides of freedom in the business of distributing music
The following review has been posted on the Books page of the History News Network.
It seems like it was so much easier in aristocratic societies: artists had patrons to support them, and those who paid the fiddlers called, paid for, and owned the tunes. But in capitalist societies, there is apparently no end to the complications of who owns what and what those on the receiving end of art can and cannot do. Nowhere have the issues been more complicated than in music. Copyright was essentially an invention of print culture, and for most of modern history, the written word was a physical object. Not so music. For a while, it seemed its essence could be captured as sheet music. But the advent of recorded sound raised nettlesome -- and, at times, profound -- questions about what the essence of music really is. In Democracy of Sound, Alex Sayf offers a detailed narrative account of how the issues became so complicated -- and how, in the face of corporate pressure, they're becoming brutally simple.
Cummings begins his story with the wax cylinders and tin foil of early recording in the late nineteenth century (some sound innovations date back to earlier in the century, but their inventors were not in a position to commercially exploit them). The first major piece of legislation to affect recorded music dates from the Copyright Act of 1909, signed by Theodore Roosevelt on his last day in office. Under the law, musical compositions could be copyrighted. But recordings could not. Moreover, anyone could record and sell a rival version of a musical composition, as long as a flat-rate royalty was paid to the composer.
Naturally, record companies were unhappy about this. But they found other things to be unhappy about as well. Bootleggers recorded live shows they sold to the public, which jostled alongside commercially released recordings. Pirates reproduced cheaper copies (in multiple senses of term) of contractually sanctioned recordings and undercut their sales. Collectors resurrected out-of-print titles and sold them to devotees. Exactly how much damage such practices caused is impossible to calculate. Whatever the estimate, one can also make a decent case that they actually fostered sales by introducing (or re-introducing) music to buyers in ways that might otherwise not have happened.
That said, it became increasingly clear by the second half of the twentieth century that a musical recording was more than a mechanical process and indeed was a source of artistry in its own right. As a piece of songwriting (i.e. a set of chords, a melody, and some hokey lyrics), "Sgt. Pepper's Lonely Hearts Club Band" is not all that remarkable a piece of music. But executed in two versions that bookend a suite of songs whose whole is greater than the sum of its parts, and rendered as an aural experience marked by any number of sound effects, the song forms the core of a landmark work in the history of popular music. By the early 1970s, Congress and the courts were increasingly receptive to such logic, a tendency that crystallized with the passage of the Copyright Act 1976, which established a new benchmark of protection for records.
This legal turn signaled some ominous developments, however. "American copyright had always been utilitarian in nature, designed to 'promote the Progress of Science and useful Arts," Cummings writes, citing the Constitution. "The new way of thinking emphasized protection of capital outlays, of established businesses like like record labels, rather than incentives." Earning back investment, not sustaining innovation, was now the point. Corporations needed to exploit hits in order to finance the misses; those who tried to make money any other way were skimming the cream. And amid the strong libertarian currents running through the U.S. and global economy generally, this profit imperative became increasingly insistent.
But it also ran headlong into what may be termed the file-sharing sensibility of the early 21st century. Nowhere has the conflict been more evident than in the world of hip-hop, a quintessentially postmodern idiom whose signal artistic strategy is sampling other musical works. The more the record industry has tries to clamp down on this -- notwithstanding the way it often serves as a kind of farm system for up-and-coming talent -- the more it has strained credibility among its customers. The people caught in the middle are artists, who tend to react ambivalently: they like the exposure, but they'd like money, too. Given the fact that it's the big behemoths, not the little guys, who actually own the rights to most of this music, it's hard to feel that honoring record company wishes is truly the most honorable thing.
Cummings's own sympathies in this conflict are clear enough: he recognizes there are dilemmas, but tends to side with those who think music wants to be free. "Piracy may not kill music," he notes, "but history may record that it killed the twentieth-century music industry." If so, there will clearly be some losers, not all of them big business. But there will always be people around who are willing and able to make sound investments. For such people, Cummings has provided a usable, musical past.
The following review has been posted on the Books page of the History News Network.
It seems like it was so much easier in aristocratic societies: artists had patrons to support them, and those who paid the fiddlers called, paid for, and owned the tunes. But in capitalist societies, there is apparently no end to the complications of who owns what and what those on the receiving end of art can and cannot do. Nowhere have the issues been more complicated than in music. Copyright was essentially an invention of print culture, and for most of modern history, the written word was a physical object. Not so music. For a while, it seemed its essence could be captured as sheet music. But the advent of recorded sound raised nettlesome -- and, at times, profound -- questions about what the essence of music really is. In Democracy of Sound, Alex Sayf offers a detailed narrative account of how the issues became so complicated -- and how, in the face of corporate pressure, they're becoming brutally simple.
Cummings begins his story with the wax cylinders and tin foil of early recording in the late nineteenth century (some sound innovations date back to earlier in the century, but their inventors were not in a position to commercially exploit them). The first major piece of legislation to affect recorded music dates from the Copyright Act of 1909, signed by Theodore Roosevelt on his last day in office. Under the law, musical compositions could be copyrighted. But recordings could not. Moreover, anyone could record and sell a rival version of a musical composition, as long as a flat-rate royalty was paid to the composer.
Naturally, record companies were unhappy about this. But they found other things to be unhappy about as well. Bootleggers recorded live shows they sold to the public, which jostled alongside commercially released recordings. Pirates reproduced cheaper copies (in multiple senses of term) of contractually sanctioned recordings and undercut their sales. Collectors resurrected out-of-print titles and sold them to devotees. Exactly how much damage such practices caused is impossible to calculate. Whatever the estimate, one can also make a decent case that they actually fostered sales by introducing (or re-introducing) music to buyers in ways that might otherwise not have happened.
That said, it became increasingly clear by the second half of the twentieth century that a musical recording was more than a mechanical process and indeed was a source of artistry in its own right. As a piece of songwriting (i.e. a set of chords, a melody, and some hokey lyrics), "Sgt. Pepper's Lonely Hearts Club Band" is not all that remarkable a piece of music. But executed in two versions that bookend a suite of songs whose whole is greater than the sum of its parts, and rendered as an aural experience marked by any number of sound effects, the song forms the core of a landmark work in the history of popular music. By the early 1970s, Congress and the courts were increasingly receptive to such logic, a tendency that crystallized with the passage of the Copyright Act 1976, which established a new benchmark of protection for records.
This legal turn signaled some ominous developments, however. "American copyright had always been utilitarian in nature, designed to 'promote the Progress of Science and useful Arts," Cummings writes, citing the Constitution. "The new way of thinking emphasized protection of capital outlays, of established businesses like like record labels, rather than incentives." Earning back investment, not sustaining innovation, was now the point. Corporations needed to exploit hits in order to finance the misses; those who tried to make money any other way were skimming the cream. And amid the strong libertarian currents running through the U.S. and global economy generally, this profit imperative became increasingly insistent.
But it also ran headlong into what may be termed the file-sharing sensibility of the early 21st century. Nowhere has the conflict been more evident than in the world of hip-hop, a quintessentially postmodern idiom whose signal artistic strategy is sampling other musical works. The more the record industry has tries to clamp down on this -- notwithstanding the way it often serves as a kind of farm system for up-and-coming talent -- the more it has strained credibility among its customers. The people caught in the middle are artists, who tend to react ambivalently: they like the exposure, but they'd like money, too. Given the fact that it's the big behemoths, not the little guys, who actually own the rights to most of this music, it's hard to feel that honoring record company wishes is truly the most honorable thing.
Cummings's own sympathies in this conflict are clear enough: he recognizes there are dilemmas, but tends to side with those who think music wants to be free. "Piracy may not kill music," he notes, "but history may record that it killed the twentieth-century music industry." If so, there will clearly be some losers, not all of them big business. But there will always be people around who are willing and able to make sound investments. For such people, Cummings has provided a usable, musical past.
Subscribe to:
Posts (Atom)