It has now been about forty years since one could say with any plausibility that landing a job as a professor of History is not especially difficult. The Baby Boom of the 1950s, combined with the massive infusion of government spending in the early postwar decades, made academia a growth industry through the 1960s. Since the late seventies, however, a relative ebbing of American economic prosperity, gradual demographic shifts, and a receding public-sector commitment to higher education – all three of which, of course, are deeply entwined – have made a career in academia an increasingly unattainable dream. It is a dream, moreover, complicated by the same kind of insidious inequities that have marked
And it’s not likely to stop aspirants even now – indeed, many young people may even regard a graduate program as shelter in the economic storm. What may eventually change this time is not the entrance, completion, or even the placement of history graduate students in particular at the nation’s colleges and universities, but what is expected of them – or, more to the point, what they can rationally expect to be doing. Actually, this has been a source of tension for some time. Young historians are trained in research methodology, are educated about historiographic debates in their fields, and are coached to produce original scholarship. They get some pedagogic practice as teaching assistants, and in recent years there has been more effort to think harder about what good teaching entails. But the structure of the profession is such that teaching is regarded as the price a good scholar pays for doing what matters most: research. Yet the production of original research, however vital it may be to the future of the academic profession, has always been something of a luxury – supporting it has been the price a good school pays for its reputation. This friction has worked well in some cases, particularly at elite colleges and universities, since research and teaching can have a symbiotic relationship, even if they are not inextricably entwined. But it has also engendered structural frustrations on the part of professors who feel impossibly bogged down by the demands of their institutions and administrators who feel like professors behave as cosseted employees who seek to shirk their obligations.
The current economic downturn has put tremendous pressure on this professional model. Should it continue, it will also raise some fundamental questions about what institutions whose primary mission is the education of students can truly afford by way of sustaining a now century-old professorial model based on academic specialization, primary source research, professional conferences, university press publishing (university presses, increasingly subject to market conditions, can no longer easily justify publishing insular scholarship), and other components of an intellectual apparatus that many scholars have long taken for granted. The cultural implications of this economic upheaval do not yet seem widely engaged, much less understood. Can we assume, for example, that the production of academic monographs will remain the basic unit of intellectual currency in the profession? Will History continue to be primarily a print medium? Will older narrative (as opposed to analytic) strategies reassert themselves? Merely posing these questions may sound heretical. But for those on the margins of the profession, they seem natural, even inevitable.
In my more sardonic moments, I’m reminded of an R.E.M. song that came out in my first year of graduate school: “It’s the End of the World as We Know It (and I Feel Fine).” But I understand that the end of History as we know it, whenever that moment inevitably arrives, will be an occasion for sorrow for more people than academic historians. Nevertheless, it seems plausible, legitimate and even necessary to hope that change will create opportunities for the past to be made anew.
I plan to attend the Organization of American Historians conference, which begins tomorrow in Seattle.