The 21st Biennial Symposium on German Language and Culture, hosted by the department, was March 29-31, 2012.
This year’s event was entitled Distant Readings/Descriptive Turns: Topologies of German Culture in the Long Nineteenth Century and was co-organized by Matt Erlin and Lynne Tatlock. Building on recent approaches to literary and cultural criticism developed by Franco Moretti, Bruno Latour, and others working under the rubric “new sociologies of culture,” fourteen German and North American scholars gave presentations that sought to generate fresh insights into cultural history by adopting and adapting the empirical methods of the natural and social sciences. Through presentations on topics ranging from the computer-assisted analysis of nineteenth-century German literature, to the reception of German books in America, to the evolving relationship between fiction and non-fiction in nineteenth-century German journals, participants addressed the question of what can be gained (and what might be lost) when we move away from intricate rhetorical analyses of individual texts and turn our attention instead toward large bodies of data, making use of analytical techniques borrowed from such disciplines as statistics, computational science, quantitative history, and the emerging field of digital humanities.
As an age of industrialization and the development of mass markets, the nineteenth century offered fertile terrain for such approaches. The roughly 150 years between 1789 and 1918 were characterized by an unprecedented boom in newspaper, magazine, and book production, fostered by breakthroughs in printing technology that made reading material, fiction and non-fiction, available to a wider range of consumers than ever before. Precisely this abundance of textual material, however, has long presented scholars with a challenge: how to read it all. No individual scholar can ever acquire knowledge of more than a tiny percentage of the total number of novels and stories that were actually published in the period. New digital technologies cannot significantly accelerate human reading processes per se, but they do give us the power to search and analyze a much greater percentage of this vast corpus of texts. The assembly and management of large databases enable both the testing and re-articulation of hypotheses as well as recovery and discovery of patterns and trends. The decision to organize the conference was thus partly inspired by the new availability of historical materials in the digital age. Google Books’ ongoing project of digitizing libraries, to name the most prominent undertaking, allows scholars access to the broad spectrum of German texts as never before and thus pushes us not to limit our understanding of literary production, circulation of texts, and reading to what contemporary presses and the academy of the twentieth century have deemed canonical.
The symposium helped to demonstrate the potential that resides in new technologies and the increased access to materials, not only to open up entirely new areas of inquiry but also to offer a fresh perspective on some of the most venerable topics of literary studies, from genre to the nature of literary realism. The presentations also generated a fruitful discussion about what a shift to more empirical and quantitative analyses might mean for our teaching mission and our professional self-understanding as humanists.
By all accounts, the conference was a resounding success, and the volume that it will generate, which is scheduled to appear with Camden House Press in 2013, promises to have a substantial impact on the shape of German Studies in years to come.