Quantitative methods have been central to the humanities since scholars began relying on full-text search to map archives. But the intellectual implications of search technology are rendered opaque by humanists’ habit of considering algorithms as arbitrary tools. To reflect more philosophically, and creatively, on the hermeneutic options available to us, humanists may need to converse with disciplines that understand algorithms as principled epistemological theories. We need computer science, in other words, not as a source of tools but as a theoretical interlocutor.
Theorizing Research Practices We Forgot to Theorize Twenty Years Ago
TED UNDERWOOD is Associate Professor of English at the University of Illinois at Urbana-Champaign. He has published two books on nineteenth-century literary history, most recently Why Literary Periods Mattered (Stanford, 2013). Supported by an ACLS Digital Innovation Fellowship and an NEH Digital Humanities Start-Up Grant, he is currently tracing the history of genre in a collection of a million English-language volumes, 1700–1923.
Ted Underwood; Theorizing Research Practices We Forgot to Theorize Twenty Years Ago. Representations 1 August 2014; 127 (1): 64–72. doi: https://doi.org/10.1525/rep.2014.127.1.64
Download citation file: