Here at the SSRC we are always looking for innovative ways to collaborate on research; to translate, promote, and disseminate research findings; to offer sound research advice and assistance; and to help our colleagues and constituents venture beyond disciplinary boundaries to seek constructive solutions and ideas pertaining to their research interests. This week we offer food for thought vis-à-vis the “post-modern condition” of knowledge production, acquisition, representation, and circulation.
Stefanie Posavec’s Writing Without Words is a conceptually ingenious visual representation of Jack Kerouac’s On The Road. The project is a good example of a methodology rooted in the field of “digital humanities”—or a method of “cultural analytics”. This work demonstrates the emergent method of transforming text into numeric data to craft a visual representation of an iconic book. Posavec shows how chapters come together, how themes vary and form different patterns as the literary narrative unfolds, how sentences are structured—essentially, how Kerouac constructed his classic novel—in a visually and computationally defined presentation. None of the intrinsic value of Kerouac’s masterpiece is diluted by quantification; rather, this method offers a visual analysis of the book’s structure that can have important implications that go well beyond this particular novel.
In our current age of technology-aided learning, the historic primacy of the brick-and-mortar university—where the traditional professor holds the lone key to knowledge—seems to be dwindling. Ever-expanding access to information raises the question of how we should proceed as educators and researchers in a milieu of increasing computationality, a condition in which computer code and software now form the basis of many of our interactions. In David Berry’s thought-provoking piece “The Computational Turn: Thinking about the Digital Humanities,” we see how digitized and “born-digital” information has become increasingly central for research and how the cross-disciplinary potential of utilizing computational techniques in the humanities and social sciences is growing exponentially. Berry points out the emergence of rigorous methods for the quantitative analysis of qualitative data (such as literary texts) to capture themes, structure, patterns, and other textual phenomena that traditionalists might consider aberrant in literary criticism. He argues that this mixed method of analyzing and presenting texts is necessary and potentially very fruitful in an age of rapidly expanding stores of data. In a world where knowledge production and data streams threaten to drown us in new information, computational techniques provide the opportunity—and even require us—to reconceptualize and resituate the meaning of learning (knowledge) and learning-institutions (universities).
For many humanists and technologists alike, this turn of events is unsettling. Sreevidya Surendran’s essay, “Of Methods and Methodologies in Literary Studies and Humanities”, posted this week to The Sociological Imagination, addresses the discomfort many academics feel when choosing from methodologies. Surendran argues that the expanding potential for mixing methodologies blurs the lines between schools of thought, making the choice of methodology highly charged both politically and ideologically for scholars. However, embracing the use of new methods also increases the potential for new voices to enter the discussion and new insights to be discovered. As Surendran says, “a negotiated model of methodology which places a certain approach at its core while simultaneously employing another approach(es) which may or may not be similar or even of the same school is a viable option. It will allow greater freedom of thought and encourage amalgamation and inclusion of diverse voices into the fabric of academics.” By using a “pastiche of methods”, scholars will be able to release themselves from the political and power-based structures of academia and address new and previously taboo issues with the aim of knowledge for knowledge’s sake. Surendran advocates a mixing not only of methods, but also disciplines, citing eco-criticism as a field in which scientific and humanities researchers must harmonize their methods.
If nothing else, the open-minded scholar will see that the once seemingly untraversable chasm between the humanities and the sciences can be bridged. Bridges won’t be built by humanists grasping at a facade of scientific “process” or by scientists waxing poetic, but rather by scholars willing to see the core work of the academy in a new light. Last month Stephen Ramsay presented a paper at the Digging into Data Conference that addresses reactions to “Data Mining with Criminal Intent”, a project that visualized and explored the stories found in The Proceedings of the Old Bailey, a newly digitized collection from London’s central criminal court from 1674-1913. Ramsay recalled a question from a colleague at an earlier conference when presented with visualizations of Shakespeare’s works: “Isn’t this just art?” Rather than defending the rigorous analysis done to create the visualizations, Ramsay embraces the artistic, and even the playful, aspects of new methods because they lead to new discoveries. On their face, Posavec’s visualizations of On the Road are just pretty pictures, as are any data visualizations. Yet when we look more deeply at these projects’ aims and methodologies, we can see serious scientific, scholarly, and research-oriented efforts. We might, as Berry suggests, need to reexamine the project and structure of the university to reap the benefits of these new ideas. We might have to cross disciplines to work with colleagues on these new projects, and we might even have to mix our methods, not to mention our metaphors.