- something went wrong
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 2.5 License.
In the 19th century, Britain was the world’s superpower, boasting a global empire of 10 million square miles and 400 million royal subjects. And British authors of the era reflected this supremacy, peppering prose with words of command and certainty — ones like always, never and forever.
At the same time in Ireland, writers echoed a different perspective in their books. With the Irish under the thumb of British rule, the nation’s scribes frequently used words that displayed inability or frustration — ones like almost, nearly or perhaps.
Matthew Jockers knows this to be a fact because it bears out in his computer-generated data: The University of Nebraska-Lincoln assistant professor of English has combined computer programming with digital text-mining to produce deep thematic, stylistic analyses in 19th-century literary works. He calls the data-driven process macroanalysis, and it’s opening up new methods for literary theorists to study classic literature.
“But what we don’t know is what happens after the turn of the 20th century,” Jockers said. “The 20th century, as we know, is when the British Empire deteriorates and the Irish gain independence. So do each country’s authors remain as they were in the previous century? Or if they do begin to change their approach, in what ways do they go about it? That’s the kind of question we can address — with access to proper data, that is.”
Now, thanks to an exclusive agreement between UNL and private company BookLamp, Jockers and research collaborators from several U.S. universities have the tools to begin uncovering the answers to that question — and many others. This new research collaboration will ultimately allow scholars to access and analyze book data from the 18th, 19th and 20th centuries.
Read more on the UNL News Blog