Today I want to present I paper which made me think about Digital Humanities. It is called “Algorithmic Criticism” by Stephen Ramsay.
Unlike most of the other papers that only focus new algorithms and new data, this one also focuses on methods how the two parts of the digital humanities can be combined together. He wants to develop a criticism (which is normally a method more used in humanities) that is based on algorithms.
He argues that even in literature research it could be possible to have approached that are a lot more empirical, meaning you have an experiment and quantitative measurement to proove your claims. Another important point that he states that computers might not be ready for that kind of analysis (the paper is from 2005 though), but in future may be, so he believes that these methods will become available.
One of the central points is that he argues every critic reads a text using his own assumptions and “sees an aspect” (Wittgenstein) in the text. So the feminist reader sees a feminist aspect of the text, and also the “algorithmic” reader can see the aspect of the computer or can read the text transformed by a computer. The paper at the end presents some research doing tf-idf measures at the novel The Waves by Virignial Woolf.
I really like this idea to have a certain way of reading a text by letting this be done by a machine and that it is considered similar to a human reader, which is also not completely effective and free of bias. This also is good for the researcher in NLP, because so you can admit that the judgement the computer gives is also not free of bias, for instance if you change the parameters in your algorithm.