DHD 2019

Last week the conference DHD 2019, the German digital humanitines conference took place in Frankfurt.

One remarkable discussion I heard was from a panel about 3D-modelling and the reconstruction of buildings. People in this panel were talking about the problems their field has and one was the lack of standards. As we all can imagine it is really hard to reconstruct old objects and buildings.

Some buildings have been re-build, destroyed or have never been built at all. This creates many uncertainties when it comes to questions like: How did the building originally look? Was it built as the architect intended it to be built? If we then look at standards, we see the importance. With a standard, one could exactly see what the other researcher wanted to show with their modelling and exchange and shared work would also be easier.

Another interesting talk was the keynote by Jana Diesner. She talked about her research in computational social science at the University of Illinois. She at first urged for a better collaboration between computational social sciences and digital humanities. This I also think is really important and there are certain fields that are quite close. Actually I think that maybe some of my research more falls in the field of computational social sciences than digital humanities because my institute is still focused on social sciences and the experts in my field are also doing qualitative social science research. The other thing I found remarkable are her stories about the ischool she is working at. In the US, there are many ischools now. The concept (as I understood it) is to bring researchers from different fields like social sciences, information science, computer science and psychology to do research in the broader sense about information. This can be a very fruitful combination because it also brings together new methods and ideas, which always helps to open our minds for difficult questions.

I did not attend so much of the conference because it was just next to my office and I had other stuff to do, but a really cool thing was the poster-slam and the poster session itself. It is just nice to look at posters and being able to discuss research directly with the researchers in a private way and it is also a nicer communication than just via journal articles and presentations.

Algorithmic Criticism

Today I want to present I paper which made me think about Digital Humanities. It is called “Algorithmic Criticism” by Stephen Ramsay.

Unlike most of the other papers that only focus new algorithms and new data, this one also focuses on methods how the two parts of the digital humanities can be combined together. He wants to develop a criticism (which is normally a method more used in humanities) that is based on algorithms.

He argues that even in literature research it could be possible to have approached that are a lot more empirical, meaning you have an experiment and quantitative measurement to proove your claims. Another important point that he states that computers might not be ready for that kind of analysis (the paper is from 2005 though), but in future may be, so he believes that these methods will become available.

One of the central points is that he argues every critic reads a text using his own assumptions and “sees an aspect” (Wittgenstein) in the text. So the feminist reader sees a feminist aspect of the text, and also the “algorithmic” reader can see the aspect of the computer or can read the text transformed by a computer. The paper at the end presents some research doing tf-idf measures at the novel The Waves by Virignial Woolf.

I really like this idea to have a certain way of reading a text by letting this be done by a machine and that it is considered similar to a human reader, which is also not completely effective and free of bias. This also is good for the researcher in NLP, because so you can admit that the judgement the computer gives is also not free of bias, for instance if you change the parameters in your algorithm.