Skip to content

Welcome

About me

I am assistant professor in the History department at Maastricht University and specialise in the early modern period and digital humanities. One of my research and teaching interests is computational text analysis, which is why I have created this repository.

The distant reading repository

This repository contains tutorials, data samples and code related to distant reading. In the courses and workshops I currently teach, participants do not have previous training in Natural Language Processing (NLP), which is why the text analysis itself is carried out with Voyant Tools, "a web-based reading and analysis environment for digital texts". However, we use Python code for data collection, merging text files and some basic data cleaning. Please check the Data Collection and Data Cleaning sections for detail.

"Machines of Knowledge" course at the Faculty of Arts and Social Sciences (FASoS)

The course Machines of Knowledge, taught in the MA Digital Cultures programme, introduces students to the transformation of the World Wide Web from an information space with a limited number of content creators to a complex network of dynamic knowledge sites to which all users can (potentially) contribute. In terms of methods, the course introduces students to the basic of distant reading. Students learn how to collect their own text corpus from the web and how to analyse it with text analysis software that highlights word frequencies, word co-occurences and narrative trends. We explore different data sets to critically reflect on how users interact online, how trending topics arise, and how digital communities are formed. The technical skills acquired in this course prepare students for further studies and research, but are equally useful for professional careers in the media and (social media) marketing.