by Leonor Álvarez Francés
The public research sector is increasingly interested in financing the development of generic software for the digital humanities. An example is the joint enterprise of the Deutsche Forschungsgemeinschaft and the National Endowment for Humanities called Bilateral Digital Humanities Program, designed to support projects that develop and implement generic tools. A rather unsurprising phenomenon of course; after so much workforce and money being wasted in the last years either on coping with tools unable to satisfy the needs of humanists, or on developing specific tools that served those for only some months. For example, it is common practice that once a project finishes, both software and data start a hibernation period that runs the risk of becoming perpetual. In order to fight these developments, two sorts of generic tools are being developed: those designed to satisfy the need to store and analyse a database, and those aiming at data integration.
Concerning the first kind, I was happily surprised when I attended the presentation of SALE three weeks ago. SALE is the software developed by PhD candidate Mark Opmeer for the project ‘Sailing Networks: Mapping colonial relations with Suriname’s seventeenth-century Sailing Letters’ (University of Amsterdam, Scheepvaartmuseum Amsterdam, GEODAN). Taking a small set of data from the Sailing Letters archive, he worked on developing this generic tool for humanities researchers that offers spatial, network and statistical analyses. A prototype can be seen here. The app is planned to be launched before the end of 2014 and includes some improvements to the prototype version plus the possibility of using it for your own data and research.
Following this presentation, which I inevitably left with a smile on my face, I met Peter van den Hooff and Wouter Klein, who are involved in the making of a generic tool for data integration of cultural heritage databases. The project is called Time Capsule; and even though it employs datasets on the history of pharmacy and botany, it aims at developing an open license ontology that will serve for integration of digital heritage material on any research field.
It´s like I´m hearing it already, a very legitimate doubt around these initiatives: who will be maintaining and upgrading the software? Given the novelty of the phenomenon, I simply do not have an answer to this burning question. Note I started this post speaking of the public sector. Private companies have in principle more resources to address this matter, simply because they are fully dedicated to it. Take Lab1100 for instance, the private partner of the public-private project ´Mapping Notes and Nodes in Networks´ (Huygens ING, 2014) I had the pleasure to be involved in. Lab1100´s tool Nodegoat is in many respects precisely what many (including me) wish for: a stunningly flexible web-based generic software for the humanities that is constantly being enhanced. For the time being, it is open and free for individual researchers, but research teams need to pay for as it is a privately developed and mantained instrument.
There are other stumbling blocks on the path toward open access generic tools for the digital humanities; such as the manner sharing data challenges academic, traditional notions of authorship. Another example is the difficulty to create a flexible, generic tool that at the same time responds to individual wishes. However, the digital humanities community is visibly moving toward collaborative enterprises and has started to ´think big´, two very suitable ingredients for bridging over those difficulties and making more global infrastructures.