This interview was carried out on 10 July 2014 at the Digital Humanities Conference in Lausanne, Switzerland. In it Thaller recalls that his earliest memory of encountering computing in the Humanities dates to c. 1973 when he attended a presentation on the use of computational techniques to map the spatial distribution of medieval coins. The difficulties of handling large, paper-based datasets was impressed upon him as he compiled some 32,000 index card of excerpts for use in his PhD thesis. When he later encountered statistical standard software at the Institute for Advanced Studies in Vienna he found that such software could not be beneficially applied to historical data without first transforming in some way the historical data in the sources (indeed, the formalisation of historical and cultural heritage data is an issue that reoccurs in this interview, much as it did in Thaller's research). In light of his experience of the problems of using such software 'out of the box' to work with historical data he went on to teach himself the programming language SNOBOL. Within a few weeks he had joined a project on daily life in the middle ages and was building software to manage the descriptions of images that the project compiled and stored on punched cards. Having contributed to various other projects with computational elements, in 1978 he took up a post at the Max-Planck-Institut for History in Göttingen. As well as discussing the research he carried out there, for example, CLIO/ κλειω a databased programming system for History with a command language in Latin, he discusses the immense freedom and access to resources that he benefitted from. He also goes on to discuss some of the later projects he worked on, including those in the wider context of digital libraries, infrastructure and cultural heritage.
This interview was conducted on 11 July at the 2014 Digital Humanities Conference, Lausanne, Switzerland. Huitfeldt recounts that he first encountered computing at the beginning of the 1980s via the Institute of Continental Shelf Research when he was a philosophy student at the University of Trondheim. However, it was in connection with a Humanities project on the writings of Wittgenstein that he learned to programme. When that project closed he worked as a computing consultant in the Norwegian Computing Centre for the Humanities and in 1990 he established a new project called the 'Wittgenstein Archives', which aimed to prepare and publish a machine-readable version of Wittgenstein's Nachlass. Here he discusses the context in which he began working on the encoding scheme (A Multi-Element Code System (MECS)) that he developed for that project. In addition to discussing matters like the trajectory of DH research and his early encounters with the conference community he also discusses some of the fundamental issues that interest him like the role of technology in relation to the written word and the lack of engagement of the Philosophy community with such questions. Ultimately he concludes that he does not view DH as a discipline, but rather as a reconfiguration of the academic landscape as a result of the convergence of tools and methods within and between the humanities and other disciplines.
This oral history interview was conducted 3 June 2015 via skype. In it Mary Dee Harris recalls her early encounters with computing, including her work at the Jet Propulsion Lab in Pasadena, California. Despite these early encounters with computing she had planned to leave it behind when she returned to graduate school to pursue a PhD; however, the discovery of c.200 pages of a Dylan Thomas manuscript prompted her to rethink this. Her graduate study was based in the English Department of the University of Texas at Austin, which did not have an account with the Computer Centre, and so it was necessary for her to apply for a graduate student grant in order to buy computer time. Her PhD studies convince her of the merits of using computers in literary research and she hoped to convince her colleagues of this too. However, her applications for academic jobs were not successful. After working in Industry for a time she went on to secure academic positions in Computer Science at various universities. During her career she also held a number of posts in Industry and as a Consultant. In these roles she worked on a wide range of Artificial Intelligence and especially Natural Language Processing projects. Her interview is a wide-ranging one. She reflects on topics like the peripheral position of a number of those who worked in Computers and the Humanities in the 1970s and her personal reactions to some of the computing systems she used, for example, the IBM 360. She also recalls how she, as a woman, was sometimes treated in what tended to be a male-dominated sector, for example, the physics Professor who asked “So are you going to be my little girl?”
This oral history interview between Wilhelm Ott and Julianne Nyhan was carried out on 14 July 2015, shortly after 10am, in the offices of pagina in Tübingen. Ott was provided with the core questions in advance of the interview. He recalls that his earliest contact with computing was in 1966 when he took an introductory programming course in the Deutsches Rechenzentrum (German Computing Centre) in Darmstadt. Having become slightly bored with the exercises that attendees of the course were asked to complete he began working on programmes to aid his metrical analysis of Latin hexameters, a project he would continue to work on for the next 19 years. After completing the course in Darmstadt he approached, among others such as IBM, the Classics Department at Tübingen University to gauge their interest in his emerging expertise. Though there was no tradition in the Department of applying computing to philological problems they quickly grasped the significance and potential of such approaches. Fortunately, this happened just when the computing center, up to then part of the Institute for Mathematics, was transformed into a central service unit for the University. Drawing on initial funding from the Physics department a position was created for Ott in the Tübingen Computing Centre. His role was to pursue his Latin hexameters project and, above all, to provide specialised support for computer applications in the Humanities. In this interview Ott recalls a number of the early projects that he supported such as the concordance to the Vulgate that was undertaken by Bonifatius Fischer, along with the assistance they received from Roberto Busa when it came to lemmatisation. He also talks at length about the context in which his TUSTEP programme came about and its subsequent development. The interview strikes a slightly wistful tone as he recalls the University of Tübingen's embrace of the notion of universitas scientiarum in the 1960s and contrasts this with the rather more precarious position of the Humanities in many countries today.
This oral history interview between Willard McCarty (on behalf of Julianne Nyhan), John Burrows and Hugh Craig took place on 4 June 2013 at the University of Newcastle, Australia. Harold Short was also present for much of the interview. Burrows recounts that his first encounter with computing took place in the late 1970s, via John Lambert, who was then the Director of the University of Newcastle's Computing Service. Burrows had sought Lambert out when the card-indexes of common words that he had been compiling became too difficult and too numerous to manage. Craig's first contact was in the mid-1980s, after Burrows put him in charge of a project that used a Remington word processor. At many points in the interview Burrows and Craig reflect on the substantial amount of time, and, indeed, belief, that they invested not only in the preparation of texts for analysis but also in the learning and development of new processes and techniques (often drawn from disciplines outside English Literature). Much is said about the wider social contexts of such processes: Craig, for example, reflects on the sense of possibility and purposefulness that having Burrows as a colleague helped to create for him. Indeed, he wonders whether he would have had the confidence to invest the time and effort that he did had he been elsewhere. Burrows emphasises the network of formal and informal, national and international expertise that he benefitted from, for example, John Dawson in Cambridge and Susan Hockey in Oxford. So too they reflect on the positive results that the scepticism they sometimes encountered had on their work. As central as computing has been to their research lives they emphasise that their main aim was to study literature and continuing to publish in core literature journals (in addition to Digital Humanities journals) has been an important aspect of this. Though they used techniques and models that are also used by Linguists and Statisticians their focus has remained on questioning rather than answering.