Computing Action

Computing Action. A Narratological Approach

The following gives an outline of the project which is described in detail in my book ‘Computing Action. A Narratological Approach’.

(1) Based on a survey of philosophical definitions of the common sense concept of ‘action’ a constructivist notion of represented action was developed. (2) This approach was augmented by a semiologic definition of narrated action in the tradition of Greimassian theory which lead to a new definition of the narratological key concepts ‘event’, ‘episode’ and ‘action’ as combinatorial readerly constructs. (3) These narratological concepts were then implemented in a Humanities Computing methodology for the analysis of narrated actions. The development of two software tools was essential to the implementation of our theory: EventParser is a Markup tool that facilitates syntactically and conceptually consistent tagging and semantic declaration of ‘events’ in a narrative text. This program was developed in VB 6.0. – EpiTest is a program for combinatorial analysis of the potential for chaining these pre-defined events into episodes, and episodes into complex actions in accordance with the semiological criteria set down in the theoretical part of the study. EpiTest was written in PROLOG. – The software (including numerous protocol files and installation instructions) is available for download.

Introduction to ‘Computing Action. A Narratological Approach’

Table of Contents for the book.


The following introductory pages contain a brief outline of the three major themes of this book. They are, as the title suggests, the themes of computing, narratology, and action. After considering computing, we shall put the abstract before the concrete by commenting on narratology and action in that order. Although this contrasts with the word order in the title of our study, the book itself will, I hope, make up for the imbalance—a good third of it is dedicated to a philosophical criticism of the concept of action before we move on to a discussion of its application in a narratological and computational context.

First, however, a few words should be said about the spirit in which the book has been translated from the original German version. At each stage, the overall aim was to make the text as accessible as possible to English-speaking readers, while always maintaining the necessary theoretical clarity and accuracy. This is particularly apparent in our treatment of quotations from authors and critics who write in a language other than English. Whenever possible, we have provided the equivalent passage from a reliable English translation. When, however, no translation has been published, or the existing translation was unobtainable or felt to be potentially misleading, we have provided our own translation. In those cases where the wording of the original German text is of paramount critical importance, it has been reproduced alongside the translation. This will allow readers who are familiar with German to compare the translation with the original and discover the nuances and allusions which escape what even the best translation can say.


The realities of the twenty-first century and the age of information technology might give one to believe that literary theory must soon, if it has not already done so, succumb entirely to the accuracy and objectivity supposedly promised by numerical data analysis. But such an analysis of the contemporary situation is a distorted exaggeration—although computational approaches to parsing and analysing texts are no longer the anathema they once were, there has not been a change in mentality such as would cause a full-scale paradigmatic shift in the critical community.

It is more appropriate to speak of coexistent paradigms that complement one another. This mutual tolerance is reflected in the wider context of the developing framework of humanities computing, a new methodology which has its own distinctive theoretical and technological features but is nonetheless firmly anchored inside the humanities. Literary computing, the processing and analysis of literary texts by computational means, is one subsector of this new field. The methodological and epistemological status of humanities computing is still emerging as I write. Is it a discipline? A methodology? A method? Or no more than a tool? The same issues and the same questions apply to literary computing, or, as it is known in the German-speaking world, computational philology (Computerphilologie), the terminological counterpart of computational linguistics (Computerlinguistik). Even if literary computing has not yet been properly defined, however, it is clear that the pragmatic uses (and restrictions) of computer-aided textual analysis have been sufficiently well-documented to allay even the most traditional critic’s misgivings about the use of computers.

Indeed, the employment of computers in the scholarly analysis of texts and languages is anything but new. For a long time, linguistics, corpus linguistics, and stylistics in particular have all employed computational methods to handle the characteristics and phenomena of written and spoken language which lend themselves to formal definition and description. These approaches include the analysis of metre and the calculation of morpheme and word frequencies. The computer-aided analysis of formally representable textual characteristics such as these began some fifty years ago, and its methods have increased in popularity and interest in the recent past as hardware resources have become more and more accessible to those outside the scientific community where they originated.

The project presented in this book, however, has little in common with the above approaches apart from its technological foundations and the abstract principles with which it is implemented. My primary concern is a philological one in the traditional sense of the word. For our purposes, empirical textual features that can be described and analysed are of interest only in so far as they have a function in producing the synthetic meaning of a text. Meaning is a high-level concept and not to be confused with denotational reference; it defines the generalized anthropological and semiotic function of the text as a whole rather than the abstract linguistic or concrete pragmatic functions of the individual elements and sequences of elements in it. Meaning is the product of the complex and extensively recursive interaction of empirical textual features on the one hand and idiosyncratically and historically predefined structures and knowledge on the other. When we talk and speculate about meaning, when we analyse and describe how it emerges in a textual representation, we must treat language in terms of dynamic, multidimensional processing models rather than unidirectional functions and two-dimensional structures.

The question of how meaning is created when humans process texts—more specifically, when humans process narratives (texts which contain temporally indexed representations of events)—is not the exclusive concern of hermeneutically orientated disciplines. In our case, for example, the approaches of artificial intelligence are obviously of particular interest. When, however, the literary theorist ventures into this latter field, it becomes painfully obvious that there is a fundamental problem inherent in employing computer-aided analysis to investigate symbolic systems. If we are to preserve the paramount importance of the concept of meaning in criticism, we must recognize that literary computing will, by definition, be required to reconcile two conflicting epistemologies: that of the quantitative, numerical paradigm and that of the qualitative, hermeneutic paradigm. In so far as it employs computational methods, literary computing draws on the quantitative paradigm, where data is processed by explicitly defined algorithms which transform input data into uncontradictory, unambiguous output data. In general, the algorithms themselves are semantically neutral and not influenced by the data they process. Nothing could be less true of the hermeneutic paradigm, in which it is equally important that the processor is context-sensitive and self-aware. In the hermeneutic paradigm, we have to ensure that it is possible to access dynamically redefined frames of reference. Perhaps more importantly, the whole purpose of processing a text in this paradigm is precisely not to transform input into output in an unambiguous, predictable, and experimentally verifiable manner. It is rather to produce a unique and completely new interpretation that has never occurred before. From a theoretical perspective, a piece of information is meaningful in the emphatic sense of the word if it is a singular phenomenon which is more than just a mechanically reproducible result, more than the result of a process that simply encodes or decodes signifiers.

However, the divide between the two paradigms can, in theory, be bridged. To a certain extent, computational algorithms can be programmed to consider the data they transform from input into output and the way in which they do so. This makes them self-aware and context-sensitive and thus able to emulate human intelligence. Furthermore, if we design them to perform recursive combinatorial transformational operations rather than unidirectional ones, we enable them to emulate the capability of the human mind to explore the potential meanings of textual data. The result is that the use of computers in literary criticism can enable us to overcome two of the most fundamental shortcomings of traditional theories: first, the inability to base individual interpretations of a text on properly consistent empirical descriptions of the phenomena in that text; second, the failure to analyse sufficiently large corpora in the methodologically consistent way that is necessary if generalizations about works, epochs, or genres are to be based on inductive inferences rather than impressionistic observations and normative declarations.


It is at this point that narratology comes into play. As the reader will recall, the accurate description of the formal features of literary texts was the goal of the programme of the Russian Formalists and the structuralist narratologists who later adopted and refined their theories and concepts. The high-level semantic and aesthetic functions of literary texts, it was thought, could eventually be traced back to formal features of the symbolic material in the text and the structures in which it appeared. The narratological theories and concepts that preceded the post-structuralist movement are steeped in formalist methodology and therefore particularly suited to being modelled, explored, and applied using computational means. In the context of this observation, ‘computational’ need not necessarily be understood in the technical sense of a concrete computing device. Granted, the computational modelling of a narratological concept should eventually mean implementing the model in a concrete piece of software. More important, however, is the reformulation of the concept itself as something that can be computed in the first place. Textual phenomena of narratological interest are rarely located below the level of sentences (or rather propositions). Because of this, they will always be phenomena which can occur only when the sentences (or propositions) are cognitively processed in the human mind. This, essentially, is why scholars like David Herman argue that narratology should be fundamentally redefined as a subdiscipline of cognitive theory. After all, the processing of textual information, particularly the temporally indexed events represented in narrative texts, is arguably the most complex approximation we have of how humans process information in the real world in real time.

The reorientation of narratological theory and methodology as part of cognitive theory is bound to have significant consequences. Human cognitive processing never takes place in a context-free environment. If narratologists are to take this insight seriously, they will have to rethink some of the most basic concepts—such as function, event, and action—which they have inherited from the formalists. And so, we suddenly find the narratologist and the computing humanist in the same boat. This is why there is no need to go overboard and suddenly redefine everything in terms of scripts and frames alone and reject the formalist paradigm altogether. In cognitive processing, scripts, frames and knowledge interact with logical universals; the same applies to the processing of narrative representations. If this were not the case, it would be a complete mystery as to how we can arrive at even remotely identical interpretations of identical narrative data. The approach of literary computing towards a specifically narratological phenomenon must, therefore, be capable of relating the abstract logical model of the phenomenon to a process model of how instances of the phenomenon are read.

What’s in it for narratology? Recall that even the most radically structuralist approaches to narratology have, so far at least, failed to supply studies in which a well-defined methodological and taxonomical system has been consistently applied to an adequately large corpus of narrative texts. Propp’s morphological and functional analysis of one hundred Russian folk-tales seems to have remained a singular example. As this book will attempt to show, computational technology may allow us to overcome the impasse. Although the practical application of the theory described in this study will be restricted to six rather short narratives, its empirical methodology can easily be applied to much larger corpora without further modification. Two software tools—EventParser, a markup tool, and EpiTest, an analytical tool—were developed to accompany this book and are a crucial part of our project. Both applications, as well as extensive documentation and record files, are available for download.

However, the above utilitarian argument does not do full justice to the way in which literary computing approaches narratological problems. The true relevance of literary computing is methodological rather than empirical in nature. One of the biggest problems for scholars in the humanities is establishing the validity of their hypotheses—contrast the situation in the natural sciences, where we can perform experiments to test whether a hypothesis is valid or not. Of course the matter becomes appreciably more difficult in both contexts when we start dealing with abstract and hypothetical entities, and even worse when we consider the fundamental relativity of observation that is an inescapable part of all perception and cognition. On the other hand, Occam’s razor and the appeal to logical consistency are universal criteria that can, irrespective of discipline, be applied to almost any theoretical construct. But we should not be under any illusions—more often than not, it proves extremely difficult to apply these criteria to the soft and impressionistic theorems that abound in the humanities in general and literary studies in particular and which hardly ever satisfy the Popperian test of falsifiability.

Against this background, the decisive methodological role of the computer proves to be that of a ‘meta-instrument’ (Orlandi 2002:53) rather than a tool. In other words, the computer helps us to simulate an objective test-bed for our ideas about how narrative processing works. Algorithms are unforgiving not only when it comes to dealing with obvious contradictions but also when it comes to processing ill- or under-defined input. It may require considerable effort to express narratological concepts in the form of program code that can actually run on a machine, but the investment of time and energy is rewarded each and every time the program malfunctions, crashes, or descends into an infinite loop, for, by doing so, it highlights a flaw in the reasoning behind our theory as it stands.


As the title of the book suggests, ‘action’ is the central narratological concept which is investigated in this study. Since Aristotle’s Poetics, scholars have discussed the question of what narrated action is, what its elements are, and how we manage to read and narrate it. The first part of our study will discuss these questions from the perspectives of philosophy and narrative theory before presenting a possible new theoretical definition of action. In the second part, I shall attempt to reformulate the preliminary definition in the context of narratology and apply it in the context of literary computing.

In the third and final part of the book, I shall describe the demonstration analysis of a concrete literary text. I have chosen Goethe’s Conversations of German Refugees of 1795 as my example text for two reasons. First, this cycle of novellas has generally been considered interesting precisely because it is very difficult to establish the coherence of its overall action and the actions contained in its component novellas. Second, I intend to show, against the background of a detailed analysis that will test the concepts and tools developed in the preceding parts of the book, that the Conversations do not only contain, or represent action and actions but are in fact a discourse on the problem of how we narrate and read action itself. In order to show this, I shall interpret the data generated by the computational analysis of the text in the light of Goethe’s ideas on the epistemological difference between how we perceive natural phenomena on the one hand and social and aesthetic phenomena on the other.

Table of Contents

List of Tables


List of Figures


Foreword by Marie-Laure Ryan






Part 1 Concepts of Action 1


1.1 Action as Project and Construct


1.2 Action: From Word to Concept


  • The Etymology of the German Word Handlung


  • Action as a Concept of Poetics


  • Towards a Concept of Aesthetic Action


  • Action I: Singular Agential Activity


  • Action II: Complex (Multi-agential) Sequence of Events


  • Action III: Discursive Meta-Activity


1.3 Philosophical Definitions of Action


  • The Commonsense Concept of Action


  • Analytical Definitions of Activity and Action


  • Transcendental Definitions of Action


  • The Constructivist Concept of Action


  • Hegel: The Philosophical Value of Aesthetic Action


  • Conclusions


1.4 The Elements of the Action Construct


  • Lotman’s Definition of the Event


  • Structuralist Approaches: Event, Transformation, Move


  • The Event as a Modal Change of State


  • The Event as a Construct of Reception


  • Event, Property and Matter


  • Conditions of an EVENT


  • Class-Homogenous EVENTS


  • Class-Heterogenous EVENTS


  • The Status of the Translation Rule


  • World Knowledge and the EVENT


  • Constructing events: An Example






  • Constructing an object EVENT


  • Constructing a discourse EVENT


  • The EVENT Matrix


  • The Referential Nature of the EVENT


  • Definitions of the Episode


  • The Formal Definition of the Episode


  • The Semiotic Square: A Reappraisal


  • The EPISODE: From Semiotic to Episodic Square


  • The EPISODE as a Three-dimensional Construct


Part 2 The Computer-Aided Analysis of Narrated Action


2.1 Narratology and Literary Computing


  • Narratology and the Cognitive Sciences


  • The Limits of Practical Implementation


2.2 The EventParser Program


2.3 Constructing episodes and actions


  • The Syntagmatic Link


  • The Ontic Link


  • The Semantic Link


  • Closure


  • The episode Matrix


2.4 The EpiTest Program


2.5 Action Potential and action Product


2.6 Practical Analysis Using EpiTest


Part 3 An Experiment in Action: Conversations of German Refugees


3.1 Critical Approaches to the Conversations


  • The Compositional Logic of Goethe’s Conversations


  • Philosophical Discourses in Goethe’s Conversations


3.2 Describing the Conversations


  • The Isochronous Constructs


  • The Isochronous and Anisochronous Constructs


3.3 Explaining the Conversations


  • Goethe and the Unfathomability of the Weltbegebenheiten


  • The Logic of ACTION in the Conversations


3.4 Conclusion




Author Index


Subject Index


back to top