3. A short history of early natural language systems.

3.1. The semantic information-processing era.

Out of the rubble of machine translation work grew an effort that is closely associated with artificial intelligence. One of the more notable ideas of this era which has persisted was the use of limited domains for language-understanding systems. Rather than attempting to understand all language, the limited-domain approach is to design a system that is 'knowledgeable' in one specific domain of language, but perhaps knows nothing at all about any other domain.

Up to now I have been using words such as 'understanding' and 'know' in a rather fast and loose manner. As we shall see below, words such as these should be treated as highly figurative, if not downright misleading, when used to describe the early dialogue systems of the 1960s.

3.2. 'Engineering approaches'.

There was a proliferation of dialogue and question-answering systems in the 1960s, of which a representative sample is:

BASEBALL (1963)
ELIZA (1966)
STUDENT (1968)
SIR (1968)
PARRY (1971)

each of which illustrates the 'limited domain' or 'micro-world' trend in dialogue systems. I call these 'engineering approaches' to natural language understanding for two related reasons: (i) they attempted merely to mimic human (verbal) behaviour in specific problem domains and not to embody whatever psychological reality lies behind out ability to use language, and (ii) they paid very little attention to specifically linguistic insights into the nature of language.

Of the two main trends in the engineering approach towards natural language -- database question-answering systems and key-word systems, exemplified by, for instance, the two early systems, BASEBALL and ELIZA -- we shall focus chiefly on the latter.

3.3. ELIZA

You may already have had the opportunity to play around with the ELIZA program, and perhaps you have some intuitions now as to how the system work.

ELIZA was written by Joseph Weizenbaum, in 1966, as a program which could 'converse' in English. Weizenbaum chose the name ELIZA because, like George Bernard Shaws' Eliza Doolittle, it could be taught to 'speak' increasingly well. The program consisted of two parts, the one a language analyzer (or recognizer) and the second a 'script' (not to be confused with the use of that term more recently by Schank and Abelson) which was a set of rules it would use to generate appropriate replies in specific knowledge domains, say cooking eggs or managing a bank account.

The most famous 'script' used by ELIZA is that of a non-directive psychotherapist, relatively easy to imitate because much of his technique consists of drawing his patient out by reflecting the patient's statements back at him. It is this version of the program, dubbed DOCTOR, that you will probably have been playing with. ELIZA/DOCTOR's performance -- as a demonstration vehicle -- was extremely impressive, so much so that it produced some unanticipated and, for its designer at least, unwanted results:

  1. A number of practicing psychiatrists seriously believed that DOCTOR could grow into a nearly completely automatic form of psychotherapy. The psychoanalyst, Kenneth Colby who, with his co-researchers, was working on the computer simulation of neurosis at the same time at which Weizenbaum was developing ELIZA, wrote of the DOCTOR program that
    If the method proves beneficial, then it would provide a therapeutic tool which can be made widely available to mental hospitals and psychiatric centres suffering a shortage of therapists. Because of the time-sharing capabilities of modern and future computers, several hundred patients an hour could be handled by a computer system designed for this purpose. The human therapist, involved in the design and operation of this system, would not be replaced, but would become a much more efficient man since his efforts would no longer be limited to the one-to-one patient-therapist ratio as now exists.
    [Weizenbaum, 1984:5].
  2. Weizenbaum was startled to see how quickly and how deeply people conversing with DOCTOR became emotionally involved with the computer and how unequivocally they anthropomorphosized it. Once, his secretary, though aware that DOCTOR was merely a computer program, started conversing with it and, after only a few exchanges with it, asked its designer to leave the room. Other users felt that DOCTOR 'really understood' them, and resented Weizenbaum's suggestion that he examine their interactions with it, accusing him of spying on their personal and private conversations.
  3. Another widespread and surprising reaction to ELIZA was the spread of a belief that it demonstrated a general solution to the problem of computer understanding of natural language. (As we shall see, however, ELIZA's apparent knowledge of English is as wholly illusory as is its knowledge of Rogerian psychotherapy; more of that later).

The significance of these reactions may become clearer if we make a slight digression to talk about two topics which have become important in AI, 'intentionality' and 'the imitation game'.