LispNvidiaJetsonNano
CL-HTTP on a Nvidia Jetson Nano, quad-core, ARM64.
LispWorks 8.1 on Linux/Ubuntu.

Inside Micro Tale-Spin: Symbolic Computing with Lisp applied to story telling.


Micro Tale-Spin World

Often it is claimed that macros are an important part of Lisp. Lisp was originally developed for Symbolic Computing with Recursive Functions. Macros are just a way how Symbolic Computing is applied to Lisp itself. The first Lisp had no macros, but symbolic expressions, recursive functions, an evaluator and list processing.

The early application areas of Lisp typically were in symbolic Artificial Intelligence: theorem proving, computer algebra, knowledge representation, natural language processing and more. It has been applied to various areas: here we are talking about computer-based story telling, using an example from the 1970s.

The excellent book Inside Computer Understanding, Five Programs Plus Miniatures by Roger Schank and Christopher K. Riesbeck, gives an example: Tale-Spin. Generally I would recommend the book to anyone who is interested how early symbolic AI research was done and how Lisp was used. The book looks at five Lisp programs in detail.

In Chapter 9 and 10 James Meehan presents his program TALE-SPIN (and the source for a simpler version called Micro TALE-SPIN) from 1976. TALE-SPIN writes simple stories.

The program is widely cited in the literature as an important early example of computer-based story telling using AI techniques like knowledge representation, plan generation, forward- and backbard chaining for inference, simulation of possible worlds and generation of natural language.

Story Generator Algorithms - From the living handbook of narratology, University of Hamburg, Germany.

TALE-SPIN takes a description of the world and a goal. It then creates a plan as a sequence of actions of various actors to fulfill that goal. This is a problem-solving component. The plan is then converted into English language. In the following we use a Common Lisp version of Micro Tale-Spin (translated by Warren Sack). Micro Tale-Spin is written in a typical simple pedagogical style of the 70s, using lists for data representation. It's also mostly free of macros, but it generates Lisp code at runtime and executes it using Lisp's evaluator. A plan is generated as a Lisp data structure and the plan is actually Lisp code, which then will be evaluated to be executed. The code is documented in the source and has a more extensive description in the above mentioned book by Schank/Riesbeck. The implementation of the Conceptual Dependency theory in Lisp is also described in the book.

micro-talespin.lisp - The Common Lisp source code for Micro Tale-Spin, with some fixes and minor modifications.

The program makes use of symbolic descriptions for the world and for plans. It can generate English sentences from those symbolic descriptions. These descriptions are based on the Conceptual Dependency theory by Roger Schank.

Conceptual Dependency - A short overview by Jan Wiebe.

A symbolic description of the world in Lisp looks like the following S-Expression. It describes what the various actors know about the locations of various things in the world. For example the first line means that Joe, the bear, knows where he is: near the cave. Joe also knows where Irving, the bird, is. Note than an actor typically may only know about a subset of objects and their locations.

((WORLD  (LOC (ACTOR JOE)    (VAL CAVE)))
 (JOE    (LOC (ACTOR JOE)    (VAL CAVE)))
 (WORLD  (LOC (ACTOR IRVING) (VAL OAK-TREE)))
 (IRVING (LOC (ACTOR IRVING) (VAL OAK-TREE)))
 (JOE    (LOC (ACTOR IRVING) (VAL OAK-TREE)))
 (WORLD  (LOC (ACTOR WATER)  (VAL RIVER)))
 (JOE    (LOC (ACTOR WATER)  (VAL RIVER)))
 (WORLD  (LOC (ACTOR HONEY)  (VAL ELM-TREE)))
 (IRVING (LOC (ACTOR HONEY)  (VAL ELM-TREE)))
 (WORLD  (LOC (ACTOR WORM)   (VAL GROUND)))
 (JOE    (LOC (ACTOR WORM)   (VAL GROUND)))
 (IRVING (LOC (ACTOR JOE)    (VAL CAVE)))
 (WORLD  (LOC (ACTOR FISH)   (VAL RIVER)))
 (IRVING (LOC (ACTOR FISH)   (VAL RIVER))))

Joe and Irving. We also have trees, a cave, a river, some honey, water, a fish and a worm.

The program creates an English description:

Once upon a time ...
JOE WAS NEAR THE CAVE.
JOE KNEW THAT JOE WAS NEAR THE CAVE.
IRVING WAS NEAR THE OAK-TREE.
IRVING KNEW THAT IRVING WAS NEAR THE OAK-TREE.
JOE KNEW THAT IRVING WAS NEAR THE OAK-TREE.
THE WATER WAS NEAR THE RIVER.
JOE KNEW THAT THE WATER WAS NEAR THE RIVER.
THE HONEY WAS NEAR THE ELM-TREE.
IRVING KNEW THAT THE HONEY WAS NEAR THE ELM-TREE.
THE WORM WAS NEAR THE GROUND.
JOE KNEW THAT THE WORM WAS NEAR THE GROUND.
IRVING KNEW THAT JOE WAS NEAR THE CAVE.
THE FISH WAS NEAR THE RIVER.
IRVING KNEW THAT THE FISH WAS NEAR THE RIVER.

We can generate a simple story: Joe is thirsty. He needs some water to drink. For that he needs to go to the river.

(joe thirsty)
One day,
JOE WAS THIRSTY.
JOE WANTED NOT TO BE THIRSTY.
JOE WANTED TO BE NEAR THE WATER.
JOE WENT TO THE RIVER.
JOE WAS NEAR THE RIVER.
JOE DRANK THE WATER.
JOE WAS NOT THIRSTY.
The end.

A more complex story: Joe is hungry. We also state deceive, like and dominate relationships between Joe and Irving.

(joe hungry
  (world  (hungry   (actor irving) (mode (pos))))
  (joe    (like     (actor irving) (to joe)    (mode (pos))))
  (joe    (deceive  (actor irving) (to joe)    (mode (neg))))
  (joe    (like     (actor joe)    (to irving) (mode (pos))))
  (irving (like     (actor irving) (to joe)    (mode (pos))))
  (irving (dominate (actor irving) (to joe)    (mode (neg))))
  (irving (deceive  (actor irving) (to joe)    (mode (neg)))))

In English:

CL-USER> (micro-talespin-demo *story4*)
JOE THOUGHT THAT IRVING LIKED JOE.
JOE THOUGHT THAT IRVING DID NOT DECEIVE JOE.
JOE THOUGHT THAT JOE LIKED IRVING.
IRVING THOUGHT THAT IRVING LIKED JOE.
IRVING THOUGHT THAT IRVING DID NOT DOMINATE JOE.
IRVING THOUGHT THAT IRVING DID NOT DECEIVE JOE.

The story generated:

JOE WAS HUNGRY.
JOE WANTED NOT TO BE HUNGRY.
JOE WANTED TO HAVE THE HONEY.
JOE WANTED TO KNOW WHERE THE HONEY WAS.
JOE WANTED IRVING TO TELL JOE WHERE THE HONEY WAS.
JOE DECIDED THAT IF JOE WOULD GIVE IRVING THE WORM
  THEN IRVING MIGHT TELL JOE WHERE THE HONEY WAS.
JOE WANTED IRVING TO THINK THAT IRVING WOULD TELL JOE
  WHERE THE HONEY WAS IF JOE GAVE IRVING THE WORM.
JOE WANTED TO BE NEAR IRVING.
JOE WENT TO THE OAK-TREE.
JOE WAS NEAR THE OAK-TREE.
JOE ASKED IRVING WHETHER IRVING WOULD TELL JOE
  WHERE THE HONEY WAS IF JOE GAVE IRVING THE WORM.
IRVING TOLD JOE THAT IF JOE WOULD GIVE IRVING THE WORM
  THEN IRVING WOULD TELL JOE WHERE THE HONEY WAS.
IRVING DECIDED THAT IF JOE WOULD GIVE IRVING THE WORM
  THEN IRVING WOULD TELL JOE WHERE THE HONEY WAS.
JOE WANTED TO HAVE THE WORM.
JOE WANTED TO BE NEAR THE WORM.
JOE WENT TO THE GROUND.
JOE WAS NEAR THE GROUND.
JOE TOOK THE WORM.
THE WORM WAS NEAR JOE.
JOE HAD THE WORM.
JOE WANTED TO BE NEAR IRVING.
JOE WENT TO THE OAK-TREE.
JOE WAS NEAR THE OAK-TREE.
JOE GAVE IRVING THE WORM.
JOE DID NOT HAVE THE WORM.
THE WORM WAS NEAR IRVING.
IRVING HAD THE WORM.
IRVING TOLD JOE THAT THE HONEY WAS NEAR THE ELM-TREE.
JOE WANTED TO BE NEAR THE HONEY.
JOE WENT TO THE ELM-TREE.
JOE WAS NEAR THE ELM-TREE.
JOE TOOK THE HONEY.
THE HONEY WAS NEAR JOE.
JOE HAD THE HONEY.
JOE ATE THE HONEY.
JOE WAS NOT HUNGRY.
The end.

An example for a sub-plan in its symbolic version. MTRANS is a mental transfer of information. ATRANS changes a relationship of physical objects - somebody gives a thing to another personand the ownership changes. Here we describe that Joe wants information about the honey, if he gives Irving a worm. The ATRANS then causes an MTRANS in the reverse direction.

(AND (DPROX 'JOE 'JOE 'IRVING)
     (DOIT (MTRANS 'JOE
                   '(CAUSE (MODE (QUES))
                           (ANTE
                            (ATRANS (ACTOR JOE)
                                    (OBJECT WORM)
                                    (TO IRVING)
                                    (FROM JOE)))
                           (CONSEQ
                            (MTRANS (TIME FUTURE)
                                    (ACTOR IRVING)
                                    (OBJECT
                                     (LOC
                                      (ACTOR HONEY)
                                      (VAL ?UNSPECIFIED)))
                                    (TO (CP (PART JOE)))
                                    (FROM IRVING))))
                   'IRVING
                   'JOE)))

The English version of that plan:

JOE WANTED IRVING TO THINK THAT IRVING WOULD
 TELL JOE WHERE THE HONEY WAS IF JOE GAVE IRVING THE WORM.

An example of the text generation of a wanted location.

> SAY  (WANT (ACTOR JOE)
             (OBJECT (LOC (ACTOR JOE)
                          (VAL WORM))))

JOE WANTED TO BE NEAR THE WORM.

Let's note a location change of a physical object.

> SAY  (MLOC (CON (PTRANS (ACTOR JOE)
                          (OBJECT JOE)
                          (TO GROUND)
                          (FROM OAK-TREE)))
             (VAL (CP (PART WORLD))))

JOE WENT TO THE GROUND.

The original Tale-Spin program was a bit more complex and used a sentence generator called MUMBLE - as described by the book. I have not seen the source for the original Tale-Spin. It seems to be lost. For its time, 1970s, it was a relatively large Lisp program. It's also notable that the original Tale-Spin was written in MLisp using an algebraic notation and not the now more familiar prefix notation. The simpler Micro Tale-Spin here is used in the Common Lisp versions, with only a few bug fixes done by me. Also newer versions of the full MUMBLE, the sentence generator, are still available in Common Lisp.

Larger Lisp programs were the main cause for the development of special Lisp Machines at that time: the programs competed on conventional mainframes with possibly hundreds of users for all of the available main memory. A simple story generation, like above, now runs in 20 milliseconds using Clozure Common Lisp on a tiny ARM computer for $59.

Today more sophisticated stories are possible. One can imagine that story generation has real-world applications.