Creativity by Case-Based Reasoning (CBR): SWALE Project Home Page

[Photo of Swale]
The racehorse Swale running the Belmont Stakes. This is a link to a photo at

The URL for this page is

Project Description

The SWALE project explores case-based reasoning (CBR) as a basis for creativity. In the CBR model of creativity, creativity comes from retrieving knowledge that is not routinely applied to a situation, and using it in a new way. In this view, the key issues for creativity are how to retrieve appropriate knowledge for novel uses and how to adapt it to fit novel circumstances. Depending on the retrieval and adaptation processes used, CBR can provide solutions anywhere along a spectrum of creativity, ranging from straightforward reapplications of old knowledge all the way to highly novel views.

The task in which the SWALE project studies creativity is creative explanation of anomalous events. SWALE is a story understanding program that detects anomalous events and uses CBR to explain the anomalies. Its explanation process is based on the retrieval and application of cases storing prior explanations, called explanation patterns from its memory. This case-based reasoning process can take place at any point along the spectrum of creativity, from the completely non-creative application of a perfectly appropriate XP, to the very novel use of an inappropriate XP that must be totally revised to be usable.

The following illustrations of SWALE's reasoning are excerpted and adapted from Schank, Roger C., and Leake, David B. Creativity and Learning in a Case-Based Explainer. Artificial Intelligence 40(1-3):353-385, 1989, and reprinted in Carbonell, J., ed, Machine Learning: Paradigms and Methods, MIT Press, Cambridge, MA, 1990. A simplified version of the SWALE code, developed in scheme for pedagogical purposes, can be downloaded and run to illustrate this process.

The Range of Explanations Generated by SWALE

In order to give an idea of the range of explanations that a system using the case-based approach can generate, below are some remindings that the SWALE system has, and a few of the explanations it generates from them for its namesake example, the story of the racehorse Swale:

The SWALE system detects the anomaly in Swale's premature death (see (Leake, 1992) for a discussion of the system's anomaly detection process). It then builds explanations of the death by retrieving and "tweaking" remindings of XPs for other episodes of death. It picks the best of these candidate explanations, and adds it to its XP library for future use. Some are reasonable explanations; others are quite fanciful or can be ruled out on the basis of other knowledge. However, they show that a memory-based explanation system, even if it has a limited range of XPs and of retrieval and tweaking strategies, can come up with a variety of interesting explanations.

Requirements for Creative Case-Based Explanation

The above examples show that interesting explanations arise when we try to use an XP that doesn't quite apply. In order to obtain creative explanations, an explainer might try to intentionally misapply XPs. Interesting ideas can arise from using old explanations to deal with situations where those explanations were never intended to be used.

While using an XP that doesn't apply gives a fresh perspective on a situation, the idea of building a system to intentionally misapply XPs raises many issues: Which XPs do you retrieve? Which tweaks should be applied? How long should the tweaking process be continued? As our research progresses, we hope to be able to answer these questions. However, we can suggest some of the things that are needed to build creative case-based explainers:

More details on the SWALE project and system can be found in the references below.

Code On-Line

A simplified version of the SWALE code, developed for pedagogical purposes and published in the book Inside Case-Based Explanation, is available to illustrate SWALE's explanation process.

Sample References

Project Team

Research on the SWALE project was conducted by Alex Kass, David Leake, and Chris Owens, advised by Roger Schank and Chris Riesbeck.


This page is maintained by David Leake, Computer Science Department, Indiana University.