A Text Realization System to Transform Shallow Semantic Forms to FDs in an Intelligent Tutoring System*

Murugan Kannan Martha W. Evens
Department of CSAM Department of CSAM
Illinois Institute of Technology Illinois Institute of Technology
45 W. 59th St. Apt# 2G 10 W. 31st St. 236-SB
Westmont, IL 60559 Chicago, IL 60616
kannan@steve.iit.edu mwe@schur.math.nwu.edu

* This Work was supported by the Cognitive Science Program, Office of Naval Research under Grant No. N00014-94-1-0338, to Illinois Institute of Technology. The content does not reflect the position or policy of the government and no official endorsement should be inferred.


Unification based formalisms and Functional Unification Grammars have been very popular in text generation. The input to this type of generation system is called Functional Descriptions. In this paper we present a method to transform shallow semantic forms, the output of our discourse generator and the lexical chooser, to Functional Descriptions, which are then fed into FUF, the surface generation system developed by Elhadad and McKeown.


Ever since Martin Kay [Kay 1979] developed the influential Functional Unification Grammar, as a grammar for computational uses, it has been very popular in text generation. FUF [Elhadad, 1991] implements the formalism of Functional Unification Grammars. FUF takes two Functional Descriptions (FDs) as input and generates surface level sentences. One FD contains semantic and structural information about the message we would like to generate, the other contains the grammar. FUF unifies these FDs to generate sentences.

In this paper we present a method to construct Functional Descriptions from the specifications supplied by the planner, discourse generator and lexical chooser. This work is a part of the text generator in our intelligent tutoring system CIRCSIM-Tutor V. 3. CIRCSIM-Tutor is designed to help medical students learn to reason about and explain the negative feedback loop that maintains a steady blood pressure in the human body. We are using the SURGE 2.0 grammar [Elhadad, 1992]. SURGE 2.0 is a systemic grammar, based originally on work by Halliday, which has been used in several generation systems.

Functional Description

A Functional Description can be described as a set of attribute value pairs, called features, where an attribute is the name of the property and the values are either atomic symbols or nested FDs. Here is the FD for the sentence "TPR is controlled by the nervous system."

	((cat clause)
	 (proc ((type material) (voice passive) (agentless no) (lex "control"))) 
	 (partic ((agent  ((cat np) (head === "nervous system")))
		 (affected  ((cat np)  (head === "TPR") (countable no))))))

The (cat clause) introduces a type of verbal phrase or a sentence. The proc list describes the gross structure of the entire sentence. The word material in the list is a systemic grammar term for a subcategory of verbs. The next list, partic, describes the thematic roles in the sentence. Functional Descriptions like these are generated by our module.

Shallow Semantic Forms

Shallow semantic forms contain the ordinary notions that are expressed in the sentences we want to generate. Shallow semantic forms are derived from discourse and instructional goals. Here is an example of such a form: (control NS TPR). In this example, the instructional and the discourse planner want to generate "TPR is controlled by the nervous system." Slight modifications of this form would generate a different sentence. For example (control ? TPR) will lead us to emit the sentence "how is TPR controlled?"

Features of the system

Some of the most important features of the system are:

1. Handles nested shallow semantic forms: In order to generate embedded clauses our realization module should be able to handle nested shallow semantic forms like :

(control (increase inflation_rate) economy)

This shallow form would then be transformed into an FD representation of "the economy is controlled by an increase in the inflation rate." Nesting of shallow forms often entails nominalizing a verb. The isolated form (increase inflation_rate) would generate "the inflation rate increases" or "the inflation rate rises" instead of the nominal "the increase in the inflation rate."

2. Overrides default attributes: Many of our generated FDs have default attributes that we can override in the input shallow forms. For example, in the case of the shallow form for control, the default is that the sentence will always be in the passive voice. This can be changed by including the voice information with the shallow form. For example, (control NS TPR (voice active)) will generate "the nervous system controls TPR." Similarly, we assume that all sentences are in the present tense unless specified in the shallow form.

The default article (default or not) for a noun is derived from lexical information if possible. Otherwise we keep a table for the nouns which have been recently used. As soon as a noun is introduced it goes into the table. Whenever the noun in question is found in the table then that particular noun is considered a definite noun.

3. Provides local semantic information: Like Callaway’s FARE [Callaway 1995] system our text realization module resolve conflicts between the information provided by the shallow form and the information retrieved from the FD skeletons table (described below).

The components of the text realization module and the ways they interact are shown in Figure 1.

Shallow form processor

This is the heart of the realization system. This module receives the shallow forms from the instructional planner and discourse planner. It then retrieves the appropriate FD skeletons from the FD skeletons hash table and constructs the thematic roles of the sentence by mapping the proc list of the FD skeleton and the syntax table. If it is a nested shallow form then it is processed recursively. If more than one FD skeleton is found for a single shallow form then the lexical chooser [Ramachandran, 1995] is consulted to pick one. The shallow form (increase TPR), could be realized by either "TPR increases" or "TPR goes up." In these cases, the shallow form processor consults the lexical chooser.

If a parameter in the shallow form is a question mark, then this module inserts appropriate mood and scope information into the FD skeletons, which will then be used to construct FDs to ask questions such as "How is TPR controlled?"

FD Skeletons

FD skeletons are stored as a hash table. The Shallow Form processor is designed in such a way that FD skeletons can be nested inside FD skeletons to any depth. For example, the skeleton for the shallow form increase can have as many as clauses and NPs. This facility gives many more choices of words. And also inclusion and deletion of FD skeletons can be done without modifying the code or other modules. An example FD skeleton is given below.

	(increase       (((cat clause) 
			   (proc ((type material) (lex  ))))
			 ((cat np) (head === )
			  (qualifier ((cat pp)
				      (prep ((lex )))
				      (np ((cat np) (head === )))))))) 

Syntax Table

This is another important data structure in the realization system. It holds information about the thematic roles that are associated with our shallow forms. This data structure is very flexible and we can add any number of proc types and the associated thematic roles. The Shallow Form Processor will recursively map the shallow form and the syntax table lists until it finds a perfect match.

Conclusion and Future Work

We are designing and implementing a flexible text realization system to transform shallow semantic forms to Functional Descriptions. By flexible we mean future work can easily be accommodated. The shallow form processor is capable of handling a wide variety of shallow forms to produce Functional Descriptions. The FDs are then realized in text by FUF using the SURGE grammar. Our studies suggest that the text realization system is flexible, capable of working with other modules in CIRCSIM-Tutor.

Encouraged by the results, we are continuing to make our system more sophisticated. Since FUF and SURGE are research tools they take lot of time to generate a sentence. One goal would be to minimize the processing time by having our own LFG grammar suitable for our sublanguage.


We would like to thank Michael Glass, our project mate, for his valuable suggestions and encouragement. We thank Michael Elhadad for providing and assisting us with FUF and SURGE.


Charles B. Callaway & James C. Lester. (1995). Robust Natural Language Generation 
       from Large-Scale Knowledge Bases, Proceedings of the 4th Bar-Ilan Symposium on 
       the Foundations of Artificial Intelligence, pp. 96-105, Tel-Aviv & Jerusalem, Israel.

M. Elhadad. (1991). FUF: The Universal Unifier user manual version 5.0 Technical Report CUCS-038-91, Department of Computer Science, Columbia University.

M. Elhadad. (1992). Using Augmentation to control Lexical Choice: A Functional Unification Implementation. Ph.D. thesis, Columbia University.

M. Halliday. (1976). System and Function in Language. Oxford University Press, Oxford.

M. Kay. (1979). Functional Grammar. In Proceedings of the Berkeley Linguistic Society.

K. McKeown. (1985). Text Generation: Using Discourse Strategies and Focus Constraints to Generate Natural Language Text. Cambridge University Press.

Kumar Ramachandran & Martha Evens. (1995). Lexical Choice for an Intelligent Tutoring System, Midwest Artificial Intelligence and Cognitive Science conference.