Report on the
21st Century Engineering Consortium Workshop
a forum on formal methods education

Prepared by Steven D. Johnson (Indiana University, ) Warren P. Alexander (University of Cincinnati), Shiu-Kai Chin (Syracuse University), and Ganesh Gopalakrishnan (University of Utah).

Comments: See Appendix A

§0.  Summary
§1.  Introduction
§2.  The central question
§3.  Charting the formal methods area
§4.  Constituencies
§5.  What is needed in formal methods education?
§6.  What would a formal methods consortium do?
§7.  Next steps
§8.  References
§A. Genesis of this report
§B. Position papers and statements
§C. XXIEC workshop announcement
§D. List of attendees


This report is one outcome of the 21st Century Engineering Consortium Workshop (XXIEC) held March 18-19, 1998, at Melbourne, Florida. The meeting was sponsored by the Air Force Rome Laboratory, working with DARPA's Information Technology Office at the initiative of Mike Nassiff of Rome Lab. One impetus was a series of high level meetings on high confidence systems in 1995-98 [CCIC98]. In order to attain the capability to build high confidence systems in significant numbers and at greater scale, it is essential that a much larger number of students be trained in engineering applications of formal methods. The workshop's main purpose was to explore the idea of establishing a consortium to advance formal methods education.

Although there was general encouragement to take steps toward establishing a consortium, one did not spontaneously materialize at this meeting. A wait-and-see attitude toward individual participation prevailed. The primary interest groups in, industry, government, and academia, are each looking to the others for direction, leadership, and support.

A specific goal was to assess the status of formal methods in academia and practice and make recommendations for education and training. Formal methods is evolving with technology and maturing in practice. There is considerable debate about its scope and role in the computer science and engineering (CS/CE) curriculum and about strategies for advancing its pedagogy. A core set of topics is emerging, but treatments exist in great variety, each inspiring its own notation and set of tools. All the while, the need for better design methods grows increasingly urgent as technology pervades all aspects of modern life. In short, the status of formal methods is that both its importance and the awareness of that importance are increasing. Within academia, however, faculty expertise is in short supply, and formal methods competes for attention with other more visible, glamorous, and lucrative subjects.

In technology enterprise today, the balance between quality and feature enhancement favors the latter. As our economy, our safety, and our way of life grows ever more dependent on information systems, quality will inevitably assert itself. Formal methods is a significant avenue in the pursuit of higher quality through better design methods. The XXIEC workshop contemplated ways to reflect this pursuit in education. Issues of general agreement include (among others):

Specific recommendations emerged, although there was no clear priority. These recommendations centered on the following issues:

1    Introduction

Our society is increasingly dependent on software, digital systems, and networks. Our attention is captured by newsworthy problems like "The Millennium Bug;" it is focused by daily experiences with balky PCs, overcharges at the checkout line, and the list goes on; it is galvanized by occasional "glitches," some with tragic consequences. Technology is overwhelming human design capabilities. The vast capacities of VLSI, computer memories, and network bandwidth, greatly favor expansion over consolidation, resulting in a proliferation of features but a decline in reliability. Similarly, the rapid evolution of electronic commerce and global communications raises concerns about security, specifically, how to design secure systems. A recent Presidential Commission report makes dire warnings about the vulnerability of our information infrastructure [PCCIP].

One of six recommendations in the 1995 MSTC/CCIC report, America in the Age of Information [CCIC95] is to "Broaden the emphasis on high confidence systems beyond security" to address safety, reliability, real-time performance, and other aspects. In the succeeding three years, a series of meetings was held to set a high confidence systems (HCS) research agenda [CCIC97] [CCIC98]. Throughout these reports, formal methods is seen as fundamental in the effort to raise the quality and assurance of critical hardware and software systems. Although the HCS agenda focuses on research, there is a clear realization that education is a major issue in achieving its goals:

The time is ripe [for] the development of educational programs for safety-critical systems. There is agreement in this panel that the area of high-confidence systems is an intellectually coherent body of knowledge. Areas that were considered separate before (such as safety, security, functional correctness, performance and real time, and fault tolerance) are parts of the emerging new area. This is the right time for industry, government, and academia to get together to organize curricula, case studies, electronic textbooks on the Internet, shared software, and so on. [CCIC95, Part III.D]

Formal methods is centrally concerned with safety-critical systems. The agencies involved in the HCS agenda see a critical shortage of people to carry advances in formal methods to practice, and, to make matters worse, in the number of educators available to fill the pipeline. The emerging "intellectually coherent body of knowledge" is not being incorporated into existing curricula, at least not rapidly enough to meet HCS needs. The XXIEC workshop was aimed at creating a movement addressing these shortages. Its goal was to assemble knowledgeable people from all sectors to begin outlining an educational plan and to plant the seeds for a continuing effort.

In the broad sense, "formal methods" refers to the frontier of engineering practice, rather than to any delineated technique or technology. The overwhelming pace of system design makes this frontier an ever-receding horizon. The rapid evolution of digital technologies, and, in particular, the exploding complexities brought about by expanding scale, generate a mounting sense of urgency in the search for more unified and systematic approaches to design. At the same time, commercial competition increases the pressure to design more efficiently, but above all, more quickly. In some sectors of the design industry, such as high-volume VLSI and telecommunications, formalized analysis is already a competitive necessity. In a recent roundtable on formal verification, Robert E. Damiano of Synopsys is quoted as saying

`We might disagree on whether the underlying technology is the right technology, whether the engine needs a lot more work, or if a breakthrough is necessary for widestream adoption. . . . But not a single one of us disagrees that formalizing design methodology and the design flow is the correct thing to do.' [CD98]

It is clear that changing design paradigms are encountering significant problems in education and reeducation. According to anecdotal observations by the XXIEC workshop organizers, there is widespread agreement that education and training have become the priority in advancing system design practice. A participant at the 1995 Workshop on Industrial-strength Formal Specification Techniques [WIFT95] reports that almost every technical discussion concluded with the mention of needs in education. At the 1998 Workshop on Formal Methods in Software Practice (FMSP98) all three members of the event's panel session, entitled "If you could spend $1M extra this year on improving software quality, how would you allocate it?" nominated education. The FMSP98 keynote speaker focused on the importance of creative expertise needed in the effective use of automatic verifiers.

However, specific recommendations are rare. Pointed questions about what should be taught to undergraduates, or what constitutes a professional skill in formal methods, or even what preparation someone should have to do research, are usually answered with a shrug or the slogan "teach more mathematics." Even people involved in developing formal methods courses are uncertain about what background they would like to see in their incoming students. While most agree that the needed pedagogy must include experience in the use of computer assisted reasoning tools, it is also acknowledged that gaining enough experience to teach these tools is daunting.

This report consolidates the viewpoints of a group of individuals actively involved in the formal methods area at this time. Although the XXIEC participants included a number of field leaders, attendance was not a representative cross section of interested groups. Most of the attendees were academics, with just a few representatives from national laboratories and agencies (NASA, NSA, AFOSR), and only one or two representing industry. The participants might also be characterized as "biased toward hardware" with a disproportionate number representing higher-level approaches (i.e. "theorem prover types"). Nearly all the participants work in the United States, whereas the formal methods area is and has been quite international in character.

The remainder of this report has five parts. The following section is a brief discussion of the central issue arising in almost every discussion of formal methods education: is the emphasis on broad-based integrated education and training or on producing a limited pool of experts for critical applications. Next, an overview of the formal methods area looks at divisions of research, approach, and emphasis that are inevitably reflected in educational discussions. Third, we look at the constituencies involved in shaping and implementing a formal methods curriculum. Fourth, we review and consolidate educational recommendations, and describe courses under development by area leaders. Finally, we list some of the possible contributions a consortium effort might make, concluding with a short list of steps to take next.

2    The key question

Even allowing for the great diversity of opinion about what constitutes "formal methods," opinions vary greatly about how, indeed whether, to build a pedagogy for it. As an illustration, we present excerpts of two statements responding to preliminary announcements of the XXIEC workshop. Both individuals chose not to participate. Comparing their reasons illuminates a fundamental debate about formal methods in CS/CE education.

In declining an invitation to participate as a panelist, a prominent academic wrote,

I must tell you that when I first saw this announcement, . . . I was turned off. It seemed to me to be a conference arguing that formal methods were an add-on feature of (software) engineering education, whereas I believe that they should be integrated into software engineering education in the way that calculus is integrated into other engineering, i.e. used in every single course. I also don't like talking about "formal methods" because I think that what is needed is something old, mathematics, not something new (. . . some committee reinvention of old mathematics in ad hoc ways). . . . My concern is that the workshop format might not provide a suitable occasion for a full discussion of a 4 year independent [program].

A contrasting point of view is seen in the following review of a draft workshop description. The reviewer is affiliated with a formal methods group at a government laboratory.

A fundamental premise underlying [the 21st Century Engineering Consortium] is that education is the solution to our problems in developing high assurance systems. This is false, because it presupposes that someone knows how to build such systems, and that everyone can do so if only they are taught how. This is false because (1) no one knows how to build high integrity software yet, and (2) even if someone did, not everyone will be able to do it. I believe the 21st Century Engineering Consortium is likely to do far more harm than good."

The first statement reflects a view that "formal methods" is just an attribution of the maturation of a new engineering discipline. It should not be confused with basic mathematical preparation, nor should it displace it. As a workshop participant observed, "Calculus is a formal method" (although perhaps not the only formal method needed in CS/CE); disciplines already provide models for incorporating methodological rigor; and they do need consortia to accomplish it. The point is that the necessary pedagogy be widespread and incorporated in the general curriculum of a broad undergraduate population.

The second statement associates "formal methods" with a more targeted need in highly demanding design applications. While brusquely dismissing education, the reviewer would probably support more exposure to formalisms and to reasoning tools. The key point is that the pedagogy is intended for a more limited pool of people with special expertise.

This debate between broad-based versus specialized education reverberates in almost every forum and discussion. Is it better, as one workshop participant put it, to focus on the carpenters' methods or those of the architect? Individual educators will be guided by their personal philosophies and professional circumstances, and similarly with advocates in government and industry. Those responsible for highly critical design applications need more experts, sooner; while those engaged in commercial design activities usually advocate incremental advances, taught to a large number of new engineers. It is vital in any discussion of formal methods to know where each discussant stands, or risk talking at cross purposes.

It may also be noted that both of these statements project an adverse intent on the notion of a consortium. Apprehensions about political and scientific implications were widely shared among those who attended the workshop. If these suspicions are not overcome, then a consortium would, indeed, become a splintering influence rather than a nurturing one. It is also widely believed that faculty resources to build formal methods pedagogy are extremely limited, so collegiality is of great importance.

3    Framing the formal methods area

The issues and perspectives reflected in this report are snapshots of a dynamic area of applied research. "Formal methods" is a widely used term but many regard it as nondescriptive, at best, and, at worst, as a somewhat derogatory reference to the academic idealization of real-world processes. The area remains something of a battleground of academic, practical, commercial, and philosophical perspectives. That the clashes are often intense and passionate is testimony to the community's sense of the area's importance. It also reflects the broad range of views on what formal methods is about and how it is applied.

This section surveys this diversity and summarizes the principal aspects shaping the formal methods area. Within this outline (it is hoped) it can be seen that apparent points of debate are more like points on a compass, reflecting not differences in judgment, but a full spectrum of approaches. In curriculum planning, it is important to understand that the intellectual foundation must eventually transcend transitory distinctions. In the interim, however, it is likely to be governed by them.

3.1    Published definitions

The XXIEC workshop announcement defined formal methods as:

that body of theory, technique, pedagogy, and tools that enable rigorous reasoning in system design practice. [XXIEC]

In a highly regarded treatise on formal methods, John Rushby writes,

`Formal methods' are the use of mathematical techniques in the design and analysis of computer hardware and software; in particular, formal methods allow properties of a computer system to be predicted from a mathematical model of the system by a process akin to calculation. [Ru93]

The series forward of the Academic Press Series in Formal Methods says,

formal methods refers to the application of mathematical techniques for the specification, analysis, design, implementation, and subsequent maintenance of complex computer software and hardware. [DeHi96]

A 1996 Computer Surveys article says formal methods are

mathematically based languages, techniques, and tools for specifying and verifying [reliable] systems. [ClWi96]

These three recent definitions are generally consistent. Though broadly accommodating, each invokes three elements that here are taken to characterize the area:

Formal methods is founded on such areas of theoretical computer science as automata, complexity, semantics, and logic. However, it centers on concrete problems. As a consequence, formal methods researchers often serve as intermediaries between theory and practice (and often are perceived in either community as representatives of the other). In this respect, they carry information both ways, seeking both to advance the state of practice and influence theoretical research to be pragmatic.

3.2    Formal versus mathematical

In this context, the word "formal" invokes the notion of a formal system, or calculus for the manipulation of syntax, graphs, and other structures, according to a determined set of semantically sound inference rules. Clearly, formalization is a prerequisite to automation, since computers are symbol manipulators. The creation of sound tools requires mathematical insight. The use of such tools sometimes requires an understanding of the underlying formalism, which, in turn, requires a certain depth and kind of mathematical background.

"Formal methods" does not mean "more mathematical methods" and, in particular, does not proclaim this area to be superior to conventional practices by virtue of its formality. Formal methods are rigorous because reasoning is symbolic, but this also greatly constrains its flexibility. Constructing a formalized argument is a different activity than proving a theorem or performing numerical analysis.

For example, the term "formal verification" is sometimes used to distinguish the use of decision procedures from conventional verification techniques that use simulation modeling. Of course, simulation also entails formalization (a hardware description language or physical description, but in any case a formal expression), and provides a supporting tool for reasoning symbolically about a design. Whether simulation classifies as a formal method depends on, well, the method with which the simulator is used. Formal and conventional verification are not alternatives; they are alternates. A practitioner will use both, as needs dictate; thus, one task of the formal methodist is to discover effective ways to integrate the two techniques.

3.3    Software versus hardware

A traditional demarcation in formal methods activity is the so-called "hardware-software boundary." Historically, there has been relatively little communication between hardware and software oriented communities. A nearly disjoint set of conferences and archival media exists. However, if this mythical distinction ever made sense, it makes less every day. The difference is a matter of emphasis rather than substance. Furthermore, the continuing fusion of hardware and software -- as manifested in distributed computation, embedded software, codesign, configurable technologies, and chip level systems -- is unifying present methods of programming and design.

In software dominated applications, many factors impede formal methods adoption in the mainstream. The complexity of abstractions, the prevalence of dynamic memory allocation, and similar qualities make both theorem proving and model checking approaches difficult. It remains economically feasible and socially acceptable to release faulty systems and then "patch" errors in situ. Furthermore, users are tolerant of, or at least used to, systems with product errors. Ironically, the increasing presence of the world wide web perpetuates this situation by making incremental software distribution easier.

The corresponding factors are more favorable in hardware dominated domains. Data and control abstractions are concrete and lend themselves to formal analysis. Time to market is of paramount importance in the hardware community while correcting a design error in the field is either impossible or prohibitively expensive. Eliminating one design iteration is worth a lot of effort. The engineering culture accepts the need to design with testing and verification in mind. System architects are increasingly willing to sacrifice resources (i.e. chip area) to achieve system verification. Consumers are highly intolerant of design flaws.

In the mean time, technology is transcending any residual distinction. Software is becoming highly distributed, complex, and mission critical. Hardware is becoming more configurable and programmable. Hardware description languages and their associated design tools give hardware design a software feel, and allow design to be initiated at higher levels of abstraction. The integrated circuit industry is increasingly involved with software artifacts, such as simulation models and cores, to market their intellectual property.

Formal verification has a foothold in hardware design practice. Reasoning tools are commercially available and incorporated into a growing number of implementation processes. As practice adapts to these advances, new applications are found. Recently, there has been a substantial buildup of verification groups in integrated circuit and hardware systems industries, prompting the CAD industry to compete with these capabilities. It is a positive feedback relationship.

Formal software verification has been far less pervasive. Few commercial offerings are available. Outside of highly critical applications, little formal verification is performed. At FMSP98, for example, virtually all attendees (with the notable exception of the keynote speaker) were academic researchers, or government representatives, or government contractors responsible for highly critical systems such as aircraft control. The trend in many software applications is using formal techniques in specification phases, particularly in specifying hardware-like control and monitoring systems. It is often reported that the activity of formalization, itself, is beneficial and can only improve as analysis tools are incorporated.

Further opportunistic expansion can be expected where algorithmic capabilities fit with analysis needs. As more software systems are seen as critical, the current "release and patch" approach cannot be sustained. Thus, we can expect to see a widening frontier at which some kind of formal analysis becomes mandatory.

3.5.    Implementation versus properties

In a keynote talk at the XXIEC workshop, Amir Pnueli described two kinds of formal analysis. He observed that in property list verification, the specification to be satisfied is usually expressed in a different formal notation than the implementation to be analyzed. There is a sense in which a specification is a cumulative collection of orthogonal attributes. Qualities such as safety, liveness, security, fault tolerance, timing, and so on, are common targets of property verification.

Implementation verification addresses the primary design intent -- its "function" or "behavior." Two levels of describing the same design object are compared. The engineering process passes through several levels, each implementation serving as the specification for the next level. It is often described as resolving the "what" and the "how" of a design. Typically, all levels are expressed in the same formalism, and verification establishes some semantic correspondence (e.g. equivalence, implication, containment, refinement) between the two expressions.

These distinctions are hard to pin down mathematically. Differences in syntax are often resolved by a embedding in common semantic theory, or by translation to a common data representation. A specification is, by definition, partial. Behavior can be regarded as one of a system's properties. There is, moreover, significant interplay between the two notions of verification. For example, an isolated property such as deadlock is often verified for a reduced specification in which irrelevant details have been abstracted away. Thus, implementation verification may be involved in demonstrating that the original specification is a genuine instance of the abstracted one.

While formal verification is often defined as "proving something correct," it is more realistically described as a process of repeated refutation. In whatever form they take, proof attempts fail far more often than they succeed, and the practical value of verification is measured in terms of the errors it exposes. A proof failure may be due to a design error, or it may disclose a fallacy in specification, or it may be resolved by constraining how the design object is used. Properties thus isolated are byproducts analogous to the data sheet of a hardware device or a product-use disclaimer for software. Increasingly, these artifacts are recognized as value-adding when packaged with the product.

3.5.    Requirements versus design

Designs are situated in the physical world and are conceived to solve physical problems. Design formalizations, then, must be situated in the appropriate mathematical models. Domain modeling refers to the construction of these models. This can be a means of communication between domain experts and design specifiers, or at least a means of recording of this communication. Discussions of mathematics in formal methods education sometimes confuse the background needed to model physical domains, and that needed to understand formal manipulations.

The bridge between problem definition and solution specification is often called requirements analysis. A number of pilot studies have reported that the process of formalizing a specification is worthwhile, whether or not automated reasoning is applied to the formalized object. Indeed, this is sometimes what is meant by the term "formal methods," contrary to the definitions given in this report. In software domains, tools for constructing specifications and performing static analysis tasks (e.g. type checking, dependence analysis, consistency and completeness checking, overflow analysis) are now of proven value in critical applications.

Implementation verification, or checking for conformance between a source and a target design description, is what most people think of first when the subject of formal methods comes up. As was mentioned earlier, there may be several strata at which verification is applied in the design process. In hardware applications, a distinction is made between verification, which applies to descriptions of design objects, and testing, which applies to the objects themselves. This distinction is not so sharp in software, even though it is still valid.

Formal methods also have applications in retrospective analysis. One example is failure investigation. Another is reengineering, where an obsolete device must be retargeted to newer technology. Hardware retargeting is a substantial problem in military systems, which may have unusually long shelf lives, and in safety critical systems, which have protracted approval processes. However, these applications are not seen routinely.

3.6    Automated versus interactive

Formal reasoning always involves a tradeoff between automation and expressiveness. Algorithmic approaches emphasize automation; they are based on highly developed decision procedures applied at concrete levels of description. Expressive power is emphasized in interactive formalisms; the reasoning tool is there to assist a creative user in performing a formal analysis or reduction.

There is a logical hierarchy, including decidable theories (propositional logic, boolean algebra, finite extensions, weak arithmetics, etc.); equational logics over either fixed theories (arithmetic, usually) or abstract data types; function domains; process logics; first order and higher order predicate logics; and metalogics (logics for describing and comparing other logics). One may add temporal or modal extensions providing a form of implicit quantification (e.g. "eventually" means "there exists some time in the future," time being implicit). Model checking involves temporal extensions to propositional logic; program logics can be viewed as modal extensions, or temporal extensions, or both.

At every level there is a choice of inference mechanism. Model checking can be seen as a symbolic form of simulation. Verification is usually cast as the deductive proof of satisfaction between implementation and specification; but it can alternatively be performed as a derivation that reduces specification to implementations by means of refinements and transformations.

Higher level formalisms admit innumerable ways to model design objects, just as one may choose among many mathematical formulations of a given phenomenon. One prevalent example is the choice between functional and relational representations of processes in logic. Some treatments translate design objects directly into logical expressions, called shallow embedding, while others introduce a design language and interpreter, called deep embedding. The choice is not determined by the formalism, but by the task for which formalization is used.

3.7    Is there common ground?

Whatever the application domain, whatever the technology, whatever the design phase, and to whatever degree the reasoning is automated, judgment, technique, and ingenuity are required to perform the complex analysis tasks in system design. It is accepted that different design aspects require different approaches; there is no universal "formal method." The expertise lies in the ability to use the appropriate techniques in consort. Although heterogeneous reasoning environments are advancing all the time, integration must take place in the mind of the engineer, either during design or establishing processes in which other specialists can use the tools effectively. These intellectual skills lie at the heart of formal methods education.

A foundation of intellectual preparation is being distilled from current research and exploratory practice. The sense of the XXIEC workshop was that we have progressed far enough to define this common ground. This definition process would be greatly facilitated by creating forums devoted to that purpose, assuming that all interest groups participate. It must be recognized, however, that this is the beginning of a consensus building effort that will, in the best case, take several years of involvement.

4    Constituencies

This section describes five overlapping groups involved in advancing formal methods---educators, researchers, developers, practitioners, and overseers. Of course, these must be very general characterizations, since in each group one will find a diverse range of perspectives. The focus is on what each group brings to and takes from the formal methods movement, and how this might impact educational initiatives.

4.1    Educators

An educator's main product is students who graduate with knowledge and skills to carry the field forward. Pedagogy, in the form of text books, course notes, laboratory projects, and so forth, is a marketable byproduct. In applied disciplines, the educator's job is to distill decades of practical experience into a few years of relevant preparation. Since the primary concern is preparation for a life-time career, teachers tend to take a long view, favoring principles and preparing their students to "learn how to learn."

They also want (and need) their students to find jobs, of course. Curriculum development in CS/CE must ultimately navigate by the job market, which is the indicator of technological trends. Formal methods is an exacting discipline and not superficially glamorous. To attract students into the classroom and convince administrators that new pedagogy is justified, educators must be able to point to demand from industry and government.

To implement the broader objectives of engineering education, we must also entice more educators to become involved in formal methods. It is a difficult area to "come up to speed" in. The few textbooks that are available vary greatly from one to another in both notation and topics covered. Publishers do not see a sufficient market to become involved, and this also influences authors. The greater challenge, though, is becoming proficient enough in the use of reasoning tools to teach with them. Occasionally, workshops and tutorials are offered through conferences or by research organizations, but a short course can only provide a jump start. Even with a consultant, it can take several months of work to gain simple proficiency.

4.2    Researchers

Researchers drive the advancement and deployment of reasoning technology. Notwithstanding some recent inroads, formal methods remains an exploratory endeavor. Progress occurs in two ways: spreading out from established beachheads and through incursions at new points of application. Both entail the kind of motivation and risk taking that is only found in research.

The researcher's needs include material resources to conduct their work and access to examples appropriate to their investigations. Above all, they need students prepared to advance research. In other words, research is not only a contributor to but also a consumer of formal methods education. As the intermediary between academia and industry, it is up to the research community to set the agenda for undergraduate education. With this primary role comes the implied mandate to present their experiences in both practical and academic venues.

In areas such as formal verification, research is already migrating from the academic to the commercial sector. This will, of course change the patterns and modes of dissemination. Some academics are concerned about how this trend will impact curriculum development, since there is a risk that communication channels between industry and educational institutions, already considered inadequate, may become further restricted.

4.3    Practitioners

Practitioners are directly engaged in system design activities or with evaluating formal methods in the industrial setting. Among this diverse population, common themes arise in discussions of the formal methods challenge. A list of concerns found in [ClWi96] includes such phrases as quantifiable benefit, conceptual compatability, stability, and integratability. When pressed for educational recommendations, industrial practitioners tend to advocate broad-based curricula and technical training. As an XXIEC participant pointed out, his company has just a handful of formal verification experts, but hires hundreds of engineers.

Commercial system design is a vastly complex undertaking, and formalized analysis is just one element. To make inroads in practice, innovation must be minimally disruptive and must carry a high likelihood of benefit. In addition, there is an understandable emphasis on on the critical path, that is, applications to the most complex designs using the most aggressive technologies. Such tests under fire are not ideal for systematic evaluation, but should be viewed as a means of gaining credibility to justify longer term investment.

There is evidence of gain in the integrated circuit industry, where some large companies are expanding their hiring of formal methods experts. This would suggest that curriculum developers can get correspondingly more substantive feedback about needed skills. However, it is unclear how to gather this feedback, since there are few opportunities to consult with industry about this issue.

4.4    Developers

Developers govern the transfer of design technology, hence also methodology, from research to practice. The hardware industry's early adoption of formal verification has fostered a growing commercial enterprise, which nurtures fundamental changes in engineering processes. Commercial tool development, either in-house or in the CAD market, is where hiring demands are now concentrated. Naturally, CAD vendors are interested in expanding their market and customer base. Hence, they tend to favor technologies that readily integrate with existing design environments and training programs. This generally translates into strong advocacy for tools that can be broadly deployed and in incremental advancements in methodology.

At the same time, the CAD industry has been, until recently perhaps, the principal consumer of formal methods expertise, hiring those skills necessary to create competitive products and establish new markets. Now, a formal methods consulting enterprise appears to be emerging, with a few companies now specializing in integrated verification tools and in providing on-site consulting services for system specification and analysis. Software vendors are breeding expertise vendors, which forecasts a greater demand in education on the use of reasoning tools rather than their implementation.

CAD researchers have declared a grand challenge in the design of full systems on silicon [NSF96]. Although verification and test are critical problems, formal methods, per se, is not stated as a solution path at this time. Certainly, present design technology, formal or not, is inadequate for systems of present complexity, but addressing this inadequacy involves not just new methods, but new paradigms unifying software and hardware design.

4.5    Advocates

Formal methods advancement has been strongly influenced by government subagencies responsible for design-critical applications. These concerns point to both a shortage of expertise, and inadequate technology, as critical issues to be addressed. In particular, research at higher levels of reasoning finds interest here, often through sponsorship of commercial pilot studies. Typically, these projects are quite different from the explorations in the CAD and integrated circuit industries. There is more emphasis on the use of general reasoning tools, and on focusing special expertise to low-volume design components with extremely high costs associated with failure. This includes safety-critical systems, of course, but also missions to distant or hostile environments, systems involving the communication of money, intelligence, and so forth.

Government funding patterns in formal methods have diminished or become disorganized in the past five years. It is perceived that DARPA, which has been the principal source of support in the United States, regards its job as done. What remains is seen as primarily a matter of technical transfer to be taken care of by commercial interests. The National Science Foundation evidently does not identify formal methods as a targeted area of support. This may partly be due to ambiguities discussed in Section 3. At NSF, technical understanding remains at the project director level, with little explicit acknowledgment at division or directorate levels. Similarly, the National Security Agency, which has historically overseen long-term research activities, seems to have abandoned or marginalized its interests.

The individual formal methods lead groups in government seem to be getting more isolated, even while they enjoy a sense of success in the past decade's initiatives. The progress would seem to justify further investment in, rather than abandonment of, research guidance. Positions may be reconstituting themselves through the HCS computing agenda [CCIC98]; however, these initiatives have not yet translated to actual resources. If a critical mass is forming, it has not yet materialized.

5    What is needed in formal methods education?

5.1    Education versus training.

An FMSP98 panelist suggested that in-house or computer based industrial training was the most cost effective approach to deploying formal methods and would, in addition, lead to more direct measures of benefit. On-site training may also be a tactic to engage educators with practitioners. In isolation, most academics prefer to not promote themselves as trainers, but those in applied fields do understand the need for professional tool skills and exposure to practical problems. The following statement from a recent NSF document is applicable here.
Broadly speaking, education usually refers to courses and similar didactic activities that equip students with an understanding of basic facts and concepts, and with facility in use of basic methods such as those provided by mathematics. Training generally refers to activities that prepare students to use the methodology and technology needed for original research in particular areas. Obviously, instruction can play a significant role in both education and training activities. Thus, and especially at the graduate level, it can be difficult at times to categorize a particular activity as either education or training. [NSF98-116]

This distinction is nowhere more greatly blurred than in formal methods. Incorporating technology in courses is always difficult, but it is also common to mainstream education in computer science and engineering. The problem for formal methods is that neither the methodology nor the technology is very advanced, compared to the compilers, editors, operating systems, browsers, and office tools that students already must know. Most people who have developed courses in this area emphasize either methodology or tool use at the expense of the other.

5.2    Formal methods in the core curriculum

There was nearly universal agreement at the XXIEC workshop against fixing a curriculum for formal methods content. It would be highly premature and wrong to impose an approach or treatment, when what is really needed is competition among them. In engineering, setting accreditation standards is simply out of the question on both pragmatic and intellectual grounds, even if there were an adequate number of educators to implement them -- which there is not.

A substantial number of workshop participants agreed with the statement, quoted earlier, that conventional mathematics provides a better foundation than some arbitrary collection of ad hoc formalisms. Mathematical insight is essential in formal methods practice. At the same time, there are topics such as structural induction, automata, semantics, boolean analysis, and others, that could be presented with a formal-methods slant.

Relevant topics and threads could be nonintrusively incorporated in existing core courses. Courses in programming and digital design should present engineering methods incorporating specification, implementation, verification, and testing. In software, attention to testing and verification methods should be more in proportion to what is seen in practice. The concept of algorithm should be extended by examples of nonterminating processes and embedded reactive software. Students should be introduced to properties of security, reliability, and safety throughout the core courses. Decision procedures for verification offer illuminating examples of advanced algorithms and data structures. Performance analysis courses should include discussions of timing analysis and real-time constraints.

Another point of general agreement is the need for both notational and mathematical coherency. Students need exposure to many levels of abstraction, but the greater need lies in a unifying set of mathematical principles and concepts. Several approaches exist to building this foundation, and each has a distinct notational framework reflected in a distinct tool set. In developing courses, it is tempting to emphasize breadth and comparative expressiveness, as, for example, between two kinds of model checkers. However, it is more valuable to emphasize depth in early courses. Notation and formalization are secondary to problem analysis and complexity management, the intellectual abilities that the student must develop first.

5.3    Formal methods in the upper division undergraduate curriculum

In upper division undergraduate courses the object is preparation for further graduate study or toward professional concentration. Since it is often the case that only one or two faculty members in a department are formal methodists, one or two courses at this level may be the only practical strategy for pedagogy development. Upper division courses are also the proving ground for proposed enhancements to the core. At this level, educators see exposure to tools and case studies as worthy goals, if difficult to achieve. Interinstitutional collaboration, together with industrial involvement, offer the best prospects for rapid progress.

Michael Gordon and Amil Pnueli, both invited speakers at the XXIEC workshop, outlined course work they are developing at this level. These offerings are models for course design elsewhere.

Michael Gordon's formal methods course at Cambridge is roughly equivalent to a two-semester course sequence in the U.S., at either a senior undergraduate or 1st-year graduate level, by comparison. A key aspect of the course is its basis in a uniform formal framework---higher order logic---encompassing hardware and software. The emphasis is on formal representation methods, with some rigor but without involving tools. Building tool use proficiency, according to Gordon, would probably take a third course.

Gordon's general curriculum recommendations are pragmatic [presentation visuals]. Progress in formal methods depends on its integration with, rather than displacement of, existing techniques. He suggests working toward meeting the expressed needs of the design industry. In the case of digital system engineering, this includes looking at verification methods in practice (e.g. use of simulators to explore design behavior) and introducing alternatives (e.g. symbolic execution and model checking). Doing this requires introduction to boolean logic, temporal logic, the basics of test vector development, automata based specification of control, simulation based verification, and a selection of topics in modeling and analysis.

Amir Pnueli's proposed curriculum, also a subject of his 1996 Turing Award lecture, harmonizes with Gordon's. He is developing a two-course early graduate sequence which could conceivably migrate to undergraduate levels. An outline of representative topics includes: specification using automata and temporal logic; "enumerative" model checking; "symbolic" model checking; abstraction/decomposition methods; deductive temporal verification; verifying refinements, algorithmically and deductively; specification and verification of complex data structures. Example follow-up topics mentioned were real-time systems and digital/analog hybrids.

Pnueli strongly endorses the development and use of tools to ground this pedagogy. However, he cautions against descriptive surveys comparing several tools of similar functionality. One must pick an approach and explore it in depth. The most daunting challenge facing someone in creating a course is making this choice. There is a multitude of alternatives at each level of reasoning, but few that present a uniform interface to the user.

5.4    Formal methods at the graduate level

There is sufficient material for a professional degree program in formal methods. It is useful to consider what the actual content of such a program might be, if only to clarify what skills someone who claims to be a specialist should have. However, there are very few institutions with the resources to implement a professional degree, and those that do are more likely to prioritize advanced research. One solution may be to encourage multiinstitutional collaborations among the more isolated research groups. Such a program would include exposure to the mathematical theory in logic, computation, and complexity; surveys of the area's structure along the lines of Section 3; application to a variety of problem domains; experience in large scale system design (possibly through internships); and implementation issues for reasoning systems.

Another approach to postgraduate study is to develop special programs in reeducation, designed for engineers taking three- or six-month educational leaves from industry. Many universities offer credentials for limited programs of focused study. This seems like an avenue which could foster industrial-academic interactions, but no such programs in CS/CE were known to exist at the XXIEC workshop.

6    What would a formal methods consortium do?

The two main directions identified at the XXIEC workshop were a better orchestrated promotion of the field and laying the groundwork for efficient global collaboration. Both of these reflect the view that the formal methods community, though significant in number, is diffuse in character.

6.1    Build support

A broader cross section of the formal methods community should be engaged. The best way to do this is to hold additional forums devoted to education. Most people agree that there are already abundant research meetings in this area, owing to the many orthogonal aspects discussed previously. So subsequent meetings on formal methods education must be held in conjunction with existing forums. On the other hand, experience has shown that it is unproductive to address education topics in subsidiary sessions within a research conference. Such sessions are often seen as afterthoughts and treated as peripheral. There is also an issue of momentum; government and industry must continue to express direct interest in education if participation is to grow.

6.2    Consolidate advocacy

The National Science Foundations's MOSIS facility is often mentioned as a model for effective infusion of a target technology in academia. Conditions do not exist for a similar initiative in formal methods, but it is a useful exercise to contemplate what a "MOSIS for formal methods" might look like. If there were a consensus in government and industry that formal methods and the associated technologies were important in maintaining leadership in the Information Age, what incentives could be created to increase their exposure in degree programs?

Almost certainly, one ingredient would be the creation of a tool set, analogous to the University of California at Berkley's CAD tools and Mississippi State's cell libraries for VLSI in the 1980s. There is no "foundry" service associated in the teaching of formal methods, but realistic problems require computational resources that undergraduate institutions and smaller research groups don't have. Administrating servers for large scale verification exercises would increase the pool of students with exposure to these topics.

Government funding for formal methods research is diminishing. If this trend cannot be reversed, then those isolated agency advocates must work collectively to raise its visibility. They must speak with a common voice about targeting formal methods in educational funding. Cooperation within commercial, governmental, and educational sectors is a prerequisite to collaboration among them.

6.3    Maintain a repository

Obviously, any consortial activity will manifest itself as a site on the World Wide Web. One way to appraise the prospects for a consortium is simply to create such a site and see how it is frequented. However, attention drawn by such a site depends, in part, on the effort put into it. It must also be noted that there are already a number of excellent web sites for formal methods research, so that creating a repository is an integrative activity.

Projects, threads, lectures, and modules. Whether the goal is to infuse standard courses with formal-methods content or to create courses on the topic, the start-up costs are high. A collection of tutorial examples would help an educator get over the threshold of using a particular tool set and developing a particular line of topics. A monitored repository could go further, cataloging contributions according to complexity, problem class, and reasoning technology. Taking this a step further, the people involved in formal methods have the technical competency to make portions of their material interactive. Provided the support infrastructure were available, these materials could be opened for network access. Springer's STTT/ETI journal and software repository is an example in this kind of collaborative initiative.

Bibliographies and archival reference. Much of the publication history in formal methods has been in workshop and conference proceedings, and even in limited distribution reports. Although it would be quite an undertaking to create a comprehensive list of readings from scratch, there are several existing bibliographies already. A consolidation would be of considerable value to someone building courses, and to students interested in exploring the area. It might also help in consolidating subtopics.

Documentation, manuals, and notes. Often, the existing documentation needed to learn and use a reasoning tool is marginal or out of date. In addition, manuals are often highly application oriented, with broader uses appearing over time in the research literature. Educators and their students can participate directly in advancing formal methods by augmenting the documentation now in use. Many CAD companies maintain university programs with this purpose in mind. At present, however, most reasoning tools are already in the public domain or are under development by start-up companies with no reserves to engage with educational efforts. A more central distribution point would be helpful.

Communication and collaboration facilities. Surely, a primary function of a consortium would be to create communication channels among all groups with an interest in formal methods advancement. It can put people in touch with each other, matching interests and needs. Centralized lists of financial-aid opportunities, program announcements, postdoctoral positions, and job openings would lend a sense of critical mass. Similarly, a referrals listing would put in contact those with problems in specific domains with qualified academics interested in gaining field experience.

Training materials and short courses. It is symptomatic that educators are now attending training courses offered by industry and government groups. Why is it not the reverse? The situation reinforces beliefs that the research has outpaced academia to become the exclusive domain of commercial enterprise. While this misconception ignores the true nature of the symbiosis, it does reflect the real difficulties of matching the rates of methodological and technological progress. Over the longer term there will be feedback; but the response time can be improved by progressive programs in reeducation, distance education, and short courses.

Cookbooks, engineering handbooks. A universal aspect of formal methods is understanding how to represent properties and specifications in the given formalism. At more abstract levels, there are usually many alternative ways to structure a model. At more concrete levels, a commensurate effort may go into expressing a particular property. While understanding a particular representation may be a matter of education, exercising this knowledge is also a matter of experience and recall. As in other aspects of engineering mathematics, this activity should result in a compendium of common techniques and representations (See, for example, the FMSP98 paper by Dwyer,, on patterns in temporal logic.)

Componentware, special configurations, and front ends. In discussing the inclusion of reasoning tools in courses, some people call for "pedagogically appropriate" versions of tools while others want "the real thing." The choice depends on several factors, including not only allowances for the relatively short learning window, but also whether the intent is to show how a tool is used or how it works. To aid this process, developers of research prototypes--especially at more abstract levels--should strive for an open, object-oriented software architecture in the hope that components of their work can be reused.

Maintained and integrated libraries. The term "library" is used for collections of formalized theories and facts. Common examples include binary vectors, arithmetic algebra, graph theory, streams, and so forth. Researchers addressing a particular problem class often must recreate theories for their own tools. Efforts to consolidate and share libraries are often marginal because there is no direct source of support for this labor-intensive activity. This is not just a data management problem; the maintainers must be experts, not only in the systems involved, but also in the underlying mathematics.

Benchmarks and formalized objects. Within the past five years, experimentation and case studies have approached realistic scale for designs in widespread use. The artifacts of this research may be useful in commercial designs that are not pushing technology too hard, or as starting points for other case studies. If the research community is committed to higher quality and reliability, then one way to achieve it is to make these artifacts available for general review and possible use.

Serve a continuing forum. A consortium must be a clearing house for issues and ideas. In an area so potentially contentious as formal methods, an organization perceived as being exclusionary will defeat its own purposes. On the other hand, there is general (though not universal) agreement that education is the priority, and this is the glue that should hold a consortium together. In addition to newsgroups and mailing lists (of which there are already several related to research), other forms of participation include position statements, accounts of historical background, promoting panels or sessions at research conferences, and so forth.

7    Next steps

It remains to be seen whether the existing interests, which are by no means uniform in their goals, will congeal into a consortium. Despite the apparent lack of "critical mass" at present, government, commercial, and academic sectors each have a voice in the formal methods community. Awareness of the need for advances in this area is clearly rising. This is a moment of opportunity to consolidate efforts and accelerate progress in system design curricula. Formal methods is hardly a "done deal," nor is its deployment merely a question of technical transfer. At all levels its use involves new concepts and techniques that are only beginning to distill from practice. Advancement remains a synergy of research, experience, and preparation. Methods will surely improve with time, but we can influence the rate of that improvement. With regard to education, proactive steps include the following, among others.

Identify formal methods as a funding target in educational programs. Education is receiving unprecedented support in U.S. government funding. However, formal methods is not yet recognized as a "coherent body of knowledge" [CCIC95, Part III.D] on which to base an educational project. Moreover, educational funding in science and engineering, at NSF in particular, greatly emphasizes multidisciplinary undertakings, and reviewers often do not regard formal methods as fitting the multidisciplinary mold.

The agenda for formal methods education includes work targeted at both undergraduate and graduate levels, not to mention infusion of research at both levels. This is a broader agenda than most educational programs conceive. A single project aimed at both undergraduate and graduate course development is unlikely to fare well in current NSF educational program reviews. It must also be acknowledged that the goals in formal methods education are not as clearly delineated as in other areas of announced emphasis. Target student populations are not as clearly identified as they are in science and medicine, for example. Outcomes and assessments are impossible to detail sharply. The tradeoff between tool skills and conceptual skills is not well understood. Commercial partnerships are tentative.

The big picture is that education is clearly in the critical path for advancement of formal methods and formal methods is a crucial avenue for achieving greater assurance, higher quality, and economic competitiveness in system design. With this realization, agencies traditionally chartered for research support should look for ways to subsidize educational aspects, possibly through training initiatives and pilot studies. At the very least they should lobby to inform granting agencies that progress depends on some fundamental shifts in the character of CS/CE education.

More and better focused research funding. The best people to teach formal methods are people who are actively doing research in it. They are also the prime producers of new educators and developers of new tools a broader population of people can use. People cannot and will not do research on formal methods unless they can obtain grants, contracts, and fellowships for this purpose. Thus, the most direct way to stimulate activity is to provide these incentives. Unfortunately, the funding trends have been regressive in recent years. In DARPA, a rebudgeting in 1996 devastated five major formal methods research projects and sent a very ominous message to new researchers. Two of the most prominent formal methods research enterprises in the United States have closed their doors in recent years because of dwindling support.

Attract students and convince university administrators. Formal methods practice involves a set of highly demanding skills requiring a relatively strong background in mathematics and computer systems. In a period when CS/CE enrollment is not meeting the demand, there is competition for students with areas such as scientific computation, graphics and visualization, computer-human interface, and so on. These areas are not only enticing, but also have well established advocacies in administrative circles. It is fair to say that advancement in high confidence systems depends on raising awareness of its importance to design practice, in both legislative circles and academic administrations. For the latter, the clamor of interests in high performance computing, human computer interaction, instrumentation, electronic information bases --the list goes on and on -- drowns out smaller advocacies, even if they are well focused.

Nurture industrial interactions. Industrial pilot studies involving academics, often subsidized by sabbatical leaves, government incentives, and so forth, have made significant contributions in research and to the spread of acceptance for formal methods. They don't always succeed, but when they do, they produce engineers with new skills and academics with fresh insights. Among the problems for starting a pilot study are differences in outlook ("time-to-benefit" expectations) and culture, as well as a fair degree of intimidation both ways. Encouragement on the part of contract givers can be influential. In some recent cases, a government agency has actually served as a "dating service," finding people, putting them together and subsidizing their preliminary work. It has also been suggested that providing incentives to formalization in government contract solicitations is more likely to spark competition than mandating it.

Conduct an outreaching survey of educational needs. As noted earlier in this report, fully specifying a formal methods curriculum is out of the question. However, there should be more concrete discussion about educational needs. Given a forum to do it, industries and research enterprises can undoubtedly compile a set of topics and skills that they would like to see in new graduates. Such a list is the raw material that academics can refine into a curriculum. The most effective approach may be to gather information at the source. Several industrial invitees to the XXIEC workshop reported problems convincing their supervisors (or themselves) that the trip would be measurably worthwhile. As a prelude to gaining greater involvement outside the educational establishment, a campaign of direct contact should be undertaken to survey people in the field about specific educational recommendations. This would be a continuing workshop in which a team of academics makes site visits for half-day intensive discussions with interested groups.

Engage more faculty. Significantly increasing the number of students practicing formal methods entails significantly increasing the number of educators teaching it. One participating educator put it this way:

Formal methods are hard. I feel that most mortals like me just need more help in it than in other subjects. This points to the need for good teaching methodologies and techniques, good class examples, good homework problems and suggestions, etc. The web site [FMEd] you have set up is definitely the kind of help we are looking for.

Subsidize a repository. The World Wide Web link cited above was created in response to suggestions made at the XXIEC workshop. It is being maintained on a voluntary basis, and is essentially an information hub. Proactive interests should contemplate a more actively developed resource in which materials are consolidated. The creation of pedagogy through textbook publication is a traditional source of income for educators; hence, a quality repository on the web will need to provide an alternative reward structure for its contributors.


Higher quality systems are essential to progress in the Information Age. Formal methods is not the only remedy to high quality, but it is certainly a component of achieving larger, safer, more secure, and more robust systems. Ultimately, this claim will be substantiated by economic realities, and educational institutions will eventually respond. The degree and rate of responsiveness, however, depends on initiatives taken by concerned interests in industry, government, and academia. The central mission of a 21st Century Engineering Consortium would be to promote those interests by providing a better orchestrated voice.

8    References

[DeHi96] C. Neville Dean and Michael G. Hinchey (eds.). Teaching and Learning Formal Methods International Series in Formal Methods, Academic Press, San Diego, 1996. A collection of papers by people building curricula in (or related to) formal methods.
[CD98] Formal verification: essential for complex designs. Computer Design 37(6) (June 1998). A roundtable on trends in formal verification primarily from the hardware system perspective
[ClWi96] Edmund M. Clarke and Jeannette M. Wing, et. al. Formal Methods: State of the Art and Future Directions. ACM Computing Surveys 26(4):626-643. A review of recent case studies and results in requirements specification and formal verification at both high and low levels of abstraction. A strategic agenda for research, and how it relates to needs in government and industry.
[CCIC95] America in the Age of Information: A Forum Report of a meeting sponsored by NIST/CCIC. A forum of members of the research community, "designed to refine and further detail the six Strategic Focus Areas introduced in the CIC Strategic Implementation Plan." One of these areas is High Confidence Design.
[CCIC97] Research Challenges in High Confidence Systems. Proceedings of the Committee on Computing, Information, and Communications Workshop, August 6-7, 1997. Follows up on [CCIC95] with perspectives offered by various government agencies, leading to [CCIC98]. Formal methods is presented as one of several approaches explored.
[CCIC98] Setting an Interagency High Confidence Systems Research Agenda. Proceedings of the Interagency High Confidence Systems Workshop, March 25, 1998. The third in a series of meetings on high-confidence systems. Included in the report are research overviews from the Federal Aviation Administration, Federal Railroad Administration, Federal Transit Administration, Food and Drug Administration, National Library of Medicine, Department of Treasury, Secret Service, National Aeronautics and Space Administration, Department of Energy, Nuclear Regulatory Commission, National Institute of Standards and Technology, and National Institutes of Health.
[PCCIP] Critical Foundations---Protecting America's Infrastructures. The Report of the President's Commission on Critical Infrastructure Protection, October 1997. A broad discussion of the increasing societal dependence on and vulnerability to computer information systems. Includes policy recommendations.
[XXIEC] The 21st Century Engineering Consortium Workshop, sponsored by Rome Labs and ITO, held March 18-19, 1998 at Melbourne Florida. This Report . This report is an outcome of the workshop.
[FMEd] Formal Methods Education Resources. web page, A web page created in response to suggestions made at the XXIEC workshop.
[Sa96] H. Saiedian, et. al. An invitation to formal methods. Computer 29(4):16-32 (April 1996). A collection of essays on formal methods, primarily from a software perspective.
[WIFT95] IEEE Workshop on Industrial Strength Formal Specification Techniques Group reports of the 1995 workshop, cited herein, are accessible through this website
[FMSP98] Mark Ardis (ed.), Second ACM SIGSOFT Workshop on Formal Methods in Software Practice, March 4-5, 1998.
[NSF96] Gaetano Borriello (ed.), Future Research Directions in CAD for Electronic Systems. Workshop sponsored by NSF/CISE/MIPS, Seattle, May 1996. Primarily focused on renewed efforts in the area of design automation for systems. Discusses the need for higher-risk research looking for methodological breakthroughs rather than incremental advances. The appended survey gives insight from the perspective of CAD system developers.
[STTT/ETI] B. Steffen and W.R. Cleaveland (eds.), International journal on Software Tools for Technology Transfer , and Electronic Tool Integration platform. Springer, from September 1997. A journal and software repository intended "to provide a forum for issues on the development, deployment and use of tools for the construction and analysis of correct and reliable systems."
[NSF98-116] National Science Foundation, Integrative Graduate Education and Research Training (IGERT) Program -- Frequently Asked Questions. Via, "nsf98116". Contains a statement about "education" versus "training".
[Ru93] John Rushby, Formal Methods and Digital Systems, Validation for Airborne Systems. SRI International Computer Science Laboratory Technical Report SRI-CSL-93-7 Contains an overview of formal methods and a primer on logic. The report is "primarily written for certifiers and assumes no previous exposure to formal methods."


A    Genesis of this report.

Comments on this report are are solicited.

Unless otherwise requested, edited commentary may be linked to the the online version. Send responses to Steven D. Johnson at Commentors are invited to use the report questionnaire which provides a framework for responses following the report's outline.

There were four stages in the creation of this report.

  1. June and July, 1998: initial drafting by Steven D. Johnson subsequent to the workshop in March 1998.
  2. August and September 1998: review and redrafting by a task group consisting of Steve Johnson, Shiu-Kai Chin (Syracuse University), Perry Alexander (University of Cincinnati) and Ganesh Gopalakrishana (University of Utah). A review meeting was held August 10-11.
  3. September and October 1998: circulation for comment to workshop participants. Approximately one-third of the participants responded in some form.
  4. Ongoing revision and dissemination to formal methods address lists, starting in March 1999.
Suggestions for archival publication are solicited.

B    Position papers and statements

Statements are invited and will be added to this list

C    Workshop description

Extracted from the announcement of the event held March 18-19, 1998 in Melbourne, Florida. The workshop was sponsored by Defense Advanced Research Projects Agency (DARPA) and U.S. Air Force's Rome Laboratory, Information Institute.

Focus The focus of the workshop, formal methods, refers to that body of theory, technique, pedagogy, and tools that enable rigorous reasoning in system design practice. It is a formative area with many approaches developing, but none yet prevailing. At the same time, a rising need for methodological advances has become manifest, not only in such aspects of high-assurance design as formal specification, safety, security, and reliability, but also in the increasingly critical demands on verification, analysis, and testing in the greater computer industry.

Goals The two specific goals of the workshop are:

  1. Outline a report assessing the status of formal methods in academia and practice, and making recommendations for education and training. From industry and government we must learn about practical needs: what skills must be developed and how broadly? From commercial and academic research we seek feedback about tools and methodologies: what skills must be developed to advance the state of the art?
  2. Form a consortium, if deemed relevant, to act as an advocacy group among all levels of college and university education, and as a working group to facilitate collaboration among institutions.

Featured Speakers: Michael J.C. Gordon and Amir Pnueli. Mike Gordon of the University of Cambridge Computer Laboratory is a widely recognized pioneer in formal methods and automated reasoning. He is a leader in the development of texts and course notes on these topics. Amir Pnueli of the Weizmann Institute is known for his work in temporal logic and applications in verification, work which lead to the 1996 Turing Award.

Organization of the Workshop The two-day workshop is structured to build consensus and content for the resulting report. The first morning will consist of plenary panel sessions of leading representatives from industry, government, and academia to review what is needed and what is currently being done in the development of formal methods curricula. Participants will then assemble a number of small discussion groups to address recommendations for undergraduate and graduate education. The moderators of these groups will report to the whole workshop at the end of the day.

[At the event, panels included individual presentations by Albert Camilleri (Hewlitt Packard Corp.), Mandayam Srivas (SRI Computer Science Laboratory), Edmund M. Clarke (Carnegie Mellon University), J Strother Moore (University of Texas at Austin), and Moshe Vardi (Rice University).]

On the second day, the workshop will reconfigure itself to reflect the prevailing issues and ideas emerging on the first day. A set of topical breakout sessions will be organized. As on Day One, these sessions will be followed by reports by the session moderators to the workshop as a whole. In the afternoon, there will be a plenary meeting to determine whether a consortium should be formed and how to organize it, and a consensus will be sought on the main points of the report and how to produce it.

Participation The principal mission of this workshop is to collect and consolidate input as widely as possible. All forms of input are solicited, including (but not limited to)

The workshop organizers accept responsibility for producing the report as a means of last resort. We regard our function as that of a facilitator in forming the consortium, rather than any kind of de facto official body of the yet-to-be-formed entity. The format of the meeting may mandate limits on direct participation. If you are interested in attending, please contact Mike Nassif, or any member of the organizing group listed below. If you cannot attend the workshop but are interested its outcomes, or in consortium involvement, please contact a member of the organizing group and consider contributing a position paper or letter.

Organizing Committee

Perry Alexander

University of Cincinnati

Bruno Andriamanalimanana

SUNY Institute of Technology

Ahmed E. Barbour

Georgia Southern University

Shiu-Kai Chin

Syracuse University

Ganesh Gopalakrishnan

University of Utah

Steven D. Johnson

Indiana University

Mike Nassif (Ch.)

Rome Laboratory

Bob Paragi

Rome Laboratory

Douglas R. Troeger

City University of New York

Phil Windley

Brigham Young University

D    List of attendees

  1. Ken Abernethy, Furman University,
  2. Warren Perry Alexander, The University of Cincinnati,
  3. Bruno Andriamanalimanana, State University of New York,
  4. Ahmed E. Barbour, Georgia Southern University,
  5. Albert Camilleri, Hewlett Packard,
  6. Victor Carreno, NASA Langley Research Center,
  7. Shiu-Kai Chin, Syracuse University,
  8. Edmund M. Clarke, Carnegie Mellon University,
  9. Kathi Fisler, Rice University, kfisler@cs.rice.edu_
  10. Robert France , Florida Atlantic University,
  11. Yuri Gurevich, University of Michigan,
  12. Ganesh Gopalakrishnan, University of Utah,
  13. Mike Gordon, Cambridge University, UK,
  14. David Hemmendinger, Union College,
  15. Steven D. Johnson, Indiana University, sjohnson@cs.indiana.ed
  16. James Lynch, Clarkson University,
  17. Brad Martin, National Security Agency,
  18. J Strother Moore, University of Texas at Austin,
  19. David Morgan, Kinnesaw State University,
  20. Michael P. Nassif, AFRL Rome Site,
  21. David Naumann, Stevens Institute of Technology, naumann@cs.stevens-tech.edu_
  22. Luke Olzewski, Georga Southern University,
  23. Amir Pnueli, Weizmann Institute of Science (Israel),
  24. Gerbard Ritter, Univeristy of Florida,
  25. Beverly Sanders, University of Florida,
  26. Mandayam Srivas, SRI International,
  27. Edward P. Stabler, Syracuse University,
  28. Douglas R. Troeger, City College of New York, dtroeger@csfaculty.engr.ccny.cuny.ed
  29. Moshe Y. Vardi, Rice University,
  30. Ranga Vermuri, University of Cincinnati,
  31. Phillip J. Windley, Brigham Young University,
  32. Mark Woodcock, National Security Agency,
  33. Frank Young, AFRL,