Comments: See Appendix A
§2. The central question
§3. Charting the formal methods area
§5. What is needed in formal methods education?
§6. What would a formal methods consortium do?
§7. Next steps
§A. Genesis of this report
§B. Position papers and statements
§C. XXIEC workshop announcement
§D. List of attendees
This report is one outcome of the 21st Century Engineering Consortium Workshop (XXIEC) held March 18-19, 1998, at Melbourne, Florida. The meeting was sponsored by the Air Force Rome Laboratory, working with DARPA's Information Technology Office at the initiative of Mike Nassiff of Rome Lab. One impetus was a series of high level meetings on high confidence systems in 1995-98 [CCIC98]. In order to attain the capability to build high confidence systems in significant numbers and at greater scale, it is essential that a much larger number of students be trained in engineering applications of formal methods. The workshop's main purpose was to explore the idea of establishing a consortium to advance formal methods education.
Although there was general encouragement to take steps toward establishing a consortium, one did not spontaneously materialize at this meeting. A wait-and-see attitude toward individual participation prevailed. The primary interest groups in, industry, government, and academia, are each looking to the others for direction, leadership, and support.
A specific goal was to assess the status of formal methods in academia and practice and make recommendations for education and training. Formal methods is evolving with technology and maturing in practice. There is considerable debate about its scope and role in the computer science and engineering (CS/CE) curriculum and about strategies for advancing its pedagogy. A core set of topics is emerging, but treatments exist in great variety, each inspiring its own notation and set of tools. All the while, the need for better design methods grows increasingly urgent as technology pervades all aspects of modern life. In short, the status of formal methods is that both its importance and the awareness of that importance are increasing. Within academia, however, faculty expertise is in short supply, and formal methods competes for attention with other more visible, glamorous, and lucrative subjects.
In technology enterprise today, the balance between quality and feature enhancement favors the latter. As our economy, our safety, and our way of life grows ever more dependent on information systems, quality will inevitably assert itself. Formal methods is a significant avenue in the pursuit of higher quality through better design methods. The XXIEC workshop contemplated ways to reflect this pursuit in education. Issues of general agreement include (among others):
Specific recommendations emerged, although there was no clear priority. These recommendations centered on the following issues:
Our society is increasingly dependent on software, digital systems, and networks. Our attention is captured by newsworthy problems like "The Millennium Bug;" it is focused by daily experiences with balky PCs, overcharges at the checkout line, and the list goes on; it is galvanized by occasional "glitches," some with tragic consequences. Technology is overwhelming human design capabilities. The vast capacities of VLSI, computer memories, and network bandwidth, greatly favor expansion over consolidation, resulting in a proliferation of features but a decline in reliability. Similarly, the rapid evolution of electronic commerce and global communications raises concerns about security, specifically, how to design secure systems. A recent Presidential Commission report makes dire warnings about the vulnerability of our information infrastructure [PCCIP].
One of six recommendations in the 1995 MSTC/CCIC report, America in the Age of Information [CCIC95] is to "Broaden the emphasis on high confidence systems beyond security" to address safety, reliability, real-time performance, and other aspects. In the succeeding three years, a series of meetings was held to set a high confidence systems (HCS) research agenda [CCIC97] [CCIC98]. Throughout these reports, formal methods is seen as fundamental in the effort to raise the quality and assurance of critical hardware and software systems. Although the HCS agenda focuses on research, there is a clear realization that education is a major issue in achieving its goals:
The time is ripe [for] the development of educational programs for safety-critical systems. There is agreement in this panel that the area of high-confidence systems is an intellectually coherent body of knowledge. Areas that were considered separate before (such as safety, security, functional correctness, performance and real time, and fault tolerance) are parts of the emerging new area. This is the right time for industry, government, and academia to get together to organize curricula, case studies, electronic textbooks on the Internet, shared software, and so on. [CCIC95, Part III.D]
Formal methods is centrally concerned with safety-critical systems. The agencies involved in the HCS agenda see a critical shortage of people to carry advances in formal methods to practice, and, to make matters worse, in the number of educators available to fill the pipeline. The emerging "intellectually coherent body of knowledge" is not being incorporated into existing curricula, at least not rapidly enough to meet HCS needs. The XXIEC workshop was aimed at creating a movement addressing these shortages. Its goal was to assemble knowledgeable people from all sectors to begin outlining an educational plan and to plant the seeds for a continuing effort.
In the broad sense, "formal methods" refers to the frontier of engineering practice, rather than to any delineated technique or technology. The overwhelming pace of system design makes this frontier an ever-receding horizon. The rapid evolution of digital technologies, and, in particular, the exploding complexities brought about by expanding scale, generate a mounting sense of urgency in the search for more unified and systematic approaches to design. At the same time, commercial competition increases the pressure to design more efficiently, but above all, more quickly. In some sectors of the design industry, such as high-volume VLSI and telecommunications, formalized analysis is already a competitive necessity. In a recent roundtable on formal verification, Robert E. Damiano of Synopsys is quoted as saying
`We might disagree on whether the underlying technology is the right technology, whether the engine needs a lot more work, or if a breakthrough is necessary for widestream adoption. . . . But not a single one of us disagrees that formalizing design methodology and the design flow is the correct thing to do.' [CD98]
It is clear that changing design paradigms are encountering significant problems in education and reeducation. According to anecdotal observations by the XXIEC workshop organizers, there is widespread agreement that education and training have become the priority in advancing system design practice. A participant at the 1995 Workshop on Industrial-strength Formal Specification Techniques [WIFT95] reports that almost every technical discussion concluded with the mention of needs in education. At the 1998 Workshop on Formal Methods in Software Practice (FMSP98) all three members of the event's panel session, entitled "If you could spend $1M extra this year on improving software quality, how would you allocate it?" nominated education. The FMSP98 keynote speaker focused on the importance of creative expertise needed in the effective use of automatic verifiers.
However, specific recommendations are rare. Pointed questions about what should be taught to undergraduates, or what constitutes a professional skill in formal methods, or even what preparation someone should have to do research, are usually answered with a shrug or the slogan "teach more mathematics." Even people involved in developing formal methods courses are uncertain about what background they would like to see in their incoming students. While most agree that the needed pedagogy must include experience in the use of computer assisted reasoning tools, it is also acknowledged that gaining enough experience to teach these tools is daunting.
This report consolidates the viewpoints of a group of individuals actively involved in the formal methods area at this time. Although the XXIEC participants included a number of field leaders, attendance was not a representative cross section of interested groups. Most of the attendees were academics, with just a few representatives from national laboratories and agencies (NASA, NSA, AFOSR), and only one or two representing industry. The participants might also be characterized as "biased toward hardware" with a disproportionate number representing higher-level approaches (i.e. "theorem prover types"). Nearly all the participants work in the United States, whereas the formal methods area is and has been quite international in character.
The remainder of this report has five parts. The following section is a brief discussion of the central issue arising in almost every discussion of formal methods education: is the emphasis on broad-based integrated education and training or on producing a limited pool of experts for critical applications. Next, an overview of the formal methods area looks at divisions of research, approach, and emphasis that are inevitably reflected in educational discussions. Third, we look at the constituencies involved in shaping and implementing a formal methods curriculum. Fourth, we review and consolidate educational recommendations, and describe courses under development by area leaders. Finally, we list some of the possible contributions a consortium effort might make, concluding with a short list of steps to take next.
Even allowing for the great diversity of opinion about what constitutes "formal methods," opinions vary greatly about how, indeed whether, to build a pedagogy for it. As an illustration, we present excerpts of two statements responding to preliminary announcements of the XXIEC workshop. Both individuals chose not to participate. Comparing their reasons illuminates a fundamental debate about formal methods in CS/CE education.
In declining an invitation to participate as a panelist, a prominent academic wrote,
I must tell you that when I first saw this announcement, . . . I was turned off. It seemed to me to be a conference arguing that formal methods were an add-on feature of (software) engineering education, whereas I believe that they should be integrated into software engineering education in the way that calculus is integrated into other engineering, i.e. used in every single course. I also don't like talking about "formal methods" because I think that what is needed is something old, mathematics, not something new (. . . some committee reinvention of old mathematics in ad hoc ways). . . . My concern is that the workshop format might not provide a suitable occasion for a full discussion of a 4 year independent [program].
A contrasting point of view is seen in the following review of a draft workshop description. The reviewer is affiliated with a formal methods group at a government laboratory.
A fundamental premise underlying [the 21st Century Engineering Consortium] is that education is the solution to our problems in developing high assurance systems. This is false, because it presupposes that someone knows how to build such systems, and that everyone can do so if only they are taught how. This is false because (1) no one knows how to build high integrity software yet, and (2) even if someone did, not everyone will be able to do it. I believe the 21st Century Engineering Consortium is likely to do far more harm than good."
The first statement reflects a view that "formal methods" is just an attribution of the maturation of a new engineering discipline. It should not be confused with basic mathematical preparation, nor should it displace it. As a workshop participant observed, "Calculus is a formal method" (although perhaps not the only formal method needed in CS/CE); disciplines already provide models for incorporating methodological rigor; and they do need consortia to accomplish it. The point is that the necessary pedagogy be widespread and incorporated in the general curriculum of a broad undergraduate population.
The second statement associates "formal methods" with a more targeted need in highly demanding design applications. While brusquely dismissing education, the reviewer would probably support more exposure to formalisms and to reasoning tools. The key point is that the pedagogy is intended for a more limited pool of people with special expertise.
This debate between broad-based versus specialized education reverberates in almost every forum and discussion. Is it better, as one workshop participant put it, to focus on the carpenters' methods or those of the architect? Individual educators will be guided by their personal philosophies and professional circumstances, and similarly with advocates in government and industry. Those responsible for highly critical design applications need more experts, sooner; while those engaged in commercial design activities usually advocate incremental advances, taught to a large number of new engineers. It is vital in any discussion of formal methods to know where each discussant stands, or risk talking at cross purposes.
It may also be noted that both of these statements project an adverse intent on the notion of a consortium. Apprehensions about political and scientific implications were widely shared among those who attended the workshop. If these suspicions are not overcome, then a consortium would, indeed, become a splintering influence rather than a nurturing one. It is also widely believed that faculty resources to build formal methods pedagogy are extremely limited, so collegiality is of great importance.
The issues and perspectives reflected in this report are snapshots of a dynamic area of applied research. "Formal methods" is a widely used term but many regard it as nondescriptive, at best, and, at worst, as a somewhat derogatory reference to the academic idealization of real-world processes. The area remains something of a battleground of academic, practical, commercial, and philosophical perspectives. That the clashes are often intense and passionate is testimony to the community's sense of the area's importance. It also reflects the broad range of views on what formal methods is about and how it is applied.
This section surveys this diversity and summarizes the principal aspects shaping the formal methods area. Within this outline (it is hoped) it can be seen that apparent points of debate are more like points on a compass, reflecting not differences in judgment, but a full spectrum of approaches. In curriculum planning, it is important to understand that the intellectual foundation must eventually transcend transitory distinctions. In the interim, however, it is likely to be governed by them.
The XXIEC workshop announcement defined formal methods as:
that body of theory, technique, pedagogy, and tools that enable rigorous reasoning in system design practice. [XXIEC]
In a highly regarded treatise on formal methods, John Rushby writes,
`Formal methods' are the use of mathematical techniques in the design and analysis of computer hardware and software; in particular, formal methods allow properties of a computer system to be predicted from a mathematical model of the system by a process akin to calculation. [Ru93]
The series forward of the Academic Press Series in Formal Methods says,
formal methods refers to the application of mathematical techniques for the specification, analysis, design, implementation, and subsequent maintenance of complex computer software and hardware. [DeHi96]
A 1996 Computer Surveys article says formal methods are
mathematically based languages, techniques, and tools for specifying and verifying [reliable] systems. [ClWi96]
These three recent definitions are generally consistent. Though broadly accommodating, each invokes three elements that here are taken to characterize the area:
Formal methods is founded on such areas of theoretical computer science as automata, complexity, semantics, and logic. However, it centers on concrete problems. As a consequence, formal methods researchers often serve as intermediaries between theory and practice (and often are perceived in either community as representatives of the other). In this respect, they carry information both ways, seeking both to advance the state of practice and influence theoretical research to be pragmatic.
In this context, the word "formal" invokes the notion of a formal system, or calculus for the manipulation of syntax, graphs, and other structures, according to a determined set of semantically sound inference rules. Clearly, formalization is a prerequisite to automation, since computers are symbol manipulators. The creation of sound tools requires mathematical insight. The use of such tools sometimes requires an understanding of the underlying formalism, which, in turn, requires a certain depth and kind of mathematical background.
"Formal methods" does not mean "more mathematical methods" and, in particular, does not proclaim this area to be superior to conventional practices by virtue of its formality. Formal methods are rigorous because reasoning is symbolic, but this also greatly constrains its flexibility. Constructing a formalized argument is a different activity than proving a theorem or performing numerical analysis.
For example, the term "formal verification" is sometimes used to distinguish the use of decision procedures from conventional verification techniques that use simulation modeling. Of course, simulation also entails formalization (a hardware description language or physical description, but in any case a formal expression), and provides a supporting tool for reasoning symbolically about a design. Whether simulation classifies as a formal method depends on, well, the method with which the simulator is used. Formal and conventional verification are not alternatives; they are alternates. A practitioner will use both, as needs dictate; thus, one task of the formal methodist is to discover effective ways to integrate the two techniques.
A traditional demarcation in formal methods activity is the so-called "hardware-software boundary." Historically, there has been relatively little communication between hardware and software oriented communities. A nearly disjoint set of conferences and archival media exists. However, if this mythical distinction ever made sense, it makes less every day. The difference is a matter of emphasis rather than substance. Furthermore, the continuing fusion of hardware and software -- as manifested in distributed computation, embedded software, codesign, configurable technologies, and chip level systems -- is unifying present methods of programming and design.
In software dominated applications, many factors impede formal methods adoption in the mainstream. The complexity of abstractions, the prevalence of dynamic memory allocation, and similar qualities make both theorem proving and model checking approaches difficult. It remains economically feasible and socially acceptable to release faulty systems and then "patch" errors in situ. Furthermore, users are tolerant of, or at least used to, systems with product errors. Ironically, the increasing presence of the world wide web perpetuates this situation by making incremental software distribution easier.
The corresponding factors are more favorable in hardware dominated domains. Data and control abstractions are concrete and lend themselves to formal analysis. Time to market is of paramount importance in the hardware community while correcting a design error in the field is either impossible or prohibitively expensive. Eliminating one design iteration is worth a lot of effort. The engineering culture accepts the need to design with testing and verification in mind. System architects are increasingly willing to sacrifice resources (i.e. chip area) to achieve system verification. Consumers are highly intolerant of design flaws.
In the mean time, technology is transcending any residual distinction. Software is becoming highly distributed, complex, and mission critical. Hardware is becoming more configurable and programmable. Hardware description languages and their associated design tools give hardware design a software feel, and allow design to be initiated at higher levels of abstraction. The integrated circuit industry is increasingly involved with software artifacts, such as simulation models and cores, to market their intellectual property.
Formal verification has a foothold in hardware design practice. Reasoning tools are commercially available and incorporated into a growing number of implementation processes. As practice adapts to these advances, new applications are found. Recently, there has been a substantial buildup of verification groups in integrated circuit and hardware systems industries, prompting the CAD industry to compete with these capabilities. It is a positive feedback relationship.
Formal software verification has been far less pervasive. Few commercial offerings are available. Outside of highly critical applications, little formal verification is performed. At FMSP98, for example, virtually all attendees (with the notable exception of the keynote speaker) were academic researchers, or government representatives, or government contractors responsible for highly critical systems such as aircraft control. The trend in many software applications is using formal techniques in specification phases, particularly in specifying hardware-like control and monitoring systems. It is often reported that the activity of formalization, itself, is beneficial and can only improve as analysis tools are incorporated.
Further opportunistic expansion can be expected where algorithmic capabilities fit with analysis needs. As more software systems are seen as critical, the current "release and patch" approach cannot be sustained. Thus, we can expect to see a widening frontier at which some kind of formal analysis becomes mandatory.
In a keynote talk at the XXIEC workshop, Amir Pnueli described two kinds of formal analysis. He observed that in property list verification, the specification to be satisfied is usually expressed in a different formal notation than the implementation to be analyzed. There is a sense in which a specification is a cumulative collection of orthogonal attributes. Qualities such as safety, liveness, security, fault tolerance, timing, and so on, are common targets of property verification.
Implementation verification addresses the primary design intent -- its "function" or "behavior." Two levels of describing the same design object are compared. The engineering process passes through several levels, each implementation serving as the specification for the next level. It is often described as resolving the "what" and the "how" of a design. Typically, all levels are expressed in the same formalism, and verification establishes some semantic correspondence (e.g. equivalence, implication, containment, refinement) between the two expressions.
These distinctions are hard to pin down mathematically. Differences in syntax are often resolved by a embedding in common semantic theory, or by translation to a common data representation. A specification is, by definition, partial. Behavior can be regarded as one of a system's properties. There is, moreover, significant interplay between the two notions of verification. For example, an isolated property such as deadlock is often verified for a reduced specification in which irrelevant details have been abstracted away. Thus, implementation verification may be involved in demonstrating that the original specification is a genuine instance of the abstracted one.
While formal verification is often defined as "proving something correct," it is more realistically described as a process of repeated refutation. In whatever form they take, proof attempts fail far more often than they succeed, and the practical value of verification is measured in terms of the errors it exposes. A proof failure may be due to a design error, or it may disclose a fallacy in specification, or it may be resolved by constraining how the design object is used. Properties thus isolated are byproducts analogous to the data sheet of a hardware device or a product-use disclaimer for software. Increasingly, these artifacts are recognized as value-adding when packaged with the product.
Designs are situated in the physical world and are conceived to solve physical problems. Design formalizations, then, must be situated in the appropriate mathematical models. Domain modeling refers to the construction of these models. This can be a means of communication between domain experts and design specifiers, or at least a means of recording of this communication. Discussions of mathematics in formal methods education sometimes confuse the background needed to model physical domains, and that needed to understand formal manipulations.
The bridge between problem definition and solution specification is often called requirements analysis. A number of pilot studies have reported that the process of formalizing a specification is worthwhile, whether or not automated reasoning is applied to the formalized object. Indeed, this is sometimes what is meant by the term "formal methods," contrary to the definitions given in this report. In software domains, tools for constructing specifications and performing static analysis tasks (e.g. type checking, dependence analysis, consistency and completeness checking, overflow analysis) are now of proven value in critical applications.
Implementation verification, or checking for conformance between a source and a target design description, is what most people think of first when the subject of formal methods comes up. As was mentioned earlier, there may be several strata at which verification is applied in the design process. In hardware applications, a distinction is made between verification, which applies to descriptions of design objects, and testing, which applies to the objects themselves. This distinction is not so sharp in software, even though it is still valid.
Formal methods also have applications in retrospective analysis. One example is failure investigation. Another is reengineering, where an obsolete device must be retargeted to newer technology. Hardware retargeting is a substantial problem in military systems, which may have unusually long shelf lives, and in safety critical systems, which have protracted approval processes. However, these applications are not seen routinely.
There is a logical hierarchy, including decidable theories (propositional logic, boolean algebra, finite extensions, weak arithmetics, etc.); equational logics over either fixed theories (arithmetic, usually) or abstract data types; function domains; process logics; first order and higher order predicate logics; and metalogics (logics for describing and comparing other logics). One may add temporal or modal extensions providing a form of implicit quantification (e.g. "eventually" means "there exists some time in the future," time being implicit). Model checking involves temporal extensions to propositional logic; program logics can be viewed as modal extensions, or temporal extensions, or both.
At every level there is a choice of inference mechanism. Model checking can be seen as a symbolic form of simulation. Verification is usually cast as the deductive proof of satisfaction between implementation and specification; but it can alternatively be performed as a derivation that reduces specification to implementations by means of refinements and transformations.
Higher level formalisms admit innumerable ways to model design objects, just as one may choose among many mathematical formulations of a given phenomenon. One prevalent example is the choice between functional and relational representations of processes in logic. Some treatments translate design objects directly into logical expressions, called shallow embedding, while others introduce a design language and interpreter, called deep embedding. The choice is not determined by the formalism, but by the task for which formalization is used.
Whatever the application domain, whatever the technology, whatever the design phase, and to whatever degree the reasoning is automated, judgment, technique, and ingenuity are required to perform the complex analysis tasks in system design. It is accepted that different design aspects require different approaches; there is no universal "formal method." The expertise lies in the ability to use the appropriate techniques in consort. Although heterogeneous reasoning environments are advancing all the time, integration must take place in the mind of the engineer, either during design or establishing processes in which other specialists can use the tools effectively. These intellectual skills lie at the heart of formal methods education.
A foundation of intellectual preparation is being distilled from current research and exploratory practice. The sense of the XXIEC workshop was that we have progressed far enough to define this common ground. This definition process would be greatly facilitated by creating forums devoted to that purpose, assuming that all interest groups participate. It must be recognized, however, that this is the beginning of a consensus building effort that will, in the best case, take several years of involvement.
This section describes five overlapping groups involved in advancing formal methods---educators, researchers, developers, practitioners, and overseers. Of course, these must be very general characterizations, since in each group one will find a diverse range of perspectives. The focus is on what each group brings to and takes from the formal methods movement, and how this might impact educational initiatives.
An educator's main product is students who graduate with knowledge and skills to carry the field forward. Pedagogy, in the form of text books, course notes, laboratory projects, and so forth, is a marketable byproduct. In applied disciplines, the educator's job is to distill decades of practical experience into a few years of relevant preparation. Since the primary concern is preparation for a life-time career, teachers tend to take a long view, favoring principles and preparing their students to "learn how to learn."
They also want (and need) their students to find jobs, of course. Curriculum development in CS/CE must ultimately navigate by the job market, which is the indicator of technological trends. Formal methods is an exacting discipline and not superficially glamorous. To attract students into the classroom and convince administrators that new pedagogy is justified, educators must be able to point to demand from industry and government.
To implement the broader objectives of engineering education, we must also entice more educators to become involved in formal methods. It is a difficult area to "come up to speed" in. The few textbooks that are available vary greatly from one to another in both notation and topics covered. Publishers do not see a sufficient market to become involved, and this also influences authors. The greater challenge, though, is becoming proficient enough in the use of reasoning tools to teach with them. Occasionally, workshops and tutorials are offered through conferences or by research organizations, but a short course can only provide a jump start. Even with a consultant, it can take several months of work to gain simple proficiency.
Researchers drive the advancement and deployment of reasoning technology. Notwithstanding some recent inroads, formal methods remains an exploratory endeavor. Progress occurs in two ways: spreading out from established beachheads and through incursions at new points of application. Both entail the kind of motivation and risk taking that is only found in research.
The researcher's needs include material resources to conduct their work and access to examples appropriate to their investigations. Above all, they need students prepared to advance research. In other words, research is not only a contributor to but also a consumer of formal methods education. As the intermediary between academia and industry, it is up to the research community to set the agenda for undergraduate education. With this primary role comes the implied mandate to present their experiences in both practical and academic venues.
In areas such as formal verification, research is already migrating from the academic to the commercial sector. This will, of course change the patterns and modes of dissemination. Some academics are concerned about how this trend will impact curriculum development, since there is a risk that communication channels between industry and educational institutions, already considered inadequate, may become further restricted.
Practitioners are directly engaged in system design activities or with evaluating formal methods in the industrial setting. Among this diverse population, common themes arise in discussions of the formal methods challenge. A list of concerns found in [ClWi96] includes such phrases as quantifiable benefit, conceptual compatability, stability, and integratability. When pressed for educational recommendations, industrial practitioners tend to advocate broad-based curricula and technical training. As an XXIEC participant pointed out, his company has just a handful of formal verification experts, but hires hundreds of engineers.
Commercial system design is a vastly complex undertaking, and formalized analysis is just one element. To make inroads in practice, innovation must be minimally disruptive and must carry a high likelihood of benefit. In addition, there is an understandable emphasis on on the critical path, that is, applications to the most complex designs using the most aggressive technologies. Such tests under fire are not ideal for systematic evaluation, but should be viewed as a means of gaining credibility to justify longer term investment.
There is evidence of gain in the integrated circuit industry, where some large companies are expanding their hiring of formal methods experts. This would suggest that curriculum developers can get correspondingly more substantive feedback about needed skills. However, it is unclear how to gather this feedback, since there are few opportunities to consult with industry about this issue.
Developers govern the transfer of design technology, hence also methodology, from research to practice. The hardware industry's early adoption of formal verification has fostered a growing commercial enterprise, which nurtures fundamental changes in engineering processes. Commercial tool development, either in-house or in the CAD market, is where hiring demands are now concentrated. Naturally, CAD vendors are interested in expanding their market and customer base. Hence, they tend to favor technologies that readily integrate with existing design environments and training programs. This generally translates into strong advocacy for tools that can be broadly deployed and in incremental advancements in methodology.
At the same time, the CAD industry has been, until recently perhaps, the principal consumer of formal methods expertise, hiring those skills necessary to create competitive products and establish new markets. Now, a formal methods consulting enterprise appears to be emerging, with a few companies now specializing in integrated verification tools and in providing on-site consulting services for system specification and analysis. Software vendors are breeding expertise vendors, which forecasts a greater demand in education on the use of reasoning tools rather than their implementation.
CAD researchers have declared a grand challenge in the design of full systems on silicon [NSF96]. Although verification and test are critical problems, formal methods, per se, is not stated as a solution path at this time. Certainly, present design technology, formal or not, is inadequate for systems of present complexity, but addressing this inadequacy involves not just new methods, but new paradigms unifying software and hardware design.
Formal methods advancement has been strongly influenced by government subagencies responsible for design-critical applications. These concerns point to both a shortage of expertise, and inadequate technology, as critical issues to be addressed. In particular, research at higher levels of reasoning finds interest here, often through sponsorship of commercial pilot studies. Typically, these projects are quite different from the explorations in the CAD and integrated circuit industries. There is more emphasis on the use of general reasoning tools, and on focusing special expertise to low-volume design components with extremely high costs associated with failure. This includes safety-critical systems, of course, but also missions to distant or hostile environments, systems involving the communication of money, intelligence, and so forth.
Government funding patterns in formal methods have diminished or become disorganized in the past five years. It is perceived that DARPA, which has been the principal source of support in the United States, regards its job as done. What remains is seen as primarily a matter of technical transfer to be taken care of by commercial interests. The National Science Foundation evidently does not identify formal methods as a targeted area of support. This may partly be due to ambiguities discussed in Section 3. At NSF, technical understanding remains at the project director level, with little explicit acknowledgment at division or directorate levels. Similarly, the National Security Agency, which has historically overseen long-term research activities, seems to have abandoned or marginalized its interests.
The individual formal methods lead groups in government seem to be getting more isolated, even while they enjoy a sense of success in the past decade's initiatives. The progress would seem to justify further investment in, rather than abandonment of, research guidance. Positions may be reconstituting themselves through the HCS computing agenda [CCIC98]; however, these initiatives have not yet translated to actual resources. If a critical mass is forming, it has not yet materialized.
Broadly speaking, education usually refers to courses and similar didactic activities that equip students with an understanding of basic facts and concepts, and with facility in use of basic methods such as those provided by mathematics. Training generally refers to activities that prepare students to use the methodology and technology needed for original research in particular areas. Obviously, instruction can play a significant role in both education and training activities. Thus, and especially at the graduate level, it can be difficult at times to categorize a particular activity as either education or training. [NSF98-116]
This distinction is nowhere more greatly blurred than in formal methods. Incorporating technology in courses is always difficult, but it is also common to mainstream education in computer science and engineering. The problem for formal methods is that neither the methodology nor the technology is very advanced, compared to the compilers, editors, operating systems, browsers, and office tools that students already must know. Most people who have developed courses in this area emphasize either methodology or tool use at the expense of the other.
There was nearly universal agreement at the XXIEC workshop against fixing a curriculum for formal methods content. It would be highly premature and wrong to impose an approach or treatment, when what is really needed is competition among them. In engineering, setting accreditation standards is simply out of the question on both pragmatic and intellectual grounds, even if there were an adequate number of educators to implement them -- which there is not.
A substantial number of workshop participants agreed with the statement, quoted earlier, that conventional mathematics provides a better foundation than some arbitrary collection of ad hoc formalisms. Mathematical insight is essential in formal methods practice. At the same time, there are topics such as structural induction, automata, semantics, boolean analysis, and others, that could be presented with a formal-methods slant.
Relevant topics and threads could be nonintrusively incorporated in existing core courses. Courses in programming and digital design should present engineering methods incorporating specification, implementation, verification, and testing. In software, attention to testing and verification methods should be more in proportion to what is seen in practice. The concept of algorithm should be extended by examples of nonterminating processes and embedded reactive software. Students should be introduced to properties of security, reliability, and safety throughout the core courses. Decision procedures for verification offer illuminating examples of advanced algorithms and data structures. Performance analysis courses should include discussions of timing analysis and real-time constraints.
Another point of general agreement is the need for both notational and mathematical coherency. Students need exposure to many levels of abstraction, but the greater need lies in a unifying set of mathematical principles and concepts. Several approaches exist to building this foundation, and each has a distinct notational framework reflected in a distinct tool set. In developing courses, it is tempting to emphasize breadth and comparative expressiveness, as, for example, between two kinds of model checkers. However, it is more valuable to emphasize depth in early courses. Notation and formalization are secondary to problem analysis and complexity management, the intellectual abilities that the student must develop first.
In upper division undergraduate courses the object is preparation for further graduate study or toward professional concentration. Since it is often the case that only one or two faculty members in a department are formal methodists, one or two courses at this level may be the only practical strategy for pedagogy development. Upper division courses are also the proving ground for proposed enhancements to the core. At this level, educators see exposure to tools and case studies as worthy goals, if difficult to achieve. Interinstitutional collaboration, together with industrial involvement, offer the best prospects for rapid progress.
Michael Gordon and Amil Pnueli, both invited speakers at the XXIEC workshop, outlined course work they are developing at this level. These offerings are models for course design elsewhere.
Michael Gordon's formal methods course at Cambridge is roughly equivalent to a two-semester course sequence in the U.S., at either a senior undergraduate or 1st-year graduate level, by comparison. A key aspect of the course is its basis in a uniform formal framework---higher order logic---encompassing hardware and software. The emphasis is on formal representation methods, with some rigor but without involving tools. Building tool use proficiency, according to Gordon, would probably take a third course.
Gordon's general curriculum recommendations are pragmatic [presentation visuals]. Progress in formal methods depends on its integration with, rather than displacement of, existing techniques. He suggests working toward meeting the expressed needs of the design industry. In the case of digital system engineering, this includes looking at verification methods in practice (e.g. use of simulators to explore design behavior) and introducing alternatives (e.g. symbolic execution and model checking). Doing this requires introduction to boolean logic, temporal logic, the basics of test vector development, automata based specification of control, simulation based verification, and a selection of topics in modeling and analysis.
Amir Pnueli's proposed curriculum, also a subject of his 1996 Turing Award lecture, harmonizes with Gordon's. He is developing a two-course early graduate sequence which could conceivably migrate to undergraduate levels. An outline of representative topics includes: specification using automata and temporal logic; "enumerative" model checking; "symbolic" model checking; abstraction/decomposition methods; deductive temporal verification; verifying refinements, algorithmically and deductively; specification and verification of complex data structures. Example follow-up topics mentioned were real-time systems and digital/analog hybrids.
Pnueli strongly endorses the development and use of tools to ground this pedagogy. However, he cautions against descriptive surveys comparing several tools of similar functionality. One must pick an approach and explore it in depth. The most daunting challenge facing someone in creating a course is making this choice. There is a multitude of alternatives at each level of reasoning, but few that present a uniform interface to the user.
There is sufficient material for a professional degree program in formal methods. It is useful to consider what the actual content of such a program might be, if only to clarify what skills someone who claims to be a specialist should have. However, there are very few institutions with the resources to implement a professional degree, and those that do are more likely to prioritize advanced research. One solution may be to encourage multiinstitutional collaborations among the more isolated research groups. Such a program would include exposure to the mathematical theory in logic, computation, and complexity; surveys of the area's structure along the lines of Section 3; application to a variety of problem domains; experience in large scale system design (possibly through internships); and implementation issues for reasoning systems.
Another approach to postgraduate study is to develop special programs in reeducation, designed for engineers taking three- or six-month educational leaves from industry. Many universities offer credentials for limited programs of focused study. This seems like an avenue which could foster industrial-academic interactions, but no such programs in CS/CE were known to exist at the XXIEC workshop.
The National Science Foundations's MOSIS facility is often mentioned as a model for effective infusion of a target technology in academia. Conditions do not exist for a similar initiative in formal methods, but it is a useful exercise to contemplate what a "MOSIS for formal methods" might look like. If there were a consensus in government and industry that formal methods and the associated technologies were important in maintaining leadership in the Information Age, what incentives could be created to increase their exposure in degree programs?
Almost certainly, one ingredient would be the creation of a tool set, analogous to the University of California at Berkley's CAD tools and Mississippi State's cell libraries for VLSI in the 1980s. There is no "foundry" service associated in the teaching of formal methods, but realistic problems require computational resources that undergraduate institutions and smaller research groups don't have. Administrating servers for large scale verification exercises would increase the pool of students with exposure to these topics.
Government funding for formal methods research is diminishing. If this trend cannot be reversed, then those isolated agency advocates must work collectively to raise its visibility. They must speak with a common voice about targeting formal methods in educational funding. Cooperation within commercial, governmental, and educational sectors is a prerequisite to collaboration among them.
Projects, threads, lectures, and modules. Whether the goal is to infuse standard courses with formal-methods content or to create courses on the topic, the start-up costs are high. A collection of tutorial examples would help an educator get over the threshold of using a particular tool set and developing a particular line of topics. A monitored repository could go further, cataloging contributions according to complexity, problem class, and reasoning technology. Taking this a step further, the people involved in formal methods have the technical competency to make portions of their material interactive. Provided the support infrastructure were available, these materials could be opened for network access. Springer's STTT/ETI journal and software repository is an example in this kind of collaborative initiative.
Bibliographies and archival reference. Much of the publication history in formal methods has been in workshop and conference proceedings, and even in limited distribution reports. Although it would be quite an undertaking to create a comprehensive list of readings from scratch, there are several existing bibliographies already. A consolidation would be of considerable value to someone building courses, and to students interested in exploring the area. It might also help in consolidating subtopics.
Documentation, manuals, and notes. Often, the existing documentation needed to learn and use a reasoning tool is marginal or out of date. In addition, manuals are often highly application oriented, with broader uses appearing over time in the research literature. Educators and their students can participate directly in advancing formal methods by augmenting the documentation now in use. Many CAD companies maintain university programs with this purpose in mind. At present, however, most reasoning tools are already in the public domain or are under development by start-up companies with no reserves to engage with educational efforts. A more central distribution point would be helpful.
Communication and collaboration facilities. Surely, a primary function of a consortium would be to create communication channels among all groups with an interest in formal methods advancement. It can put people in touch with each other, matching interests and needs. Centralized lists of financial-aid opportunities, program announcements, postdoctoral positions, and job openings would lend a sense of critical mass. Similarly, a referrals listing would put in contact those with problems in specific domains with qualified academics interested in gaining field experience.
Training materials and short courses. It is symptomatic that educators are now attending training courses offered by industry and government groups. Why is it not the reverse? The situation reinforces beliefs that the research has outpaced academia to become the exclusive domain of commercial enterprise. While this misconception ignores the true nature of the symbiosis, it does reflect the real difficulties of matching the rates of methodological and technological progress. Over the longer term there will be feedback; but the response time can be improved by progressive programs in reeducation, distance education, and short courses.
Cookbooks, engineering handbooks. A universal aspect of formal methods is understanding how to represent properties and specifications in the given formalism. At more abstract levels, there are usually many alternative ways to structure a model. At more concrete levels, a commensurate effort may go into expressing a particular property. While understanding a particular representation may be a matter of education, exercising this knowledge is also a matter of experience and recall. As in other aspects of engineering mathematics, this activity should result in a compendium of common techniques and representations (See, for example, the FMSP98 paper by Dwyer, et.al., on patterns in temporal logic.)
Componentware, special configurations, and front ends. In discussing the inclusion of reasoning tools in courses, some people call for "pedagogically appropriate" versions of tools while others want "the real thing." The choice depends on several factors, including not only allowances for the relatively short learning window, but also whether the intent is to show how a tool is used or how it works. To aid this process, developers of research prototypes--especially at more abstract levels--should strive for an open, object-oriented software architecture in the hope that components of their work can be reused.
Maintained and integrated libraries. The term "library" is used for collections of formalized theories and facts. Common examples include binary vectors, arithmetic algebra, graph theory, streams, and so forth. Researchers addressing a particular problem class often must recreate theories for their own tools. Efforts to consolidate and share libraries are often marginal because there is no direct source of support for this labor-intensive activity. This is not just a data management problem; the maintainers must be experts, not only in the systems involved, but also in the underlying mathematics.
Benchmarks and formalized objects. Within the past five years, experimentation and case studies have approached realistic scale for designs in widespread use. The artifacts of this research may be useful in commercial designs that are not pushing technology too hard, or as starting points for other case studies. If the research community is committed to higher quality and reliability, then one way to achieve it is to make these artifacts available for general review and possible use.
Serve a continuing forum. A consortium must be a clearing house for issues and ideas. In an area so potentially contentious as formal methods, an organization perceived as being exclusionary will defeat its own purposes. On the other hand, there is general (though not universal) agreement that education is the priority, and this is the glue that should hold a consortium together. In addition to newsgroups and mailing lists (of which there are already several related to research), other forms of participation include position statements, accounts of historical background, promoting panels or sessions at research conferences, and so forth.
Identify formal methods as a funding target in educational programs. Education is receiving unprecedented support in U.S. government funding. However, formal methods is not yet recognized as a "coherent body of knowledge" [CCIC95, Part III.D] on which to base an educational project. Moreover, educational funding in science and engineering, at NSF in particular, greatly emphasizes multidisciplinary undertakings, and reviewers often do not regard formal methods as fitting the multidisciplinary mold.
The agenda for formal methods education includes work targeted at both undergraduate and graduate levels, not to mention infusion of research at both levels. This is a broader agenda than most educational programs conceive. A single project aimed at both undergraduate and graduate course development is unlikely to fare well in current NSF educational program reviews. It must also be acknowledged that the goals in formal methods education are not as clearly delineated as in other areas of announced emphasis. Target student populations are not as clearly identified as they are in science and medicine, for example. Outcomes and assessments are impossible to detail sharply. The tradeoff between tool skills and conceptual skills is not well understood. Commercial partnerships are tentative.
The big picture is that education is clearly in the critical path for advancement of formal methods and formal methods is a crucial avenue for achieving greater assurance, higher quality, and economic competitiveness in system design. With this realization, agencies traditionally chartered for research support should look for ways to subsidize educational aspects, possibly through training initiatives and pilot studies. At the very least they should lobby to inform granting agencies that progress depends on some fundamental shifts in the character of CS/CE education.
More and better focused research funding. The best people to teach formal methods are people who are actively doing research in it. They are also the prime producers of new educators and developers of new tools a broader population of people can use. People cannot and will not do research on formal methods unless they can obtain grants, contracts, and fellowships for this purpose. Thus, the most direct way to stimulate activity is to provide these incentives. Unfortunately, the funding trends have been regressive in recent years. In DARPA, a rebudgeting in 1996 devastated five major formal methods research projects and sent a very ominous message to new researchers. Two of the most prominent formal methods research enterprises in the United States have closed their doors in recent years because of dwindling support.
Attract students and convince university administrators. Formal methods practice involves a set of highly demanding skills requiring a relatively strong background in mathematics and computer systems. In a period when CS/CE enrollment is not meeting the demand, there is competition for students with areas such as scientific computation, graphics and visualization, computer-human interface, and so on. These areas are not only enticing, but also have well established advocacies in administrative circles. It is fair to say that advancement in high confidence systems depends on raising awareness of its importance to design practice, in both legislative circles and academic administrations. For the latter, the clamor of interests in high performance computing, human computer interaction, instrumentation, electronic information bases --the list goes on and on -- drowns out smaller advocacies, even if they are well focused.
Nurture industrial interactions. Industrial pilot studies involving academics, often subsidized by sabbatical leaves, government incentives, and so forth, have made significant contributions in research and to the spread of acceptance for formal methods. They don't always succeed, but when they do, they produce engineers with new skills and academics with fresh insights. Among the problems for starting a pilot study are differences in outlook ("time-to-benefit" expectations) and culture, as well as a fair degree of intimidation both ways. Encouragement on the part of contract givers can be influential. In some recent cases, a government agency has actually served as a "dating service," finding people, putting them together and subsidizing their preliminary work. It has also been suggested that providing incentives to formalization in government contract solicitations is more likely to spark competition than mandating it.
Conduct an outreaching survey of educational needs. As noted earlier in this report, fully specifying a formal methods curriculum is out of the question. However, there should be more concrete discussion about educational needs. Given a forum to do it, industries and research enterprises can undoubtedly compile a set of topics and skills that they would like to see in new graduates. Such a list is the raw material that academics can refine into a curriculum. The most effective approach may be to gather information at the source. Several industrial invitees to the XXIEC workshop reported problems convincing their supervisors (or themselves) that the trip would be measurably worthwhile. As a prelude to gaining greater involvement outside the educational establishment, a campaign of direct contact should be undertaken to survey people in the field about specific educational recommendations. This would be a continuing workshop in which a team of academics makes site visits for half-day intensive discussions with interested groups.
Engage more faculty. Significantly increasing the number of students practicing formal methods entails significantly increasing the number of educators teaching it. One participating educator put it this way:
Formal methods are hard. I feel that most mortals like me just need more help in it than in other subjects. This points to the need for good teaching methodologies and techniques, good class examples, good homework problems and suggestions, etc. The web site [FMEd] you have set up is definitely the kind of help we are looking for.
Subsidize a repository. The World Wide Web link cited above was created in response to suggestions made at the XXIEC workshop. It is being maintained on a voluntary basis, and is essentially an information hub. Proactive interests should contemplate a more actively developed resource in which materials are consolidated. The creation of pedagogy through textbook publication is a traditional source of income for educators; hence, a quality repository on the web will need to provide an alternative reward structure for its contributors.
|[DeHi96]||C. Neville Dean and Michael G. Hinchey (eds.). Teaching and Learning Formal Methods International Series in Formal Methods, Academic Press, San Diego, 1996. A collection of papers by people building curricula in (or related to) formal methods.|
|[CD98]||Formal verification: essential for complex designs. Computer Design 37(6) (June 1998). A roundtable on trends in formal verification primarily from the hardware system perspective|
|[ClWi96]||Edmund M. Clarke and Jeannette M. Wing, et. al. Formal Methods: State of the Art and Future Directions. ACM Computing Surveys 26(4):626-643. A review of recent case studies and results in requirements specification and formal verification at both high and low levels of abstraction. A strategic agenda for research, and how it relates to needs in government and industry.|
|[CCIC95]||America in the Age of Information: A Forum Report of a meeting sponsored by NIST/CCIC. http://www.hpcc.gov/ccic/cic_forum_v224/cover.html A forum of members of the research community, "designed to refine and further detail the six Strategic Focus Areas introduced in the CIC Strategic Implementation Plan." One of these areas is High Confidence Design.|
|[CCIC97]||Research Challenges in High Confidence Systems. Proceedings of the Committee on Computing, Information, and Communications Workshop, August 6-7, 1997. http://www.hpcc.gov/pubs/hcs-Aug97/ Follows up on [CCIC95] with perspectives offered by various government agencies, leading to [CCIC98]. Formal methods is presented as one of several approaches explored.|
|[CCIC98]|| Setting an Interagency High Confidence Systems
Research Agenda. Proceedings of the Interagency High Confidence
Systems Workshop, March 25, 1998.
The third in a series of meetings on high-confidence systems.
Included in the report are research overviews from the
Federal Aviation Administration,
Federal Railroad Administration,
Federal Transit Administration,
Food and Drug Administration,
National Library of Medicine,
Department of Treasury,
National Aeronautics and Space Administration,
Department of Energy,
Nuclear Regulatory Commission,
National Institute of Standards and Technology,
and National Institutes of Health.
||Critical Foundations---Protecting America's Infrastructures.
The Report of the President's Commission on Critical Infrastructure
Protection, October 1997.
A broad discussion of the increasing societal dependence on and vulnerability
to computer information systems.
Includes policy recommendations.
||The 21st Century Engineering Consortium Workshop, sponsored by
Rome Labs and ITO, held March 18-19, 1998 at Melbourne Florida.
This report is an outcome of the workshop.
||Formal Methods Education Resources.
A web page created in response to suggestions made at the XXIEC
H. Saiedian, et. al.
An invitation to formal methods.
Computer 29(4):16-32 (April 1996).
A collection of essays on formal methods,
primarily from a software perspective.
IEEE Workshop on Industrial Strength Formal Specification Techniques
Group reports of the 1995 workshop, cited herein,
are accessible through this website
Mark Ardis (ed.),
Second ACM SIGSOFT Workshop on Formal Methods in Software Practice,
March 4-5, 1998.
Gaetano Borriello (ed.),
Future Research Directions in CAD for Electronic Systems.
Workshop sponsored by NSF/CISE/MIPS, Seattle, May 1996.
Primarily focused on renewed efforts in the
area of design automation for systems.
Discusses the need for higher-risk research looking
for methodological breakthroughs rather than incremental advances.
The appended survey gives insight from the perspective of
CAD system developers.
B. Steffen and W.R. Cleaveland (eds.),
International journal on
Software Tools for Technology Transfer
Electronic Tool Integration
platform. Springer, from September 1997.
A journal and software repository intended
"to provide a forum for issues on the development, deployment and
use of tools for the construction and analysis of correct and reliable
National Science Foundation,
Integrative Graduate Education and Research Training (IGERT) Program
-- Frequently Asked Questions.
Contains a statement about "education" versus "training".
Formal Methods and Digital Systems, Validation for Airborne Systems.
SRI International Computer Science Laboratory Technical Report
Contains an overview of formal methods and a primer on logic. The report
"primarily written for certifiers and assumes no previous exposure to
Unless otherwise requested, edited commentary may be linked to the the online version. Send responses to Steven D. Johnson at email@example.com. Commentors are invited to use the report questionnaire which provides a framework for responses following the report's outline.
There were four stages in the creation of this report.
Focus The focus of the workshop, formal methods, refers to that body of theory, technique, pedagogy, and tools that enable rigorous reasoning in system design practice. It is a formative area with many approaches developing, but none yet prevailing. At the same time, a rising need for methodological advances has become manifest, not only in such aspects of high-assurance design as formal specification, safety, security, and reliability, but also in the increasingly critical demands on verification, analysis, and testing in the greater computer industry.
Goals The two specific goals of the workshop are:
Featured Speakers: Michael J.C. Gordon and Amir Pnueli. Mike Gordon of the University of Cambridge Computer Laboratory is a widely recognized pioneer in formal methods and automated reasoning. He is a leader in the development of texts and course notes on these topics. Amir Pnueli of the Weizmann Institute is known for his work in temporal logic and applications in verification, work which lead to the 1996 Turing Award.
Organization of the Workshop The two-day workshop is structured to build consensus and content for the resulting report. The first morning will consist of plenary panel sessions of leading representatives from industry, government, and academia to review what is needed and what is currently being done in the development of formal methods curricula. Participants will then assemble a number of small discussion groups to address recommendations for undergraduate and graduate education. The moderators of these groups will report to the whole workshop at the end of the day.
[At the event, panels included individual presentations by Albert Camilleri (Hewlitt Packard Corp.), Mandayam Srivas (SRI Computer Science Laboratory), Edmund M. Clarke (Carnegie Mellon University), J Strother Moore (University of Texas at Austin), and Moshe Vardi (Rice University).]
On the second day, the workshop will reconfigure itself to reflect the prevailing issues and ideas emerging on the first day. A set of topical breakout sessions will be organized. As on Day One, these sessions will be followed by reports by the session moderators to the workshop as a whole. In the afternoon, there will be a plenary meeting to determine whether a consortium should be formed and how to organize it, and a consensus will be sought on the main points of the report and how to produce it.
Participation The principal mission of this workshop is to collect and consolidate input as widely as possible. All forms of input are solicited, including (but not limited to)
The workshop organizers accept responsibility for producing the report as a means of last resort. We regard our function as that of a facilitator in forming the consortium, rather than any kind of de facto official body of the yet-to-be-formed entity. The format of the meeting may mandate limits on direct participation. If you are interested in attending, please contact Mike Nassif, or any member of the organizing group listed below. If you cannot attend the workshop but are interested its outcomes, or in consortium involvement, please contact a member of the organizing group and consider contributing a position paper or letter.
University of Cincinnati
SUNY Institute of Technology
Georgia Southern University
University of Utah
Mike Nassif (Ch.)
City University of New York
Brigham Young University