Formal Methods in Safety-Critical Standards Jonathan Bowen Oxford University Computing Laboratory 11 Keble Road, Oxford OX1 3QD, UK Email:
[email protected]
Abstract There is great interest in ensuring correctness of safety-critical embedded systems since on the one hand the use of software gives greatly increased functionality and exibility and on the other hand it provides unprecedented possibilities for errors. Formal methods are one technique that could improve the situation. Their use is now being suggested by an increasing number of standards in the safety-critical area. This paper compares the recommendations given by a number of important existing and emerging standards and tries to identify future trends in this area. A bibliography of standards and related publications is included.
\The nice thing about standards is that you have so many to choose from; further, if you do not like any of them, you can just wait for next year's model." Andrew Tanenbaum
1 Background Computers are being used increasingly in safetycritical systems because of the added exibility and decreased costs that they can engender. However, much concern has been raised as to their suitability in such systems [27]. Many accidents have been blamed on the use of computers, and especially on the software in them. Deaths due to software have occurred (e.g., the Therac 25 radiotherapy machine, where the dosage editor was poorly designed [24]). For one of the most comprehensive lists of the problems brought about by the use of computers, see [32], that has been updated annually, and the associated electronic RISKS newsgroup and mailing list from which the list is derived. The industrial use of formal methods in safetycritical systems is on this increase, and it is generally accepted that formal methods have potential bene ts and are likely to be used increasingly in this eld [40].
Formal techniques have now been applied successfully on signi cant systems of the order of 100 man years' eort (e.g., see [19]). [6, 12, 13] provide surveys of some examples of large safety-critical system development. However the use of formal methods, even in the area of safety-critical systems, is still very much the exception rather than the rule. [1] is a recent survey on the use of formal methods in industry and academia. The exact meaning of formal methods and its related terms are often misunderstood and sometimes misused, both within the formal methods community and by others [8]. As an example, consider the following two alternative de nitions of formal speci cation taken from a glossary issued by the IEEE [IEEE91]: 1. A speci cation written and approved in accordance with established standards. 2. A speci cation written in a formal notation, often for use in proof of correctness.
The latter is the accepted sense by those concerned with formal methods, but the former is presumably considered by the IEEE to be more widespread in industrial and other circles since it is placed rst in the de nitions above. Formal methods may be characterized at a number of levels of use and these provide dierent levels of assurance for the software developed by such methods [44]. Here it is assumed the term to mean any technique with a mathematical basis that is intended for use in the development of computer-based systems. This could mean merely formal speci cation of the system, which even alone gives bene ts by allowing a mathematical model to be developed and reasoning conducted using this model; or it could mean formal veri cation of the system, allowing the implementation to be checked against the speci cation. The process may be undertaken manually, with simple tools such as word processing support, or it may be mecha-
nized (at a high cost) for the highest con dence in its correctness. It should be noted that despite the mathematical basis of these methods, errors are still possible because of the fallibility of humans and, for mechanical veri cation, computers. However the methods have been demonstrated to reduce errors (and even costs and time to market) if used appropriately [6]. In general though, formal development does increase costs [7], possibly decreasing code productivity by around a half [8]. There is still a great lack of education about formal methods in industry, and this is likely to take a while to change. However, many computer science courses now contain a signi cant amount of formal methods content, particularly in Europe. There is perhaps less training about safety-critical systems issues in many courses, since this is often still seen as a specialist area. The safety-critical software engineer must be aware of many more issues, and must be able to interface to many more disciplines than his/her counterpart in most other areas of software engineering (e.g., control engineers, hardware designers, safety engineers, etc.). Some authors of standard software engineering textbooks are now realising the increasing importance of safety-critical systems, as the use of computers becomes more and more common in such applications, and are including sections on this topic (e.g., [39]). Professional bodies are also providing guidance for their members (e.g., see [22] issued by the UK Institute of Electrical Engineers). A signi cant and even the main motivating force for the use of formal methods is likely to be standards. Many safety-critical standards are now mentioning formal methods as one of the techniques that should be used when the highest integrity of software is required. The main part of this paper surveys some current and emerging standards that are applicable to safety-critical systems, particularly with regard to their suggested use of formal methods, or otherwise. A list of some relevant standards and a bibliography are included at the end of the paper to allow interested readers to investigate the issues further.
2 Standards There are a wide variety of standards bodies, perhaps too many, throughout the world concerned with software development. For example, during the December 1991 ACM SIGSOFT conference on safety critical software, Mike De Walt of the US Federal Aviation Administration (FAA) stated that a count revealed 146 dierent standards relevant to software
safety. The UK IED (Information Engineering Directorate) DTI/SERC funded SMARTIE project (no. 2160) is investigating a standards assessment framework [15]. [21, 43] give an overview of existing software engineering standards and indicate the major national and international bodies involved. [35] outlines the standardization eorts speci cally in the area of safety-related systems. Many standards have emerged or are currently emerging in the area of software safety, because this is now of such widespread importance and public concern and thus any survey, including that given in this paper, can only give a snapshot of the situation. Formal methods are being increasingly mentioned in some safety-related standards as a possible method of improving dependability. This section gives some examples of such standards. An important trigger for the exploitation of formal methods could be the interest of regulatory bodies or standardization committees (e.g., the International Electrotechnical Commission [IEC91, IEC92], the European Space Agency [ESA91], and the UK Health and Safety Executive [HSE87a, HSE87b]). Many emerging standards are at the discussion stage (e.g., [RIA91, IEEE93]). A major impetus has already been provided in the UK by promulgation of the Ministry of Defence interim standard 00-55 [MoD91a], which mandates the use of formal methods and languages with sound formal semantics. Previous guidelines [EWICS88/89] have been in uential in the contents of safety standards and a standards framework [SafeIT90b] may help to provide a basis for future standards. There are many terms associated with safety-critical systems and considerable international eort has been expended to standardize some of these [25]. In addition, formal speci cation languages and their semantics are themselves being standardized by ISO/IEC (e.g., LOTOS [ISO89], VDM [ISO91] and Z [ISO93]). Of these, the LOTOS standard is the only one that has completed the acceptance process so far. Formal notations are also becoming increasingly accepted in standards as it is realized that many existing standards using informal natural language descriptions alone (e.g., for programming language semantics) are ambiguous and can easily be (and often are) misinterpreted. [3] argues the case, although there are still signi cant dierences of opinion in this area. Even when describing a language as formal as the logic programming language Prolog, there is disagreement between the dierent countries involved as to whether a formal description of the language should be included in the proposed international ISO standard [ISO92],
despite the reduction of ambiguity [14]. In this case, the UK (BSI) and France (AFNOR) are strongly for including a formal semantics, and Germany (DIN) and USA (ANSI) are against. Conformity of speci cation languages to tools intended for their support raises a new set of issues compared with those for programming languages since they are not in general executable (e.g., see [33] concerning the speci cation language VDM-SL). [26] provides further information in the area of standards for non-executable speci cation languages.
3 Related issues Up until relatively recently there have been few standards concerned speci cally with software in safety-critical systems. Often software quality standards such as the ISO9000 series have been used instead since these were the nearest relevant guidelines. Now a spate of standards in this area have been or are about to be issued. [38] gives a good overview (in 1989) and also covers a number of formalisms such as VDM, Z and OBJ. Many standards do not mention a formal approach speci cally (e.g., MIL-STD-882B [DoD84]) although most are periodically updated to incorporate recent ideas. For example, a new version of this standard, MIL-STD-882C, is due to be distributed in 1993, although unlike many new versions of such standards, there is still no mention of formal methods. The software engineering community became acutely aware of the introduction of formal methods in standards in the late 1980s and particularly since the introduction of the UK MoD Def Stan 00-55 [MoD91a] which will be commented upon later. Although the debate on the exact formal methods content of standards like 00-55 is bound to continue, there are certain aspects such as formal speci cation which cannot sensibly be ignored by standardizing bodies. A number of issues relate to the motivating forces for the recommendations included in standards and these are touched upon in the rest of this section.
3.1 Legislation
Governmental legislation is likely to provide increasing motivation to apply appropriate techniques in the development of safety-critical systems. For example, a new piece of European Commission legislation, the Machine Safety Directive, came into eect on 1st January 1993 [DTI93]. This encompasses software and if there is an error in the machine's logic that results in injury then a claim can be made under civil
law against the supplier. If negligence can be proved during the product's design or manufacture then criminal proceedings may be taken against the director or manager in charge. There is a maximum penalty of three months in jail or a large ne [30]. Suppliers have to demonstrate that they are using best working practice. This could include, for example, the use of formal methods. However the explicit mention of software in [DTI93] is very scant. Subsection 1.2.8 on Software in Annex B on p. 21 is so short that it can be quoted in full here: \Interactive software between the operator and the command or control system of a machine must be user-friendly." Software correctness, reliability and
risk are not covered as separate issues. Care should be taken in not overstating the eectiveness of formal methods. In particular, the term formal proof has been used quite loosely sometimes, and this has even led to litigation in the law courts over the VIPER microprocessor, although the case was ended before a court ruling was pronounced [28]. If extravagant claims are made, it is quite possible that a similar case could occur again. 00-55 dierentiates between formal proof and rigorous argument (informal proof), preferring the former, but sometimes accepting the latter with a correspondingly lower level of design assurance. De nitions in such standards could aect court rulings in the future.
3.2 Education and certi cation Standards are a motivating force that pull industry to meet a minimum level of safety in the development of critical systems. Another complementary force that could be seen to push industry is the education of engineers in the proper techniques that should be applied to such systems. [46] includes a report on education and training with respect to safety-related software. Currently a major barrier to the acceptance of formal methods is the fact that many engineers and programmers do not have the appropriate training to make use of them and many managers do not know when and how they may be applied. This is gradually being alleviated as the necessary mathematics (typically set theory and predicate calculus) is being taught increasingly in computing science curricula and on industrial courses [18]. Educational concerns in the UK are re ected in the SafeIT strategy document [SafeIT90a]. The UK Department of Trade and Industry has commissioned a special study to stimulate the development of education and training in the area [11]. In addition, the British Computer Society (BCS) and the Institute of Electrical Engineers (IEE) have established working groups which are aiming to produce
proposals on the content of courses aimed at safetycritical systems engineers. Some standards and draft standards are now recognizing the problems and recommending that appropriate personnel should be used on safety-critical projects. There are suggestions that some sort of certi cation of developers should be introduced. This is still an active topic of discussion [31], but there are possible drawbacks as well as bene ts by introducing such a `closed shop' since suitable able engineers may be inappropriately excluded (and vice versa). The education/accreditation debate has been particularly active in the UK, in the wake of Def Stan 00-55. The MoD, having commissioned a report on training and education to support 00-55 [49], has chosen to withdraw from the consequent controversy stating that it is beyond the remit of the standard to set a national training agenda. Perhaps the central issue here is not formal methods education per se, but the identity of the whole software engineering profession; in other words, what precisely is a software engineer is a question that will no doubt be debated for some time to come. [42] includes a discussion on the drift of many kinds of professionals into software engineering.
3.3 Technology transfer
Industrial awareness of applicable techniques and relevant standards is very important, particularly in an area like safety-critical systems where the reduction of errors by any means is especially important. However technology transfer is a delicate process that takes time and eort which can easily fail at any stage [10]. The UK Department of Trade and Industry (DTI) has provided a lead by instituting the `SafeIT' initiative in 1991 to establish a uni ed approach to the assurance of software-based safety-critical systems by encouraging and nancing collaborative projects in the area [SafeIT90a]. This sponsors industrial (and to a lesser extent academic) organizations to undertake collaborative projects in this area, and a second phase of the initiative has been started. An associated Safety Critical Systems Club (SCSC) has been formed and interest is very strong, with around 1,600 members, and several meetings each year, including issues concerning standards [37]. A regular club newsletter includes articles on the application of mathematical methods to safety-critical systems as well as information relevant to standards (e.g., on pages 4 and 8 of [36]). The SafeIT initiative is particularly interested in safety standards and has produced a framework for them [SafeIT90b]. Other regular meetings such as COMPASS [23], SAFECOMP
[16], FME [47] and ZUM [5] also discuss issues relevant to this area. The debate concerning the use of software in safetycritical applications, the appropriate techniques for development, and even whether the use of software increases or decreases safety [34], are set to continue for some time.
4 A survey of some standards This section introduces the recommendations concerning the use of formal methods in a number of software safety standards. The selection, which is summarized in Table 1, is somewhat eclectic, but demonstrates the range of areas and organizational bodies that are involved. Overviews of standards concerned with software safety from an American point of view are provided by [2, 45, 48]. The US and Europe are the major sources of software safety standards, guidelines and research in this area. In Canada, [AECB91, AECB92, OH90, OH93] (amongst others) have been produced in relation to the nuclear industry. Standards Australia is recommending adoption of the IEC Draft Document 65A [IEC91]. [29] gives a rare overview of dependability and safety issues in Japan, including details of an abortive attempt to produce their own JIS standard sponsored by MITI in this area, although guideline reports exist.
RTCA DO-178 The US Radio Technical Commission for Aeronautics (RTCA) produced a guideline on software con-
siderations in airborne systems and equipment certi cation (DO-178A) in 1985. This does not explicitly recognize formal methods as part of accepted practice. However a completely rewritten guideline for the newly named Requirements and Technical Concepts for Aviation { still RTCA { (DO-178B) [RTCA92] has now been approved since 1st December 1992. This includes a very brief subsection 12.3.1 on pp. 61{62 entitled Formal Methods under a section entitled Alternative Methods. This gives a general introduction to formal methods and mentions three levels of rigour: formal speci cation with (1) no proofs, (2) manual proofs and (3) automatically checked or generated proofs. It is now possible for a manufacturer following the DO-178B guideline to make use of formal methods in the context of aircraft certi cation, although it is incumbent on the manufacturer to justify its use. This is likely to be negotiated on a case by case basis. This
Country Body Europe UK Europe Europe UK UK US Canada US US
Sector
Name
FMs content IEC Nuclear IEC880 No HSE Generic PES No IEC Generic IEC65A WG9 Yes IEC65A 122 Yes ESA Space PSS-05-0 Yes MoD Defence 00-55 Yes RIA Railway { Yes RTCA Aviation DO-178A No DO-178B Yes AECB Nuclear 2.234.1 Yes OH 982C-H69002 Yes DoD Defence MIL-STD-882B No MIL-STD-882C No IEEE Generic P1228 No
FMs mandated N/A N/A No No No Yes No N/A No Yes Yes N/A N/A N/A
Status
Year
Standard Guideline Draft Proposed Standard Interim Draft Guideline Guideline Report Procedure Standard Due Draft J
1986 1987 1989 1991 1991 1991 1991 1985 1992 1992 1993 1985 1993 1993
Table 1: Summary of software-related standards and guidelines. less than enthusiastic endorsement is despite significant lobbying for a more rigorous approach to be adopted for the most critical software in aircraft systems.
UK HSE
The UK Health and Safety Executive issued an introductory guide [HSE87a] and some general technical guidelines [HSE87b] concerning Programmable Electronic Systems (PES) in safety related applications in 1987. Two pages are devoted to software development (pp. 31{32) and a further two to software change procedures (pp. 32{33). No mention is made of formal methods; it simply states that software should be of high quality, well documented, match its speci cation and be maintainable. It does list the necessary phases of software development and includes in these requirements speci cation, software speci cation, design, coding and testing, and system testing. It goes on to state that modi cations to the software should be strictly controlled. The eorts of HSE are now mainly concentrated on the IEC standards mentioned below.
IEC
The International Electrotechnical Commission has issued two standards in the area of safety-critical system development [IEC91, IEC92]. These documents were originally issued in 1989, but have subsequently been updated and reissued. The former deals specif-
ically with software for computers in the application of industrial safety related systems, while the latter is concerned with the functional safety of programmable electronic systems in general. These are generic international standards designed to be applied in many dierent industrial sectors. An example of a particular instantiation of the IEC65-WG9 standard is the proposed RIA standard [RIA91] included below. The \formal methods" CCS, CSP, HOL, LOTOS, OBJ, Temporal Logic, VDM and Z are speci cally mentioned in [IEC91] (with a brief description and bibliography for each) as possible techniques to be applied in the development of safety-critical systems in an extensive section (B.30, pp. B-14{18) under a Bibliography of Techniques. A shorter section on \formal proof" (B.30, p. B-18) is also included. It is currently proposed that three IEC standards on Functional Safety of Safety-Related Systems will be developed (see page 8 of [36]): (1) Generic requirements; (2) Requirements for electronical/electronic/programmable systems; and (3) Software requirements. The next drafts are planned to be available by the end of 1993.
ESA
The European Space Agency has issued guidelines for software engineering standards [ESA91]. This suggests that \Formal methods [such as Z or VDM] should be considered for the speci cation of safetycritical systems" in the Software Requirement Document (p. 1-27). The emphasised word \should " in-
dicates strongly recommended practices in the document whereas \shall " is used for mandatory practices. It also states that a natural language description should accompany the formal text. A short section on formal proof (p. 2-25) suggests that proof of the correctness of the software should be attempted if practicable. Because of the possibility of human error, proofs should be checked independently. Methods such as formal proof should always be tried before testing is undertaken. Thus the use of formal methods is strongly recommended, but not mandated by the document.
UK RIA
The Railway Industry Association consisting of a number of interested organizations and industrial companies in the UK have produced a consultative document on safety-related software for railway signaling [RIA91]. It is a draft proposal for an industry speci c standard that has yet to be rati ed. It makes extensive reference to the IEC65-WG9 standard [IEC91]. Formal methods are mentioned brie y in several places in the document. Rigorous correctness argument is mentioned as a less detailed and formal proof method to demonstrate the correctness of a program by simply outlining the main steps of the proof. In general, formal techniques such as proofs of programs and mathematical modelling are only recommended for the two highest integrity levels (graded from 0 to 4) are required which means they are considered \a useful technique which may or may not be used or called for in a particular application (as per IEC65-WG9)" (see p. A1 of [RIA91]).
MoD 00-55 and 00-56
The UK Ministry of Defence have published two interim standards concerning safety. 00-55 [MoD91a], on the procurement of safety-critical software in defence equipment, is split into two parts, on requirements (Part 1) and guidance (Part 2). Issue 1 was made available in 1991. 00-55 may be revised in 1994, although only Part 2 is likely to change. The 00-56 standard [MoD91b] is concerned with hazard analysis and safety classi cation of the computer and programmable electronic system elements of defence equipment. The latest draft version of Issue 2 [MoD93] is divided into two parts like 00-55, on requirements and guidance. Comments are requested and meetings are held to discuss such revisions (e.g., see page 4 of [36]). Annex C of Part 2 provides a Z speci cation of some of the tables in Part 1. Most of the procedural elements of this revised standard are also being
speci ed in Z by Formal Systems (Europe) Ltd [17], subcontracted by Safety and Reliability Consultants Ltd, but this has not (yet) been included in the standard itself. These standards, and particularly 00-55, mention and mandate formal methods extensively and have, therefore, caused much discussion and argument in the defence software industry as well as the software engineering community in the UK. [41] gives an interesting account of the evolution of 00-55 and the associated debate in the UK. The standards are currently in interim form. The MoD set 1995 as the goal date for the introduction of fully mandatory standards [9], but has now withdrawn a speci c introduction date. 00-55 mandates the production of safety-critical module speci cations in a formal language notation. Such speci cations must be analyzed to establish their consistency and completeness in respect of all potentially hazardous data and control ow domains. A further fundamental requirement is that all safety-critical software must be subject to validation and veri cation to establish that it complies with its formal speci cation over its operating domain. This involves static and dynamic analysis as well as formal proofs and informal but rigorous arguments of correctness. Of the rest of the 00 series of MoD standards, the very large 00-970 standard is intended to provide a whole set of requirements for aircraft, although the section on safety-critical software is not yet available.
AECB, Canada
The Atomic Energy Control Board (AECB) in Canada commissioned a proposed standard for software for computers in the safety systems of nuclear power stations [AECB91]. This was prepared by David Parnas who is well known in the elds of both software safety and formal methods. His report formalizes the notions of the environment (`nature'), the behavioural system requirements, and their feasibility with respect to the environment. It is based on the IEC Standard 880 [IEC86]. Since then, a new report which is much further from IEC880 and more focused on documentation has been prepared by Parnas [AECB92] and is due to appear as an AECB \INFO" report shortly. This report is not used as a standard itself by AECB, but is used in the evaluation of standards and procedures submitted by Canadian licensees. In particular, Ontario-Hydro have developed a number of such standards and procedures (e.g., [OH90, OH93]) and further procedures are under development. Standards and procedures developed by Canadian licensees mandate the use of formal
methods, and apart from 00-55 are still some of the few to do so. A guideline, based on Parnas' and other ideas is in preparation by AECB.
IEEE P1228 The P1228 Software Safety Plans Working Group, under the Software Engineering Standards Subcommittee of the IEEE Computer Society, is preparing a standard for software safety plans [IEEE93]. This is an unapproved draft that is subject to change. The appendix of an early draft dated July 1991 includes headings of \Formal/Informal Proofs " and \Mathematical Speci cation Veri cation " under techniques being discussed for inclusion. However, a more recent version of the draft (Draft J of 11th February 1993) as cited above omits all mention of formal methods so it is quite likely that the standard will make no speci c recommendations concerning formal methods. P1228 should be accepted as a standard in 1993.
5 Discussion The role of standards for safety related software has critical implications for the industry. For example, the MoD Def Stan 00-55 has had a great impact both in terms of research and development, and education in the United Kingdom [41, 42]. The current level of standardization activity is encouraging. It should be noted, however, that the proliferation of standards is not in itself sucient to ensure the production of safer software. These standards need to be used and their impact on software safety assessed and quanti ed. Moreover, research is needed in order to establish precisely what standards should contain and how various sector speci c standards interact when they are used simultaneously on a system. Work in this direction is being investigated by the SMARTIE project [15]. It is important that standards should not be overprescriptive, or that prescriptive sections are clearly separated and identi ed as such (perhaps as an appendix or even as a separate document). These parts of a standard are likely to date much more quickly that its goals, and thus should be monitored and updated more often. Ideally, dependability goals should be set and the onus should be on the software supplier to ensure that the methods used achieve the required level of con dence. Although such practice is well established for hardware, the equivalent techniques for software are still debatable and further research in this area is desirable and necessary.
If particular methods are recommended or mandated in a standard, it is possible for the supplier to assume that the method will produce the desired results and blame the standards body if it does not. This reduces the responsibility and accountability of the supplier and may also result in a decrease of safety. Any recommendations in standards concerning the use of particular techniques should be regularly checked and updated in the light of recent advances and experience. The trend for the future is more standards in this area, given the number that are at the discussion stage now. Many are likely to be industry speci c standards, often based on more generic standards like [IEC91, IEC92]. A signi cant number are mentioning formal methods now, and more are likely to do so in the future. However there is signi cant market resistance to the use of formal methods, even in the area of safety-critical systems, perhaps due to many preconceived ideas about the diculty of using such methods, whether founded or not [20]. Most standards are recommending formal methods rather than mandating them, with the notable except of 00-55. Standards have the dual eect of re ecting current best practice and normalizing procedures to the highest commonly acceptable denominator. As such, a signi cant number of software safety standards (at least half in this study) re ect the importance and relative maturity of formal methods. This trend looks set to continue and standards will increasingly provide more motivation for the research, teaching and use of formal methods. Hopefully this will eventually lead to some improvement in the safety of people and resources that depend upon computer software.
Acknowledgements I am particularly indebted to Victoria Stavridou, Royal Holloway College, University of London, who co-authored a technical report on which part of this paper is based [6]. Many other people have supplied information and standards used as input to this original survey; a list is included in [6]. Victor Carre~no, Ben Di Vito, John Elliott, Paul Gardiner, Kevin Geary, Don Good, David Levan, David Parnas, Victoria Stavridou, Martyn Thomas, Lyne Tougas and Tony Zawilski, as well as the anonymous reviewers, supplied information and advice which speci cally aided this paper. The UK Science and Engineering Research Council (SERC) provided nancial support under the Information Engineering Directorate safemos project (IED3/1/1036).
Bibliography Standards, draft standards and guidelines
[AECB91] Proposed Standard for Software for Computers in the Safety Systems of Nuclear Power Stations. Final Report for contract 2.117.1 for the Atomic Energy Control Board, Canada, March 1991 (By David L. Parnas, now at Communications Research Laboratory, Department of Electrical and Computer Engineering, McMaster University, Hamilton, Ontario L8S 4K1, Canada. Based on IEC Standard 880 [IEC86].) [AECB92] Documentation of Computerised Safety Systems of Nuclear Power Stations. AECB Project No. 2.234.1, December 1992 (By David L. Parnas, Communications Research Laboratory, Department of Electrical and Computer Engineering, McMaster University, Hamilton, Ontario L8S 4K1, Canada. To be published by the Atomic Energy Control Board as an INFO report.) [DoD84] Military Standard: System Safety Program Requirements. MIL-STD-882B, Department of Defense, Washington DC 20301, USA, 30 March 1984 [DTI93] Product Standards: Machinery. Department of Trade and Industry, 3rd oor, 151 Buckingham Palace Road, London SW1W 9SS, UK, April 1993 (Covers The Supply of Machinery (Safety) Regulations 1992 (S.I. 1992/3073). HMSO Publications Centre, PO Box 276, London SW8 5DT, UK.) [ESA91] ESA Software Engineering Standards. European Space Agency, 8{10 rue Mario-Nikis, 75738 Paris Cedex, France, ESA PSS-05-0 Issue 2, February 1991 [EWICS88/89] Redmill, F. (Ed.): Dependability of Critical Computer Systems 1 & 2. European Workshop on Industrial Computer Systems Technical Committee 7 (EWICS TC7), Elsevier Applied Science, London, 1988/1989 [HSE87a] Programmable Electronic Systems in Safety Related Applications: 1. An Introductory Guide. Health and Safety Executive, HMSO, Publications Centre, PO Box 276, London SW8 5DT, UK, 1987 [HSE87b] Programmable Electronic Systems in Safety Related Applications: 2. General Technical Guidelines. Health and Safety Executive, HMSO, Publications Centre, PO Box 276, London SW8 5DT, UK, 1987 [IEC86] Software for Computers in the Safety Systems of Nuclear Power Stations. International Electrotechnical Commission, IEC 880, 1986 [IEC91] Software for Computers in the Application of Industrial Safety Related Systems. International Electrotechnical Commission, Technical Committee no. 65, Working Group 9 (WG9), IEC 65A (Secretariat) 122, Version 1.0, 1 August 1991 [IEC92] Functional Safety of Programmable Electronic Systems: Generic Aspects. International Electrotechnical Commission, Technical Committee no. 65, Working Group 10 (WG10), IEC 65A (Secretariat) 123, February 1992
[IEEE91] IEEE standard glossary of software engineering terminology, in IEEE Software Engineering Standards Collection, Elsevier Applied Science, 1991 [IEEE93] Standard for Software Safety Plans. Draft J, P1228, Software Safety Plans Working Group, Software Engineering Standards Subcommittee, IEEE Computer Society, 345 East 47th Street, New York, NY 10017, USA, 11 February 1993 (Unapproved IEEE Standards Draft, subject to change) [ISO87] JTC1 Statement of Policy on Formal Description Techniques. ISO/IEC JTC1 N145 and ISO/IEC JTC1/SC18 N13333, International Standards Organization, Geneva, Switzerland, 1987 [ISO89] ISO 8807: Information Processing Systems { Open Systems Interconnection { LOTOS { A Formal Description Technique Based on the Temporal Ordering of Observational Behaviour. First edition, International Organization for Standardization, Geneva, Switzerland, 15 February 1989 [ISO91] VDM Speci cation Language Proto-Standard. Draft, ISO/IEC JTC1/SC22/WG19 IN9, 1991 (Available from D. Andrews, Dept. of Computing Studies, University of Leicester, University Road, Leicester LE1 7RH, UK.) [ISO92] Draft Prolog standard, ISO/JTC1/SC22/WG17 N92, International Standards Organization, Geneva, Switzerland, 1992 [ISO93] Nicholls, J.E., and Brien, S.M. (Eds.): Z Base Standard. ISO/IEC JTC1/SC22, 1993, ZIP Project Technical Report ZIP/PRG/92/121, SRC Document: 132, Version 1.0, 30 November 1992 (Available from the Secretary, ZIP Project, Oxford University Computing Laboratory, PRG, 11 Keble Road, Oxford OX1 3QD, UK.) [MoD91a] The Procurement of Safety Critical Software in Defence Equipment (Part 1: Requirements, Part 2: Guidance). Interim Defence Standard 00-55, Issue 1, Ministry of Defence, Directorate of Standardization, Kentigern House, 65 Brown Street, Glasgow G2 8EX, UK, 5 April 1991 [MoD91b] Hazard Analysis and Safety Classi cation of the Computer and Programmable Electronic System Elements of Defence Equipment. Interim Defence Standard 00-56, Issue 1, Ministry of Defence, Directorate of Standardization, Kentigern House, 65 Brown Street, Glasgow G2 8EX, UK, 5 April 1991 [MoD93] Safety Management Requirements for Defence Systems Containing Programmable Electronics (Part 1: Requirements, Part 2: General Application Guidance). Draft. Interim Defence Standard 00-56, Issue 2, Ministry of Defence, Directorate of Standardization, Kentigern House, 65 Brown Street, Glasgow G2 8EX, UK, 18 February 1993 [OH90] Standard for Software Engineering of Safety Critical Software. Doc. 982 C-H 69002-0001, rev. 0, Ontario Hydro, 700 University Avenue, Toronto, Ontario
M5G 1X6, Canada, 21 December 1990 (Issued for one year trial use.) [OH93] Procedure for the Systematic Design Veri cation of Safety Critical Software. Doc. 982 C-H 690020006, rev. 0, Ontario Hydro, 700 University Avenue, Toronto, Ontario M5G 1X6, Canada, 5 April 1993 (Issued for one year trial use.) [RIA91] Safety Related Software for Railway Signalling. BRB/LU Ltd/RIA technical speci cation no. 23, Consultative Document, Railway Industry Association, 6 Buckingham Gate, London SW1E 6JP, UK, 1991 [RTCA92] Software Considerations in Airborne Systems and Equipment Certi cation. DO-178B, RTCA Inc., Suite 1020, 1140 Connecticut Avenue NW, Washington DC 20036, USA, 1 December 1992 (Prepared by RTCA SC-167 and EUROCAE WG-12. Also available as document ED-12B from EUROCAE, 17 Rue Hamelin, Paris Cedex 75783, France.) [RTCA90] Minimum Operational Performance Standards for Trac Alert and Collision Avoidance System (TCAS) Airborne Equipment { Consolidated Edition. DO-185, RTCA, One McPherson Square, 1425 K Street N.W., Suite 500, Washington DC 20005, USA, 6 September 1990 [SafeIT90a] Bloom eld, R.E. (Ed.): SafeIT1 { The Safety of Programmable Electronic Systems. Safety-Related Working Group (SRS-WG), Interdepartmental Committee on Software Engineering (ICSE), Department of Trade and Industry, ITD7a { Room 840, Kingsgate House, 66{74 Victoria Street, London SW1E 6SW, UK, June 1990 [SafeIT90b] Bloom eld, R.E., and Brazendale, J. (Eds.): SafeIT2 { A Framework for Safety Standards. SafetyRelated Working Group (SRS-WG), Interdepartmental Committee on Software Engineering (ICSE), Department of Trade and Industry, ITD7a { Room 840, Kingsgate House, 66{74 Victoria Street, London SW1E 6SW, UK, June 1990
References
[1] Austin, S., and Parkin, G.I.: Formal methods: A survey, National Physical Laboratory, Queens Road, Teddington, Middlesex TW11 0LW, UK, March 1993 [2] Bhansali, P.V.: Survey of software safety standards shows diversity, Computer, January 1993, pp. 88{89 [3] Blyth, D., Bolddyre, C., Ruggles, C., and TettehLartey, N.: The case for formal methods in standards, IEEE Software, September 1990, pp. 65{67 [4] Bowen, J.P. (Ed.): Towards veri ed systems (Elsevier, Real-Time Safety Critical Systems series, 1993) In preparation [5] Bowen, J.P., and Nicholls, J.E. (Eds.): Z User Workshop, London 1992 (Springer-Verlag, Workshops in Computing, 1993)
[6] Bowen, J.P., and Stavridou, V.: Safety-critical systems, formal methods and standards, Programming Research Group Technical Report PRG-TR-5-92, Oxford University Computing Laboratory, 1992 (Revised versions to appear in [4] and the BCS/IEE Software Engineering Journal) [7] Bowen, J.P., and Stavridou, V.: Formal methods and software safety, in [16], 1992, pp. 93{98 [8] Bowen, J.P., and Stavridou, V.: The industrial takeup of formal methods in safety-critical and other areas: A perspective, in [47], 1993, pp. 183{195 [9] Brown, M.J.D.: Rationale for the development of the UK defence standards for safety-critical computer software, Proc. 5th Annual Conference on Computer Assurance (COMPASS '90), Washington DC, USA, June 1990 [10] Buxton, J.N., and Malcolm, R.: Software technology transfer, Software Engineering Journal, January 1991, 6, (1), pp. 17{23 [11] Coopers & Lybrand: Safety related computer controlled systems market study, A review for the Department of Trade and Industry by Coopers & Lybrand in association with SRD-AEA Technology and Benchmark Research (HMSO, London, 1992) [12] Craigen, D., Gerhart, S., and Ralston, T.J.: An international survey of industrial applications of formal methods, Atomic Energy Control Board of Canada, U.S. National Institute of Standards and Technology, and U.S. Naval Research Laboratories, 1993 [13] Craigen, D., Gerhart, S., and Ralston, T.J.: Formal methods reality check: Industrial usage, in [47], 1993, pp. 250{267 [14] Deransart, P.: Prolog standardisation: the usefulness of a formal speci cation, on usenet comp.lang.prolog, comp.software-eng and comp.specification electronic newsgroups, October 1992 [15] Devine, C., Fenton, N., and Page, S.: De ciencies in existing software engineering standards as exposed by SMARTIE, in [37], 1993 [16] Frey, H.H. (Ed.): Safety of computer control systems 1992 (SAFECOMP'92), Computer Systems in Safetycritical Applications, Proc. IFAC Symposium, Zurich, Switzerland, 28{30 October 1992 (Pergamon Press, 1992) [17] Gardiner, P.H.B.: Defence Standard 00-56: Development and support Project formal speci cation of procedural elements, Draft, FSEL/SRC/RDS-PRC/1 Version 3.2, Formal Systems (Europe) Ltd, 17 May 1993 (Available from John Elliott, Safety and Reliability Consultants Ltd (SRC), 89 High Street, Alton, Hampshire GU34 1LG, UK.) [18] Garlan, D.: Formal methods for software engineers: Tradeos in curriculum design, in Sledge, C. (Ed.): Software Engineering Education, SEI Conference, San Diego, California, USA, October 1992 (SpringerVerlag, Lecture Notes in Computer Science 640, 1992) pp. 131{142
[19] Guiho, G., and Hennebert, C.: SACEM software validation, Proc. 12th International Conference on Software Engineering (ICSE) (IEEE Computer Society Press, March 1990) pp. 186{191 [20] Hall, J.A.: Seven myths of formal methods, IEEE Software, September 1990, pp. 11{19 [21] Hall, P.A.V., and Resnick, M.: Standards, in McDermid, J.A. (Ed.): Software Engineer's Reference Book, part II, chapter 50 (Butterworth-Heinmann, 1991) [22] IEE: Safety-related systems: A professional brief for the engineer, The Institution of Electrical Engineers, Savoy Place, London WB2R 0BR, UK, January 1992 [23] IEEE: Software Safety Standards session, Proc. 7th Annual Conference on Computer Assurance (COMPASS '92), Gaithersburg, Maryland, USA, 15{18 June 1992 [24] Jacky, J.: Safety-critical computing: Hazards, practices, standards and regulation, in Dunlop, C., and Kling, R. (Eds.): Computerization and controversy, chapter 5 (Academic Press, 1991) pp. 612{631 [25] Laprie, J.C. (Ed.): Dependability: Basic concepts and terminology (Springer-Verlag, 1991) [26] Larsen, P.G., and Plat, N.: Standards for nonexecutable speci cation languages, The Computer Journal, December 1992, 35, (6), pp. 567{573 [27] Leveson, N.G.: Software safety in embedded computer systems, Communications of the ACM, February 1991, 34, (2), pp. 34{46 [28] MacKenzie, D.: Computers, formal proof, and the law courts, Notices of the American Mathematical Society, November 1992, 39, (9), pp. 1066{1069 [29] Natsume, T., and Hasegawa, Y.: A view on computer systems and their safety in Japan, in [16], 1992, pp. 45{49 [30] Neesham, C.: Safe conduct, Computing, 12 November 1992, pp. 18{20 [31] Neumann, P.G. (Ed.): Subsection on certi cation of professionals, ACM SIGSOFT Software Engineering Notes, January 1991, 16, (1), pp. 24{32 [32] Neumann, P.G.: Illustrative risks to the public in the use of computer systems and related technology, ACM SIGSOFT Software Engineering Notes, January 1992, 17, (1), pp. 23{32 [33] Parkin, G.I., and Wichmann, B.: Conformity clause for VDM-SL, in [47], 1993, pp. 501{520 [34] Potocki de Montalk, J.P.: Computer software in civil aircraft, Microprocessors and Microsystems, in Cullyer, W.J. (Ed.): Special Issue on Safety Critical Systems, January 1993, 17, (1) [35] Rata, J.M.: Standardization eorts worldwide, in Bennett, P.: Safety aspects of computer control, chapter 4 (Butterworth-Heinmann, 1993) pp. 56{75 [36] Redmill, F. (Ed.): Safety Systems: The SafetyCritical Systems Club Newsletter, Centre for Software Reliability, University of Newcastle upon Tyne, Newcastle NE1 7RU, UK, May 1993, 2, (3)
[37] Redmill, F., and Anderson, T.: Safety-critical systems { current issues, techniques and standards (Chapman and Hall, 1993) [38] Smith, D.J., and Wood, K.B.: Engineering quality software: a review of current practices, standards and guidelines including new methods and development tools, 2nd edition (Elsevier Applied Science, 1989) [39] Sommerville, I.: Software engineering, 4th edition (Addison Wesley, 1992) [40] Thomas, M.C.: The industrial use of formal methods, Microprocessors and Microsystems, in Cullyer, W.J. (Ed.): Special Issue on Safety Critical Systems, January 1993, 17, (1) [41] Tierney, M.: The evolution of Def Stan 00-55 and 00-56: An intensi cation of the \formal methods debate" in the UK, Proc. Workshop on Policy Issues in Systems and Software Development, Science Policy Research Unit, Brighton, UK, July 1991 [42] Tierney, M.: Some implications of Def Stan 00-55 on the software engineering labour process in safety critical developments, Research Centre for Social Sciences, Edinburgh University, 1991 [43] Tripp, L.L.: Future of software engineering standards, ACM SIGSOFT Software Engineering Notes, January 1992, 17, (1), pp. 58{61 [44] Wallace, D.R., Kuhn, D.R., and Cherniavsky, J.C.: Report of the NIST workshop of standards for the assurance of high integrity software, NIST Special Publication 500-190, Computer Systems Laboratory, National Institute of Standards and Technology, Gaithersburg, MD 20899, USA, August 1991 (Available from the Superintendent of Documents, Government, U.S. Printing Oce, Washington, DC 20402, USA.) [45] Wallace, D.R., Kuhn, D.R., and Ippolito, L.M.: An analysis of selected software safety standards, IEEE AES Magazine, August 1992, pp. 3{14 (Also presented in [23].) [46] Wichmann, B.A. (Ed.): Software in safety-related systems (Wiley, 1992) Also published by BCS (Includes a copy of the 00-55 standard [MoD91a].) [47] Woodcock, J.C.P., and Larsen, P.G. (Eds.): FME '93: Industrial-strength formal methods, First International Symposium of Formal Methods Europe, Odense, Denmark, 19{23 April 1993 (Springer-Verlag, Lecture Notes in Computer Science 670, 1993) [48] Wright, C.L., and Zawilski, A.J.: Existing and emerging standards for software safety. The MITRE Corporation, Center for Advanced Aviation System Development, 7525 Colshire Drive, McLean, Virginia 22102-3481, USA, MP-91W00028, June 1991 (Presented at the IEEE Fourth Software Engineering Standards Application Workshop, San Diego, California, USA, 20{24 May 1991.) [49] Youll, D.P.: Study of the training and education needed in support of Def Stan 00-55. Cran eld IT Institute Ltd, UK, September 1988 (Also included as an appendix of the April 1989 00-55 draft.)