The Ethics of Safety-Critical Systems Jonathan P. Bowen

The University of Reading, Department of Computer Science Whiteknights, PO Box 225, Reading, Berks RG6 6AY, UK Email: [email protected] URL: http://www.cs.reading.ac.uk/people/jpb/ May 1997 Abstract

Safety-critical systems require the utmost care in their speci cation and design to avoid errors in their implementation, using state of the art techniques in a responsible manner. To do otherwise is at best unprofessional and at worst can lead to disastrous consequences. An inappropriate approach could lead to loss of life, and will almost certainly result in nancial penalties in the long run, whether because of loss of business or because of the imposition of nes. Legislation and standards impose external pressures, but education and ethical considerations should help provide more self-imposed guidelines for all those involved in the production of safety-critical systems. This paper considers some of the issues involved, with pointers to material providing greater depth in particular areas, especially with respect to the use of formal methods.

1 Prologue (oo o& ) Plato is dear to me, but dearer still is truth.

{ Aristotle (384{322 B.C.)

The use of computers in safety-critical systems is increasing rapidly. Computers are probably used in at least an order of magnitude more safety-critical applications compared to a decade ago. The decision about the use of software is often taken on economic rather than safety grounds. For example, the employment of computers in y-by-wire aircraft can make signi cant fuel cost savings since the computer can be programmed to y the optimal route with little pilot intervention if all goes well. The safety implications are more dicult to assess. Some in industry claim that increased use of software increases the safety of the system [31]. However, since it is so dicult to measure software reliability, this is dicult to justify in practice. In fact, the number of jet-airliner crashes continues to rise alarmingly [16].

In the aircraft industry a gure of 10;9 accidents per hour is often quoted. Software testing can result in failure rates of around 10;4 [12]. Software diversity (N-version programming) could add an extra order of magnitude to this. Further reliability could be added with software veri cation and validation, but measurements on the e ectiveness of such approaches are hard to come by. In any case, relying solely on software seems to be a foolhardy approach given these gures. Despite this, it is sometimes assumed that software is totally reliable in some calculations since software does not wear out in the same way that hardware does. Instead, the errors occur in a much more random and unpredictable manner. Software is a digital rather than analogue artifact. As such, techniques such as interpolation and extrapolation, used by many hardware engineers in calculations, do not apply. Changing a single bit in a computer program can have a very unpredictable e ect on its operation, or may be completely benign in some circumstances. Given the above problems with the inclusion of software in safety-critical systems, the decision to depend on software to ensure the safety of a system should be taken with great caution. Hardware backup is normally recommended where possible.

2 Seven Deadly Sins How about `Cloudcuckooland'?

{ Aristophanes (c.444{c.380 B.C.)

Any scienti cally-based activity requires a level of responsibility and it is important that those involved understand the moral questions involved [33]. Science has developed technology such as nuclear ssion and fusion which have the potential for great harm as well as good. When developing safety-critical systems, it is even more important than normal in software development to consider the ethical questions involved. In this section, a number of undesirable practices which should be avoided in order to help ensure success in a safety-critical software project are presented [22]. While these considerations cannot ensure favourable results, it may help to avoid failure, which is all too easy an outcome in software-based projects, especially those attempting to use a more formal approach.

1. Epideictic (o& ) { used for e ect Particular techniques and notations should not be applied merely as a means of demonstrating a company's ability. Similarly, they should not be used to satisfy management whim, or merely as a result of peer pressure [13]. For example, many rms have adopted object-oriented approaches and programming languages, perhaps without great thought in some cases [20]. Before a particular approach is applied, it should be determined whether it is really necessary. Potential reasons may be that to increase con dence in the system, to satisfy a particular standard required by procurers, to aid in tackling complexity, etc. There are, however, occasions where some techniques such as formal methods are not only desirable, but positively required. A number of standards bodies have not only

used formal speci cation languages in making their own standards unambiguous, but have strongly recommended and in the future may mandate the use of formal methods in certain classes of applications [4, 5, 12].

2. Hyperbole (" ) { exaggeration Sometimes exaggerated claims for the bene ts of certain methods are made, but no particular method should ever be seen as a panacea. For example, a formal method may be just one of a number of techniques that, when applied judiciously, may result in a system of high integrity. Complementary methods should not be dismissed lightly [6, 7]. Despite the mathematical basis of formal methods, they no guarantee of correctness; they are applied by humans, with all the potential for error that this brings. Tools may be available to support a particular method, but again their value may be overstated by the vendor. Most methods depend on the quality and suitability of the personnel involved in applying the techniques since system development is essentially a human activity.

3. Pistic (o& ) { too trusting Some place too much faith in techniques and tools, as opposed to the reasoning power of humans. A tool may return an answer (possibly `true' or `false'), but this depends both on the quality of the input to the tool and on the correctness of the tool itself. Similarly, no proof should be taken as de nitive. Hand-proofs are notorious in not only introducing errors at each proof step, but also at making assumptions which may be unfounded. From experience of machine checking proofs, most theorems are correct, but most hand proofs are wrong [32]. Even the use of a proof checker does not guarantee the correctness of a proof. While it does aid in highlighting unsubstantiated reasoning, and avoidable errors, proof should never be considered an alternative to testing, although it may reduce the amount of testing required.

4. Oligarchy (o  ) { rule by a few Many communities associated with a particular software engineering approach tend to be rather introspective, with a few experts preaching their particular wares. However, there has been considerable investment in existing software development techniques and it would be foolhardy to replace these en masse with any new method. Instead it is desirable to integrate new methods into the design process in a cost-e ective manner. Technology transfer from the oligarchy of experts to the potential users as an extremely important and time-consuming process, critical to the success of any new technology [11]. In the early application of a new method, it is often helpful to have an expert on call or, even better, working directly with the team involved. Ready access to education and

training are important aspects of the transfer process. University education allows longterm inculcation of ideas and attitudes. Training courses allow engineers to remain up to date with new developments.

5. Ephemeral (o& ) { transitory Some techniques are short-lived and others are deemed more useful in the longer term. It is easy to be bamboozled into using a new technique which may not be helpful for the future. It is especially important for safety-critical systems to ensure that any techniques used are well established and produce reliable results that improve the overall development process. Designing with reuse in mind is important for long-term cost bene ts, but can have serious safety implications if not done with care. An extremely expensive example of the potential problems of reuse is provided by the Ariane 5 disaster, where the inclusion of existing software in a new product resulted in catastrophe [27]. Here the old code was not suciently well tested in the new environment since it had always worked in the original environment without problems.

6. Epexegesis ( & ) { additional words How many things I can do without!

{ Socrates (469{399 B.C.)

Documentation is a very important aid to understanding, but should not include unnecessary detail. Too much documentation, obscuring the real information content, is as bad as too little documentation. Abstraction can be used to avoid verbosity, but arriving at the right level of abstraction is something that only comes with experience. If formal speci cations are included, these should always be accompanied by informal explanation to clarify the meaning and place it in context. Formal speci cations can be made intelligible and acceptable to non-technical personnel [19], but only provided that they are augmented with sucient amounts of informal explanation, diagrams, etc.

7. Maiandros1 (M  o& ) { meandering Taking too long in the development of software is a widespread phenomenon [18]. Estimating the time for software projects is dicult since even given a speci cation, the complexity of the software to implement it is very hard to assess in any reliable and scienti c way. In a typical project, natural language and diagrams are used to specify the desired system. These delay the use of formalism until the programming stage. Many errors are only discovered only some sort of formalism is introduced, be it a mathematically based notation for speci cation or a programming language for implementation. 1

A winding river in Phrygia, Turkey.

A speci cation is typically an order of magnitude smaller than its matching implementation, and is thus much cheaper to produce. It also makes the design of the matching program easier, since many issues that may have been discovered later have already been resolved. Thus the early use formalism for speci cation allows the discovery of many errors more quickly and thus also at a reduced overall cost.

3 Ethical Considerations But, my dearest Agathon, it is truth which you cannot contradict; you can without any diculty contradict Socrates. { Plato (c.429{347 B.C.)

The origins of western philosophy, ethics and scienti c thinking can be traced back to the Greeks. Indeed, as quoted in [40], it has been said by John Burnet in his book on Early Greek Philosophy that: It is an adequate description of science to say that it is \thinking about the world in the Greek way." That is why science has never existed except among peoples who came under the in uence of Greece.

This may be overstating the case somewhat, but certainly ancient Greece provided the catalyst for much early advancement of knowledge. Socrates (469{399 B.C.), then Plato (c.429{347 B.C.) and Aristotle (384{322 B.C.) developed philosophical research in a manner far advanced for their time. This was made possible by the Greek way of life, which left time to ponder more abstract questions, at least for a proportion of the population. Their in uence is still very much with us today, and their ideas are still under active discussion in philosophical research circles (e.g., see [14] discussing false beliefs). Aristotle was a great teacher and left many notes. Amongst these were those for his lectures on ethics, developed over the course of several years. Whilst these were never nished in the form of a complete book, they are of sucient interest to be the subject of continued study, and to have been reworked into book form by much more recent researchers [2]. Aristotle's Nicomachean Ethics has had a profound and lasting e ect on western philosophical thinking since its origins in the 4th century B.C. It has been both supported and rejected subsequently, but never ignored. It is inevitably awed by its very nature, being working notes for lectures. It starts with the object of life (aiming at some good), and proceeds via morals, justice, intellectual virtues, pleasure, friendship to happiness, the ultimate goal, even if it is dicult to de ne. Here we concentrate on Book VI concerning Intellectual Virtues [2]. We use some of the headings to discuss ethics in the context of safety-critical systems.

What is the right principle that should regulate conduct? Ethical considerations are a source of much debate, in general as well as with respect to computing applications [24]. Utilitarianism considers the consequences of action and

deontological theories, in response to problems with utilitarianism, consider the actions themselves. A third set of approaches are social contract theories, allowing individuals to act as rational free agents constrained by social rules. There is no general agreement about the best approach, but the maximization of happiness is normally an aim. Of course there is individual happiness and overall happiness, which can be in con ict. Risk of death and injury generally result in a decease of happiness, but the the case of war may be considered a necessary evil. The development of a safety-critical system should aim to avoid the loss of human life or serious injury by reducing the risks involved to an acceptable level. This is an overriding factor. The system should always aim to be safe, even if this a ects its availability adversely. It is the responsibility of the software engineering team and the management of the company involved to ensure that suitable mechanisms are in place and are used appropriate to achieve this goal for the lifetime of the product. It is sensible to follow certain guidelines when developing any software-based artifact, and especially if there are safety concerns. Most professional bodies such as the ACM (Association of Computing Machinery, USA), IEEE (Institute of Electronic and Electrical Engineers, USA), BCS (British Computer Society, UK) and IEE (Institution of Electrical Engineers, UK) provide codes of conduct for members which it is wise for them to follow. Textbooks dealing with ethical issues in computing (e.g., see [3, 36]) normally cover at least some of these. Most such codes coming from an engineering background place a very high priority on safety aspects. For example, the IEEE Code of Ethics (1990) commits its members to the following, listed rst out of ten points of ethical and professional conduct: 1. to accept responsibility in making engineering decisions consistent with the safety, health and welfare of the public or the environment.

Codes of conduct from a computer science background tend to place slightly less emphasis on safety and consider other losses such information as well, although safety is still an important factor. The rst two General Moral Imperatives for ACM members are: 1.1 1.2

Contribute to society and human well-being. Avoid harm to others.

`Harm' is considered to mean injury or other negative consequences such as undesirable loss of information. The points are elaborated further. For example, 1.1 goes on to say `An essential aim of computing professions is to minimize negative consequences of computing systems, including threats to health and safety.' In the UK, there is guidance speci cally for engineers and managers working on safetyrelated systems (see Figure 1, extracted from [21]). This code of practice is intended for use by professional engineering institutions such as the IEE, who may wish to augment the code further with sector-speci c guidance on particular techniques. For example, in 1995 the IEE held a workshop on the role of formal methods in the development of safety-related systems, as reported in [37]. The results of this workshop and other similar discussion forums could form the basis for more detailed guidance.

Figure 1: Code of Practice for Engineers and Managers (Produced by the UK Hazards Forum [21], reproduced with permission) Engineers and managers working on safety-related systems should: 

at all times take all reasonable care to ensure that their work and the consequences of their work cause no unacceptable risk to safety



not make claims for their work which are untrue, or misleading, or which are not supported by a line of reasoning which is recognised in the particular eld of application



accept personal responsibility for all work done by them or under their supervision or direction



take all reasonable steps to maintain and develop their competence by attention to new developments in science and engineering relevant to their eld of activity; and encourage others working under their supervision to do the same



declare their limitations if they do not believe themselves to be competent to undertake certain tasks, and declare such limitations should they become apparent after a task has begun



take all reasonable steps to make their own managers, and those to whom they have a duty of care, aware of risks which they identify; and make anyone overruling or neglecting their professional advice formally aware of the consequent risks



take all reasonable steps to ensure that those working under their supervision or direction are competent; that they are made aware of their own responsibilities; and that they accept personal responsibility for work delegated to them

Anyone responsible for human resource allocation should: 

take all reasonable care to ensure that allocated sta will be competent for the tasks to which they will be assigned



ensure that human resources are adequate to undertake the planned tasks



ensure that sucient resources are available to provide the level of diversity appropriate to the intended integrity.

Contemplative and calculative intellect There are two types of thought process, the irrational and the rational. Obviously the professional engineer should aim to use rational lines of thought in the reasoning about and development of a safety-critical system. Unfortunately development can depend on the personal, possibly unfounded, preferences of the personnel involved, especially those in a management role. Within the rational part of the mind, Aristotle makes a further distinction. In science, theory and practice are two very important aspects, each of which can help to con rm support the other. Without a rm theoretical basis, practical applications can easily

ounder; and without practical applications, theoretical ideas can be worthless. To quote from Christopher Strachey, progenator and leader of the Programming Research Group at Oxford: It has long been my personal view that the separation of practical and theoretical work is arti cial and injurious. Much of the practical work done in computing, both in software and hardware design, is unsound and clumsy because the people who do it do not have any clear understanding of the fundamental principles underlying their work. Most abstract mathematical and theoretical work is sterile because it has no point of contact with real computing.

It is highly desirable to ensure that the separation between theory and practice is minimized. This is certainly extremely important in the area of safety-critical systems. A good theoretical and mathematical underpinning is essential to ensure the maximum understanding of the system being designed. If this understanding is not achieved, serious problems can easily ensue [30].

Both kinds of intellect aim at truth, but the calculative faculty aims at truth as rightly desired by the exercise of choice In practical applications, there is nearly always a choice of which techniques to use. The decision may be in uenced by many considerations, including ethical issues. It is important to recognize the choices available, and assess their relative merits in as objective a manner as possible.

Five modes of thought or states of mind by which truth is reached The mind of man { how far will it reach? Where will its daring impudence nd limits? Hippolytus { Euripides (480{406 B.C.)

There are various ways to help ensure the correctness of a system. The background and expertise of the engineers involved is an extremely important contributing factor to the success of a project. Here ve important aspects of the personnel involved are considered.

1. Science or scienti c knowledge ( { episteme) It is important to have the basic theoretical groundwork on which to base practical application. This is typically achieved through initial education, topped up with specialist training courses. Modern comprehensive software engineering textbooks and reference books normally include a section or chapter on safety-critical systems (e.g., see [29, 34]). Subsequent professional development is also crucial to keep up to date with changes in the state of the art. There are now a number of courses available speci cally for safety-critical systems, especially at Master of Science level. For example, the University of York in the UK run a modular MSc in Safety Critical Systems Engineering [39] and Diploma / Certi cate short courses. The teaching is split into intensive one week taught modules, which has been found to be much more convenient for industrial employees taking the course part-time. It is easier to take a week o work every so often rather than being committed to a particular day of the week for a much longer period. The course content is agreed and monitored by a panel of experts, many from industry to ensure the relevance of the material. Introductory modules comprise an Introduction to Safety, Mathematics for Safety and Introduction to Project Management. Mandatory modules consist of Requirements Engineering, Formal Methods, Safety and Hazard Analysis, Dependable Systems Analysis and Management of Safety Critical Projects. There are a number of other optional modules. The course includes a signi cant mathematical content, especially in the areas of probability and discrete mathematics for software development. The MSc normally lasts 1 year if done full-time, and 3 years for part-time students, including a project for half the time.

2. Art or technical skill ( { techne) The life so short, the craft so long to learn. Aphorisms { Hippocrates (5th century B.C.)

Once the educational groundwork has been covered, much further expertise is gained in the actual application of the techniques learned. Applying mathematical skills can be even harder than acquiring them in the rst place for some. One of the most dicult aspects is learning to model reality with sucient accuracy [8]. Abstraction is a skill that only comes with practice. Unfortunately many existing programmers have to un-learn their natural tendency to become bogged down in implementation detail when considering the system as a whole. Instead, only the pertinent aspects should be included at any given level of abstraction.

3. Prudence or practical wisdom (o& { phronesis) It is essential to take great care in safety-critical applications in order to avoid undesirable consequences. New techniques should be used with great caution and introduced gradually. It is best for new approaches to be used on non-safety-critical systems rst to gain con dence, even if the look promising theoretically and are understood by the development team. Once sucient experience has been gained, and the bene ts have been assessed, then a technique may be recommended and adopted for use in safety-critical areas. For this reason, any prescriptive recommendations in standards should be updated regularly to ensure they remain as up to date as possible. Every engineer has their limitations, and it is important that individuals recognize their own limitations, as well as those of others, and keep within their bounds of expertise. However competent a person is, there will always be tasks which cannot reasonably be tackled. If the constraints on the development of a safety-critical application are impossible to meet, there should be mechanisms to allow this to be expressed at all levels in a company. A well-known example where such mechanisms were not e ective is the Therac-25 radiation therapy machine where several fatal radiation overdoses occurred due to an obscure malfunction of the equipment, the rst of its type produced by the company involved to include software [26]. It should be remembered that software by itself is not unsafe; it is the combination with hardware and its environment to produce an entire system that raises safety issues [25].

4. Intelligence or intuition (o& { nous) I have hardly ever known a mathematician who is capable of reasoning. { Plato (c.429{347 B.C.)

The highest quality of personnel should be employed in the development of safety-critical applications. The engineers involved should be capable of absorbing the required toolkit of knowledge and also accurately apprehending the required operation of computer-based systems. Specialist techniques, including the use of mathematics, are important in producing systems of the highest integrity. Formal speci cation helps reduce errors at low cost, or even a saving in overall cost. Formal veri cation, while expensive, can reduce errors even further, and may be deemed cost e ective if the price of failure is extremely high (e.g., involving loss of life). For the ultimate con dence, machine support should be used to mechanize the proof of correctness of the software implementation with respect to its formal speci cation. For example, the B tool has been used to support the B method [1] in a number of safety-critical applications, especially in the railway industry [15].

For mathematicians, proofs are a social process, taking years, decades or even centuries to be generally accepted. The automation of proof is regarded with suspicion by many traditional mathematicians [28]. A critical aspect of proofs is seen by many to be their surveyability, leading to understanding of why a proof is valid, and explaining why the theorem under consideration is true. An automated proof may have millions of (very simple) steps, and be extremely dicult to follow as a result. In addition, any `proofs' undertaken in the software engineering world must be performed several orders of magnitude faster than in traditional mathematics. However they are far shallower than mosts proofs of interest to professional mathematicians. Hence intuition about why a program is correct with respect to its speci cation is dicult to obtain in all but the simplest of cases. Unfortunately this problem will not disappear easily and much research remains to be done. Collaborations such as the European ProCoS project on `Provably Correct Systems' [10] have attempted to explore formal techniques for relating a formal requirements speci cation, through various levels of design, programming, compilation and even into hardware using transformations based on algebraic laws. Unifying theories for these di erent levels are needed to ensure consistency. However it is often dicult to guarantee compatibility of the models used at the various levels without restricting exibility unduly. Quod erat demonstrandum. (Translated from the Greek.) Which was to be proved. { Euclid (c.300 B.C.)

5. Wisdom (o { sophia) With experience comes wisdom. It is safest to stick to very modest ambitions if possible to ensure success. Unfortunately software tends to encourage complexity because of its

exibility. This should be resisted in safety-critical applications. The safety-critical aspects of the software should be disentangled from less critical parts if possible. Then more e ort can be expended on ensuring the correctness of the safety-critical parts of the system.

Resourcefulness or good deliberation ( o { euboulia) Impulses should be avoided in safety-critical system development and all choices should be carefully considered. This is why formalizing the system early is recommending since this allows much more informed and reasoned choices to be made, and for the consequences to be assessed.

Understanding (& { sunesis) A formal speci cation signi cantly aids in the comprehension of a system. The process of producing the formalization is as important as, or perhaps even more important than, the resulting speci cation itself in helping with this understand [19]. The construction of proofs for theorems concerning the system can bring an even greater depth of knowledge about why the system works the way it does. Modelling, animation and rapid prototyping are also complementary aids to understanding [8].

Judgement ( ! { gnome) and consideration Choosing appropriate techniques requires good judgement, which can be built up with experience, but also requires a decisive managerial approach. Choices also occur in the design process itself; otherwise it could all be done by computer. It may be necessary to re ne a non-deterministic speci cation to a deterministic implementation by gradually reducing the design space until just one choice is left. It is important to ensure number of choices is not reduced to zero (i.e., no implementation), which can be easily done by accident using some formal approaches.

General comments on the various states of mind The programming profession has traditionally had many of the attributes of a craftsman, such as artistry, but has often lacked foundational knowledge, such as mathematical underpinning of the concepts involved [23]. This is not so important for programs which are not critical for the operation of a system, where errors will not produce disastrous results. However if safety is of prime importance, such an approach is not longer practical, and in fact is downright unethical. Personnel involve in safety-critical application development should possess a balance of high-quality skills with regard to all the aspects outlined earlier in this section. Those that cannot demonstrate the right mix of expertise should be restricted to non-critical applications. Currently no special quali cations are required for personnel developing or maintaining software for safety-critical systems. This contrasts with the more established engineering professions where regulations, certi cation and accreditation often apply more strictly. Standards can apply to both hardware and software, but are often more au fait with the problems and solutions associated with hardware rather than software. However, standards for software are being introduced across a wide range of industrial sectors with an interest in safety-critical applications such as aviation, the nuclear industry, railways, etc. Some

standards are prescriptive, recommending particular approaches such as formal methods [5, 12]. As well as standards, legislation is likely to play an increasing role in safety-critical systems, as and when accidental death and injury caused by computer-controlled systems do occur.

The faculty of cleverness (o& { deinotes) Finally, it is advisable not to be too `clever' in safety-critical applications. This is best left to researchers developing new ideas. It is more sensible to use the simplest and most obviously correct solution possible to reduce the chances of error.

4 Epilogue (o o& ) The unexamined life is not worth living. { Socrates (469{399 B.C.) Some of the words in the section headings in this paper may be unfamiliar to some readers. The same problem occurs when new methods are used, especially ones employing mathematical symbols such as formal methods. In the past, the necessary grounding for the use of sound techniques like formal methods has not been taught adequately on computer science courses, resulting in many software engineers, and even more managers, shying away from their use because they feel they are out of their depth. Fortunately many undergraduate courses (at least in Europe, and particularly in the UK), do now teach the requisite foundations to software engineering. Courses also normally give a grounding in ethical issues, often at the behest of professional bodies who may not accredit the course otherwise. However, the teaching of both ethical and formal aspects is often rather unintegrated with the rest of the syllabus. More coordinated courses (e.g., see [17]) in which the mathematical foundations are subsequently applied to practical problems will help to produce more professional software engineers for the future. Further general information on safety-critical systems maintained by the author of this paper, including links to many on-line publications, is available as part of the World Wide Web Virtual Library under the following URL (Uniform Resource Locator): http://www.comlab.ox.ac.uk/archive/safety.html

I've got it! (E { Eureka!)

Acknowledgements

{ Archimedes (287{212 B.C.)

Many colleagues on the ESPRIT ProCoS project and Working Group on `Provably Correct Systems' have provided much inspiration over the years [10]. Thank you to Prof. Mike Hinchey (New Jersey Institute of Technology, USA, and University of Limerick, Ireland) for collaborating on the seven sins [22]! Finally, many thanks to the organizers of the ENCRESS'97 conference for choosing such an inspiring location.

References [1] J.-R. Abrial. The B-Book: Assigning Programs to Meanings. Cambridge University Press, 1996. [2] Aristotle. Ethics. Penguin Books, London, 1976. Translated by J. A. K. Thomson. [3] S. Baase. A Gift of Fire: Social, Legal and Ethical Issues in Computing. Prentice Hall, 1997. [4] J. P. Bowen. Formal methods in safety-critical standards. In Proc. 1993 Software Engineering Standards Symposium, pages 168{177. IEEE Computer Society Press, 1993. [5] J. P. Bowen and M. G. Hinchey. Formal methods and safety-critical standards. IEEE Computer, 27(8):68{71, August 1994. [6] J. P. Bowen and M. G. Hinchey. Seven more myths of formal methods. IEEE Software, 12(4):34{41, 1995. [7] J. P. Bowen and M. G. Hinchey. Ten commandments of formal methods. IEEE Computer, 28(4):56{63, April 1995. [8] J. P. Bowen and M. G. Hinchey. Formal models and the speci cation process. In Tucker, Jr. [38], chapter 107, pages 2302{2322. [9] J. P. Bowen, M. G. Hinchey, and D. Till, editors. ZUM '97: The Z Formal Speci cation Notation, 10th International Conference of Z Users, Reading, UK, 3{4 April 1997, Proceedings, volume 1212 of Lecture Notes in Computer Science. Springer-Verlag, 1997. [10] J. P. Bowen, C. A. R. Hoare, H. Langmaack, E.-R. Olderog, and A. P. Ravn. A ProCoS II project nal report: ESPRIT Basic Research project 7071. Bulletin of the European Association for Theoretical Computer Science (EATCS), 59:76{99, June 1996. [11] J. P. Bowen and V. Stavridou. The industrial take-up of formal methods in safetycritical and other areas: A perspective. In Woodcock and Larsen [41], pages 183{195. [12] J. P. Bowen and V. Stavridou. Safety-critical systems, formal methods and standards. IEE/BCS Software Engineering Journal, 8(4):189{209, July 1993. [13] J. P. Bowen and V. Stavridou. Formal methods: Epideictic or apodeictic? IEE/BCS Software Engineering Journal, 9(1):2, January 1994. [14] P. Crivelli. The argument from knowing and not knowing in Plato's Theatetus (187E5{ 188C8). Proceedings of the Aristotelian Society, XCVI:177{196, 1996.

[15] B. Dehbonei and F. Mejia. Formal development of safety-critial software systems in railway signalling. In M. G. Hinchey and J. P. Bowen, editors, Applications of Formal Methods, chapter 10, pages 227{252. Prentice Hall International Series in Computer Science, 1995. [16] The Economist. Fasten your safety belts. The Economist, pages 69{71, 11{17 January 1997. [17] D. Garlan. Making formal methods e ective for professional software engineers. Information and Software Technology, 37(5-6):261{268, May{June 1995. [18] W. W. Gibbs. Software's chronic crisis. Scienti c American, 271(3):86{95, September 1994. [19] J. A. Hall. Seven myths of formal methods. IEEE Software, 7(5):11{19, September 1990. [20] J. A. Hall. Taking Z seriously. In Bowen et al. [9], pages 89{91. [21] The Hazards Forum. Safety-related systems: Guidance for engineers. The Hazards Forum, 1 Great George Street, London SW1P 4AA, UK, 1995. URL: http://www.iee.org.uk/PAB/safe rel.htm. [22] M. G. Hinchey and J. P. Bowen. Seven deadly sins. In Practical Application of Formal Methods, number 1995/109 in IEE Colloquium, Computing and Control Division, pages 4/1{3, Savoy Place, London WC2R 0BL, UK, 19 May 1995. The Institution of Electrical Engineers. Extended abstract. [23] C. A. R. Hoare. Programming is an engineering profession. In P. J. L. Wallis, editor, Software Engineering, volume 11, number 3 of State of the Art Report, pages 77{84. Pergamon/Infotech, 1983. Also issued as Oxford University Computing Laboratory Technical Monograph PRG-27, May 1982. [24] D. G. Johnson and K. Miller. Ethical issues for computer scientists and engineers. In Tucker, Jr. [38], chapter 2, pages 16{26. [25] N. G. Leveson. Safeware: System Safety and Computers. Addison-Wesley, 1995. [26] N. G. Leveson and C. S. Turner. An investigation of the Therac-25 accidents. IEEE Computer, 26(7):18{41, July 1993. [27] J. L. Lyons. ARIANE 5: Flight 501 failure. European Space Agency, 19 July 1996. URL: http://www.esrin.esa.it/htdocs/tidc/Press/Press96/ariane5rep.html, Report by the Inquiry Board. [28] D. MacKenzie. The automation of proof: A historical and sociological exploration. IEEE Annals of the History of Computing, 17(3):7{29, Fall 1995. [29] J. A. McDermid, editor. Software Engineer's Reference Book. Butterworth Heinemann, 1991.

[30] P. G. Neumann. Computer Related Risks. Addison-Wesley, 1995. [31] J. P. Potocki de Montalk. Computer software in civil aircraft. Microprocessors and Microsystems, 17(1):17{23, January/February 1993. [32] J. Rushby. Formal methods and their role in the certi cation of critical systems. Technical Report SRI-CSL-95-1, SRI International, Menlo Park, California, USA, March 1995. Shorter version of Formal Methods and the Certi cation of Critical Systems, Technical Report SRI-CSL-93-7, November 1993. [33] C. Sagan. When scientists know sin. In The Demon-Haunted World: Science as a Candle in the Dark, chapter 16, pages 282{291. Ballantine Books, New York, 1997. [34] I. Sommerville. Safety-critical software. In Software Engineering [35], chapter 21, pages 419{442. Part 4: Dependable Systems. [35] I. Sommerville. Software Engineering. Addison-Wesley, 5th edition, 1996. [36] R. A. Spinello. Case Studies in Information and Computer Ethics. Prentice Hall, 1997. [37] M. Thomas. Formal methods and their role in developing safe systems. High Integrity Systems, 1(5):447{451, 1996. Workshop report 20/3/95, IEE, London, UK. URL: http://www.iee.org.uk/PAB/Safe rel/wrkshop1.htm. [38] A. B. Tucker, Jr., editor. The Computer Science and Engineering Handbook. CRC Press, 1997. [39] The University of York. Modular MSc in safety critical systems engineering: Prospectus 1997/98. Department of Computer Science, The University of York, Heslington, York YO1 5DD, UK, 1 March 1997. URL: http://www.cs.york.ac.uk/SC.html. [40] C. van Doren. A History of Knowledge: Past, Present and Future. Ballantine Books, New York, 1992. [41] J. C. P. Woodcock and P. G. Larsen, editors. FME '93: Industrial-Strength Formal Methods, volume 670 of Lecture Notes in Computer Science. Formal Methods Europe, Springer-Verlag, 1993.

The Ethics of Safety-Critical Systems 1 Prologue ( oo o&)

a decade ago. The decision about the use of software is often taken on economic rather ..... to undertake certain tasks, and declare such limitations should they become apparent ... people who do it do not have any clear understanding of the fundamental prin- .... Formal specification helps reduce errors at low cost, or even a ...

257KB Sizes 0 Downloads 176 Views

Recommend Documents

Page 1 50 34. OO 2 O 1O O LevelDB - - - - - - LevelDB Basho ...
Page 1. 50. 34. OO. 2. O. 1O. O. LevelDB - - - - - -. LevelDB Basho - 10000. Percona MySQL -e-. Wired Tiger-1.3.8 -e-. 1OOO a. 2OOO. 75. 3OOO. Average ...

pl 10-08 ododododododododo d. oo do, o do, o do, o do, o do ... - GitHub
PL. 10-08. O D O D O D O D O D O D O D O D O D. O. O DO, O DO, O DO, O DO, O DO, O DO, O DO, O DO, O DO, O. O DO, O DO, O DO, O DO, O DO, O DO, ...

Prologue -
that the holy text of forty sacred verses of Hanumān-CālÄ«sā dedicated to the great ... also a gloss, annotation, and expansion of its content rendered in English ...

prologue-dames.pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Whoops! There was a problem previewing this document. Retrying... Download. Connect ...

Page 1 o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o ...
om v. N on. CompositeResource Test. FileResourceTest. HTTPPostTest. HTTPRequestTest. HTTPResponse Test. HTTPServer Test. HeaderFieldTest.

Systems development using OO techniques & CASE -
customised off-the-shelf (COTS) - a core system is ... If the designed system was counted at 1,000 FPs ... life cycle range- the development contract can.

OO
card. Please see that no block is left unfilled and even Zeros appearing in the Centre Code No. are correctly .... Fruit: Apple :: Mammal: ? ... (4) Business.

OO Checklist.pdf
Pay Required Fee: You must pay the $150 application processing fee with a credit card (American Express, ... OO Checklist.pdf. OO Checklist.pdf. Open. Extract.

o 1:2
and conforms With all FDA approved application protocols for the devices. ..... required to adhere to What the FDA de?nes as “Good Manu facturing Practices. ... ?rst 1,000 units call for a soldering station and Will be imple mented according to ...

Prologue to The Master Algorithm - Washington
And the more data they have, the better they get. ... You use a data cube to summarize masses of data, look at it from .... Big data and machine learning greatly.

OO-V7_Leerrijke_netwerken_in_de_technologie.pdf
OO-V7_Leerrijke_netwerken_in_de_technologie.pdf. OO-V7_Leerrijke_netwerken_in_de_technologie.pdf. Open. Extract. Open with. Sign In. Main menu.

A O Co 2 1
Sep 24, 2004 - computer, data processing system, logic circuit or similar processing device (e.g., an ..... tape or hard disk drive. Such code may comprise object ...

O-1.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. O-1.pdf. O-1.pdf.

A O Co 2 1
Sep 24, 2004 - Within a computer, processor, controller and/or memory. These descriptions and ... those skilled in the data processing arts to effectively convey.

O-1.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. O-1.pdf. O-1.pdf.

1-o : K
Z. AlC0,.3CJ04 Af.U

fa fe eefat ft fi i I fI fo fo foA foA oo O -
a. J a, e. NJ. S. J n. NJ. S. 77. 62217. F309 t. MVS a, e. NJ. S. 34. 6177. 1821 ft. NJ e. NJ. S. 78. 62218. F30A fi. NJ i. J. S. 130. 62368. F3A0 i. J i. J. V y i. 79.

O O O O Cherry Of A Ride 100 Miles Downtown The Dalles
S. T. A. T. E. R. D. HU. SK. EY R. D. M. O. S. IE. R. C. R. M. ILL C. RE. EK. RD. G. O. D. B. E. R. S. O. N. R. D. PLEASANT RIDGE RD. PINE HOLLOW RD. FIVEM.

OO 2015 Results.pdf
Page 1 of 7. Pos'n BIB Name Cat Gender Club Time. Gender. Pos. CAT. pos. 1 234 Harry JONES M Mdc 1:20:39.0 1. 2 233 Andrew REYNOLDS M Neath ...

3.5.3 Position-Independent Function Prologue - GitHub
Jun 26, 2017 - More information on the AndroidTM platform is available from http://www.android.com/. 14. AMD64 ABI Draft 0.99.8 – June 26, 2017 – 10:42 ...

OO-2016-02 CSC Memorandum Circular No.1 Series of 2016 ...
Page 1 of 1. Page 1 of 1. OO-2016-02 CSC Memorandum Circular No.1 Series of 20 ... iod within which to avail of the Paternity Leave.pdf. OO-2016-02 CSC Memorandum Circular No.1 Series of 201 ... riod within which to avail of the Paternity Leave.pdf.

Page 1 infi3s of th EEE n t Estig RECEPTION NO FOO93349 OO PG ...
O ( by reg t. EEEEEEEEE. 5 may B8"Raat 8 Siv right ...... otherwise learns o is condemnation or eminent domain prgCea 3. O. Other Oropose ag idn' with respect ...

pdf-1831\bibliography-of-hg-wells-with-a-prologue ...
... of the apps below to open or edit this item. pdf-1831\bibliography-of-hg-wells-with-a-prologue-introducing-mr-wells-to-the-future-by-fred-a-chappell.pdf.