Safety Critical Systems
Class Notes

Example Safety Critical Systems

Related Definitions

  1. safety critical system - "The software used in the design of physical streams and structures, whose failure can have massive, life-threatening impact." (Bowyer, 151)
  1. hazard - "An intrinsic property or condition that has the potential to cause an accident." (Bowyer, 151)
  1. reliability - "The probability that a system, subsystem, or component will perform its intended function for a specified period of time under normal condition." (Bowyer, 151)
  1. risk - "The combination of the probability of an abnormal event or failure and the consequences of that event or failure to a system’s operator ,users, or its environment." (Bowyer, 151)
  1. accident - "An undesired consequence inflicting injury to person or damage to property in the process" (Bowyer, 151)
  1. uncertainty - "Measure of knowledge limits in a technical area" (Bowyer, 151)

Safety critical systems are designed to prevent accidents and establish reliability by minimizing risk and eliminating hazards and uncertainties.

Ethics and Safety Critical Systems

Making the right ethical decision is essential when handling safety critical systems. The effectiveness and safety of the system is judged by the programmer, and incorrect assessments result in loss of human life. Decisions about the quality and safety shouldn’t be set aside due to time constraints or money making. The following passages from professional codes of ethics relate to safety critical systems:

Item One, IEEE Code of Ethics – Designers should "accept responsibility in making engineering decisions consistent with the safety, health, and welfare of the public, and to disclose promptly factors that might endanger the public or the environment." This makes the designer responsible for the final product.

Item Three, IEEE Code of Ethics – Designers should "be honest and realistic in stating claims or estimates based on available data." Money or time shouldn’t be a substitute for human life.

Item Six, IEEE Code of Ethics – Designers should "maintain and improve our technical competence and to undertake technological tasks for others only if qualified by training or experience or after full disclosure of pertinent limitations." This passage says don’t toy with a safety critical system you’re not qualified to modify.

Professional Responsibilities, Item One, ACM Code of Ethics – Designers should "strive to achieve the highest quality in both the process and product of professional work."

Professional Responsibilities, Item 5.1, ACM Code of Ethics – Designers should "give comprehensive and thorough evaluations of computer systems and their impacts, including analysis of possible risks."

Planning, Development and Maintenance of Safety Critical Systems

Planning, development, and maintenance are three aspects of safety critical systems. Two tools are used for planning and development to increase the systems level of operation. A fault tree analysis starts with a system problem and deducts the cause of failures by reverse thinking. Logical operators are used to define interaction between systems. Another tool is the Failure Mode and Effect Analysis which identifies all the ways equipment can fail and the result of that failure is prioritized for better construction.

The developers of the program are responsible for testing to eliminate accidents and uncertainties in the program. Data entry and storage are checked for errors and information loss. Finally, the maintenance of safety critical systems corrects problems arising over the life of the system to create a more stable, certain environment.

Justin Latimer
Group #7, CS3604, Fall 1997