The operative software safety certification standard for commercial avionics (and increasingly for military avionics also) is DO-178B[1]. This standard categorizes software failure conditions in terms of their potential effect on safety, ranging from the lowest (Level E – no effect) to the highest (Level A – catastrophic failure with loss of aircraft), with a corresponding set of objectives that must be met. Different systems on the aircraft are at different levels.
In its nearly 20-year history, DO-178B has proven to be a successful safety standard: Although there have been some “close calls,” there has never been an aircraft fatality due to a failure of DO-178B certified code. Still, technology has changed since the early 1990s, and work is in progress on a revision known as DO-178C. The new standard will correct a few errors and ambiguities, clarify the requirements for “qualifying” a tool that automates some process that would otherwise need to be done manually (for example, determining a program’s maximum stack usage), and add some new content in the form of special technology supplements: 1) model-based development; 2) object-oriented and related technology; and 3) formal methods.
Neither DO-178B nor DO-178C call out specific objectives for security. This does not imply that a DO-178B/C-certified system is necessarily insecure. However, as advanced networking and communications facilities become available, such safety-critical systems can be accessed by external systems and equipment. In many cases, this is a benefit, but there are obvious security (and therefore, safety) risks. Thus, whether intentional or inadvertent, such security breaches can lead to loss of life. Thus, security cannot be added to the system as an afterthought.
Security objectives should be addressed by adding security-related requirements to the software’s overall safety requirements – and by using appropriate deployment platforms and development technologies.
Evaluation Assurance Levels
In the security area, the principal certification specification is the Common Criteria standard[2], which catalogs and defines two sets of requirements:
• Security Functional Requirements (SFRs): Services that perform security-related tasks
• Security Assurance Requirements (SARs): Evaluation steps that check whether a product’s security objectives are met
Similar to the safety objectives of DO-178B, the SARs are grouped into Evaluation Assurance Levels (EALs), ranging from 1 (lowest) to 7 (highest) (see Figure 1). Achieving higher EALs takes additional effort (for example, formal methods are required at EAL 7), but is justified when the value of protected assets is high.
Different application domains have different kinds of security requirements. To bring some consistency to evaluating prospective products, the Common Criteria defines the concept of a Protection Profile. A Protection Profile identifies – in an implementation independent manner – the assets that need to be protected, the SFRs that need to be implemented, the SARs (that is, the EAL) that must be met, and the operational environment/attacker sophistication that is assumed. A product with a higher EAL is not necessarily more secure than a product with a lower EAL; it depends on their respective Protection Profiles.
Unlike DO-178B, the track record of the Common Criteria has been mixed. One issue is whether the effort required to take a product through a successful evaluation (with respect to a given Protection Profile) produces the desired benefit in security assurance. Nevertheless, the Common Criteria’s catalog of SFRs and SARs can be extremely useful when considering the security objectives of a safety-critical system. Based on a component’s functions and safety level, the developer can determine which SFRs and SARs are relevant and add them as DO-178B requirements. Through such an analysis and selection of SFRs and SARs, the developer can achieve appropriate levels of assurance for both safety and security.
Deployment platforms for safety and security
As mentioned, modern systems with safety and security requirements share a common attribute: the need to have different components, at possibly different safety levels and/or different security levels, operate jointly (and possibly communicate with each other) without interference. That is, assurance is needed that no component can jeopardize the safety or security of the overall system or other components. Architectures have been designed to meet these requirements through partitioning: ARINC-653[3] for safety and MILS[4] for security.
ARINC-653 is an operating system architecture that supports multiple applications running at potentially different safety levels. A small real-time kernel controls all time and space usage for an arbitrary number of applications, each running in its own isolated partition and invoking services from the APplication EXecutive (APEX), for example, to multithread within a partition. Each application is allocated a certain amount of processor time per execution cycle and a certain amount (and location) of memory. This model guarantees that the operation of any one application cannot adversely affect another, and simplifies the testing.
The Multiple Independent Levels of Security (MILS) operating system architecture is similar to ARINC-653, but for applications running at potentially different security levels (Figure 2). An additional consideration for MILS is the managing of interpartition communication in a secure policy-based manner, for example, to ensure that an unclassified partition cannot read classified data.
Development technologies for safety and security
Achieving high levels of safety or security means finding bugs and potential vulnerabilities early in the software life cycle. That’s why the use of typical static analysis tools is too late; the error is already in the code. A preferable approach is to use programming languages and associated tools that prevent the errors from being inserted in the first place. Languages such as Ada, with strong typing and extensive compile-time checking, can help. For example, in a language such as C, adding an integer to a pointer can easily result in a “buffer overrun” error, where data is inserted into a location outside the bounds of the intended target data structure. This error is prevented in Ada, since the compiler will reject a program that attempts such an operation.
For applications that need to achieve high levels of safety and/or security, assurance backed by formal mathematical reasoning might be necessary. In such situations it is appropriate and also cost effective – as evidenced by practical experience in projects such as the NSA-sponsored Tokeneer[5] – to use a language that supports proofs of correctness of developer-specified program properties. The SPARK language[6] takes this approach. SPARK is a deterministic and verifiable Ada subset augmented with a notation for expressing a program’s “contracts” – for example, the preconditions, postconditions, and invariants of a subprogram, or a program unit’s data dependencies and information flows. Tools that complement the compiler apply proof techniques to verify the specified contracts. The resulting analysis can demonstrate, for example, that the program is free of runtime errors.
One of the issues that has complicated the job of maintaining and enhancing a safety- or security-critical system is the “big freeze”: Changing the source code, or even upgrading the compiler, has required a large effort in regenerating the certification artifacts to ensure that the change has not introduced any regressions. A promising approach that addresses this issue is the Open-DO initiative[7], which is using methods from the Agile and Lean communities, coupled with qualifiable open-source tools.
Melding safety and security
Designing a safe system requires accounting for security; this is difficult, especially since safety standards such as DO-178B do not explicitly address security issues. However, a combination of sound processes and appropriate technologies can make it manageable.
References:
[1] RTCA SC-167/EUROCAE WG-12. RTCA/DO-178B – Software Considerations in Airborne Systems and Equipment Certification, December 1992.
[2] Common Criteria Portal, www.commoncriteriaportal.org
[3] Avionics Application Software Standard Interface, ARINC Specification 653, January 1997.
[4] W. S. Harrison, N. Hanebutte, P. Oman, and J. Alves-Foss, October 2005. The MILS Architecture for a Secure Global Information Grid, CrossTalk 18 (10): 20–24. www.stsc.hill.af.mil/CrossTalk/2005/10/0510Harrisonetal.html
[5] Praxis High Integrity Systems. Tokeneer ID Station EAL5 Demonstrator: Summary Report, August 2008. www.adacore.com/multimedia/tokeneer/Tokeneer_Report.pdf
[6] M. Croxford and R. Chapman, Correctness by Construction: A Manifesto for High-Integrity Software, Crosstalk, Dec. 2005. www.stsc.hill.af.mil/crossTalk/2005/12/0512CroxfordChapman.html
[7] Open-DO: Towards a cooperative and open framework for the development of certifiable software. www.open-do.org
Greg Gicca is Director of Safety and Security Product Marketing at AdaCore. He can be contacted at [email protected].
Ben Brosgol is a member of the senior technical staff at AdaCore. Contact him at [email protected].
AdaCore
212-620-7300