VME: Can you please remind our readers what LDRA does, what its focus is?
STCLAIR: LDRA provides tools and services to help software developers achieve zero-defect software in safety-, security-, and business-critical software systems. This is achieved by exposing code to best-practice guidelines that begin with requirements traceability and extend to static and dynamic analysis and unit testing. Traditionally, LDRA customers have included both commercial and military avionics companies, as well as nuclear power, automotive, and transportation industries. However, other applications such as factory automation, consumer, medical, and security are now recognizing that software verification and testing help reduce programming errors.
VME: Static analysis seems to be a trend these days, so why does LDRA offer dynamic analysis? Is it really necessary?
STCLAIR: Dynamic analysis includes code coverage at both the source code and object code levels. Dynamic analysis determines that the code is actually doing what it was specified to do. Dynamic analysis allows a relationship to be created between the code that is executed and the high- or low-level requirement that satisfies a certain portion of functionality. This is called “structural coverage” in DO-178B.
Dynamic analysis can also be used at very early stages of development to verify consistency and integrity of design requirements. This activity, called Design Verification Testing (DVT), has been established over the years as the most effective way of rooting out runtime errors and defects revealed at the implementation level but linked to faulty design practices. DVT requires the use of a test harness that automatically simulates the design environment; otherwise, the manually created harness or test framework will make DVT impractical.
VME: How has static analysis revolutionized software development?
STCLAIR: Static analysis is all about defect prevention. Such prevention techniques have been established to be the most cost effective and quality enhancing, which is why they are known as best practices.
Defect prevention offers tremendous cost advantages because it implements error prevention at an early stage of development. The cost of defects increases exponentially through the software life cycle. Our customers report that over 70 percent of defects are requirements related. So by integrating requirements traceability into the overall software development, verification, analysis, and testing process, enormous error-, cost-, and resource-savings advantages benefit customers.
In contrast, defect detection, the approach of some test tool vendors, relies on latter stage analysis, is usually performed on a complete system, and focuses on finding runtime errors. This kind of defect detection is inherently simpler than defect prevention (and real testing) because it flags defects before products are shipped without requiring conformance to best practices. However, relying on runtime error checking by itself is a risky approach to assuring software quality and the integrity of your system. Trend analyses typically indicate that problems found at this late stage are usually part of a larger pattern of defects and attributable to bad design practices.
VME: You’re a Green Hills Software partner on MULTI. Will you be working with them on any future programs – or providing them any tools in conjunction with their new EAL6+ certified INTEGRITY-178 RTOS?
STCLAIR: LDRA is RTOS agnostic. We partner with several RTOS vendors and have integrated offerings with Wind River’s VxWorks, Workbench, and Tornado; DDC-I’s Deos; and Green Hills’ INTEGRITY and MULTI. So we actually have many common customers with Green Hills. Typically, certification bodies require a third-party objective tool chain to verify safety-, mission-, and business-security systems. We happily play that role in a number of accounts.
That said, LDRA also recognizes that most compiler and RTOS companies also offer their own tools for both static and dynamic analyses. This is certainly the case with Green Hills. But to accommodate the rapidly expanding market for traceability and requirements-based testing, LDRA will be offering its patent-pending requirements verification middleware products for integration with their respective IDEs in the near future. This capability would certainly complement Green Hills’ INTEGRITY-178 product.
VME: In light of today’s sinking economy, are companies tending more toward legacy code reuse?
STCLAIR: Without question, legacy code reuse has become more common among commercial software vendors especially. With project and capital budgets being sliced like salami these days, the software that has already been tested and proven is frequently incorporated if it conforms to current requirements. However, at the same time, the more long-term goals of design for reuse and component-based architecture are viewed with increased importance because of the economic advantages offered.
VME: Should the U.S. DoD reuse legacy code in its apps? How long can legacy code keep up with today’s ever-more-sophisticated critical apps?
STCLAIR: If source code components are properly modularized with such things as well-defined interfaces and data models, they can be successfully reused. Unfortunately, most legacy systems are “procedural” and were not designed for reuse. That said, as long as the legacy software conforms to the current requirements and has gone through thorough testing, why not reuse it? Software doesn’t deteriorate or disintegrate like hardware.
VME: Which programming language is most user-friendly – and best able to keep up with today’s evolving life-critical systems technologies?
STCLAIR: User-friendliness of programming languages is rarely a factor in life-critical systems. This is because such programs are usually well documented and fully traceable to the software requirements. Also, such systems are inherently complex and warrant development by highly skilled programmers. Consequently, today’s software suppliers must be committed to supporting C/C++, Ada 83/95/2005, Java and assemblers, and other technologies used in the safety-, mission-, and security-critical space.
More valuable than user-friendliness are the self-documenting characteristics of object-oriented languages such as Java, Ada 2005, and C++. These languages and their class hierarchies allow the architect to organize the system around recognizable concepts directly traceable to requirements.
VME: What effect will the upcoming release of the DO-178C avionics safety standard have on the established software validation/verification tools and methods?
STCLAIR: The DO-178C standard will add guidance for applicants and certification authorities to facilitate the certification of advanced software technologies such as object-oriented software (C++, Java, Ada 2005 and others), application modeling tools, and formal methods. Each of these technologies will have its own supplement that provides technology-specific perspectives. By the way, DO-178C will be released by 2010 at the earliest.
VME: What changes do you predict in the safety-critical software arena within the next 5 to 10 years?
STCLAIR: The changes in the safety critical arena will be directly driven by DO-178C. Since this new standard includes technologies such as object-oriented software, I think it’s realistic to expect that application modeling tools and formal methods will begin to replace the traditional C and Ada (83 and 95) implementations.
Another major trend will be a movement toward Application Lifecycle Management (ALM) tools, which support all phases of requirements engineering, development, verification, and production of software systems. ALM tools have existed in the host application space for the past five years or so, but none have been directly usable for safety-critical embedded applications.
LDRA Technology 650-583-8880 www.ldra.com