Software Engineering
Design and Implementation with Ada
Design of software systems is an art learned by
practice. Tools can support the capture of design ideas, and many tools
can check the consistency of separately designed elements, but the translation
of textual requirements to software design is still a creative process.
General concepts such as object definition, system layering, software reuse,
and designed-in maintenance and test are easy to talk about but hard to
realize. Working with people who have successfully performed this process
before saves time and provides apprenticeship for inexperienced designers.
Effective use of the Ada language for interface
design and implementation is also developed through experience. Techniques
for partitioning the system into packages, defining layered typing, using
tasking even in real time, reusing and/or defining reusable components,
and using target-dependent features can greatly affect project success.
Time-critical systems are often affected by the compilerÕs implementation
of Ada constructs. Understanding what issues are critical and how to evaluate
their correct usage within a software system can save future redesign and
implementation effort.
Software Analysis and Verification
Existing software can be analyzed to determine its
adherence to correctness or quality metrics. Visual inspection or code
reviews have been shown to identify more problems with less time investment
than explicit testing or debugging. Although explicit testing is essential,
design and code reviews can greatly improve system correctness and consistency,
especially in early design and development phases. Employing external reviewers
can have the added benefit of avoiding preconceptions developed early in
the design and the potential internal conflicts associated with criticism.
My analysis is also supported by tools that traverse the software and collect
locations that exhibit certain characteristics. Typically, tools of this
kind have reduced their analysis to a quality-factor number that gives
little help on which places could be improved and how. The Ada Analyzer
performs
traversal and analysis, displays an overview of the results in a hypertable
format, and allows direct traversal into the code for final determination
of the problem and possible improvements. The tools analyze code in the
following ways:
-
General analysis: The tools collect and condense
all areas of the code that exhibit a particular trait so that an overall
picture of the code can be seen. Tools in this group locate critical Ada
constructs such as tasking and generics, count lines and Ada constructs,
and analyze typing, object declaration, and subprogram usage.
-
Design analysis: The tools support structural
and design analysis with partitioning, interunit dependency, and call-tree
analysis.
-
Performance: The tools illuminate areas of
code that have potentially high impact on performance. For example, it
is possible to identify all locations where inlining would reduce code
size and execution time. The locations of large objects and dynamically
allocated objects, as well as the existence or lack of parameter passing,
can be identified.
-
Code correctness: The tools identify areas
of the code that have a high likelihood for errors. This can be especially
helpful when code is written quickly in a prototype phase of the project.
Examples include set/used analysis, static constraint checking, use of
Ada others clauses, and other dangerous Ada constructs.
-
Compatibility and portability: The tools
identify implementation-dependent features such as representation specifications,
address clauses, attributes, and pragmas and analyzes them for correct
and/or portable usage.
-
Programming standards and style: The tools
verify that local coding standards are not violated and that the code maintains
consistent usage patterns. Examples of violations include inconsistent
use of naming, redundant typing, use of Ada use clauses, and other
constructs held to negatively affect readability, maintainability, and
testability.
Custom analysis also can be developed rapidly based
on an extensive library of utilities and tool templates.
Prototyping
Prototyping can be used effectively for rapid development
of the key elements of a software system early in the process of concept
development or design. Prototypes allow projects to evaluate preliminary
design decisions, analyze incomplete requirement specifications, and rapidly
arrive at proof-of-concept demonstrations. Using existing utilities and
effective selection of what to implement can minimize the effort required
to arrive at an effective prototype. Consultants with extensive Ada experience
can assist in the rapid development of software with the help of proven
implementation experience and existing libraries of utilities. Early prototyping
also can give the development staff more experience with Ada, the development
tools, and implementation techniques.
Development Practice
Consulting in this domain is often called process
engineering because it deals with the processes associated with software
development. Although generally accepted development methodologies are
available, actual practice always differs for each project because of customer
requirements, available development tools, and project staff. One key aspect
of process definition is its integration with the host and the tools it
provides. Consulting support for the development processes of configuration
management, testing, documentation, verification and quality assurance,
and automated code generation are discussed in the following sections.
Configuration Management
Rational Apex offers subsystems as its primary mechanism
for software partitioning and support for the configuration-management
processes of history collection, internal release, definition of testing
configurations, and specification of customer delivery. Consulting can
help determine how best to map the software design structure into subsystems.
Methods for updating design interfaces, releasing, and building system
configurations can be defined to match project requirements. Support for
complex strategies of parallel development, multimachine implementation,
and target download and test also can be defined.
Existing software that has already been developed
in subsystems can be analyzed to improve partitioning, reduce recompilation
and downloading requirements, and check the consistency of each subsystem
and readiness for release. Analysis also can be supported by tools that
look into the configuration-management databases of the Rational Environment,
extract information, and display it in a convenient hypertable format.
Testing
Test Automation: Tools supporting management
of test programs, test input and output, and regression testing have been
available for a long time and can be used with good results. Even unit-test
generation for analysis of test coverage is possible in simple cases. More
difficult is the formal testing of integrated software at higher levels
in the hierarchy. This software is often decision-making or state-machine-oriented
software with complex inputs and outputs. Instrumentation techniques (with
code elimination for the final target version) can be used to support cause-and-effect
testing of complex code. Input values can be inserted into the code, and
the events resulting from execution can be collected to ensure that all
required events occurred in the correct sequence. It is important, however,
to design this approach into the software early in the development.
Host-Based Target Emulation: Developers
on most projects assume from the outset that all testing must be performed
on the target. They often come to this conclusion for one or more of the
following reasons:
-
The code has direct hardware dependencies that the
host hardware does not have.
-
The target has a specific run-time system and/or
interfaces for input and output that are present only on the target.
-
The target compiler has implementation-dependent
characteristics that the host compiler does not have.
-
Real-time software is time-dependent and cannot
be adequately tested on the host because the host executes at a different
rate.
Although final verification and, of course, timing-related
testing must be performed on the target, a great deal of algorithmic and
basic integration testing can be performed on the host where the environment
is more comfortable for the user. This is not accomplished without effort,
but the resulting time savings can be dramatic in most cases. Host testing
allows all users to test in parallel with good debugging support and allows
them to immediately repair code directly with the configuration-management
environment. The basic approach is the following:
-
Hardware dependencies can be isolated and implemented
in a different way on the host machine. In almost all cases, compiler implementation
dependencies can be emulated on the Rational host. In some cases, emulation
requires the use of interfaces with alternative Rational Environment bodies,
but this is often desirable from a design point of view.
-
Target-specific interfaces can be uploaded and compiled
on the Rational Environment. Thus, clients can compile against these interfaces
and bodies can be written to emulate their execution on the Rational Environment.
Most emulations require a small subset of functionality to support effective
testing on the host. Remote procedure call through the network can be used
to connect to interfaces executing on other hosts.
Verification and Quality Assurance
The software-analysis tools described above can
be used directly or customized to ensure that software is written in the
required style and does not use Ada constructs that are prohibited by the
project. Simple requirements-tracing capability also can be supported through
annotations and traversal collection.
Reporting software problems and requirements
changes and tracking their progress until completion also can be accomplished
through tool support on the Rational Environment.
Ada Software Auditing
A software audit is a review by a team of
expert Ada developers of the source code of a Ada software project. Peer
review has been proven to be the most effective method of locating software
defects and improving the overall quality of software. Auditing by external
experts delivers the effectiveness of peer review but offers the additional
benefit of independent auditors that are not biased with prior knowledge
of the software. External auditors are also unencumbered by project history,
political issues, or personal bias.
Software Auditing is best performed before
key release points in the software development lifecycle. An audit can
be performed after detailed design is complete to check the integrity of
the program structure, type model, data model, error-handling model, and
multi-threading design if present. Validation of the design at this point
can potentially save more costly restructuring efforts in later lifecycle
phases. More mature software can be analyzed for code defects, internal
consistency, efficiency, and adherence to project standards. Well established
code can be audited for portability, reusability, and its future maintainability.
The Ada Analyzer, a static analysis tool
with the ability to analyze code in many dimensions, will assist auditors
in the locating Ada constructs that impact the analysis objectives of the
audit. The Ada Analyzer can be used initially to help the auditors understand
the structure and content of the code. The tools also locate all instances
of Ada constructs matching some analysis requirement. It can find the "needles
in the haystack" in voluminous amounts of code and do so with computer
speed.
Audit deliverables consist of reports that
describe specific quality issues, and a list of all instances in the code
that violate or impact that issue. Reports are delivered in two formats.
An online form, with hypertext links to actual Ada constructs, can be used
interactively to further investigate reported constructs "in context" to
decide whether an update is required. Hard copy reports with line number
locations of all reported constructs can be distributed at review meetings,
or sent via e-mail to individual developers responsible for sections of
the code. It is expected that all reports will be scrutinized carefully
before a decision to modify the software is made. Some changes may be deemed
too risky to incorporate into mature Automated Generation of Application
Software
Several projects have achieved dramatic improvements
in coding efficiency and reduced testing costs through use of automated
generation of software. Typically this involves the collection of all system
data items that share a common pattern of implementation. All bus messages,
for example, are generally handled in the same way. Specific attributes
may alter scaling or data format, but most of the message-processing code
is very similar. This code can be generated automatically to save time
and reduce errors that are introduced when coded manually. Consulting can
be used to identify areas where automatic generation is possible and to
employ existing software utilities to reduce project development time.
Summary
Especially in early project phases, consulting can
be an effective way of optimizing staff experience and use of available
development tools. My extensive experience with Ada, the Rational Environment,
and development practice can positively affect project success within a
short period of time.
about
little tree | resume
little
tree home | products | technical
articles | email