The document summarizes the software development process recommended by NASA's Software Engineering Laboratory (SEL) at Goddard Space Flight Center. The SEL process involves 8 phases: requirements definition, requirements analysis and specifications, preliminary design, detailed design, implementation, system testing, acceptance testing, and maintenance/operations. It emphasizes getting requirements right, with two separate requirements phases. It also stresses metrics tracking, reviews involving both managers and engineers, and tailoring the process based on project needs.
1. Tony DaSilva : How Rocket Scientists Do It
This page last changed on Aug 04, 2008 by tonyd.
quot;Okay, so you're a rocket scientist. That don't impress me much. So you got the brain but have
you got the touch.quot; - Shania Twain
Introduction
•
Process Recommendation Highlights
•
Process Context And Overview
•
The People
•
The Requirements Definition Phase
•
The Requirements Analysis And Specifications Phase
•
The Preliminary Design Phase
•
The Detailed Design Phase
•
The Implementation Phase
•
The System Testing Phase
•
The Acceptance Testing Phase
•
The Maintenance and Operations Phase
•
Keys To Success
•
Dos And Don'ts
•
Introduction
Unlike the IEEE and SEI organizations, NASA'a Goddard Space Center builds large, mission critical,
computationally intensive, real-time software. Thus, I listen more closely, and with a less skeptical
attitude, when they speak about SW development. Via the beauty of web push technology, I recently
stumbled upon this freely available NASA document:
quot;Recommended approach to SW developmentquot;, NASA Software Engineering Laboratory (SEL), Goddard
Space Center 1992.
I would have uploaded it and attached it to this Wiki page, but it exceeds the 10MB attachment size
limit. However, in case you're interested in reading it yourself, here's the external link to it: SEL SW Dev
Approach. The purpose of this article is to review and summarize the SEL recommended approach to SW
development as described in their document.
Full Disclosure
The content of this article below the box that your reading is colored by my personal interpretation
of the SEL's intent and meaning. Words in quot;quotesquot; were lifted directly from the document. I tried
to bend their advice to fit our ATS application domain and I also tried to map their language into the
language that we (sort-of) speak in ATS. If you'd like, read the document yourself to form your own
opinions. We can then compare notes
The date that the SEL document was published, 1992, shows that it isn't very current. Usage of the terms
quot;VAXquot;, quot;FORTRANquot;, quot;Adaquot;, quot;JCLquot;, quot;DCLquot;, and my favorite, quot;System Delivery Tapequot; may trigger a light-
hearted chuckle. However, aged material aside, it's seriously chock full of timeless insights and golden
nuggets of wisdom.
The process described in the document is founded on data collected from over 100 SW development
projects. SW product sizes ranged from 30 KSLOC to 300 KSLOC. The SEL calls the application domain
Document generated by Confluence on Aug 04, 2008 09:17 Page 1
2. covered by their SW products flight dynamics mission SW. Like our SW products, failure of their SW can
lead to, or directly cause, large financial loss and/or loss of human life.
One of the great attributes of the document is that it is totally self-contained. The recommended
artifacts are not spread out all over the place among different documents and difficult to link/associate
together. Definitions and descriptions of practices and methods and tools and roles and phases and
review templates and document templates are all cohesively located in this one 228 page document.
Process Recommendation Highlights
After reading the entire document, I went back over it and cherry-picked (<--- that's for you Phil )
the concepts and ideas that resonated with me. Here's the list:
• The SEL's guiding philosophy is: quot;.. Intellectual control and reliance on the human mind to build a
quality product the first time.quot; This nearly 20 year old philosophy is consistent with one of the agile
manifesto's four main tenets: we value quot;Individuals and interactions over processes and toolsquot;.
• Out of the 8 phases in the recommended process, there are TWO requirements phases: A
Requirements Definition phase, and a Requirements Analysis & Specifications phase. Unlike every
other process definition that I've ever seen (obviously, I haven't seen them all), this fact drives
home the importance of getting the requirements right early in the process for real-time, mission
critical, SW systems. This characteristic clearly flies in the face of modern agile approaches that
recommend against BDUF (Big Design Up Front) and advise getting to working code ASAP. However,
considering the class and application domain of SW that NASA develops, and the consequences of
failure, I'm on board with the SEL's 2 requirements phase recommendation.
• When Management team responsibilities are defined in each phase, the tasks are not just defined as
quot;status takingquot; and quot;schedule hawkingquot;. There's technical content too:
° quot;Both the Management team and the Development team review the test results to ensure that
all discrepancies are identified and corrected.quot;
° quot;All build test plans are reviewed for correctness and completeness by the Management team.quot;
° quot;Prepare a preliminary build plan.quot;
° quot;Reviews technical productsquot;
• Training is recommended LOTS of times throughout the document.
• The SEL maintains and uses a quot;project historiesquot; database to store and reuse lessons learned.
The SEL database was one of the main drivers behind the process document being reviewed in this
article.
• The SEL classifies requirements into 5 types: mandatory, requires review, needs clarification,
information only, and TBD. The numbers of these metrics (especially TBDs) are tracked rigorously
and measured frequently throughout the project. metrics are used as prime indicators of project and
product health.
• The importance of pragmatic, down-to-earth, tailorability is expressed in a section right up front.
The number and formality of reviews, the number of documents produced, and the level of detail
provided in each document should vary depending on the size and complexity of the SW, and the
extent of the modifications that need to be made to existing SW.
• The SEL doesn't mandate either Object Oriented Analysis and Design methods nor Structured
Analysis and Design methods. They leave it up to the team to decide which method to employ. They
do, however, recommend not mixing the two. quot;The analysis methodology for the project should be
appropriate to the type of problem the system addresses.quot;
• The SEL approach uses quot;Requirements Question and Answerquot; forms as one of the important metrics
to track the health of the product and project. The requirements definition team quot;responds to ALL
developer questionsquot;.
• Separate System Test and Acceptance Test teams are identified. The System Test team is comprised
of SW developers. The Acceptance test team is comprised of analysts and Requirements Definition
team members.
Document generated by Confluence on Aug 04, 2008 09:17 Page 2
3. • Metrics are a highly important feature of the SEL SW development process. They are tracked,
evaluated, and updated throughout the project as objective indicators of how well the team and the
product are progressing. A sample metrics list is given as:
° Number of Requirements changes/modifications
° Number of Requirements additions
° Number of TBDs
° Number of Requirements questions
° Number of Requirements answers
° Number of planned SW units
° Number of designed units
° Number of reviewed units
° Number of planned tests
° Number of tests executed
° Number of tests passed
° Number of defects reported
° Number of defects resolved
Process Context And Overview
The organizational context in which flight dynamics SW is developed is shown in the figure below.
Computationally intensive mission support SW (orbit determination, orbit adjustment, altitude
determination, manuever planning) is engineered in the Flight Dynamics Facility. The Systems Technology
Laboratory develops simulators, embedded systems, flight dynamics utilities, and SW that supports
advanced system studies.
The 8 phase SEL SW development life cycle is illustrated below.
Notice that, at any given point in time, multiple activities are being executed in parallel. Requirements
definition activity continues throughout the process. In order to bound the project and control scope
creep, there's early baselining of requirements. Cllear entry/exit criteria are defined for each phase, but
there's no quot;pens down, we're done with requirementsquot; proclamations to convey a false sense of cosmetic
security to stakeholders.
Each phase is defined by
• Entry & exit criteria
• Output products/artifacts
Document generated by Confluence on Aug 04, 2008 09:17 Page 3
4. • Metrics collected
• Methods & tools
• Key activities, and who is responsible for doing them
The Maintenance & Operation phase, like in most process descriptions, is (curiously) not addressed in the
document.
The SEL doesn't use the DoD CSCI/CSC/CSU terminology that we internally use in ATS to characterize the
design of their SW. They use subsystem levels (1..N), packages, and units to document their designs. A
Unit is flexibly defined as: quot;any set of program statements that are logically treated as a whole.
A main program, a subprogram, or a subroutine may each be termed a unit.quot; A module is
defined as a collection of logically related units (typically 5 to 10 units). A component is the most abstract
concept that the SEL uses for describing the SW. It is defined as quot;any constituent partquot;.
Each reviews for each phase is specified in terms of its format and its hard copy materials content.
Presenters, participants, schedule, agenda, and materials distribution characterize the review format.
Content templates describe the hard copy review material.
The People
In the SEL process, the following individual roles are defined:
1. SW engineer
2. Analyst (determines detailed requirements AND performs acceptance tests).
3. Project manager (allocates/tracks resources, performs active monitoring, acts as a technical
consultant).
4. Task Leader (provides technical direction and daily technical oversight).
5. Application Specialist(developer who has engineered similar SW previously and also understands
the complex mathematics and physics of flight dynamics).
6. QA representative
7. Project librarian
In addition to individual roles assigned to specific people, the following group roles are defined:
1. The Requirements Definition team
2. The SW Development team
3. The Management team
4. The Maintenance and Operation group
5. The Configuration Control Board (CCB)
6. The System Test team
7. The Acceptance Test team
A typical Requirements Definition team is composed of Flight Dynamics analysts and application
specialists.
A typical SEL SW Development team is composed of: a project manager, a task leader, N developers, K
application specialists, a QA representative, a project librarian, and a CCB.
A typical Acceptance Test team is comprised of analysts and Requirements Definition team members.
A typical System Test team is comprised of members of the SW Development team and J analysts.
The Requirements Definition Phase
quot;The purpose of the Requirements Definition phase is to produce a clear, complete, consistent, and
testable specification of the technical requirements for the product.quot;
The major actors in this phase are:
• The Requirements Definition team. The team collects and itemizes all high level requirements. It
develops the operational concept. The Requirements Definition team develops system concepts,
defines detailed requirements, defines all external interfaces, and derives specifications. The system
functions and algorithms that meet each detailed requirement are defined and specified, and the
overall architecture concept is defined.
Document generated by Confluence on Aug 04, 2008 09:17 Page 4
5. • The Management team: The team develops a plan for this phase, staffs and trains the
Requirements Definition team, and interfaces with the customer.
The major output artifacts of the Requirements Definition phase are:
1. The System and Operations Concept (SOC) Document
2. The Requirements and Specifications Document in 3 parts:
a. Detailed requirements
b. Functional or Object Oriented Specifications (data in, data out, in-to-out transformation steps).
c. Necessary mathematical background information
The reviews executed in this phase are: The System Concept review (SCR) and The System Requirements
Review (SRR).
The tools employed to develop the outputs are selected from this set: Structured Analysis, Object
Oriented Analysis, Walk-throughs, inspections, and prototyping (to resolve requirements issues).
The Requirements Analysis And Specifications
Phase
quot;The purpose of the Requirements Analysis phase is to ensure that the requirements and specifications
are feasible, complete, and consistent, and that they are understood by the development team.quot;
The major actors in this phase are:
• The Requirements Definition team. The team continues to resolve ambiguities, discrepancies, and
TBDs. It responds to all developer questions.
• The SW Development team. The team analyzes and classifies requirements. It produces the Data
Flow Diagrams or Object Oriented diagrams that characterize the required behavior of the product.
Structured or Object Oriented analysis is used to clarify & amplify the behavioral requirements. The
team scrutinizes the Requirements and Specifications document for omissions, contradictions, and
TBDs.
• The Management team: The team prepares the SW Development Plan, staffs and trains the SW
Development team, facilitates requirements issue resolution, and reviews technical products.
The major output artifacts of the Requirements Analysis And Specifications phase are:
1. The Requirements Analysis Report
2. The SW Development Plan
3. A Requirements and Specifications artifact update
4. Prototyping plans.
The Review that takes place in this phase is the SW Requirements and Specifications Review.
The tools employed to develop the outputs are selected from this set: Walk-throughs, Requirements Q &
A forms, requirements classification, project library, and prototyping.
The Preliminary Design Phase
quot;The purpose of the Preliminary Design phase is to define the high level SW architecture that will best
satisfy the requirements and specifications for the system.quot;
The major actors in this phase are:
• The Requirements Definition team. The team continues to resolve requirements issues, answer
developer questions, and it participates in SW design walk-throughs.
• The SW Development team. The team defines the high level functions or objects in the system,
prepares the preliminary design report, conducts the preliminary design review, and develops
scaffolding skeleton code for prototyping and performance analysis.
Document generated by Confluence on Aug 04, 2008 09:17 Page 5
6. • The Management team. The team reassess schedules/staffing/training needs, and controls
requirements changes. The focus switches from planning to controlling. The team prepares a
preliminary incremental build plan and establishes the independent Acceptance test team.
The major output artifacts of the Preliminary Design phase are:
1. The Preliminary Design Report
2. The System Test Plan
The Review that takes place in this phase is the Preliminary Design Review.
The tools employed to develop the outputs are selected from this set: Walk-throughs, design inspections,
prototyping, functional decomposition, object oriented design.
The Detailed Design Phase
quot;The purpose of the Detailed Design phase is to produce a completed design specification that will satisfy
all requirements for the system and that can be directly implemented in code.quot;
The major actors in this phase are:
• The Requirements Definition team. The team continues to resolve requirements issues, answer
developer questions, and it participates in design walk-throughs.
• The SW Development team. The team prepares detailed design diagrams, conducts walk-throughs,
prepares the detailed design document, and conducts the Critical Design Review.
• The Management team. The team assess lessons learned from the previous phases, controls
requirements changes, and prepares the build plan.
• The Acceptance test team: The team begins work on the acceptance test plan and the build 1 test
plan (the order in which units should be implemented & integrated, capabilities in the build).
The major output artifacts of the Detailed Design phase are:
1. The Detailed Design Report
2. The Build Plan
3. The Build Test Plan
The Review that takes place in this phase is the Critical Design Review.
The tools employed to develop the outputs are selected from this set: Walk-throughs, design inspections,
functional decomposition, object oriented design.
The Implementation Phase
quot;The purpose of the Implementation phase is to build a complete, high-quality SW system from the
blueprint provided in the detailed design document.quot;
The major actors in this phase are:
• The Requirements Definition team. The team continues to resolve requirements issues, answer
developer questions, and it participates in the Build Design reviews.
• The SW Development team. The team codes new units and revises existing units. It tests and
integrates units, conducts the build tests, prepares the System Test plan from the Requirements and
Specifications artifact, and drafts the User's Guide.
• The Management team. The team reassess schedules/staffing/training, and controls requirements
changes.
• The Acceptance test team: The team completes the draft acceptance test plan.
The major output artifacts of the Implementation phase are:
Document generated by Confluence on Aug 04, 2008 09:17 Page 6
7. 1. The Source code
2. Build test results
3. The System Test plan
4. Draft user's guide
The Review that takes place in this phase is the Build Design Review.
The tools employed to develop the outputs are selected from this set: code reading, unit testing,
integration testing, build testing, source code configuration management.
The System Testing Phase
quot;The purpose of the System Testing phase is to verify the end-to-end functionality of the system in
satisfying all requirements and specifications.quot; Discrepancies found during testing are classified as either
code, design, or requirements errors. In addition, errors are prioritized into 3 levels of severity:L1, L2,
L3. In the SEL process, the documents are fixed first, and then the code is fixed.
The major actors in this phase are:
• The SW Development team. The team verifies end-to-end functionality against the Requirements
and Specifications document according to tests recorded in the System Test plan. The team also
identifies SW units and modules for reuse and develops the System Description document.
• The Management team. The team reassess schedules/staffing/training, controls requirements
changes, and conducts configuration audits.
• The System Test team: The team executes the tests in the System Test plan, analyzes and reports
the test results, and evaluates the User's Guide.
• The Acceptance Test team: The team finalizes the acceptance test plan.
The major output artifacts of the System Testing phase are:
1. The tested source code
2. The System Description document
3. The Acceptance Test plan
4. User's guide
The Review that takes place in this phase is the Acceptance Test Readiness Review.
The tools employed to develop the outputs are selected from this set: regression testing, source code
configuration management, discrepancy reports, configuration audits.
The Acceptance Testing Phase
quot;The purpose of the Acceptance Testing phase is to demonstrate that the system meets its requirements
in its operational environment.quot; Testing is performed on the operational equipment as opposed to a lab
environment. No new requriements or enhancements can be accepted during the Acceptance Testing
Phase. Five levels of test disposition are defined: Cannot be evaluated; Failed and no workaround exists;
Failed but workaround exists; Cosmetic errors; Passed.
Important Quality Information Measured By The SEL
As of 1992, the SEL measured an average error density of 4.5 errors/KSLOC at the end of
Acceptance Testing. Furthermore, they decomposed this number further into; 2.6 errors/KSLOC at
the end of the Implementation phase; 1.3 errors/KSLOC at the end of the System Testing phase;
and .6 errors/KSLOC at the end of the Acceptance Testing phase. The SEL also found that error
detection rates decreased by 50% from phase to phase.
The major actors in this phase are:
• The SW Development team. The team trains the Acceptance Test team in how to use the system. It
also corrects discrepancies found during testing.
Document generated by Confluence on Aug 04, 2008 09:17 Page 7
8. • The Management team. The team ensures the quality and progress of the testing and starts
preparing the SW Development History.
• The Acceptance test team: The team executes the tests in the Acceptance Test plan, analyzes and
reports the test results, and formally accepts the system.
The major output artifacts of the Acceptance Testing phase are:
1. The system source code and document set
2. The finalized System Description document
3. The finalized User's guide
4. The Acceptance test results
5. The phase by phase SW Development History (no later than 1 month after formal system
acceptance).
The tools employed to develop the outputs are selected from this set: regression testing, source code
configuration management, discrepancy reports.
The Maintenance and Operations Phase
In their process document, the SEL, like most other process sponsors, doesn't define the details of the
Maintenance and Operations Phase. They state that the overall recommended approach defined in the
previous sections of the document can be tailored to apply to this phase.
Keys To Success
At the tail end of the SEL SW process document is what I think is the most valuable section. It is
appropriately named quot;Keys To Successquot;. Based on their extensive experience, here are their insights:
1. Understand your environment. The environment can be expressed as; the nature of the problem to
be solved, the limits and capabilities of your staff, and the supporting SW and HW infrastructure in
your organization.
2. Match the process to the Environment. Tailor the process and ensure that the activities have a
rationale and can be enforced. If you cannot or won't enforce a behavior/activity, then don't
superficially include it in the plan. Make metrics collection easy, and an integral part of the process.
3. Experiment to improve the process. Stretch it a little at a time to improve it continuously. Employ
new techniques and extensions, cut out the fat, and evaluate the impact of the change as
objectively as possible. Don't change too many things at once.
4. Don't attempt to use excessively foreign technology. Do not select and attempt a significantly
different technology just because it was successful in other situations. The technology must fit the
local environment and culture.
Dos And Don'ts
In addition to the quot;Keys to Successquot; section, the SEL has arrived at a set of 9 Do and and 8 Don't
recommendations to facilitate success. I've already listed these in a separate Wiki page here, but I think
they are so valuable that I'm repeating them below:
1. Do adhere to a quot;livingquot; Software Development Plan. Update it frequently, at least once per phase.
2. Do empower project personnel. Clearly assign responsibilities and decision-making authority to
specific roles, and assign specific people to those roles.
3. Do minimize bureaucracy. Establish the minimum documentation level and meeting frequency
necessary to communicate status and decisions to all stakeholders. Resist the temptation to address
difficulties by adding more meetings, documents, and tightening management control. quot;More
meetings plus more documentation plus more management does not equal more success.quot;
4. Do establish and manage the software baseline. Prioritize and track the number of TBDs, number of
Requirements & Specifications, and determine schedule/cost impacts throughout the project.
5. Do take periodic snapshots of project health and progress and don't be afraid to replan. Adjust the
scope, staffing, schedule when necessary.
6. Do re-estimate size, staffing, and schedules regularly (we recommend monthly). More information,
experience and knowledge is acquired as the project progresses and ambiguity decreases.
7. Do define and manage phase transitions. Publish clear and unambiguous entry/exit criteria.
Document generated by Confluence on Aug 04, 2008 09:17 Page 8
9. 8. Do foster a team spirit. Maximize commonality and minimize differences between the staff. Establish
a common language specific to the project. Cross-train. Help and applaud each other.
9. Do start the project with a small senior staff. This group should establish the approach, priorities,
and organize the work. In addition the group should establish reasonable schedules.
1. Don't allow team members to proceed in an undisciplined manner. Provide training on specific
methods and techniques.
2. Don't set unreasonable goals. This behavior is worse on morale than not setting any. Don't hold
personnel accountable for impossible commitments.
3. Don't implement changes without assessing their impact. Assess cost, schedule and technical risk.
Little changes add up over time.
4. Don't gold plate. Implement only what is required.
5. Don't overstaff, especially early in the project. Bring on additional staff only when there is useful
work for them to do.
6. Don't assume that an intermediate schedule slippage can be absorbed later. It may be absorbed if
the right decisions are made, but don't assume it to be absolutely true.
7. Don't relax standards in an attempt to reduce costs/schedule. This behavior lowers the quality of
intermediate products/artifacts and leads to more rework downstream. It also sends the message
that schedules are more important than quality.
8. Don't assume that a large amount of documentation ensures success. Determine the level of
formality and amount of detail required based on the size, life cycle duration, and lifetime of the
system.
Document generated by Confluence on Aug 04, 2008 09:17 Page 9