This document provides an overview of advanced software engineering and software process improvement (SPI). It discusses SPI frameworks like the Capability Maturity Model (CMM) and defines what SPI entails. The document outlines the five activities in the SPI process: assessment and gap analysis, education and training, selection and justification, installation/migration, and evaluation. It also discusses SPI risks, success factors, maturity models, and returns on investment. Finally, it covers the People CMM and trends toward more agile SPI approaches.
3. What is SPI?
SPI implies that
elements of an effective software process can be defined in an effective manner
an existing organizational approach to software development can be assessed
against those elements, and
a meaningful strategy for improvement can be defined.
The SPI strategy transforms the existing approach to software
development into something that is more focused, more repeatable, and
more reliable (in terms of the quality of the product produced and the
timeliness of delivery).
3
4. Information Systems Development
Resources Activities Products
•Planning
•Analysis •Hardware
•Design •Software
•Construction •Documentation
•Testing
•Training
•Implementation
•Follow-up
•Enhancements
equipment •etc...
5. Software Process Improvement
Efforts
Carnegie Mellon University’s Software Engineering
Institute’s Capability Maturity Model - (SEI’s CMM)
International Standards Organization’s 9001
Specification (ISO 9001)
Proprietary SPI’s from consulting firms
6. SPI Framework
a set of characteristics that must be present if an effective software process
is to be achieved
a method for assessing whether those characteristics are present
a mechanism for summarizing the results of any assessment, and
a strategy for assisting a software organization in implementing those
process characteristics that have been found to be weak or missing.
An SPI framework assesses the “maturity” of an organization’s software
process and provides a qualitative indication of a maturity level.
6
9. Constituencies
Quality certifiers
Quality(Process) --> Quality(Product)
Formalists: process modeling languages
Tool advocates
Practitioners: little formal process modeling
Reformers: organisational change
Ideologists: partocular SP for specific organisation
9
10. Maturity Models
A maturity model is applied within the context of an SPI
framework.
The intent of the maturity model is to provide an overall
indication of the “process maturity” exhibited by a software
organization.
an indication of the quality of the software process, the degree to which
practitioner’s understand and apply the process,
the general state of software engineering practice.
10
11. Four levels of Immaturity
Schorsch suggests four levels of immaturity
Level 0: Negligent– failure to allow processes
Level 1: Obstructive– counterproductive processes are imposed
Level 2: Contemptuous– disregard for good software engineering
Level 3: Undermining– total neglect of own charter
-11-
12. Is SPI for Everyone?
Can a small company initiate SPI activities and do it
successfully?
Answer: a qualified “yes”
It should come as no surprise that small organizations are
more informal, apply fewer standard practices, and tend to be
self-organizing.
SPI will be approved and implemented only after its proponents
demonstrate financial leverage.
12
13. The SPI Process—I
Five activities
Assessment and Gap Analysis
Assessment examines a wide range of actions and tasks that
will lead to a high quality process.
Consistency. Are important activities, actions and tasks
applied consistently across all software projects and by all
software teams?
Sophistication. Are management and technical actions
performed with a level of sophistication that implies a
thorough understanding of best practice?
Acceptance. Is the software process and software engineering
practice widely accepted by management and technical staff?
Commitment. Has management committed the resources
required to achieve consistency, sophistication and
acceptance?
Gap analysis—The difference between local application and
best practice represents a “gap” that offers opportunities
for improvement.
13
14. The SPI Process—II
Education and Training
Three types of education and training should be conducted:
Generic concepts and methods. Directed toward both managers and practitioners, this category
stresses both process and practice. The intent is to provide professionals with the intellectual tools
they need to apply the software process effectively and to make rational decisions about
improvements to the process.
Specific technology and tools. Directed primarily toward practitioners, this category stresses
technologies and tools that have been adopted for local use. For example, if UML has been chosen for
analysis and design modeling, a training curriculum for software engineering using UML would be
established.
Business communication and quality-related topics. Directed toward all stakeholders, this category
focuses on “soft” topics that help enable better communication among stakeholders and foster a
greater quality focus.
14
15. The SPI Process—III
Selection and Justification
choose the process model (Chapters 2 and 3) that best fits your organization, its
stakeholders, and the software that you build
decide on the set of framework activities that will be applied, the major work
products that will be produced and the quality assurance checkpoints that will
enable your team to assess progress
develop a work breakdown for each framework activity (e.g., modeling),
defining the task set that would be applied for a typical project
Once a choice is made, time and money must be expended to install it within
an organization and these resource expenditures should be justified.
15
16. The SPI Process—IV
Installation/Migration
actually software process redesign (SPR) activities. Scacchi [Sca00] states
that “SPR is concerned with identification, application, and refinement
of new ways to dramatically improve and transform software
processes.”
three different process models are considered:
the existing (“as-is”) process,
a transitional (“here-to-there”) process, and
the target (“to be”) process.
16
17. The SPI Process—V
Evaluation
assesses the degree to which changes have been instantiated and adopted,
the degree to which such changes result in better software quality or other
tangible process benefits, and
the overall status of the process and the organizational culture as SPI activities
proceed
From a qualitative point of view, past management and practitioner
attitudes about the software process can be compared to attitudes polled
after installation of process changes.
17
18. Risk Management for SPI
manage risk at three key points in the SPI process [Sta97b]:
prior to the initiation of the SPI roadmap,
during the execution of SPI activities (assessment, education,
selection, installation), and
during the evaluation activity that follows the instantiation of
some process characteristic.
In general, the following categories [Sta97b] can be identified
for SPI risk factors:
budget and cost
content and deliverables culture
maintenance of SPI deliverables
mission and goals
organizational management and organizational stability
process stakeholders
schedule for SPI development
SPI development environment and process
SPI project management and SPI staff
18
19. Critical Success Factors
The top five CSFs are [Ste99]:
Management commitment and support
Staff involvement
Process integration and understanding
A customized SPI strategy
A customized SPI strategy
19
21. The CMMI model
An integrated capability model that includes software and
systems engineering capability assessment.
The model has two instantiations
Staged where the model is expressed in terms of capability levels;
Continuous where a capability rating is computed.
22. SEI Capability < 1% Optimizing
Maturity Model
Process Control
2-3% Managed
Process Measurement
20% Defined
Process Definition
30% Repeatable
Basic Management Control
45% Initial
23. CMM - Initial (Level 1)
• The software process is characterized as ad hoc,
occasionally even chaotic
• Few processes are defined
• Success depends on individual effort and heroics
“BASICALLY NO CONTROL”
24. CMM - Repeatable (Level 2)
• Basic project management processes are
established to track cost, schedule, and
functionality
• The necessary process discipline is in place to
repeat earlier successes on projects with similar
applications
• Success achieved through basic project
management; not advanced technologies
“BASIC MANAGEMENT CONTROL”
25. CMM - Defined (Level 3)
• The software process for both management and
engineering activities is documented,
standardized, and integrated into a standard
software process for the organization
• All projects use an approved, tailored version of
the organization’s standard software process for
developing and maintaining software
• Formality lends itself to improvement
“PROCESS DEFINITION”
26. CMM - Managed (Level 4)
• Detailed measures of the software process and
product quality are collected
• Both the software process and products are
quantitatively understood and controlled
• A software metrics program is in use
“PROCESS MEASUREMENT”
27. CMM - Optimizing (Level 5)
• Continuous process improvement is enabled by
quantitative (metrics) feedback from the process
• Continuous process improvement is enabled by piloting
innovative ideas and technologies
“PROCESS CONTROL”
28. The continuous CMMI model
This is a finer-grain model that considers individual or groups of
practices and assesses their use.
The maturity assessment is not a single value but is a set of values
showing the organisations maturity in each area.
The CMMI rates each process area from levels 1 to 5.
The advantage of a continuous approach is that organisations can
pick and choose process areas to improve according to their local
needs.
29. CMMI model components
Process areas
24 process areas that are relevant to process capability and improvement
are identified. These are organised into 4 groups.
Goals
Goals are descriptions of desirable organisational states. Each process area
has associated goals.
Practices
Practices are ways of achieving a goal - however, they are advisory and
other approaches to achieve the goal may be used.
33. CMMI practices
Practice Associated goal
Analyse derived requirements to ensure that they are The requirements are analysed and
necessary and sufficient validated and a definition of the
required functionality is developed.
Validate requirements to ensure that the resulting
product will perform as intended in the userÕs
environment using multiple techniques as
appropriate.
Select the defects and other problems for analysis. Root causes of defects and other
problems are systematically determined.
Perform causal analysis of selected defects and other
problems and propose actions to address them.
Establish and maintain an organisational policy for The process is institutionalised as a
planning and performing the requirements defined process.
development process.
Assign responsibility and authority for performing
the process, developing the work products and
providing the services of the requirements
development process.
34. CMMI assessment
Examines the processes used in an organisation and assesses
their maturity in each process area.
Based on a 6-point scale:
Not performed;
Performed;
Managed;
Defined;
Quantitatively managed;
Optimizing.
35. A process capability profile
Project m onitoring
and control
Supplier ag reem ent
m anagem ent
Risk
m anagem ent
Configuration
m anagem ent
Requirem ents
m anagem ent
Verification
Valid ation
1 2 3 4 5
36. The People CMM
“a roadmap for implementing workforce practices that
continuously improve the capability of an organization’s
workforce.” [Cur02]
defines a set of five organizational maturity levels that provide
an indication of the relative sophistication of workforce
practices and processes
36
38. Other SPI Frameworks
SPICE— a international initiative to support the International Standard
ISO/IEC 15504 for (Software) Process Assessment [ISO08]
Bootstrap—a SPI framework for small and medium sized organizations
that conforms to SPICE [Boo06],
PSP and TSP—individual and team specific SPI frameworks ([Hum97],
[Hum00]) that focus on process in-the-small, a more rigorous approach to
software development coupled with measurement
TickIT—an auditing method [Tic05] that assesses an organization
compliance to ISO Standard 9001:2000
38
39. SPI Return on Investment
“How do I know that we’ll achieve a reasonable return for the money
we’re spending?”
ROI = [S (benefits) – S (costs)] / S (costs)] X 100%
where
benefits include the cost savings associated with higher product quality (fewer
defects), less rework, reduced effort associated with changes, and the income that
accrues from shorter time-to-market.
costs include both direct SPI costs (e.g., training, measurement) and indirect costs
associated with greater emphasis on quality control and change management
activities and more rigorous application of software engineering methods (e.g., the
creation of a design model).
39
40. SPI Trends
future SPI frameworks must become significantly more agile
Rather than an organizational focus (that can take years to complete
successfully), contemporary SPI efforts should focus on the project level
To achieve meaningful results (even at the project level) in a short time
frame, complex framework models may give way to simpler models.
Rather than dozens of key practices and hundreds of supplementary
practices, an agile SPI framework should emphasize only a few pivotal
practices
40
41. SPI AFTERTHOUGHTS
“...according to the SEI model, Apple Computer
should not exist.” Tom DeMarco
Small organizations may not be able to afford the
overhead required by an SEI-type model
You can’t skip levels
It takes time (2 to 3 years/level) to move from one
level to the next
Not many organizations are beyond Level 1
New organizations are unlikely to start at Level 3
Levels are important in some contracts
43. Requirements engineering
The process of establishing the services that the customer
requires from a system and the constraints under which it
operates and is developed.
The requirements themselves are the descriptions of the system
services and constraints that are generated during the
requirements engineering process.
44. Functional and non-functional requirements
Functional requirements
Statements of services the system should provide, how the system should
react to particular inputs and how the system should behave in particular
situations.
Non-functional requirements
constraints on the services or functions offered by the system such as
timing constraints, constraints on the development process, standards,
etc.
Domain requirements
Requirements that come from the application domain of the system and
that reflect characteristics of that domain.
46. Requirements Engineering-II
Inception—ask a set of questions that establish …
basic understanding of the problem
the people who want a solution
the nature of the solution that is desired, and
the effectiveness of preliminary communication and collaboration
between the customer and the developer
Elicitation—elicit requirements from all stakeholders
Elaboration—create an analysis model that identifies data,
function and behavioral requirements
Negotiation—agree on a deliverable system that is realistic for
developers and customers
47. Requirements Engineering-III
Specification—can be any one (or more) of the following:
A written document
A set of models
A formal mathematical
A collection of user scenarios (use-cases)
A prototype
Validation—a review mechanism that looks for
errors in content or interpretation
areas where clarification may be required
missing information
inconsistencies (a major problem when large products or systems are
engineered)
conflicting or unrealistic (unachievable) requirements.
Requirements management
48. Inception
Identify stakeholders
“who else do you think I should talk to?”
Recognize multiple points of view
Work toward collaboration
The first questions
Who is behind the request for this work?
Who will use the solution?
What will be the economic benefit of a successful
solution
Is there another source for the solution that you need?
49. Eliciting Requirements
meetings are conducted and attended by both software engineers and customers
rules for preparation and participation are established
an agenda is suggested
a "facilitator" (can be a customer, a developer, or an outsider) controls the meeting
a "definition mechanism" (can be work sheets, flip charts, or wall stickers or an
electronic bulletin board, chat room or virtual forum) is used
the goal is
to identify the problem: objects, services, constraints, performance, mini specifications,
issues list
propose elements of the solution
negotiate different approaches, and
specify a preliminary set of solution requirements
50. Eliciting Requirements
Conduct FA ST
m eet ings
Make list s of
f unct ions, class es
Make list s of
c onst raint s, et c.
f orm al priorit iz at ion?
Eli c i t re q uire m en t s
yes no
Use QFD t o inf orm ally def ine act ors
priorit ize priorit ize
requirem ent s requirem ent s
draw use-c as e
writ e scenario
diagram
Creat e Use-c as es
com plet e t em plat e
51. Quality Function Deployment
Function deployment determines the “value” (as
perceived by the customer) of each function
required of the system
Normal, Expected and Exciting Requirements
Information deployment identifies data objects and
events
Task deployment examines the behavior of the
system
Value analysis determines the relative priority of
requirements
Customer Voice Table
52. Elicitation Work Products
a statement of need and feasibility.
a bounded statement of scope for the system or product.
a list of customers, users, and other stakeholders who
participated in requirements elicitation
a description of the system’s technical environment.
a list of requirements (preferably organized by function) and
the domain constraints that apply to each.
a set of usage scenarios that provide insight into the use of
the system or product under different operating conditions.
any prototypes developed to better define requirements.
53. Building the Analysis Model
Elements of the analysis model
Scenario-based elements
Functional—processing narratives for
software functions
Use-case—descriptions of the interaction
between an “actor” and the system
Class-based elements
Implied by scenarios
Behavioral elements
State diagram
Flow-oriented elements
Data flow diagram
54. Use-Cases
A collection of user scenarios that describe the thread of usage of a system
Each scenario is described from the point-of-view of an “actor”—a person
or device that interacts with the software in some way
Each scenario answers the following questions:
Who is the primary actor, the secondary actor (s)?
What are the actor’s goals?
What preconditions should exist before the story begins?
What main tasks or functions are performed by the actor?
What extensions might be considered as the story is described?
What variations in the actor’s interaction are possible?
What system information will the actor acquire, produce, or change?
Will the actor have to inform the system about changes in the external
environment?
What information does the actor desire from the system?
Does the actor wish to be informed about unexpected changes?
55. Use-Case Diagram
Arms/ disarms
syst em
Accesses syst em sensors
via Int ernet
homeow ner
Responds t o
alarm event
Encount ers an
error condit ion
syst em Reconf igures sensors
administ rat or and relat ed
syst em f eat ures
56. Class Diagram
From the SafeHome system …
Sensor
name/id
type
location
area
characteristics
identify()
enable()
disable()
reconfigure ()
57. State Diagram
Reading
Commands
State name
System status = “ready”
Display msg = “enter cmd”
Display status = steady
State variables
Entry/subsystems ready
Do: poll user input panel
Do: read user input
Do: interpret user input State activities
58. Analysis Patterns
Pattern name: A descriptor that captures the essence of the pattern.
Intent: Describes what the pattern accomplishes or represents
Motivation: A scenario that illustrates how the pattern can be used to address the
problem.
Forces and context: A description of external issues (forces) that can affect how
the pattern is used and also the external issues that will be resolved when the
pattern is applied.
Solution: A description of how the pattern is applied to solve the problem with an
emphasis on structural and behavioral issues.
Consequences: Addresses what happens when the pattern is applied and what
trade-offs exist during its application.
Design: Discusses how the analysis pattern can be achieved through the use of
known design patterns.
Known uses: Examples of uses within actual systems.
Related patterns: On e or more analysis patterns that are related to the named
pattern because (1) it is commonly used with the named pattern; (2) it is
structurally similar to the named pattern; (3) it is a variation of the named pattern.
59. Negotiating Requirements
Identify the key stakeholders
These are the people who will be involved in the negotiation
Determine each of the stakeholders “win conditions”
Win conditions are not always obvious
Negotiate
Work toward a set of requirements that lead to “win-win”
60. Validating Requirements - I
Is each requirement consistent with the overall objective for the
system/product?
Have all requirements been specified at the proper level of
abstraction? That is, do some requirements provide a level of
technical detail that is inappropriate at this stage?
Is the requirement really necessary or does it represent an add-on
feature that may not be essential to the objective of the system?
Is each requirement bounded and unambiguous?
Does each requirement have attribution? That is, is a source
(generally, a specific individual) noted for each requirement?
Do any requirements conflict with other requirements?
61.
Validating Requirements - II
Is each requirement achievable in the technical environment that will house the system
or product?
Is each requirement testable, once implemented?
Does the requirements model properly reflect the information, function and behavior
of the system to be built.
Has the requirements model been “partitioned” in a way that exposes progressively
more detailed information about the system.
Have requirements patterns been used to simplify the requirements model. Have all
patterns been properly validated? Are all patterns consistent with customer
requirements?