This short Lighting Talk introduces the Running Tested Feature (RTF) metric, a wonderfully useful metric that's easy to collect and promotes agility. It provides examples of RTF when development has steady progress and when SW breaks. This talk also discusses what happens when people try to game the RTF metric.
The Running Tested Features metric provides developers, managers and customers alike with a clear, unambiguous gauge of real software development progress. Usable on any kind of development project, RTF’s focus on outcome instead of process makes RTF especially fit for Agile projects. Because RTF can be used with both Agile and Waterfall projects, RTF makes an excellent progress metric for teams transitioning to Agile.
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Promoting Agility with Running Tested Features - Lightening Talk
1. Promo%ng
Agility
with
Running
Tested
Features
Camille
Bell
Agile
Coach
cbell@CamilleBellConsul0ng.com
Twi5er
@agilecamille
B’more
On
Rails
Lightening
Talk
March
8,
2011
2. Running
Tested
Features
Defini9on
1. The
desired
so,ware
is
broken
down
into
named
features
(requirements,
stories),
which
are
part
of
what
it
means
to
deliver
the
desired
system.
2. For
each
named
feature,
there
are
one
or
more
automated
acceptance
tests
which,
when
they
work,
will
show
that
the
feature
in
quesAon
is
implemented.
3. The
RTF
metric
shows,
at
every
moment
in
the
project,
how
many
features
are
passing
all
their
acceptance
tests.
Ron
Jeffries,
A
Metric
Leading
to
Agility
3. Calcula9ng
RTF
is
Simple
• No
weigh9ng
-‐
No
par9als
credit:
A
Feature
is
either
all
running
&
passing
all
tests
or
it
isn’t.
• So
a
feature
either
counts
as
0
or
1.
• Add
up
the
1s.
• Run
all
tests
for
each
measurement.
• Track
over
9me.
4. RTF
Works
on
any
Project,
even
Waterfall
RTF
on
Agile
soQware
development
projects
should
steadily
increase
over
9me
RTF
on
Waterfall
development
only
increases
at
project
end
5. Example
1:
Buggy
New
Feature
Breaks
Old
Monday
6
running
tested
features.
Tuesday
no
new
features
added,
but
exis9ng
features
s9ll
pass
tests.
On
Wednesday,
new
feature
added
which
fails
test
and
causes
side
effect
that
causes
pre-‐exis9ng
feature
to
also
fail
tests.
On
Thursday,
new
tests
added
to
old
feature
and
new
feature
fixed.
All
tests
pass.
6. Example
2:
New
Feature
Breaks
Everything
Monday
6
running
tested
features.
Tuesday
no
new
features
added,
but
exis9ng
features
s9ll
pass
tests.
On
Wednesday,
new
feature
added
which
breaks
the
build,
crashes
system,
corrupts
DB
or
something
similarly
catastrophic.
On
Thursday,
new
tests
added
and
new
feature
fixed.
All
tests
pass.
7. • Don’t
lie
–
RTF
doesn’t
count
unless
– It’s
a
user
feature
– Integrated
and
running
– Has
automated
tests
and
passing
all
of
them
• Many
developers
don’t
test
very
well
– Don’t
think
beyond
the
happy
path
– Don’t
think
about
nega9ve,
boundary,
data
tes9ng,
performance,
etc.
– Improves
with
educa9on
and
experience
Cau9ons:
Gaming
the
RTF
8. • Adding
automated
tests
to
exis9ng
untested
features
• Breaking
larger
user
stories
into
lots
of
small
stories
• Integra9ng
earlier
• Tes9ng
earlier
• Shortening
itera9ons
Other
RTF
Gaming
Is
OK
9. Camille
Bell
Agile
Technical
Coaching
Scrum,
XP,
Kanban
&
Lean
Consul9ng
Retrospec9ves
Agile
Boot
Camps
Agile
Training
or
just
to
chat
about
things
agile
cbell@CamilleBellConsul0ng.com