SlideShare ist ein Scribd-Unternehmen logo
1 von 26
Downloaden Sie, um offline zu lesen
Prioritizing Verification and Validation
  Resources on Mission and Safety Critical
IV&V Facility
                    Software




Kenneth A. Costello, NASA IV&V Program
                                                1
Discussion Outline
IV&V Facility


• Quick Overview of the NASA Approach to IV&V
      – What is IV&V?
      – Objectives
      – Goal
• NASA Approach to Determining IV&V Tasking
      – Software Integrity Level Assessment Process
      – Developing the task list
• Dealing with IV&V Implementation Constraints
      – Managing Risk with SILAP
      – Slicing the Data
      – Data views
• Future


                                                      2
IV&V Facility
                     What is IV&V at NASA?
•   An engineering discipline employing rigorous methods for evaluating
    the correctness and quality of the software product throughout the
    software life cycle from a system level viewpoint
•   The NASA Software IV&V approach covers not only expected operating
    conditions but the full spectrum of the system and its interfaces in the
    face of unexpected operating conditions or inputs
•   Three nominal aspects are technical, managerial and financial
      – Technical – The IV&V Program has its own approach to determining where
        to focus analysis activities
      – Managerial – The IV&V Program is functionally managed by the NASA
        Office of Safety and Mission Assurance while project level management
        comes form the IV&V Program
      – Financial – The IV&V Program is funded from NASA Corporate General &
        Administrative funds for Agency identified high priority Projects


                                                                                 3
IV&V Facility
                A Lifecycle Approach
• The NASA approach is not end of life cycle testing
• It is a testing approach
• Each task that is performed “tests” a development
  artifact ranging from system and software
  requirements to source code and test results
• Each task in the NASA IV&V Work Breakdown
  Structure is designed to “test” a development
  artifact or process
• Testing is throughout the life cycle


                                                       4
IV&V Facility
                 Objectives of IV&V
• To find defects within the system with a focus on
  software and its interactions with the system
• To make an assessment of whether or not the
  system is usable in an operational environment,
  again with a focus on the software within the
  system
• To identify any latent risks associated with the
  software



                                                      5
IV&V Facility
                      Ultimate Goal of IV&V
      Establish confidence that the software is fit
      for its purpose within the context of the
      system
• Note that the software may not be free from defects
     – That is rarely the case and can be difficult to prove
• However, the software must be sufficient for its intended
  use
     – The system should be as described by the requirements
     – The requirements should be representative of the user’s needs
• The type of use will determine the level of confidence that is
  needed
     – The consequence of software defect/failure will drive the needed
       level of confidence

                                                                          6
IV&V Facility




 Approach to Determining IV&V
            Tasking



                            7
Software Integrity Level Assessment Process
IV&V Facility


• The NASA Software Integrity Level Assessment Process
  (SILAP)
      – Combination of the best of the current processes and best practices
        from industry and academia
      – The process has been used on over 15 projects and executed
        multiple times with excellent results
• Process supports two key goals
      – Provides the NASA IV&V Program with a consistent risk-based
        approach to determining IV&V tasking
      – An extremely effective communication tool with the development
        projects




                                                                          8
SILAP Overview
IV&V Facility


• Measure of software goodness
     – Software integrity level
          • How good (dependable, reliable, etc.) does a software
            component need to be in terms of its role in the system
             – Understand how the component fits within and
               impacts the system
             – Understand what is required of that component to
               be able to maintain the functionality of the system
     – Software integrity level is equivalent to the tolerable level
       of risk that is associated with the system.
          • A software component can be associated with risk because
                – a failure (or defect) can lead to a threat, or
                – its functionality includes mitigation of consequences of initiating
                  events in the system’s environment that can lead to a threat

                                                                                        9
SILAP Overview (2)
IV&V Facility

• Determine overall needed integrity level
• Impact of a defect
     – Determine the worse case error (at the software component level)
       that has a reasonable or credible fault/failure scenario
     – Consider the system architecture and try to understand how that
       software fault/failure scenario may affect the system
     – This determines the Consequence of the defect
•   Probability that the developer may insert an error into a software
    component is determined
      – Not an assessment of the probability of a failure, but rather the probability
        that an error of any type may exist in the operational software
      – This determination is known as the Error Potential




                                                                                        10
SILAP Overview (3)
IV&V Facility


• These two factors provided a risk-like look at the software
  components
      – However, there is no relationship defined between Consequence
        and Error Potential that results in a risk level
      – Rather Consequence and EP are addressed independently of each
        other as will be shown later
• This risk-based approach is the prime reason that
  communications with development projects have grown
  better
• The risk-based approach also allows the process to be
  used to define different levels of risk reduction



                                                                    11
Consequence
IV&V Facility


•   Consequence consists of the following items
      – Human Safety – This is a measure of the impact that a failure of this
        component would have on human life
      – Asset Safety – This is a measure of the impact that a failure would have on
        hardware
      – Performance – This is a measure of the impact that a failure would have on
        a mission being able to meet its goals




                                                                                  12
Error Potential
IV&V Facility


•   Error Potential consists of the following items
      – Developer Characteristics
           • Experience – This is a measure of the system developer’s experience in developing similar
             systems
           • Organization – This is a measure of the complexity of the organization developing the
             system (distance and number of organizations involved tend to increase the probability of
             errors being introduced into the system)
      – Software/System Characteristics
           • Difficulty – This is a measure of the complexity of the software being developed
           • Degree of Innovation – This is a measure of the level of innovation needed in order to
             develop this system/software
           • System Size – This is a measurement of the size of the system in terms of the software (i.e.,
             Source Lines of Code)
      – Development Process Characteristics
           • Formality of the Process – This is a measure of how maturity of the developer’s processes
           • Re-use Approach – This is a measure of the level of re-use for the system/software
           • Artifact Maturity – This is a measure of the current state of the development documentation
             in relation to the state of the overall development project (i.e., the is past critical design
             review but the requirements documents are still full of TBDs and incompletes)


                                                                                                          13
Developing Tasking
IV&V Facility


•   Once the scoring is complete a tasking set is developed based on each
    individual score
      – There is a set of tasking associated with a given Consequence score
      – There is a set of tasking associated with a given Error Potential score
•   The tasks are then combined into one set of tasks for that component
      – The tasks are not exclusive to a given score, i.e., you may have a task that
        shows up for the Consequence score and for the Error Potential score
•   This results in a matrix of software components and scores that
    provides the starting set of requirements or tasks for IV&V on that
    project




                                                                                   14
SILAP Process
IV&V Facility



• More specific details of the SILAP process can be
  found at the NASA IV&V Program IV&V
  Management System (IMS) website
• http://ims.ivv.nasa.gov
• Look under the documentation link for Work
  Instruction 9-8-1
      – At the time of the writing of this presentation the work
        instruction had not been approved for publishing




                                                                   15
IV&V Facility




     Dealing with Implementation
              Constraints



                                   16
Managing Risk With SILAP
IV&V Facility


• The SILAP has been used effectively on many different
  types of projects and across several iterations
      – The process is easy to use and generates data that is
        understandable on many levels, i.e., technical and managerial
      – The data can also be grouped to provide different views of the risk
        associated with a system
      – These different views provides differing level of risk reduction but
        can still be tuned to meet the goals and objectives of the IV&V
        Program




                                                                               17
Managing Risk With SILAP (2)
IV&V Facility


•   As an example, the SILAP data can provide viewpoints that allow for
    differing levels of financial commitment but still meet the objectives of
    the IV&V program (although at the most minimum level)
      – The goal in developing the viewpoints was to always keen in mind a desire
        to meet our primary objective but with a funding constraint
      – Several other options to dealing with a funding constraint were also
        explored
      – However the approach using SILAP provided the most consistency in
        application and management




                                                                                18
Managing Risk with SILAP (3)
IV&V Facility


•   The approach to developing the viewpoints was to start with one of the
    underlying principles in SILAP; that the identified software components
    could be ranked in a priority fashion
      – Items of high consequence and error potential were more important than
        those of lower values
•   The foundation of the process is determining the consequence of a
    defect in a software component
      – Performing IV&V on items of high consequence is generally more important
        than performing IV&V on items of higher error potential
      – Also keep in mind that the tasking associated with each factor,
        consequence and error potential, are disjoint and separate
•   So with this in mind, not only can components be ranked by a
    combination of the two scores, but also by each score individually



                                                                                 19
Slicing the Data
IV&V Facility


•   Consequence is the foundation
      – It is important to examine software components with high consequence
        irrespective of their error potential
      – It is less important to examine low consequence components irrespective of
        their error potential
•   Several ranking approaches were developed
•   To understand these approaches, there was also a need to understand
    some of the details in the criteria for scoring consequence
•   In the table on the next page, the shaded areas represent scores of
    strong consequence
      – In other words, serious or greater injury, permanent loss of an important
        part of the spacecraft, or a permanent loss of primary mission data




                                                                                    20
Consequence Criteria
IV&V Facility


                                                                               Consequence

                            1
                                                        2                        3                             4                              5

Human Safety   No impact to human           Discomfort or nuisance    Minor injury or potential   Major injury or potential    Loss of life
               safety                                                 for minor injury            for major injury


Asset Safety   No damage to any s/c         Temporary loss to a s/c   Permanent loss of non-      Permanent loss of            Complete loss of
               component                    component                 primary component           primary component            vehicle/spacecraft/system


               Short-term partial loss of   Partial loss of other     Complete loss of other
               other critical asset         critical asset            critical asset

                                            Degradation of a non-     Degradation of a
                                            primary component         primary component
Performance                                 Unable to achieve non-    Unable to achieve a         Unable to achieve            Unable to achieve minimum
                                            primary                   particular primary          multiple mission/science     mission/science objectives or
                                            mission/science           mission/science             objectives                   minimum success criteria
                                            objective                 objective

               Unrecoverable loss of                                  Unrecoverable loss of
               non-primary                                            primary
               mission/science data                                   missions/science data

                                            Loss of non-primary                                   Loss of primary capability
                                            capability




                                                                                                                                                               21
Slicing the Data (2)
IV&V Facility


•   Understanding this breakdown between strong and weak
    consequences creates a basis for further refining an IV&V approach
•   With the primary objective being to provide confidence that the mission
    will succeed, we can use the strong consequences as delineators in
    this definition
      – That is to say, the minimal level of confidence in a mission is defined by
        saying that it will not cause a major injury or worse, destroy a major
        spacecraft component or the entire spacecraft, or cause the permanent loss
        of data or a complete inability to meet one or more mission objectives.
•   From this statement several viewpoints were developed based on
    differing rankings of consequence and error potential




                                                                                22
Data Views
IV&V Facility

•   Each view brings a different level of risk reduction but are easily determined
    simply by sorting the scores for components
      –   Consequence >2 along with EP tasking
      –   Consequence > 2 with no EP tasking
      –   No EP tasking
      –   Performance > 2, Asset safety > 3
      –   Performance > 2, Asset safety > 3 with no EP tasking
•   Note that Performance and Asset safety are called out
      – Creates a viewpoint based on the components of Consequence
      – This was needed due to the weighting scheme used; combinations of scores exist
        that may place the individual component in the shaded area on the previous chart,
        but the overall resulting Consequence score may be a 2
      – Also note that for the purpose of this example human safety score were not included
•   For this presentation, a generic project was developed and the following
    potential cost savings were generated based on these scores
•   The project had 44 identified software components and the savings are based
    on the average cost of performing a given task from the WBS

                                                                                          23
Bands of Potential Funding
IV&V Facility


                                                              •   The cost reduction is
                             % Savings   #of components out
                                                                  taken against the nominal
Full Tasking                     -               0                cost for doing all of the
                                                                  tasking
PF > 2, AS > 3                 13%              11
                                                              •   The components out
EP out                         39%               0
                                                                  column shows the number
PF > 2, AS > 3 and EP out      42%              11                of components removed
                                                                  form the identified 44
Consequence > 2                40%              29


Consequence > 2 and EP out     50%              29



•     In each of these cases, our current belief is that we can still meet the
      desire of NASA in providing confidence in the most important portions
      of the mission
•     It is not the best possible risk reduction, but it meets the minimal
      needs of NASA
                                                                                         24
The Future of SILAP
IV&V Facility


•   The SILAP is the planned approach for determining IV&V tasking for the
    future
•   Further investigations are also underway to examine the scoring and
    document and analyze any trends that can be found
•   A tool is currently being developed to allow for the easy capture and
    analysis of the scoring data for each individual software component
•   Other work is also ongoing to determine how to integrate the results of
    SILAP into other views of the systems using various modeling
    techniques
      – The goal is to provide even greater focus on the critical components of the
        system and show how they relate to the success or failure of the system
      – Provide a more detailed approach to performing validation of software using
        a model based analysis approach



                                                                                 25
Questions?
IV&V Facility


•   If you have any further questions or comments you can contact

Dr. Butch Caffall, IV&V Program Manager, Director IV&V Facility
NASA IV&V Facility
304 367 8201
Butch.Caffall@nasa.gov


Kenneth Costello, IV&V Program Chief Engineer
NASA IV&V Facility
304 367 8343
Kenneth.A.Costello@nasa.gov


Christina Moats, IV&V Planning and Scoping Lead
NASA IV&V Facility
304 367 8340
Christina.D.Moats@nasa.gov



                                                                    26

Weitere ähnliche Inhalte

Was ist angesagt?

David.oberhettinger
David.oberhettingerDavid.oberhettinger
David.oberhettingerNASAPMC
 
Snow lee
Snow leeSnow lee
Snow leeNASAPMC
 
Armstrong
ArmstrongArmstrong
ArmstrongNASAPMC
 
Vonnie simonsen
Vonnie simonsenVonnie simonsen
Vonnie simonsenNASAPMC
 
Lau.cheevon
Lau.cheevonLau.cheevon
Lau.cheevonNASAPMC
 
Stefanini.trinh
Stefanini.trinhStefanini.trinh
Stefanini.trinhNASAPMC
 
D brown gbezos_shirshorn
D brown gbezos_shirshornD brown gbezos_shirshorn
D brown gbezos_shirshornNASAPMC
 
Hazen michael
Hazen michaelHazen michael
Hazen michaelNASAPMC
 
Jim.free
Jim.freeJim.free
Jim.freeNASAPMC
 
Mullane stanley-hamilton-wise
Mullane stanley-hamilton-wiseMullane stanley-hamilton-wise
Mullane stanley-hamilton-wiseNASAPMC
 
Kapruch steve
Kapruch steveKapruch steve
Kapruch steveNASAPMC
 
Hughitt brian
Hughitt brianHughitt brian
Hughitt brianNASAPMC
 
Sally godfreyheatherrarick
Sally godfreyheatherrarickSally godfreyheatherrarick
Sally godfreyheatherrarickNASAPMC
 
Lawrence.jim
Lawrence.jimLawrence.jim
Lawrence.jimNASAPMC
 
Kremic.tibor
Kremic.tiborKremic.tibor
Kremic.tiborNASAPMC
 
Sean carter dan_deans
Sean carter dan_deansSean carter dan_deans
Sean carter dan_deansNASAPMC
 
Thomas.mc vittie
Thomas.mc vittieThomas.mc vittie
Thomas.mc vittieNASAPMC
 
Dittemore.gary
Dittemore.garyDittemore.gary
Dittemore.garyNASAPMC
 
Barth simpkins
Barth simpkinsBarth simpkins
Barth simpkinsNASAPMC
 

Was ist angesagt? (20)

David.oberhettinger
David.oberhettingerDavid.oberhettinger
David.oberhettinger
 
Snow lee
Snow leeSnow lee
Snow lee
 
Armstrong
ArmstrongArmstrong
Armstrong
 
Vonnie simonsen
Vonnie simonsenVonnie simonsen
Vonnie simonsen
 
Lau.cheevon
Lau.cheevonLau.cheevon
Lau.cheevon
 
Stefanini.trinh
Stefanini.trinhStefanini.trinh
Stefanini.trinh
 
D brown gbezos_shirshorn
D brown gbezos_shirshornD brown gbezos_shirshorn
D brown gbezos_shirshorn
 
Housch
HouschHousch
Housch
 
Hazen michael
Hazen michaelHazen michael
Hazen michael
 
Jim.free
Jim.freeJim.free
Jim.free
 
Mullane stanley-hamilton-wise
Mullane stanley-hamilton-wiseMullane stanley-hamilton-wise
Mullane stanley-hamilton-wise
 
Kapruch steve
Kapruch steveKapruch steve
Kapruch steve
 
Hughitt brian
Hughitt brianHughitt brian
Hughitt brian
 
Sally godfreyheatherrarick
Sally godfreyheatherrarickSally godfreyheatherrarick
Sally godfreyheatherrarick
 
Lawrence.jim
Lawrence.jimLawrence.jim
Lawrence.jim
 
Kremic.tibor
Kremic.tiborKremic.tibor
Kremic.tibor
 
Sean carter dan_deans
Sean carter dan_deansSean carter dan_deans
Sean carter dan_deans
 
Thomas.mc vittie
Thomas.mc vittieThomas.mc vittie
Thomas.mc vittie
 
Dittemore.gary
Dittemore.garyDittemore.gary
Dittemore.gary
 
Barth simpkins
Barth simpkinsBarth simpkins
Barth simpkins
 

Andere mochten auch

K gallagher tosborne
K gallagher tosborneK gallagher tosborne
K gallagher tosborneNASAPMC
 
Mitchell.michael
Mitchell.michaelMitchell.michael
Mitchell.michaelNASAPMC
 
Bizier.brenda
Bizier.brendaBizier.brenda
Bizier.brendaNASAPMC
 
Sally.godfrey
Sally.godfrey Sally.godfrey
Sally.godfrey NASAPMC
 
Al.worden
Al.wordenAl.worden
Al.wordenNASAPMC
 
Richard.grammier
Richard.grammierRichard.grammier
Richard.grammierNASAPMC
 
T carlson mbontrager_wtippin_lsinger_v2
T carlson mbontrager_wtippin_lsinger_v2T carlson mbontrager_wtippin_lsinger_v2
T carlson mbontrager_wtippin_lsinger_v2NASAPMC
 
Chris.hardcastle
Chris.hardcastleChris.hardcastle
Chris.hardcastleNASAPMC
 
Lockwood.scott
Lockwood.scottLockwood.scott
Lockwood.scottNASAPMC
 
Kothari.d.mitchell.r
Kothari.d.mitchell.rKothari.d.mitchell.r
Kothari.d.mitchell.rNASAPMC
 
Jeff.robinson
Jeff.robinsonJeff.robinson
Jeff.robinsonNASAPMC
 
Ralph.basilio
Ralph.basilioRalph.basilio
Ralph.basilioNASAPMC
 
Fujieh.maura
Fujieh.mauraFujieh.maura
Fujieh.mauraNASAPMC
 
Gwyn.smith
Gwyn.smithGwyn.smith
Gwyn.smithNASAPMC
 
Zimmerman barbier
Zimmerman barbierZimmerman barbier
Zimmerman barbierNASAPMC
 
Michael.hazen
Michael.hazenMichael.hazen
Michael.hazenNASAPMC
 
Randy.lovell
Randy.lovellRandy.lovell
Randy.lovellNASAPMC
 
Jim.morrison
Jim.morrisonJim.morrison
Jim.morrisonNASAPMC
 

Andere mochten auch (20)

K gallagher tosborne
K gallagher tosborneK gallagher tosborne
K gallagher tosborne
 
Mitchell.michael
Mitchell.michaelMitchell.michael
Mitchell.michael
 
Bizier.brenda
Bizier.brendaBizier.brenda
Bizier.brenda
 
Sally.godfrey
Sally.godfrey Sally.godfrey
Sally.godfrey
 
Al.worden
Al.wordenAl.worden
Al.worden
 
Richard.grammier
Richard.grammierRichard.grammier
Richard.grammier
 
T carlson mbontrager_wtippin_lsinger_v2
T carlson mbontrager_wtippin_lsinger_v2T carlson mbontrager_wtippin_lsinger_v2
T carlson mbontrager_wtippin_lsinger_v2
 
Chris.hardcastle
Chris.hardcastleChris.hardcastle
Chris.hardcastle
 
Lockwood.scott
Lockwood.scottLockwood.scott
Lockwood.scott
 
Kothari.d.mitchell.r
Kothari.d.mitchell.rKothari.d.mitchell.r
Kothari.d.mitchell.r
 
Jeff.robinson
Jeff.robinsonJeff.robinson
Jeff.robinson
 
Ralph.basilio
Ralph.basilioRalph.basilio
Ralph.basilio
 
Sparrow
SparrowSparrow
Sparrow
 
Fujieh.maura
Fujieh.mauraFujieh.maura
Fujieh.maura
 
Gwyn.smith
Gwyn.smithGwyn.smith
Gwyn.smith
 
Zimmerman barbier
Zimmerman barbierZimmerman barbier
Zimmerman barbier
 
Michael.hazen
Michael.hazenMichael.hazen
Michael.hazen
 
Randy.lovell
Randy.lovellRandy.lovell
Randy.lovell
 
Pisano
PisanoPisano
Pisano
 
Jim.morrison
Jim.morrisonJim.morrison
Jim.morrison
 

Ähnlich wie Costello kenneth

Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assuranceAman Adhikari
 
SENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptxSENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptxMinsasWorld
 
Softwarequalityassurance with Abu ul hassan Sahadvi
Softwarequalityassurance with Abu ul hassan SahadviSoftwarequalityassurance with Abu ul hassan Sahadvi
Softwarequalityassurance with Abu ul hassan SahadviAbuulHassan2
 
Overview of Software QA and What is Software Quality
Overview of Software QA and What is Software QualityOverview of Software QA and What is Software Quality
Overview of Software QA and What is Software QualityUniversity of Dhaka
 
Software engineering quality assurance and testing
Software engineering quality assurance and testingSoftware engineering quality assurance and testing
Software engineering quality assurance and testingBipul Roy Bpl
 
Ch1 preliminaries
Ch1 preliminariesCh1 preliminaries
Ch1 preliminariesRonak Patel
 
IT8076 – Software Testing Intro
IT8076 – Software Testing IntroIT8076 – Software Testing Intro
IT8076 – Software Testing IntroJohnSamuel280314
 
Building a software testing environment
Building a software testing environmentBuilding a software testing environment
Building a software testing environmentHimanshu
 
software testing and quality assurance .pdf
software testing and quality assurance .pdfsoftware testing and quality assurance .pdf
software testing and quality assurance .pdfMUSAIDRIS15
 
Software engineering 23 software reliability
Software engineering 23 software reliabilitySoftware engineering 23 software reliability
Software engineering 23 software reliabilityVaibhav Khanna
 
IEEE 1633 Recommended Practices for Reliable Software
IEEE 1633 Recommended Practices for Reliable SoftwareIEEE 1633 Recommended Practices for Reliable Software
IEEE 1633 Recommended Practices for Reliable SoftwareAnn Marie Neufelder
 
software Engineering process
software Engineering processsoftware Engineering process
software Engineering processRaheel Aslam
 

Ähnlich wie Costello kenneth (20)

Software testing introduction
Software testing  introductionSoftware testing  introduction
Software testing introduction
 
Sslc
SslcSslc
Sslc
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurance
 
SENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptxSENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptx
 
Softwarequalityassurance with Abu ul hassan Sahadvi
Softwarequalityassurance with Abu ul hassan SahadviSoftwarequalityassurance with Abu ul hassan Sahadvi
Softwarequalityassurance with Abu ul hassan Sahadvi
 
Overview of Software QA and What is Software Quality
Overview of Software QA and What is Software QualityOverview of Software QA and What is Software Quality
Overview of Software QA and What is Software Quality
 
Software engineering quality assurance and testing
Software engineering quality assurance and testingSoftware engineering quality assurance and testing
Software engineering quality assurance and testing
 
IV&V Cox Overview
IV&V Cox OverviewIV&V Cox Overview
IV&V Cox Overview
 
Ch1 preliminaries
Ch1 preliminariesCh1 preliminaries
Ch1 preliminaries
 
IT8076 – Software Testing Intro
IT8076 – Software Testing IntroIT8076 – Software Testing Intro
IT8076 – Software Testing Intro
 
Building a software testing environment
Building a software testing environmentBuilding a software testing environment
Building a software testing environment
 
software testing and quality assurance .pdf
software testing and quality assurance .pdfsoftware testing and quality assurance .pdf
software testing and quality assurance .pdf
 
UNIT-I.pptx
UNIT-I.pptxUNIT-I.pptx
UNIT-I.pptx
 
Hemachandra_s
Hemachandra_sHemachandra_s
Hemachandra_s
 
UNIT 1.pptx
UNIT 1.pptxUNIT 1.pptx
UNIT 1.pptx
 
Software engineering 23 software reliability
Software engineering 23 software reliabilitySoftware engineering 23 software reliability
Software engineering 23 software reliability
 
IEEE 1633 Recommended Practices for Reliable Software
IEEE 1633 Recommended Practices for Reliable SoftwareIEEE 1633 Recommended Practices for Reliable Software
IEEE 1633 Recommended Practices for Reliable Software
 
Software Process
Software ProcessSoftware Process
Software Process
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing ppt
 
software Engineering process
software Engineering processsoftware Engineering process
software Engineering process
 

Mehr von NASAPMC

Bejmuk bo
Bejmuk boBejmuk bo
Bejmuk boNASAPMC
 
Baniszewski john
Baniszewski johnBaniszewski john
Baniszewski johnNASAPMC
 
Yew manson
Yew mansonYew manson
Yew mansonNASAPMC
 
Wood frank
Wood frankWood frank
Wood frankNASAPMC
 
Wood frank
Wood frankWood frank
Wood frankNASAPMC
 
Wessen randi (cd)
Wessen randi (cd)Wessen randi (cd)
Wessen randi (cd)NASAPMC
 
Vellinga joe
Vellinga joeVellinga joe
Vellinga joeNASAPMC
 
Trahan stuart
Trahan stuartTrahan stuart
Trahan stuartNASAPMC
 
Stock gahm
Stock gahmStock gahm
Stock gahmNASAPMC
 
Smalley sandra
Smalley sandraSmalley sandra
Smalley sandraNASAPMC
 
Seftas krage
Seftas krageSeftas krage
Seftas krageNASAPMC
 
Sampietro marco
Sampietro marcoSampietro marco
Sampietro marcoNASAPMC
 
Rudolphi mike
Rudolphi mikeRudolphi mike
Rudolphi mikeNASAPMC
 
Roberts karlene
Roberts karleneRoberts karlene
Roberts karleneNASAPMC
 
Rackley mike
Rackley mikeRackley mike
Rackley mikeNASAPMC
 
Paradis william
Paradis williamParadis william
Paradis williamNASAPMC
 
Osterkamp jeff
Osterkamp jeffOsterkamp jeff
Osterkamp jeffNASAPMC
 
O'keefe william
O'keefe williamO'keefe william
O'keefe williamNASAPMC
 
Muller ralf
Muller ralfMuller ralf
Muller ralfNASAPMC
 
Mulenburg jerry
Mulenburg jerryMulenburg jerry
Mulenburg jerryNASAPMC
 

Mehr von NASAPMC (20)

Bejmuk bo
Bejmuk boBejmuk bo
Bejmuk bo
 
Baniszewski john
Baniszewski johnBaniszewski john
Baniszewski john
 
Yew manson
Yew mansonYew manson
Yew manson
 
Wood frank
Wood frankWood frank
Wood frank
 
Wood frank
Wood frankWood frank
Wood frank
 
Wessen randi (cd)
Wessen randi (cd)Wessen randi (cd)
Wessen randi (cd)
 
Vellinga joe
Vellinga joeVellinga joe
Vellinga joe
 
Trahan stuart
Trahan stuartTrahan stuart
Trahan stuart
 
Stock gahm
Stock gahmStock gahm
Stock gahm
 
Smalley sandra
Smalley sandraSmalley sandra
Smalley sandra
 
Seftas krage
Seftas krageSeftas krage
Seftas krage
 
Sampietro marco
Sampietro marcoSampietro marco
Sampietro marco
 
Rudolphi mike
Rudolphi mikeRudolphi mike
Rudolphi mike
 
Roberts karlene
Roberts karleneRoberts karlene
Roberts karlene
 
Rackley mike
Rackley mikeRackley mike
Rackley mike
 
Paradis william
Paradis williamParadis william
Paradis william
 
Osterkamp jeff
Osterkamp jeffOsterkamp jeff
Osterkamp jeff
 
O'keefe william
O'keefe williamO'keefe william
O'keefe william
 
Muller ralf
Muller ralfMuller ralf
Muller ralf
 
Mulenburg jerry
Mulenburg jerryMulenburg jerry
Mulenburg jerry
 

Kürzlich hochgeladen

Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationUsing IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationIES VE
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1DianaGray10
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6DianaGray10
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarPrecisely
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPathCommunity
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.YounusS2
 
Designing A Time bound resource download URL
Designing A Time bound resource download URLDesigning A Time bound resource download URL
Designing A Time bound resource download URLRuncy Oommen
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Websitedgelyza
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxGDSC PJATK
 
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfJamie (Taka) Wang
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Brian Pichman
 
Empowering Africa's Next Generation: The AI Leadership Blueprint
Empowering Africa's Next Generation: The AI Leadership BlueprintEmpowering Africa's Next Generation: The AI Leadership Blueprint
Empowering Africa's Next Generation: The AI Leadership BlueprintMahmoud Rabie
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024D Cloud Solutions
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioChristian Posta
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureEric D. Schabell
 
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...Aggregage
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemAsko Soukka
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfDaniel Santiago Silva Capera
 

Kürzlich hochgeladen (20)

20150722 - AGV
20150722 - AGV20150722 - AGV
20150722 - AGV
 
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve DecarbonizationUsing IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
Using IESVE for Loads, Sizing and Heat Pump Modeling to Achieve Decarbonization
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity Webinar
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation Developers
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.
 
Designing A Time bound resource download URL
Designing A Time bound resource download URLDesigning A Time bound resource download URL
Designing A Time bound resource download URL
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Website
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptx
 
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
 
Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )Building Your Own AI Instance (TBLC AI )
Building Your Own AI Instance (TBLC AI )
 
Empowering Africa's Next Generation: The AI Leadership Blueprint
Empowering Africa's Next Generation: The AI Leadership BlueprintEmpowering Africa's Next Generation: The AI Leadership Blueprint
Empowering Africa's Next Generation: The AI Leadership Blueprint
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and Istio
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability Adventure
 
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystem
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
 

Costello kenneth

  • 1. Prioritizing Verification and Validation Resources on Mission and Safety Critical IV&V Facility Software Kenneth A. Costello, NASA IV&V Program 1
  • 2. Discussion Outline IV&V Facility • Quick Overview of the NASA Approach to IV&V – What is IV&V? – Objectives – Goal • NASA Approach to Determining IV&V Tasking – Software Integrity Level Assessment Process – Developing the task list • Dealing with IV&V Implementation Constraints – Managing Risk with SILAP – Slicing the Data – Data views • Future 2
  • 3. IV&V Facility What is IV&V at NASA? • An engineering discipline employing rigorous methods for evaluating the correctness and quality of the software product throughout the software life cycle from a system level viewpoint • The NASA Software IV&V approach covers not only expected operating conditions but the full spectrum of the system and its interfaces in the face of unexpected operating conditions or inputs • Three nominal aspects are technical, managerial and financial – Technical – The IV&V Program has its own approach to determining where to focus analysis activities – Managerial – The IV&V Program is functionally managed by the NASA Office of Safety and Mission Assurance while project level management comes form the IV&V Program – Financial – The IV&V Program is funded from NASA Corporate General & Administrative funds for Agency identified high priority Projects 3
  • 4. IV&V Facility A Lifecycle Approach • The NASA approach is not end of life cycle testing • It is a testing approach • Each task that is performed “tests” a development artifact ranging from system and software requirements to source code and test results • Each task in the NASA IV&V Work Breakdown Structure is designed to “test” a development artifact or process • Testing is throughout the life cycle 4
  • 5. IV&V Facility Objectives of IV&V • To find defects within the system with a focus on software and its interactions with the system • To make an assessment of whether or not the system is usable in an operational environment, again with a focus on the software within the system • To identify any latent risks associated with the software 5
  • 6. IV&V Facility Ultimate Goal of IV&V Establish confidence that the software is fit for its purpose within the context of the system • Note that the software may not be free from defects – That is rarely the case and can be difficult to prove • However, the software must be sufficient for its intended use – The system should be as described by the requirements – The requirements should be representative of the user’s needs • The type of use will determine the level of confidence that is needed – The consequence of software defect/failure will drive the needed level of confidence 6
  • 7. IV&V Facility Approach to Determining IV&V Tasking 7
  • 8. Software Integrity Level Assessment Process IV&V Facility • The NASA Software Integrity Level Assessment Process (SILAP) – Combination of the best of the current processes and best practices from industry and academia – The process has been used on over 15 projects and executed multiple times with excellent results • Process supports two key goals – Provides the NASA IV&V Program with a consistent risk-based approach to determining IV&V tasking – An extremely effective communication tool with the development projects 8
  • 9. SILAP Overview IV&V Facility • Measure of software goodness – Software integrity level • How good (dependable, reliable, etc.) does a software component need to be in terms of its role in the system – Understand how the component fits within and impacts the system – Understand what is required of that component to be able to maintain the functionality of the system – Software integrity level is equivalent to the tolerable level of risk that is associated with the system. • A software component can be associated with risk because – a failure (or defect) can lead to a threat, or – its functionality includes mitigation of consequences of initiating events in the system’s environment that can lead to a threat 9
  • 10. SILAP Overview (2) IV&V Facility • Determine overall needed integrity level • Impact of a defect – Determine the worse case error (at the software component level) that has a reasonable or credible fault/failure scenario – Consider the system architecture and try to understand how that software fault/failure scenario may affect the system – This determines the Consequence of the defect • Probability that the developer may insert an error into a software component is determined – Not an assessment of the probability of a failure, but rather the probability that an error of any type may exist in the operational software – This determination is known as the Error Potential 10
  • 11. SILAP Overview (3) IV&V Facility • These two factors provided a risk-like look at the software components – However, there is no relationship defined between Consequence and Error Potential that results in a risk level – Rather Consequence and EP are addressed independently of each other as will be shown later • This risk-based approach is the prime reason that communications with development projects have grown better • The risk-based approach also allows the process to be used to define different levels of risk reduction 11
  • 12. Consequence IV&V Facility • Consequence consists of the following items – Human Safety – This is a measure of the impact that a failure of this component would have on human life – Asset Safety – This is a measure of the impact that a failure would have on hardware – Performance – This is a measure of the impact that a failure would have on a mission being able to meet its goals 12
  • 13. Error Potential IV&V Facility • Error Potential consists of the following items – Developer Characteristics • Experience – This is a measure of the system developer’s experience in developing similar systems • Organization – This is a measure of the complexity of the organization developing the system (distance and number of organizations involved tend to increase the probability of errors being introduced into the system) – Software/System Characteristics • Difficulty – This is a measure of the complexity of the software being developed • Degree of Innovation – This is a measure of the level of innovation needed in order to develop this system/software • System Size – This is a measurement of the size of the system in terms of the software (i.e., Source Lines of Code) – Development Process Characteristics • Formality of the Process – This is a measure of how maturity of the developer’s processes • Re-use Approach – This is a measure of the level of re-use for the system/software • Artifact Maturity – This is a measure of the current state of the development documentation in relation to the state of the overall development project (i.e., the is past critical design review but the requirements documents are still full of TBDs and incompletes) 13
  • 14. Developing Tasking IV&V Facility • Once the scoring is complete a tasking set is developed based on each individual score – There is a set of tasking associated with a given Consequence score – There is a set of tasking associated with a given Error Potential score • The tasks are then combined into one set of tasks for that component – The tasks are not exclusive to a given score, i.e., you may have a task that shows up for the Consequence score and for the Error Potential score • This results in a matrix of software components and scores that provides the starting set of requirements or tasks for IV&V on that project 14
  • 15. SILAP Process IV&V Facility • More specific details of the SILAP process can be found at the NASA IV&V Program IV&V Management System (IMS) website • http://ims.ivv.nasa.gov • Look under the documentation link for Work Instruction 9-8-1 – At the time of the writing of this presentation the work instruction had not been approved for publishing 15
  • 16. IV&V Facility Dealing with Implementation Constraints 16
  • 17. Managing Risk With SILAP IV&V Facility • The SILAP has been used effectively on many different types of projects and across several iterations – The process is easy to use and generates data that is understandable on many levels, i.e., technical and managerial – The data can also be grouped to provide different views of the risk associated with a system – These different views provides differing level of risk reduction but can still be tuned to meet the goals and objectives of the IV&V Program 17
  • 18. Managing Risk With SILAP (2) IV&V Facility • As an example, the SILAP data can provide viewpoints that allow for differing levels of financial commitment but still meet the objectives of the IV&V program (although at the most minimum level) – The goal in developing the viewpoints was to always keen in mind a desire to meet our primary objective but with a funding constraint – Several other options to dealing with a funding constraint were also explored – However the approach using SILAP provided the most consistency in application and management 18
  • 19. Managing Risk with SILAP (3) IV&V Facility • The approach to developing the viewpoints was to start with one of the underlying principles in SILAP; that the identified software components could be ranked in a priority fashion – Items of high consequence and error potential were more important than those of lower values • The foundation of the process is determining the consequence of a defect in a software component – Performing IV&V on items of high consequence is generally more important than performing IV&V on items of higher error potential – Also keep in mind that the tasking associated with each factor, consequence and error potential, are disjoint and separate • So with this in mind, not only can components be ranked by a combination of the two scores, but also by each score individually 19
  • 20. Slicing the Data IV&V Facility • Consequence is the foundation – It is important to examine software components with high consequence irrespective of their error potential – It is less important to examine low consequence components irrespective of their error potential • Several ranking approaches were developed • To understand these approaches, there was also a need to understand some of the details in the criteria for scoring consequence • In the table on the next page, the shaded areas represent scores of strong consequence – In other words, serious or greater injury, permanent loss of an important part of the spacecraft, or a permanent loss of primary mission data 20
  • 21. Consequence Criteria IV&V Facility Consequence 1 2 3 4 5 Human Safety No impact to human Discomfort or nuisance Minor injury or potential Major injury or potential Loss of life safety for minor injury for major injury Asset Safety No damage to any s/c Temporary loss to a s/c Permanent loss of non- Permanent loss of Complete loss of component component primary component primary component vehicle/spacecraft/system Short-term partial loss of Partial loss of other Complete loss of other other critical asset critical asset critical asset Degradation of a non- Degradation of a primary component primary component Performance Unable to achieve non- Unable to achieve a Unable to achieve Unable to achieve minimum primary particular primary multiple mission/science mission/science objectives or mission/science mission/science objectives minimum success criteria objective objective Unrecoverable loss of Unrecoverable loss of non-primary primary mission/science data missions/science data Loss of non-primary Loss of primary capability capability 21
  • 22. Slicing the Data (2) IV&V Facility • Understanding this breakdown between strong and weak consequences creates a basis for further refining an IV&V approach • With the primary objective being to provide confidence that the mission will succeed, we can use the strong consequences as delineators in this definition – That is to say, the minimal level of confidence in a mission is defined by saying that it will not cause a major injury or worse, destroy a major spacecraft component or the entire spacecraft, or cause the permanent loss of data or a complete inability to meet one or more mission objectives. • From this statement several viewpoints were developed based on differing rankings of consequence and error potential 22
  • 23. Data Views IV&V Facility • Each view brings a different level of risk reduction but are easily determined simply by sorting the scores for components – Consequence >2 along with EP tasking – Consequence > 2 with no EP tasking – No EP tasking – Performance > 2, Asset safety > 3 – Performance > 2, Asset safety > 3 with no EP tasking • Note that Performance and Asset safety are called out – Creates a viewpoint based on the components of Consequence – This was needed due to the weighting scheme used; combinations of scores exist that may place the individual component in the shaded area on the previous chart, but the overall resulting Consequence score may be a 2 – Also note that for the purpose of this example human safety score were not included • For this presentation, a generic project was developed and the following potential cost savings were generated based on these scores • The project had 44 identified software components and the savings are based on the average cost of performing a given task from the WBS 23
  • 24. Bands of Potential Funding IV&V Facility • The cost reduction is % Savings #of components out taken against the nominal Full Tasking - 0 cost for doing all of the tasking PF > 2, AS > 3 13% 11 • The components out EP out 39% 0 column shows the number PF > 2, AS > 3 and EP out 42% 11 of components removed form the identified 44 Consequence > 2 40% 29 Consequence > 2 and EP out 50% 29 • In each of these cases, our current belief is that we can still meet the desire of NASA in providing confidence in the most important portions of the mission • It is not the best possible risk reduction, but it meets the minimal needs of NASA 24
  • 25. The Future of SILAP IV&V Facility • The SILAP is the planned approach for determining IV&V tasking for the future • Further investigations are also underway to examine the scoring and document and analyze any trends that can be found • A tool is currently being developed to allow for the easy capture and analysis of the scoring data for each individual software component • Other work is also ongoing to determine how to integrate the results of SILAP into other views of the systems using various modeling techniques – The goal is to provide even greater focus on the critical components of the system and show how they relate to the success or failure of the system – Provide a more detailed approach to performing validation of software using a model based analysis approach 25
  • 26. Questions? IV&V Facility • If you have any further questions or comments you can contact Dr. Butch Caffall, IV&V Program Manager, Director IV&V Facility NASA IV&V Facility 304 367 8201 Butch.Caffall@nasa.gov Kenneth Costello, IV&V Program Chief Engineer NASA IV&V Facility 304 367 8343 Kenneth.A.Costello@nasa.gov Christina Moats, IV&V Planning and Scoping Lead NASA IV&V Facility 304 367 8340 Christina.D.Moats@nasa.gov 26