SlideShare ist ein Scribd-Unternehmen logo
1 von 66
Downloaden Sie, um offline zu lesen
UNCLASSIFIED / FOUO

   UNCLASSIFIED / FOUO




                                National Guard
                               Black Belt Training

                                                 Module 25

                                  Measurement
                              System Analysis (MSA)
                                              Attribute Data
      This material is not for general distribution, and its contents should not be quoted, extracted for publication, or otherwise
                                                                                                                UNCLASSIFIED / FOUO
                     copied or distributed without prior coordination with the Department of the Army, ATTN: ETF.
                                                                                                                    UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO




CPI Roadmap – Measure
                                                                 8-STEP PROCESS
                                                                                                           6. See
   1.Validate              2. Identify           3. Set          4. Determine          5. Develop                           7. Confirm    8. Standardize
                                                                                                          Counter-
      the                 Performance         Improvement            Root               Counter-                             Results        Successful
                                                                                                          Measures
    Problem                   Gaps              Targets              Cause             Measures                             & Process        Processes
                                                                                                          Through

        Define                    Measure                         Analyze                           Improve                        Control


                                                                           TOOLS
                                                                      •Process Mapping
                                        ACTIVITIES
                      •   Map Current Process / Go & See              •Process Cycle Efficiency/TOC
                      •   Identify Key Input, Process, Output Metrics •Little’s Law
                      •   Develop Operational Definitions             •Operational Definitions
                      •   Develop Data Collection Plan                •Data Collection Plan
                      •   Validate Measurement System                 •Statistical Sampling
                      •   Collect Baseline Data                       •Measurement System Analysis
                      •   Identify Performance Gaps                   •TPM
                      •   Estimate Financial/Operational Benefits     •Generic Pull
                      •   Determine Process Stability/Capability      •Setup Reduction
                      •   Complete Measure Tollgate                   •Control Charts
                                                                      •Histograms
                                                                      •Constraint Identification
                                                                      •Process Capability
                           Note: Activities and tools vary by project. Lists provided here are not necessarily all-inclusive.      UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO




 Learning Objective
          Understand how to conduct and interpret a
           measurement system analysis with Attribute Data




                           Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   3
UNCLASSIFIED / FOUO




 Attribute Measurement Systems
          Most physical measurement systems use
           measurement devices that provide continuous data
            For continuous data Measurement System Analysis
             we can use control charts or Gage R&R methods
          Attribute/ordinal measurement systems utilize
           accept/reject criteria or ratings (such as 1 - 5) to
           determine if an acceptable level of quality has been
           attained
             Kappa and Kendall techniques can be used to
              evaluate these Attribute and Ordinal Measurement
              Systems

                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   4
UNCLASSIFIED / FOUO




 Are You Really Stuck With Attribute Data?
      Many inspection or checking processes have the ability to
       collect continuous data, but decide to use attribute data to
       simplify the task for the person taking and recording the data
      Examples:
         On-time Delivery can be recorded in 2 ways:
           a) in hours late or
           b) whether the delivery was on-time or late
         Many functional tests will evaluate a product on a
          continuous scale (temperature, pressure drop, voltage
          drop, dimensional, hardness, etc) and record the results
          as pass/fail
                      Strive to get continuous data!

                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   5
UNCLASSIFIED / FOUO




 Attribute and Ordinal Measurements
        Attribute and Ordinal measurements often rely on
         subjective classifications or ratings
               Examples include:
                     Rating different features of a service as either good or
                      bad, or on a scale from 1 to 5 with 5 being best
                     Rating different aspects of employee performance as
                      excellent, satisfactory, needs improvement
                     Rating wine on a) aroma, b) taste, and c) after taste
        Should we evaluate these measurement systems before
         using them to make decisions on our CPI project?
        What are the consequences of not evaluating them?


                                    Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   6
UNCLASSIFIED / FOUO




 MSA – Attribute Data
       What methodologies are appropriate to assess
        Attribute Measurement Systems?
              Attribute Systems – Kappa technique which treat all
               misclassifications equally
              Ordinal Systems – Kendall‟s technique which
               considers the rank of the misclassification
                     For example, if we are judging an advertising service on a
                      scale from 1 to 5 and Inspector A rates the service a „1‟ while
                      Inspector B rates it a „5.‟ That is a greater misclassification
                      than Inspector A rating it a „4‟ while Inspector B rates it a „5.‟




                                       Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   7
UNCLASSIFIED / FOUO




 Data Scales
        Nominal: Contains numbers that have no basis on which to arrange
         in any order or to make any assumptions about the quantitative
         difference between them. These numbers are just names or labels.
         For example:
           In an organization: Dept. 1 (Accounting), Dept. 2 (Customer
             Service), Dept. 3 ( Human Resources)
           In an insurance co.: Business Line 1, Line 2, Line 3
           Modes of transport: Mode 1 (air), Mode 2 (truck), Mode 3 (sea)

        Ordinal: Contains numbers that can be ranked in some natural
         sequence. This scale, however, cannot make an inference about the
         degree of difference between the numbers. Examples:
           On service performance: excellent, very good, good, fair, poor
           Salsa taste test: mild, hot, very hot, makes me suffer
           Customer survey: strongly agree, agree, disagree, strongly
            disagree
                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   8
UNCLASSIFIED / FOUO




 Kappa Techniques
          Kappa is appropriate for non-quantitative systems
           such as:
                Good or bad
                Go/No Go
                Differentiating noises (hiss, clank, thump)
                Pass/fail




                                Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   9
UNCLASSIFIED / FOUO




 Kappa Techniques
          Kappa for Attribute Data:
                Treats all misclassifications equally
                Does not assume that the ratings are equally
                 distributed across the possible range
                Requires that the units be independent and that the
                 persons doing the judging or rating make their
                 classifications independently
                Requires that the assessment categories be mutually
                 exclusive




                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   10
UNCLASSIFIED / FOUO




 Operational Definitions
      There    are some quality characteristics that are either difficult
         or very time consuming to define
      To   assess classification consistency, several units must be
         classified by more than one rater or judge
      If  there is substantial agreement among the raters, there is
         the possibility, although no guarantee, that the ratings are
         accurate
      If  there is poor agreement among the raters, the usefulness
         of the rating is very limited

                      Poor attribute measurement systems can almost
                      always be traced to poor operational definitions


                                   Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   11
UNCLASSIFIED / FOUO




 Consequences?
          What are the important concerns?
                What are the risks if agreement within and between
                 raters is not good?
             Are bad items escaping to the next operation in the
              process or to the external customer?
             Are good items being reprocessed unnecessarily?
             What is the standard for assessment?
             How is agreement measured?
             What is the Operational Definition for assessment?




                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   12
UNCLASSIFIED / FOUO




 What Is Kappa? “K”
                          Pobserved  Pchance
                       K
                              1  Pchance
P      observed
         Proportion of units on which both Judges agree = proportion both
          Judges agree are good + proportion both Judges agree are bad
P      chance (expected)
         Proportion of agreements expected by chance = (proportion Judge
          A says good * proportion Judge B says good) + (proportion Judge
          A says bad * proportion B says bad)

          Note: equation applies to a two category analysis, e.g., good or
          bad
                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   13
UNCLASSIFIED / FOUO




 Kappa
                                 Pobserved  Pchance
                              K
                                     1  Pchance
           For        perfect agreement, P observed = 1 and K=1
                     As a rule of thumb, if Kappa is lower than 0.7, the
                      measurement system is not adequate
                     If Kappa is 0.9 or above, the measurement system is
                      considered excellent
           The        lower limit for Kappa can range from 0 to -1
                     For P observed = P chance (expected), then K=0
                     Therefore, a Kappa of 0 indicates that the agreement is
                      the same as would be expected by random chance
                                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   14
UNCLASSIFIED / FOUO




 Attribute MSA Guidelines
          When selecting items for the study consider the
           following:
                If you only have two categories, good and bad, you
                 should have a minimum of 20 good and 20 bad
                As a maximum, have 50 good and 50 bad
                Try to keep approximately 50% good and 50% bad
                Have a variety of degrees of good and bad


                      If only good items are chosen for the study, what
                            might happen to P-chance (expected)?



                                    Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   15
UNCLASSIFIED / FOUO




 Attribute MSA Guidelines (Cont.)
          If you have more than two categories, with one of the
           categories being good and the other categories being
           different error modes, you should have approximately
           50% of the items being good and a minimum of 10%
           of the items in each of the error modes
          You might combine some of the error modes as
           “other”
          The categories should be mutually exclusive or, if not,
           they should also be combined



                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   16
UNCLASSIFIED / FOUO




 Within Rater/Repeatability Considerations
      Have           each rater evaluate the same item at least twice
      Calculate   a Kappa for each rater by creating separate
          Kappa tables, one for each rater
              If a Kappa measurement for a particular rater is small, that
               rater does not repeat well within self
              If the rater does not repeat well within self, then they will not
               repeat well with the other raters and this will hide how good
               or bad the others repeat between themselves
      Calculate    a between-rater Kappa by creating a Kappa table
          from the first judgment of each rater
      Between-rater     Kappa will be made as pairwise comparisons
          (A to B, B to C, A to C)
                                   Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   17
UNCLASSIFIED / FOUO




 Example: Data Set = Attribute Ordinal.mtw
        An educational testing organization is training five new
         appraisers for the written portion of the twelfth-grade
         standardized essay test
        The appraisers‟ ability to rate essays consistent with the
         standards needs to be assessed
        Each appraiser rated fifteen essays on a five-point scale
         (-2, -1, 0, 1, 2)
        The organization also rated the essays and supplied the “official
         score”
        Each essay was rated twice and the data captured in the file
         Attribute Ordinal.mtw
        Open the file and evaluate the appraisers' performance

                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   18
UNCLASSIFIED / FOUO




 Minitab and Attribute Measurement Systems
  Stat>Quality Tools>Attribute Agreement Analysis




                         Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   19
UNCLASSIFIED / FOUO




 Minitab Dialog Box



1. Double click on the
  appropriate variable
  to place it in the
  required dialog box:

   Attribute = Rating
   Samples = Sample
   Appraisers = Appraiser


  2. Click on OK



                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   20
UNCLASSIFIED / FOUO




 Within Appraiser Percent
                    This output represents the percent agreement and the 95%
                            confidence interval around that percentage
                                                                                     Date of study :
       Assessment Agreement
                                                                                     Reported by :
                                                                                     Name of product:
                                                                                     Misc:



                                           Within A ppraisers
                    100                                                                                 95.0% C I
                                                                                                        P ercent


                    80


                    60
          Percent




                    40


                    20


                     0
                          Duncan   Hayes           Holmes             Montgomery         Simpson
                                                  Appraiser


                                           Measurement System Analysis - Attribute                      UNCLASSIFIED / FOUO   21
UNCLASSIFIED / FOUO




 Within Appraiser Session Window Output
                      This output is the same information contained in the graph
                        with the addition of a Between-Appraiser assessment




                                       Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   22
UNCLASSIFIED / FOUO




 Let’s Do It Again
                      Stat>Quality Tools>Attribute Agreement Analysis




                                    Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   23
UNCLASSIFIED / FOUO




 Introducing a Known Standard
 1. Double click on the
   appropriate variable
   to place it in the
   required dialog box
    (same as before)

 2. If you have a known
     standard (the real answer)
     for the items being inspected,
     let Minitab know what column
     that information is in.




  3. Click on OK

                                  Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   24
UNCLASSIFIED / FOUO




 Appraiser vs. Standard
                                                                                                                 Date of study :
      Assessment Agreement
                                                                                                                 Reported by :
                                                                                                                 Name of product:
                                                                                                                 Misc:



                             Within Appraisers                                                    Appraiser vs Standard
                 100                                               95.0% C I                  100                                               95.0% C I
                                                                   P ercent                                                                     P ercent
                 90                                                                           90

                 80                                                                           80

                 70                                                                           70
       Percent




                                                                                    Percent
                 60                                                                           60

                 50                                                                           50

                 40                                                                           40

                 30                                                                           30

                        an        es     es      ry      on                                          an       es     es      ry      on
                     nc        ay      lm      me      ps                                         nc       ay      lm      me      ps
                  Du          H      Ho      go     Si
                                                      m                                        Du         H      Ho      go     Si
                                                                                                                                  m
                                          ont                                                                         ont
                                        M                                                                           M
                                   Appraiser                                                                   Appraiser


                                                              Measurement System Analysis - Attribute                                     UNCLASSIFIED / FOUO   25
UNCLASSIFIED / FOUO




 Within Appraiser




                       In addition to the Within-Appraiser
                      graphic, Minitab will give percentages


                             Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   26
UNCLASSIFIED / FOUO




 Each Appraiser vs. Standard




                      Some appraisers will repeat their own ratings well but
                       may not match the standard well (look at Duncan)



                                     Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   27
UNCLASSIFIED / FOUO




 More Session Window Output




                      The session window will give percentage data as to how
                      all the appraisers did when judged against the standard
                                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   28
UNCLASSIFIED / FOUO




 Kappa and Minitab
          Minitab will calculate a Kappa for each (within) appraiser for each category




                      Note: This is only a part of the total data set for illustration
                                         Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   29
UNCLASSIFIED / FOUO




 Kappa vs. Standard
                          Minitab will also calculate a Kappa statistic for each
                                appraiser as compared to the standard




                      Note: This is only a part of the total data set for illustration
                                         Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   30
UNCLASSIFIED / FOUO




 Kappa and Minitab


                                                                    Minitab will not provide a
                                                                    Kappa between a specific
                                                                    pair of appraisers, but will
                                                                    provide an overall Kappa
                                                                    between all appraisers for
                                                                    each possible category of
                                                                    response




                How might this output help us improve our measurement system?

                                   Measurement System Analysis - Attribute             UNCLASSIFIED / FOUO   31
UNCLASSIFIED / FOUO




 What If My Data Is Ordinal?
             Stat>Quality Tools>Attribute Agreement Analysis




                              Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   32
UNCLASSIFIED / FOUO




 Ordinal Data



    If your data is
    Ordinal, you
    must also check
    this box




                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   33
UNCLASSIFIED / FOUO




 What Is Kendall’s




      Kendall‟s coefficient can be thought of as an R-squared value, it is the correlation
        between the responses treating the data as attribute as compared to ordinal.
          The lower the number gets, the more severe the misclassifications were.



                                    Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   34
UNCLASSIFIED / FOUO




 Kendall’s




              Kendall‟s coefficient can be thought of as an R-squared value, it is the
               correlation between the responses treating the data as attribute as
             compared to ordinal. The lower the number gets, the more severe the
                                       misclassifications were.

                                     Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   35
UNCLASSIFIED / FOUO




 Kendall’s (Cont.)




                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   36
UNCLASSIFIED / FOUO




 Exercise: Seeing Stars
          Divide into teams of two
          One person will be the rater and one the recorder
          Have each rater inspect each start and determine if it is Good
           or Bad (Kappa)
          Record the results in Minitab
          Mix up the stars and repeat with same rater 2 more times
          Compare results to other raters and to the known standard
          Take 30 minutes to complete the exercise and be prepared to
           review your findings with the class




                               Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   37
UNCLASSIFIED / FOUO




 Takeaways
          How to set-up/conduct an MSA
          Use attribute data only if the measurement can not be
           converted to continuous data
          Operational definitions are extremely important
          Attribute measurement systems require a great deal
           of maintenance
          Kappa is an easy method to test how repeatable and
           reproducible a subjective measurement system is



                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   38
UNCLASSIFIED / FOUO




         What other comments or questions
                   do you have?




                                     UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO




 References
          Cohen, J., “A Coefficient of Agreement for Nominal
           Scales,” Educational and Psychological Measurement,
           Vol. 20,
           pp. 37-46, 1960
          Futrell, D., “When Quality Is a Matter of Taste, Use
           Reliability Indexes,” Quality Progress, May 1995




                            Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   40
UNCLASSIFIED / FOUO




          APPENDIX – A Practical Example of Kappa
                      Evaluating the Measurement System for
                            Determining Civilian Awards




                                 Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   41
UNCLASSIFIED / FOUO




 Kappa Example #1
        The Chief of Staff (COS) of the 1st Infantry Division is preparing for the
         redeployment of 3 brigade combat teams supporting Operation Iraqi Freedom.
        The Secretary of General Staff (SGS) informs the COS that awards for civilian
         personnel (Department of the Army Civilians and military dependents) who
         provided volunteer support prior to and during the deployment is always a
         “significant emotional issue.” There are hundreds of submissions for awards.
        A board of senior Army personnel decides who receives an award. The
         measurement system the board uses to determine who receives an award is a
         major concern due to differences in board member to board member
         differences as well as within board member differences.
        The COS directs the SGS (a certified Army Black Belt) to conduct a
         measurement system study using historical data to “level set” the board
         members. Kappa for each board member as well as Kappa between board
         members must be calculated.
        The COS‟ guidance is to retrain and/or replace board members until the
         measurement system is not a concern.

                                   Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   42
UNCLASSIFIED / FOUO




 Consider the Following Data
     • The Lean Six Sigma Pocket Toolbook, p.100-103 outlines
       the procedures for calculating Kappa. Kappa is MSA for
       attribute data.

     • The SGS‟ study involves two categories for
       recommendations, “Award” and “No Award”.

     • We select 40 candidate packets from historical data and
       ensure that 20 are definitely for “Award” and 20 are for “No
       Award”.

     • Board Member 1 and 2 evaluate each candidate‟s packet.
       The results are shown in the tables on the following slides.
                           Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   43
UNCLASSIFIED / FOUO




 Consider the Following Data




                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   44
UNCLASSIFIED / FOUO




 Consider the Following Data




                      Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   45
UNCLASSIFIED / FOUO




 Contingency Table for Board Member 1

                            Populate Each Cell with the Evaluation Data

  Contingency Table: Counts                       Board Member 1 - 1st
                                               Award               No Award
    Member 1




                       Award                       15                               3                      18
      Board

      - 2nd




                      No Award                      3                               19                     22
                                                   18                               22
         Board Member 1 – 1st : shows the results of Board Member 1’s 1st recommendations. The 1st board
         member recommended an “Award” or “No Award” for each of the 40 candidates on the first review
         of the files.

         Board Member 1 – 2nd : shows the results of Board Member 1’s 2nd recommendations. The 1st
         board member recommended an “Award” or “No Award” for each of the 40 candidates on the
         second review of the files.


                                          Measurement System Analysis - Attribute            UNCLASSIFIED / FOUO   46
UNCLASSIFIED / FOUO




 Contingency Table: Cell 1
                                       The first cell represents the number of
                                       times Board Member 1 recommended a
                                       candidate should receive an “Award” in
                                       both the first and second evaluation.



       Contingency Table:                      Board Member 1 - 1st
            Counts                       Award                   No Award
    Member 1




                       Award                15                             3                  18
      Board

      - 2nd




                      No Award              3                              19                 22
                                            18                             22




                                 Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   47
UNCLASSIFIED / FOUO




 Contingency Table: Cell 2
                                                  The second cell represents the number of
                                                  times Board Member 1 recommended a
                                                  candidate as “No Award” the first time
                                                  and “Award” the second evaluation.



      Contingency Table:                      Board Member 1 - 1st
           Counts                       Award                   No Award
    Member 1




                       Award              15                               3                 18
      Board

      - 2nd




                      No Award            3                                19                22
                                          18                               22




                                 Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   48
UNCLASSIFIED / FOUO




 Contingency Table: Cell 3


     Contingency Table:                       Board Member 1 - 1st
          Counts                        Award                   No Award
    Member 1




                       Award              15                               3                 18
      Board

      - 2nd




                      No Award            3                                19                22
                                          18                               22




    The third cell represents the number of times Board Member 1
    recommended “Award” on the first evaluation and “No Award”
    on the second evaluation.


                                 Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   49
UNCLASSIFIED / FOUO




 Contingency Table: Cell 4


      Contingency Table:                           Board Member 1 - 1st
           Counts                            Award                   No Award
    Member 1




                       Award                    15                              3                 18
      Board

      - 2nd




                      No Award                  3                               19                22
                                                18                              22




                                 The fourth cell represents the number of times Board
                                 Member 1 recommended “No Award” on the first
                                 evaluation and “No Award” on the second evaluation.


                                      Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   50
UNCLASSIFIED / FOUO




 Contingency Table: Sum of Row and Columns


      Contingency Table:                          Board Member 1 - 1st
           Counts                           Award                   No Award
     Member 1




                       Award                   15                              3                18
       Board

       - 2nd




                      No Award                 3                               19               22
                                               18                              22




                                 The numbers on the margins are the totals of the rows
                                 and columns of data. The sum in both instances is 40,
                                 the total number of candidate packets reviewed.

                                     Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   51
UNCLASSIFIED / FOUO




 Contingency Table – Counts & Proportions
     Contingency Table:                 Board Member 1 - 1st
                Counts            Award                    No Award
    Member 1




                       Award        15                                     3     18
      Board

     - 2nd




                      No Award       3                                     19    22
                                    18                                     22

    Contingency Table:                 Board Member 1 - 1st
          Proportions            Award                    No Award
    Member 1
      Board




                       Award      0.375                                0.075    0.45
     - 2nd




                      No Award    0.075                                0.475    0.55
                                  0.45                                 0.55

                                                                    Represents 18/40
  Board Member 1 Proportions: The lower table is the data in the upper table
  represented as a percentage of the total.
                                 Measurement System Analysis - Attribute        UNCLASSIFIED / FOUO   52
UNCLASSIFIED / FOUO




 Contingency Table – Sum of Percentages



      Contingency Table:                      Board Member 1 - 1st
         Proportions                    Award                   No Award
    Member 1




                       Award             0.375                             0.075                 0.45
      Board

     - 2nd




                      No Award           0.075                             0.475                 0.55
                                         0.45                              0.55



                                             The sum percentages from the rows and
                                             columns. The sums must equal 1.0

                                 Measurement System Analysis - Attribute           UNCLASSIFIED / FOUO   53
UNCLASSIFIED / FOUO




 Calculating Kappa
                            Pobserved  Pchance
                         K
                                1  Pchance
      Pobserved
         Proportion of candidates for which both Board Members agree
          = proportion both Board Members agree are “Award” +
          proportion both Board Members agree are “No Award”.
      Pchance
         Proportion of agreements expected by chance = (proportion
          Board Member 1 says “Award” * proportion Board Member 2
          says “Award”)+ (proportion Board Member 1 says “No Award”
          * proportion Member 2 says ”No Award”)

         The verbiage for defining Kappa will vary slightly depending on whether
             we are defining a Within-Rater Kappa or Between-Rater Kappa

                                  Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   54
UNCLASSIFIED / FOUO




 Calculate Kappa for Board Member 1
        Contingency Table:
                                                  Board Member 1 - 1st
              Proportions
                                            Award                   No Award
     Member 1




                       Award                 0.375                             0.075                0.45
       Board

      - 2nd




                      No Award               0.075                             0.475                0.55
                                             0.45                              0.55

   Pobserved is the sum of the probabilities on the diagonal:
              P observed =(0.375 + 0.475) = 0.850

   Pchance is the probabilities for each classification multiplied and then summed:
               Pchance =(0.450*0.450) + (0.550*0.550) = 0.505

   Then KBoard Member 1=(0.850 - 0.505)/(1 - 0.505)=0.697

   Kappa for Board Member 1 is sufficiently close to 0.700 that we conclude that Board Member 1
   exhibits repeatability.
                                     Measurement System Analysis - Attribute           UNCLASSIFIED / FOUO   55
UNCLASSIFIED / FOUO




 Calculate Kappa for Board Member 2
         Contingency Table:                      Board Member 2 - 1st
               Counts            Award                                  No Award
        Member 2




                        Award
          Board

         - 2nd




                      No Award



         Contingency Table:                      Board Member 2 - 1st
            Proportion           Award                                  No Award
       Member 2




                       Award
         Board

        - 2nd




                      No Award




                                 K Board Member 2 = ?
                                  Measurement System Analysis - Attribute          UNCLASSIFIED / FOUO   56
UNCLASSIFIED / FOUO




 Kappa Between Board Members

         To calculate a Kappa for between Board Members, we
          will use a similar procedure.
         We calculate Kappa for the first recommendations of
          the pair of the Board Members.
         NOTE: If there is a Board Member who has poor
          Within-Board Member repeatability (less than 85%),
          there is no need to calculate a Between-Board
          Member rating.



                           Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   57
UNCLASSIFIED / FOUO




 Kappa – Board Member 1 to Board Member 2

        Contingency Table:
                                                       Board Member 1 - 1st
              Counts
                                                        Award              No Award
     Member 2




                       Award                                14                5              19
       Board

       - 1st




                      No Award                              4                17              21
                                                            18               22

                        Number of times both board members
                        agreed the candidate should receive an “Award.”
                        (using their first evaluation)



                                 Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   58
UNCLASSIFIED / FOUO




 Kappa Between Board Members
        Contingency Table:
              Counts                                   Board Member 1 - 1st
                                                        Award              No Award
     Member 2




                       Award                                14                5              19
       Board

       - 1st




                      No Award                               4               17              21
                                                            18               22

                             Number of times Board Member 1
                             recommended “No Award” and Board Member
                             2 recommended “Award”. (using their first
                             evaluation)


                                 Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   59
UNCLASSIFIED / FOUO




 Board Member 1 to Board Member 2 Kappa

        Contingency Table:
              Counts
                                                       Board Member 1 - 1st
                                                        Award              No Award
     Member 2




                       Award                                14                5              19
       Board

       - 1st




                      No Award                              4                17              21
                                                            18               22

                 Number of times Board Member 1 recommended
                 “Award” and Board Member 2 recommended “No
                 Award” (using their first measurement)


                                 Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   60
UNCLASSIFIED / FOUO




 Between Board Member Kappa

       Contingency Table:
                                                       Board Member 1 - 1st
             Counts
                                                        Award              No Award
     Member 2




                       Award                                14                5              19
       Board

       - 1st




                      No Award                              4                17              21
                                                            18               22

                                 Number of times both Board Members
                                 agreed the candidate was “No Award”
                                 (using their first measurement)



                                 Measurement System Analysis - Attribute    UNCLASSIFIED / FOUO   61
UNCLASSIFIED / FOUO




 Kappa Between Board Members
  Calculate Between-Board Member Kappa:

       Contingency Table:        Board Member 1 - 1st
            Counts               Award               No Award
     Member 2




                       Award        14                           5           19
      Board




                                                                                  The lower table
       - 1st




                                                                                  represents the data
                      No Award      4                            17          21   in the top with each
                                    18                           22               cell being
                                                                                  represented as a
                                                                                  percentage of the
       Contingency Table:
          Proportions
                                 Board Member 1 - 1st                             total.
                                 Award               No Award
     Member 2




                       Award       0.35                        0.125       0.48
      Board

       - 1st




                      No Award    0.100                        0.425       0.53
                                  0.450                        0.550

                                 Measurement System Analysis - Attribute                  UNCLASSIFIED / FOUO   62
UNCLASSIFIED / FOUO




 Remember How to Calculate Kappa?
                               Pobserved  Pchance
                            K
                                   1  Pchance
       Pobserved
               Proportion of items on which both Board Members agree =
                proportion both Board Members agree “Award” + proportion
                both Board Members agree are “No Award”.

       Pchance
               Proportion of agreements expected by chance = (proportion
                Board Member 1 recommends “Award” * proportion Board
                Member 2 says “No Award”) + (proportion Board Member 1 says
                No Award” * proportion Board Member 2 says “No Award”)

     The verbiage for defining Kappa will vary slightly depending on whether we are
       defining a Within-Board Member Kappa or Between-Board Member Kappa

                                   Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   63
UNCLASSIFIED / FOUO




 Calculate Kappa for Board Member 1 to Board Member 2
          Contingency Table:
                                          Board Member 1 - 1st
             Proportions
                                          Award               No Award
       Member 2




                        Award               0.35                          0.125   0.48
        Board

         - 1st




                      No Award             0.100                          0.425   0.53
                                           0.450                          0.550
     Pobserved is the sum of the probabilities on the diagonal:
                 Pobserved =(0.350 + 0.425) = 0.775

     Pchance is the probability for each classification multiplied and then summed:
                 Pchance =(0.480*0.450) + (0.530*0.550) = 0.503

     Then Kboard Member 1 / 2=(0.775 - 0.503)/(1 - 0.503)=0.548

     The Board Members evaluate candidate packets differently too often. The SGS
     will retrain each Board Member before dismissing a Board Member and finding a
     replacement.
                                    Measurement System Analysis - Attribute              UNCLASSIFIED / FOUO   64
UNCLASSIFIED / FOUO




 Improvement Ideas
          How might we improve this measurement system?
                Additional training
                Physical standards/samples
                Rater certification (and periodic re-certification)
                 process
                Better operational definitions




                                 Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   65
UNCLASSIFIED / FOUO




 Kappa Conclusions
          Is the current measurement system adequate?
          Where would you focus your improvement efforts?
          What rater would you want to conduct any training
           that needs to be done?



             Class Challenge: After exposure to Minitab in the following slides,
             input the data from previous example into Minitab. As homework,
             perform the analysis and compare the computer output and simplicity
             with the manual calculations performed in the previous slides.
             Hint: You will need to stack columns.


                                   Measurement System Analysis - Attribute   UNCLASSIFIED / FOUO   66

Weitere ähnliche Inhalte

Was ist angesagt?

02training material for msa
02training material for msa02training material for msa
02training material for msa營松 林
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysisPPT4U
 
R&R Gage Analysis
R&R Gage AnalysisR&R Gage Analysis
R&R Gage AnalysisTripticon
 
Intro to Measurement System Analysis (MSA)
Intro to Measurement System Analysis (MSA)Intro to Measurement System Analysis (MSA)
Intro to Measurement System Analysis (MSA)Mark Harrison
 
NG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionNG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionLeanleaders.org
 
Gage R&R Measurement Systems Analysis Sample Slides
Gage R&R Measurement Systems Analysis Sample SlidesGage R&R Measurement Systems Analysis Sample Slides
Gage R&R Measurement Systems Analysis Sample SlidesScott Munger
 
Gage r&r
Gage r&rGage r&r
Gage r&rAris UM
 
Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Ram Kumar
 
Measurement System Analysis - Module 1
Measurement System Analysis - Module 1Measurement System Analysis - Module 1
Measurement System Analysis - Module 1Subhodeep Deb
 
Gr&r studies
Gr&r studiesGr&r studies
Gr&r studiesshilpi020
 
We just had a failure will weibull analysis help
We just had a failure will weibull analysis help We just had a failure will weibull analysis help
We just had a failure will weibull analysis help ASQ Reliability Division
 
02trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp0202trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp02Junelly Grace Catalan-Tecson
 

Was ist angesagt? (20)

02training material for msa
02training material for msa02training material for msa
02training material for msa
 
Measurement system analysis
Measurement system analysisMeasurement system analysis
Measurement system analysis
 
Basics of Process Capability
Basics of Process CapabilityBasics of Process Capability
Basics of Process Capability
 
R&R Gage Analysis
R&R Gage AnalysisR&R Gage Analysis
R&R Gage Analysis
 
Spc
SpcSpc
Spc
 
6.2 msa-gauge-r&r
6.2 msa-gauge-r&r6.2 msa-gauge-r&r
6.2 msa-gauge-r&r
 
Spc training
Spc training Spc training
Spc training
 
Intro to Measurement System Analysis (MSA)
Intro to Measurement System Analysis (MSA)Intro to Measurement System Analysis (MSA)
Intro to Measurement System Analysis (MSA)
 
NG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - IntroductionNG BB 23 Measurement System Analysis - Introduction
NG BB 23 Measurement System Analysis - Introduction
 
Attribute MSA
Attribute MSAAttribute MSA
Attribute MSA
 
Apqp fundamentals
Apqp fundamentalsApqp fundamentals
Apqp fundamentals
 
Msa training
Msa trainingMsa training
Msa training
 
Gage R&R Measurement Systems Analysis Sample Slides
Gage R&R Measurement Systems Analysis Sample SlidesGage R&R Measurement Systems Analysis Sample Slides
Gage R&R Measurement Systems Analysis Sample Slides
 
MSA
MSAMSA
MSA
 
Gage r&r
Gage r&rGage r&r
Gage r&r
 
Measurement System Analysis (MSA)
Measurement System Analysis (MSA)Measurement System Analysis (MSA)
Measurement System Analysis (MSA)
 
Measurement System Analysis - Module 1
Measurement System Analysis - Module 1Measurement System Analysis - Module 1
Measurement System Analysis - Module 1
 
Gr&r studies
Gr&r studiesGr&r studies
Gr&r studies
 
We just had a failure will weibull analysis help
We just had a failure will weibull analysis help We just had a failure will weibull analysis help
We just had a failure will weibull analysis help
 
02trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp0202trainingmaterialformsa (1) 111030062223-phpapp02
02trainingmaterialformsa (1) 111030062223-phpapp02
 

Andere mochten auch

NG BB 24 Measurement System Analysis - Continuous
NG BB 24 Measurement System Analysis - ContinuousNG BB 24 Measurement System Analysis - Continuous
NG BB 24 Measurement System Analysis - ContinuousLeanleaders.org
 
NG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapNG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapLeanleaders.org
 
NG BB 22 Process Measurement
NG BB 22 Process MeasurementNG BB 22 Process Measurement
NG BB 22 Process MeasurementLeanleaders.org
 
NG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateNG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateLeanleaders.org
 
NG BB 38 ANALYZE Tollgate
NG BB 38 ANALYZE TollgateNG BB 38 ANALYZE Tollgate
NG BB 38 ANALYZE TollgateLeanleaders.org
 
NG BB 27 Process Capability
NG BB 27 Process CapabilityNG BB 27 Process Capability
NG BB 27 Process CapabilityLeanleaders.org
 
NG BB 18 Theory of Constraints
NG BB 18 Theory of ConstraintsNG BB 18 Theory of Constraints
NG BB 18 Theory of ConstraintsLeanleaders.org
 
NG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningNG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningLeanleaders.org
 
NG BB 08 Change Management
NG BB 08 Change ManagementNG BB 08 Change Management
NG BB 08 Change ManagementLeanleaders.org
 
NG BB 53 Process Control [Compatibility Mode]
NG BB 53 Process Control [Compatibility Mode]NG BB 53 Process Control [Compatibility Mode]
NG BB 53 Process Control [Compatibility Mode]Leanleaders.org
 
NG BB 37 Multiple Regression
NG BB 37 Multiple RegressionNG BB 37 Multiple Regression
NG BB 37 Multiple RegressionLeanleaders.org
 
NG BB 09 Project Management
NG BB 09 Project ManagementNG BB 09 Project Management
NG BB 09 Project ManagementLeanleaders.org
 
NG BB 21 Intro to Minitab
NG BB 21 Intro to MinitabNG BB 21 Intro to Minitab
NG BB 21 Intro to MinitabLeanleaders.org
 
NG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateNG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateLeanleaders.org
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change OverLeanleaders.org
 
NG BB 42 Visual Management
NG BB 42 Visual ManagementNG BB 42 Visual Management
NG BB 42 Visual ManagementLeanleaders.org
 
NG BB 52 CONTROL Roadmap
NG BB 52 CONTROL RoadmapNG BB 52 CONTROL Roadmap
NG BB 52 CONTROL RoadmapLeanleaders.org
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapLeanleaders.org
 

Andere mochten auch (20)

NG BB 24 Measurement System Analysis - Continuous
NG BB 24 Measurement System Analysis - ContinuousNG BB 24 Measurement System Analysis - Continuous
NG BB 24 Measurement System Analysis - Continuous
 
NG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapNG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE Roadmap
 
NG BB 22 Process Measurement
NG BB 22 Process MeasurementNG BB 22 Process Measurement
NG BB 22 Process Measurement
 
NG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateNG BB 55 CONTROL Tollgate
NG BB 55 CONTROL Tollgate
 
NG BB 38 ANALYZE Tollgate
NG BB 38 ANALYZE TollgateNG BB 38 ANALYZE Tollgate
NG BB 38 ANALYZE Tollgate
 
NG BB 27 Process Capability
NG BB 27 Process CapabilityNG BB 27 Process Capability
NG BB 27 Process Capability
 
NG BB 18 Theory of Constraints
NG BB 18 Theory of ConstraintsNG BB 18 Theory of Constraints
NG BB 18 Theory of Constraints
 
NG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningNG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project Planning
 
NG BB 08 Change Management
NG BB 08 Change ManagementNG BB 08 Change Management
NG BB 08 Change Management
 
NG BB 11 Power Steering
NG BB 11 Power SteeringNG BB 11 Power Steering
NG BB 11 Power Steering
 
NG BB 53 Process Control [Compatibility Mode]
NG BB 53 Process Control [Compatibility Mode]NG BB 53 Process Control [Compatibility Mode]
NG BB 53 Process Control [Compatibility Mode]
 
NG BB 37 Multiple Regression
NG BB 37 Multiple RegressionNG BB 37 Multiple Regression
NG BB 37 Multiple Regression
 
NG BB 09 Project Management
NG BB 09 Project ManagementNG BB 09 Project Management
NG BB 09 Project Management
 
NG BB 21 Intro to Minitab
NG BB 21 Intro to MinitabNG BB 21 Intro to Minitab
NG BB 21 Intro to Minitab
 
NG BB 30 Basic Tools
NG BB 30 Basic ToolsNG BB 30 Basic Tools
NG BB 30 Basic Tools
 
NG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateNG BB 28 MEASURE Tollgate
NG BB 28 MEASURE Tollgate
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change Over
 
NG BB 42 Visual Management
NG BB 42 Visual ManagementNG BB 42 Visual Management
NG BB 42 Visual Management
 
NG BB 52 CONTROL Roadmap
NG BB 52 CONTROL RoadmapNG BB 52 CONTROL Roadmap
NG BB 52 CONTROL Roadmap
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE Roadmap
 

Ähnlich wie NG BB 25 Measurement System Analysis - Attribute

NG BB 20 Data Collection
NG BB 20 Data CollectionNG BB 20 Data Collection
NG BB 20 Data CollectionLeanleaders.org
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapLeanleaders.org
 
NG BB 32 Failure Modes and Effects Analysis
NG BB 32 Failure Modes and Effects AnalysisNG BB 32 Failure Modes and Effects Analysis
NG BB 32 Failure Modes and Effects AnalysisLeanleaders.org
 
NG BB 27 Process Capability
NG BB 27 Process CapabilityNG BB 27 Process Capability
NG BB 27 Process CapabilityLeanleaders.org
 
NG BB 46 Mistake Proofing
NG BB 46 Mistake ProofingNG BB 46 Mistake Proofing
NG BB 46 Mistake ProofingLeanleaders.org
 
NG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsNG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsLeanleaders.org
 
NG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixNG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixLeanleaders.org
 
NG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixNG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixLeanleaders.org
 
NG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventNG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventLeanleaders.org
 
NG BB 49 Risk Assessment
NG BB 49 Risk AssessmentNG BB 49 Risk Assessment
NG BB 49 Risk AssessmentLeanleaders.org
 
NG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionNG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionLeanleaders.org
 
NG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionNG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionLeanleaders.org
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change OverLeanleaders.org
 
NG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsNG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsLeanleaders.org
 
NG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsNG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsLeanleaders.org
 
NG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapNG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapLeanleaders.org
 
NG BB 54 Sustain the Gain
NG BB 54 Sustain the GainNG BB 54 Sustain the Gain
NG BB 54 Sustain the GainLeanleaders.org
 
NG BB 43 Standardized Work
NG BB 43 Standardized WorkNG BB 43 Standardized Work
NG BB 43 Standardized WorkLeanleaders.org
 
NG BB 02 Table of Contents
NG BB 02 Table of ContentsNG BB 02 Table of Contents
NG BB 02 Table of ContentsLeanleaders.org
 

Ähnlich wie NG BB 25 Measurement System Analysis - Attribute (20)

NG BB 20 Data Collection
NG BB 20 Data CollectionNG BB 20 Data Collection
NG BB 20 Data Collection
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE Roadmap
 
NG BB 32 Failure Modes and Effects Analysis
NG BB 32 Failure Modes and Effects AnalysisNG BB 32 Failure Modes and Effects Analysis
NG BB 32 Failure Modes and Effects Analysis
 
NG BB 27 Process Capability
NG BB 27 Process CapabilityNG BB 27 Process Capability
NG BB 27 Process Capability
 
NG BB 46 Mistake Proofing
NG BB 46 Mistake ProofingNG BB 46 Mistake Proofing
NG BB 46 Mistake Proofing
 
NG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsNG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of Experiments
 
NG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixNG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) Matrix
 
NG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) MatrixNG BB 31 Cause and Effect (XY) Matrix
NG BB 31 Cause and Effect (XY) Matrix
 
NG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventNG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement Event
 
NG BB 49 Risk Assessment
NG BB 49 Risk AssessmentNG BB 49 Risk Assessment
NG BB 49 Risk Assessment
 
NG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionNG BB 36 Simple Linear Regression
NG BB 36 Simple Linear Regression
 
NG BB 36 Simple Linear Regression
NG BB 36 Simple Linear RegressionNG BB 36 Simple Linear Regression
NG BB 36 Simple Linear Regression
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change Over
 
NG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsNG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing Basics
 
NG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing BasicsNG BB 33 Hypothesis Testing Basics
NG BB 33 Hypothesis Testing Basics
 
NG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE RoadmapNG BB 39 IMPROVE Roadmap
NG BB 39 IMPROVE Roadmap
 
NG BB 54 Sustain the Gain
NG BB 54 Sustain the GainNG BB 54 Sustain the Gain
NG BB 54 Sustain the Gain
 
NG BB 43 Standardized Work
NG BB 43 Standardized WorkNG BB 43 Standardized Work
NG BB 43 Standardized Work
 
NG BB 11 Power Steering
NG BB 11 Power SteeringNG BB 11 Power Steering
NG BB 11 Power Steering
 
NG BB 02 Table of Contents
NG BB 02 Table of ContentsNG BB 02 Table of Contents
NG BB 02 Table of Contents
 

Mehr von Leanleaders.org

Mehr von Leanleaders.org (20)

Variation and mistake proofing
Variation and mistake proofingVariation and mistake proofing
Variation and mistake proofing
 
D11 Define Review
D11 Define ReviewD11 Define Review
D11 Define Review
 
Blankgage.MTW
Blankgage.MTWBlankgage.MTW
Blankgage.MTW
 
Chi-sq GOF Calculator.xls
Chi-sq GOF Calculator.xlsChi-sq GOF Calculator.xls
Chi-sq GOF Calculator.xls
 
D04 Why6Sigma
D04 Why6SigmaD04 Why6Sigma
D04 Why6Sigma
 
D10 Project Management
D10 Project ManagementD10 Project Management
D10 Project Management
 
Attrib R&R.xls
Attrib R&R.xlsAttrib R&R.xls
Attrib R&R.xls
 
Blank Logo LEAN template
Blank Logo LEAN templateBlank Logo LEAN template
Blank Logo LEAN template
 
D07 Project Charter
D07 Project CharterD07 Project Charter
D07 Project Charter
 
ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)
 
Cause and Effect Tree.vst
Cause and Effect Tree.vstCause and Effect Tree.vst
Cause and Effect Tree.vst
 
LEAN template
LEAN templateLEAN template
LEAN template
 
I07 Simulation
I07 SimulationI07 Simulation
I07 Simulation
 
D01 Define Spacer
D01 Define SpacerD01 Define Spacer
D01 Define Spacer
 
Attribute Process Capability Calculator.xls
Attribute Process Capability Calculator.xlsAttribute Process Capability Calculator.xls
Attribute Process Capability Calculator.xls
 
A05 Continuous One Variable Stat Tests
A05 Continuous One Variable Stat TestsA05 Continuous One Variable Stat Tests
A05 Continuous One Variable Stat Tests
 
XY Matrix.xls
XY Matrix.xlsXY Matrix.xls
XY Matrix.xls
 
D06 Project Selection
D06 Project SelectionD06 Project Selection
D06 Project Selection
 
G04 Root Cause Relationships
G04 Root Cause RelationshipsG04 Root Cause Relationships
G04 Root Cause Relationships
 
15 Deliv template
15 Deliv template15 Deliv template
15 Deliv template
 

Kürzlich hochgeladen

EMBODO Lesson Plan Grade 9 Law of Sines.docx
EMBODO Lesson Plan Grade 9 Law of Sines.docxEMBODO Lesson Plan Grade 9 Law of Sines.docx
EMBODO Lesson Plan Grade 9 Law of Sines.docxElton John Embodo
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
Millenials and Fillennials (Ethical Challenge and Responses).pptx
Millenials and Fillennials (Ethical Challenge and Responses).pptxMillenials and Fillennials (Ethical Challenge and Responses).pptx
Millenials and Fillennials (Ethical Challenge and Responses).pptxJanEmmanBrigoli
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxRosabel UA
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 

Kürzlich hochgeladen (20)

EMBODO Lesson Plan Grade 9 Law of Sines.docx
EMBODO Lesson Plan Grade 9 Law of Sines.docxEMBODO Lesson Plan Grade 9 Law of Sines.docx
EMBODO Lesson Plan Grade 9 Law of Sines.docx
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
Millenials and Fillennials (Ethical Challenge and Responses).pptx
Millenials and Fillennials (Ethical Challenge and Responses).pptxMillenials and Fillennials (Ethical Challenge and Responses).pptx
Millenials and Fillennials (Ethical Challenge and Responses).pptx
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptx
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 

NG BB 25 Measurement System Analysis - Attribute

  • 1. UNCLASSIFIED / FOUO UNCLASSIFIED / FOUO National Guard Black Belt Training Module 25 Measurement System Analysis (MSA) Attribute Data This material is not for general distribution, and its contents should not be quoted, extracted for publication, or otherwise UNCLASSIFIED / FOUO copied or distributed without prior coordination with the Department of the Army, ATTN: ETF. UNCLASSIFIED / FOUO
  • 2. UNCLASSIFIED / FOUO CPI Roadmap – Measure 8-STEP PROCESS 6. See 1.Validate 2. Identify 3. Set 4. Determine 5. Develop 7. Confirm 8. Standardize Counter- the Performance Improvement Root Counter- Results Successful Measures Problem Gaps Targets Cause Measures & Process Processes Through Define Measure Analyze Improve Control TOOLS •Process Mapping ACTIVITIES • Map Current Process / Go & See •Process Cycle Efficiency/TOC • Identify Key Input, Process, Output Metrics •Little’s Law • Develop Operational Definitions •Operational Definitions • Develop Data Collection Plan •Data Collection Plan • Validate Measurement System •Statistical Sampling • Collect Baseline Data •Measurement System Analysis • Identify Performance Gaps •TPM • Estimate Financial/Operational Benefits •Generic Pull • Determine Process Stability/Capability •Setup Reduction • Complete Measure Tollgate •Control Charts •Histograms •Constraint Identification •Process Capability Note: Activities and tools vary by project. Lists provided here are not necessarily all-inclusive. UNCLASSIFIED / FOUO
  • 3. UNCLASSIFIED / FOUO Learning Objective  Understand how to conduct and interpret a measurement system analysis with Attribute Data Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 3
  • 4. UNCLASSIFIED / FOUO Attribute Measurement Systems  Most physical measurement systems use measurement devices that provide continuous data  For continuous data Measurement System Analysis we can use control charts or Gage R&R methods  Attribute/ordinal measurement systems utilize accept/reject criteria or ratings (such as 1 - 5) to determine if an acceptable level of quality has been attained  Kappa and Kendall techniques can be used to evaluate these Attribute and Ordinal Measurement Systems Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 4
  • 5. UNCLASSIFIED / FOUO Are You Really Stuck With Attribute Data?  Many inspection or checking processes have the ability to collect continuous data, but decide to use attribute data to simplify the task for the person taking and recording the data  Examples:  On-time Delivery can be recorded in 2 ways: a) in hours late or b) whether the delivery was on-time or late  Many functional tests will evaluate a product on a continuous scale (temperature, pressure drop, voltage drop, dimensional, hardness, etc) and record the results as pass/fail Strive to get continuous data! Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 5
  • 6. UNCLASSIFIED / FOUO Attribute and Ordinal Measurements  Attribute and Ordinal measurements often rely on subjective classifications or ratings  Examples include:  Rating different features of a service as either good or bad, or on a scale from 1 to 5 with 5 being best  Rating different aspects of employee performance as excellent, satisfactory, needs improvement  Rating wine on a) aroma, b) taste, and c) after taste  Should we evaluate these measurement systems before using them to make decisions on our CPI project?  What are the consequences of not evaluating them? Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 6
  • 7. UNCLASSIFIED / FOUO MSA – Attribute Data  What methodologies are appropriate to assess Attribute Measurement Systems?  Attribute Systems – Kappa technique which treat all misclassifications equally  Ordinal Systems – Kendall‟s technique which considers the rank of the misclassification  For example, if we are judging an advertising service on a scale from 1 to 5 and Inspector A rates the service a „1‟ while Inspector B rates it a „5.‟ That is a greater misclassification than Inspector A rating it a „4‟ while Inspector B rates it a „5.‟ Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 7
  • 8. UNCLASSIFIED / FOUO Data Scales  Nominal: Contains numbers that have no basis on which to arrange in any order or to make any assumptions about the quantitative difference between them. These numbers are just names or labels. For example:  In an organization: Dept. 1 (Accounting), Dept. 2 (Customer Service), Dept. 3 ( Human Resources)  In an insurance co.: Business Line 1, Line 2, Line 3  Modes of transport: Mode 1 (air), Mode 2 (truck), Mode 3 (sea)  Ordinal: Contains numbers that can be ranked in some natural sequence. This scale, however, cannot make an inference about the degree of difference between the numbers. Examples:  On service performance: excellent, very good, good, fair, poor  Salsa taste test: mild, hot, very hot, makes me suffer  Customer survey: strongly agree, agree, disagree, strongly disagree Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 8
  • 9. UNCLASSIFIED / FOUO Kappa Techniques  Kappa is appropriate for non-quantitative systems such as:  Good or bad  Go/No Go  Differentiating noises (hiss, clank, thump)  Pass/fail Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 9
  • 10. UNCLASSIFIED / FOUO Kappa Techniques  Kappa for Attribute Data:  Treats all misclassifications equally  Does not assume that the ratings are equally distributed across the possible range  Requires that the units be independent and that the persons doing the judging or rating make their classifications independently  Requires that the assessment categories be mutually exclusive Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 10
  • 11. UNCLASSIFIED / FOUO Operational Definitions  There are some quality characteristics that are either difficult or very time consuming to define  To assess classification consistency, several units must be classified by more than one rater or judge  If there is substantial agreement among the raters, there is the possibility, although no guarantee, that the ratings are accurate  If there is poor agreement among the raters, the usefulness of the rating is very limited Poor attribute measurement systems can almost always be traced to poor operational definitions Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 11
  • 12. UNCLASSIFIED / FOUO Consequences?  What are the important concerns?  What are the risks if agreement within and between raters is not good?  Are bad items escaping to the next operation in the process or to the external customer?  Are good items being reprocessed unnecessarily?  What is the standard for assessment?  How is agreement measured?  What is the Operational Definition for assessment? Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 12
  • 13. UNCLASSIFIED / FOUO What Is Kappa? “K” Pobserved  Pchance K 1  Pchance P observed  Proportion of units on which both Judges agree = proportion both Judges agree are good + proportion both Judges agree are bad P chance (expected)  Proportion of agreements expected by chance = (proportion Judge A says good * proportion Judge B says good) + (proportion Judge A says bad * proportion B says bad) Note: equation applies to a two category analysis, e.g., good or bad Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 13
  • 14. UNCLASSIFIED / FOUO Kappa Pobserved  Pchance K 1  Pchance  For perfect agreement, P observed = 1 and K=1  As a rule of thumb, if Kappa is lower than 0.7, the measurement system is not adequate  If Kappa is 0.9 or above, the measurement system is considered excellent  The lower limit for Kappa can range from 0 to -1  For P observed = P chance (expected), then K=0  Therefore, a Kappa of 0 indicates that the agreement is the same as would be expected by random chance Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 14
  • 15. UNCLASSIFIED / FOUO Attribute MSA Guidelines  When selecting items for the study consider the following:  If you only have two categories, good and bad, you should have a minimum of 20 good and 20 bad  As a maximum, have 50 good and 50 bad  Try to keep approximately 50% good and 50% bad  Have a variety of degrees of good and bad If only good items are chosen for the study, what might happen to P-chance (expected)? Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 15
  • 16. UNCLASSIFIED / FOUO Attribute MSA Guidelines (Cont.)  If you have more than two categories, with one of the categories being good and the other categories being different error modes, you should have approximately 50% of the items being good and a minimum of 10% of the items in each of the error modes  You might combine some of the error modes as “other”  The categories should be mutually exclusive or, if not, they should also be combined Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 16
  • 17. UNCLASSIFIED / FOUO Within Rater/Repeatability Considerations  Have each rater evaluate the same item at least twice  Calculate a Kappa for each rater by creating separate Kappa tables, one for each rater  If a Kappa measurement for a particular rater is small, that rater does not repeat well within self  If the rater does not repeat well within self, then they will not repeat well with the other raters and this will hide how good or bad the others repeat between themselves  Calculate a between-rater Kappa by creating a Kappa table from the first judgment of each rater  Between-rater Kappa will be made as pairwise comparisons (A to B, B to C, A to C) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 17
  • 18. UNCLASSIFIED / FOUO Example: Data Set = Attribute Ordinal.mtw  An educational testing organization is training five new appraisers for the written portion of the twelfth-grade standardized essay test  The appraisers‟ ability to rate essays consistent with the standards needs to be assessed  Each appraiser rated fifteen essays on a five-point scale (-2, -1, 0, 1, 2)  The organization also rated the essays and supplied the “official score”  Each essay was rated twice and the data captured in the file Attribute Ordinal.mtw  Open the file and evaluate the appraisers' performance Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 18
  • 19. UNCLASSIFIED / FOUO Minitab and Attribute Measurement Systems Stat>Quality Tools>Attribute Agreement Analysis Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 19
  • 20. UNCLASSIFIED / FOUO Minitab Dialog Box 1. Double click on the appropriate variable to place it in the required dialog box: Attribute = Rating Samples = Sample Appraisers = Appraiser 2. Click on OK Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 20
  • 21. UNCLASSIFIED / FOUO Within Appraiser Percent This output represents the percent agreement and the 95% confidence interval around that percentage Date of study : Assessment Agreement Reported by : Name of product: Misc: Within A ppraisers 100 95.0% C I P ercent 80 60 Percent 40 20 0 Duncan Hayes Holmes Montgomery Simpson Appraiser Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 21
  • 22. UNCLASSIFIED / FOUO Within Appraiser Session Window Output This output is the same information contained in the graph with the addition of a Between-Appraiser assessment Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 22
  • 23. UNCLASSIFIED / FOUO Let’s Do It Again Stat>Quality Tools>Attribute Agreement Analysis Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 23
  • 24. UNCLASSIFIED / FOUO Introducing a Known Standard 1. Double click on the appropriate variable to place it in the required dialog box (same as before) 2. If you have a known standard (the real answer) for the items being inspected, let Minitab know what column that information is in. 3. Click on OK Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 24
  • 25. UNCLASSIFIED / FOUO Appraiser vs. Standard Date of study : Assessment Agreement Reported by : Name of product: Misc: Within Appraisers Appraiser vs Standard 100 95.0% C I 100 95.0% C I P ercent P ercent 90 90 80 80 70 70 Percent Percent 60 60 50 50 40 40 30 30 an es es ry on an es es ry on nc ay lm me ps nc ay lm me ps Du H Ho go Si m Du H Ho go Si m ont ont M M Appraiser Appraiser Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 25
  • 26. UNCLASSIFIED / FOUO Within Appraiser In addition to the Within-Appraiser graphic, Minitab will give percentages Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 26
  • 27. UNCLASSIFIED / FOUO Each Appraiser vs. Standard Some appraisers will repeat their own ratings well but may not match the standard well (look at Duncan) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 27
  • 28. UNCLASSIFIED / FOUO More Session Window Output The session window will give percentage data as to how all the appraisers did when judged against the standard Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 28
  • 29. UNCLASSIFIED / FOUO Kappa and Minitab Minitab will calculate a Kappa for each (within) appraiser for each category Note: This is only a part of the total data set for illustration Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 29
  • 30. UNCLASSIFIED / FOUO Kappa vs. Standard Minitab will also calculate a Kappa statistic for each appraiser as compared to the standard Note: This is only a part of the total data set for illustration Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 30
  • 31. UNCLASSIFIED / FOUO Kappa and Minitab Minitab will not provide a Kappa between a specific pair of appraisers, but will provide an overall Kappa between all appraisers for each possible category of response How might this output help us improve our measurement system? Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 31
  • 32. UNCLASSIFIED / FOUO What If My Data Is Ordinal? Stat>Quality Tools>Attribute Agreement Analysis Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 32
  • 33. UNCLASSIFIED / FOUO Ordinal Data If your data is Ordinal, you must also check this box Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 33
  • 34. UNCLASSIFIED / FOUO What Is Kendall’s Kendall‟s coefficient can be thought of as an R-squared value, it is the correlation between the responses treating the data as attribute as compared to ordinal. The lower the number gets, the more severe the misclassifications were. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 34
  • 35. UNCLASSIFIED / FOUO Kendall’s Kendall‟s coefficient can be thought of as an R-squared value, it is the correlation between the responses treating the data as attribute as compared to ordinal. The lower the number gets, the more severe the misclassifications were. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 35
  • 36. UNCLASSIFIED / FOUO Kendall’s (Cont.) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 36
  • 37. UNCLASSIFIED / FOUO Exercise: Seeing Stars  Divide into teams of two  One person will be the rater and one the recorder  Have each rater inspect each start and determine if it is Good or Bad (Kappa)  Record the results in Minitab  Mix up the stars and repeat with same rater 2 more times  Compare results to other raters and to the known standard  Take 30 minutes to complete the exercise and be prepared to review your findings with the class Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 37
  • 38. UNCLASSIFIED / FOUO Takeaways  How to set-up/conduct an MSA  Use attribute data only if the measurement can not be converted to continuous data  Operational definitions are extremely important  Attribute measurement systems require a great deal of maintenance  Kappa is an easy method to test how repeatable and reproducible a subjective measurement system is Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 38
  • 39. UNCLASSIFIED / FOUO What other comments or questions do you have? UNCLASSIFIED / FOUO
  • 40. UNCLASSIFIED / FOUO References  Cohen, J., “A Coefficient of Agreement for Nominal Scales,” Educational and Psychological Measurement, Vol. 20, pp. 37-46, 1960  Futrell, D., “When Quality Is a Matter of Taste, Use Reliability Indexes,” Quality Progress, May 1995 Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 40
  • 41. UNCLASSIFIED / FOUO APPENDIX – A Practical Example of Kappa Evaluating the Measurement System for Determining Civilian Awards Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 41
  • 42. UNCLASSIFIED / FOUO Kappa Example #1  The Chief of Staff (COS) of the 1st Infantry Division is preparing for the redeployment of 3 brigade combat teams supporting Operation Iraqi Freedom.  The Secretary of General Staff (SGS) informs the COS that awards for civilian personnel (Department of the Army Civilians and military dependents) who provided volunteer support prior to and during the deployment is always a “significant emotional issue.” There are hundreds of submissions for awards.  A board of senior Army personnel decides who receives an award. The measurement system the board uses to determine who receives an award is a major concern due to differences in board member to board member differences as well as within board member differences.  The COS directs the SGS (a certified Army Black Belt) to conduct a measurement system study using historical data to “level set” the board members. Kappa for each board member as well as Kappa between board members must be calculated.  The COS‟ guidance is to retrain and/or replace board members until the measurement system is not a concern. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 42
  • 43. UNCLASSIFIED / FOUO Consider the Following Data • The Lean Six Sigma Pocket Toolbook, p.100-103 outlines the procedures for calculating Kappa. Kappa is MSA for attribute data. • The SGS‟ study involves two categories for recommendations, “Award” and “No Award”. • We select 40 candidate packets from historical data and ensure that 20 are definitely for “Award” and 20 are for “No Award”. • Board Member 1 and 2 evaluate each candidate‟s packet. The results are shown in the tables on the following slides. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 43
  • 44. UNCLASSIFIED / FOUO Consider the Following Data Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 44
  • 45. UNCLASSIFIED / FOUO Consider the Following Data Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 45
  • 46. UNCLASSIFIED / FOUO Contingency Table for Board Member 1 Populate Each Cell with the Evaluation Data Contingency Table: Counts Board Member 1 - 1st Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 Board Member 1 – 1st : shows the results of Board Member 1’s 1st recommendations. The 1st board member recommended an “Award” or “No Award” for each of the 40 candidates on the first review of the files. Board Member 1 – 2nd : shows the results of Board Member 1’s 2nd recommendations. The 1st board member recommended an “Award” or “No Award” for each of the 40 candidates on the second review of the files. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 46
  • 47. UNCLASSIFIED / FOUO Contingency Table: Cell 1 The first cell represents the number of times Board Member 1 recommended a candidate should receive an “Award” in both the first and second evaluation. Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 47
  • 48. UNCLASSIFIED / FOUO Contingency Table: Cell 2 The second cell represents the number of times Board Member 1 recommended a candidate as “No Award” the first time and “Award” the second evaluation. Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 48
  • 49. UNCLASSIFIED / FOUO Contingency Table: Cell 3 Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 The third cell represents the number of times Board Member 1 recommended “Award” on the first evaluation and “No Award” on the second evaluation. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 49
  • 50. UNCLASSIFIED / FOUO Contingency Table: Cell 4 Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 The fourth cell represents the number of times Board Member 1 recommended “No Award” on the first evaluation and “No Award” on the second evaluation. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 50
  • 51. UNCLASSIFIED / FOUO Contingency Table: Sum of Row and Columns Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 The numbers on the margins are the totals of the rows and columns of data. The sum in both instances is 40, the total number of candidate packets reviewed. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 51
  • 52. UNCLASSIFIED / FOUO Contingency Table – Counts & Proportions Contingency Table: Board Member 1 - 1st Counts Award No Award Member 1 Award 15 3 18 Board - 2nd No Award 3 19 22 18 22 Contingency Table: Board Member 1 - 1st Proportions Award No Award Member 1 Board Award 0.375 0.075 0.45 - 2nd No Award 0.075 0.475 0.55 0.45 0.55 Represents 18/40 Board Member 1 Proportions: The lower table is the data in the upper table represented as a percentage of the total. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 52
  • 53. UNCLASSIFIED / FOUO Contingency Table – Sum of Percentages Contingency Table: Board Member 1 - 1st Proportions Award No Award Member 1 Award 0.375 0.075 0.45 Board - 2nd No Award 0.075 0.475 0.55 0.45 0.55 The sum percentages from the rows and columns. The sums must equal 1.0 Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 53
  • 54. UNCLASSIFIED / FOUO Calculating Kappa Pobserved  Pchance K 1  Pchance  Pobserved  Proportion of candidates for which both Board Members agree = proportion both Board Members agree are “Award” + proportion both Board Members agree are “No Award”.  Pchance  Proportion of agreements expected by chance = (proportion Board Member 1 says “Award” * proportion Board Member 2 says “Award”)+ (proportion Board Member 1 says “No Award” * proportion Member 2 says ”No Award”) The verbiage for defining Kappa will vary slightly depending on whether we are defining a Within-Rater Kappa or Between-Rater Kappa Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 54
  • 55. UNCLASSIFIED / FOUO Calculate Kappa for Board Member 1 Contingency Table: Board Member 1 - 1st Proportions Award No Award Member 1 Award 0.375 0.075 0.45 Board - 2nd No Award 0.075 0.475 0.55 0.45 0.55 Pobserved is the sum of the probabilities on the diagonal: P observed =(0.375 + 0.475) = 0.850 Pchance is the probabilities for each classification multiplied and then summed: Pchance =(0.450*0.450) + (0.550*0.550) = 0.505 Then KBoard Member 1=(0.850 - 0.505)/(1 - 0.505)=0.697 Kappa for Board Member 1 is sufficiently close to 0.700 that we conclude that Board Member 1 exhibits repeatability. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 55
  • 56. UNCLASSIFIED / FOUO Calculate Kappa for Board Member 2 Contingency Table: Board Member 2 - 1st Counts Award No Award Member 2 Award Board - 2nd No Award Contingency Table: Board Member 2 - 1st Proportion Award No Award Member 2 Award Board - 2nd No Award K Board Member 2 = ? Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 56
  • 57. UNCLASSIFIED / FOUO Kappa Between Board Members  To calculate a Kappa for between Board Members, we will use a similar procedure.  We calculate Kappa for the first recommendations of the pair of the Board Members.  NOTE: If there is a Board Member who has poor Within-Board Member repeatability (less than 85%), there is no need to calculate a Between-Board Member rating. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 57
  • 58. UNCLASSIFIED / FOUO Kappa – Board Member 1 to Board Member 2 Contingency Table: Board Member 1 - 1st Counts Award No Award Member 2 Award 14 5 19 Board - 1st No Award 4 17 21 18 22 Number of times both board members agreed the candidate should receive an “Award.” (using their first evaluation) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 58
  • 59. UNCLASSIFIED / FOUO Kappa Between Board Members Contingency Table: Counts Board Member 1 - 1st Award No Award Member 2 Award 14 5 19 Board - 1st No Award 4 17 21 18 22 Number of times Board Member 1 recommended “No Award” and Board Member 2 recommended “Award”. (using their first evaluation) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 59
  • 60. UNCLASSIFIED / FOUO Board Member 1 to Board Member 2 Kappa Contingency Table: Counts Board Member 1 - 1st Award No Award Member 2 Award 14 5 19 Board - 1st No Award 4 17 21 18 22 Number of times Board Member 1 recommended “Award” and Board Member 2 recommended “No Award” (using their first measurement) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 60
  • 61. UNCLASSIFIED / FOUO Between Board Member Kappa Contingency Table: Board Member 1 - 1st Counts Award No Award Member 2 Award 14 5 19 Board - 1st No Award 4 17 21 18 22 Number of times both Board Members agreed the candidate was “No Award” (using their first measurement) Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 61
  • 62. UNCLASSIFIED / FOUO Kappa Between Board Members Calculate Between-Board Member Kappa: Contingency Table: Board Member 1 - 1st Counts Award No Award Member 2 Award 14 5 19 Board The lower table - 1st represents the data No Award 4 17 21 in the top with each 18 22 cell being represented as a percentage of the Contingency Table: Proportions Board Member 1 - 1st total. Award No Award Member 2 Award 0.35 0.125 0.48 Board - 1st No Award 0.100 0.425 0.53 0.450 0.550 Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 62
  • 63. UNCLASSIFIED / FOUO Remember How to Calculate Kappa? Pobserved  Pchance K 1  Pchance  Pobserved  Proportion of items on which both Board Members agree = proportion both Board Members agree “Award” + proportion both Board Members agree are “No Award”.  Pchance  Proportion of agreements expected by chance = (proportion Board Member 1 recommends “Award” * proportion Board Member 2 says “No Award”) + (proportion Board Member 1 says No Award” * proportion Board Member 2 says “No Award”) The verbiage for defining Kappa will vary slightly depending on whether we are defining a Within-Board Member Kappa or Between-Board Member Kappa Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 63
  • 64. UNCLASSIFIED / FOUO Calculate Kappa for Board Member 1 to Board Member 2 Contingency Table: Board Member 1 - 1st Proportions Award No Award Member 2 Award 0.35 0.125 0.48 Board - 1st No Award 0.100 0.425 0.53 0.450 0.550 Pobserved is the sum of the probabilities on the diagonal: Pobserved =(0.350 + 0.425) = 0.775 Pchance is the probability for each classification multiplied and then summed: Pchance =(0.480*0.450) + (0.530*0.550) = 0.503 Then Kboard Member 1 / 2=(0.775 - 0.503)/(1 - 0.503)=0.548 The Board Members evaluate candidate packets differently too often. The SGS will retrain each Board Member before dismissing a Board Member and finding a replacement. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 64
  • 65. UNCLASSIFIED / FOUO Improvement Ideas  How might we improve this measurement system?  Additional training  Physical standards/samples  Rater certification (and periodic re-certification) process  Better operational definitions Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 65
  • 66. UNCLASSIFIED / FOUO Kappa Conclusions  Is the current measurement system adequate?  Where would you focus your improvement efforts?  What rater would you want to conduct any training that needs to be done? Class Challenge: After exposure to Minitab in the following slides, input the data from previous example into Minitab. As homework, perform the analysis and compare the computer output and simplicity with the manual calculations performed in the previous slides. Hint: You will need to stack columns. Measurement System Analysis - Attribute UNCLASSIFIED / FOUO 66