javier garcia - verdugo sanchez - six sigma training - w1 attributive data (msa)
Post on 10-Feb-2017
230 Views
Preview:
TRANSCRIPT
Analysis of Measurement SystemsPart 2: Attributive Data
Week 1
Knorr-Bremse Group
About this Module
Based on this technique you can asses and judge t t h b tt th d ib dmeasurement systems much better than described in the ISO 9000 standard.
• Part 1: Introduction of Measurement System Analysis– Concept definition and describing the basic termsConcept definition and describing the basic terms
• Part 2: Attributive Measurements– Kappa Analysis
• Part 3: Continuous Measurements– The method for the Gage R&R Study
• Some exercises
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 2/30
The DMAIC Cycle
ControlMaintain
DefineMaintain
ImprovementsSPC
Control Plans
Project charter (SMART)
Business Score CardQFD VOC
D Documentation QFD + VOC
Strategic GoalsProject strategy
C M
MeasureImprove
AIBaseline Analysis
Process MapC + E MatrixAnalyze
ImproveAdjustment to the
OptimumFMEA
Measurement System
Definition of critical InputsFMEA
S
FMEAStatistical Tests
SimulationTolerancing
yProcess CapabilityStatistical Tests
Multi-Vari StudiesRegression
Tolerancing
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 3/30
Content and Terminology
• Discrimination • P/T Ratio
• Terms connected with accuracyT l
− Precision to tolerance
• R&R %− True value
− Systematic Error / Bias
− Linearity
• R&R %− Repeatability and
ReproducibilityLinearity
• Terms connected with precision • Process capability related i i f h
p− Repeatability
− Reproducibility
variation from the measurement system
− Linearity
Stability (over Time)• Stability (over Time)
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 4/30
Possible Sources for Process Variation
Observed Process Variation
Actual Process Variation Measurement Variation
Short Term Process Variation
Long Term Process Variation
Variation within a sample
Variation due to Measurement
System
Variation due to Operatorp
Systemp
Repeatability Precision
Calibration Stability Linearity
In order to work on the actual process variation, the measurement
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 5/30
variation has to be determined and separated from the process variation
Sources of Measurement Variation
W k M th d
Operator Training
Ease of Data Entry
M eaMechanical instability
Tool Work Methods
Sufficient Work Time
Maintenance Standard
Calibration Frequency
Electrical Instability
Wear
'Measurement Variation'
Operator Technique
Standard ProceduresAlgorithm Instabilty
Measurement Variation
Humidity
Cleanliness
Vibration
Line Voltage Variation
Temperature Fluctuation
M ethodsEnvironment Environment
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 6/30
Needed Information
• How big is the measurement error?
• What are the sources of the measurement error?• What are the sources of the measurement error?
• Is the gauge stable over the time?
• Is the gauge suitable for this examination?
• How can we improve the measurement system?
• Measurement tools (Hardware and Software)Measurement tools (Hardware and Software)
• All procedures for using the tools
• Which operator?
• Set-up and handling proceduresp g p
• Off-line calculations and data entry
C lib ti f d t h i
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 7/30
• Calibration frequency and technique
Effects of Measurement Error
Measurement System Bias -
AverageDetermined through “Calibration Study”
Accuracy
µ µ µtotal product measurement= +p
V i bilit
Measurement System Variability - Determined through “R&R Study”
Variability
222
Precision
222tmeasuremenproducttotal σσσ +=
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 8/30
The True Process Variation
Observed Variation (Total Variation)
Actual Process Variation Measurement Variation
Can we observe the truth?
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 9/30
Can we observe the truth?
Attributive Measurements
Most administrative assessments are of subjective nature. We are talking about good vs. bad classification
t fit i if iblor an assessment fit in groups if possible.
These attributive results can be evaluated applying theThese attributive results can be evaluated applying the Kappa calculation by using contingency tables.
At physical measurements we get continuous results mostly. Here we can calculate means, standard y ,deviations and evaluate the root causes for variation.
It is often recognized that continuous checked criteria are judged as
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 10/30
attributes in the practice.
Questions for Measurement Systems
Which information about the measure process is existing?
• Is there a description or instruction for the execution?
• Is there a detailed flowchart available?
• Are the inspectors qualified?
Which information we have about:Which information we have about:
• Discrimination
• RepeatabilityRepeatability
• Reproducibility
• Which correlation is there to customers or suppliers?• Which correlation is there to customers or suppliers?
• What is the variation for the process and the measurement system?system?
Our knowledge determines the further procedure
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 11/30
Our knowledge determines the further procedure
Attributive Measurements
• Attributive Measurements are based on subjective
classifications and ratings.
• Example:
Rating of features as good or bad– Rating of features as good or bad
– Classification of wine aroma or taste
– Rating of employee satisfaction on a scale of 1 - 5.
– Rating of a service in acceptable or unacceptable
We should evaluate these measurement systems before we change processes. Otherwise we may oversee an important factor which could
b j ti f th b d i ti
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 12/30
be a major portion of the observed variation
Reliability Coefficient Kappa
• A statistical method to evaluate attributive data sets is the reliability coefficient. It does inform about how ystrong the difference of ratings is compared to a random chance.
• All differences in the rating will be handled equally. There is no direction given.
f• There are several ways for the evaluation. Just 1 rater can be evaluated but also several raters against each th F th th 2 l bother. Furthermore, more than 2 classes can be
evaluated separately.
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 13/30
Die Kappa Technik
This method judges classification data.
• The following conditions should be adhered at the data collection to get a meaningful result:to get a meaningful result:
• The inspectors take her decisions independently
• Use at least two categories (classes)
• A category can be more frequently used than otherg y q y
• The categories exclude each other
• Kappa (K) is defined as the share in agreement of inspectors or categories of the at most possible agreement
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 14/30
The Kappa Coefficient
PP
The Kappa (K) equation:
chanceobserved
P
PPK
−=
1chance
P−1Description:p
• P observed = the proportion of results in agreement
both inspectors assesses good or both inspectors assesses= both inspectors assesses good or both inspectors assesses bad
• P chance = the proportion of results in agreement by chance
= (proportion of good rated units by inspector A x proportion good rated units by inspector B) + (proportion of bad rates units by inspector A proportion of bad rated nits b inspector B)
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 15/30
inspector A x proportion of bad rated units by inspector B)
For Clarification
Example 1: 24 parts assessed by 2 inspectors with 96% agreement Example 1
Good Bad
Good 16 0 16
Number
er B
Rater APart Rater A Rater B1 g g2 g g3 g g4 b
Bad 1 7 8
17 7 24
Rat
e4 g b5 g g6 b b7 g g8 g g
Good Bad
Good 0,66666667 0 0,66666667
Rater A
er B
Portion9 g g
10 b b11 g g12 b b13 g g
Bad 0,04166667 0,29166667 0,33333333
0,70833333 0,29166667 1
Rat
eg g14 g g15 g g16 b b17 g g18 b b
Pobserved = (0,667 + 0,292) = 0,959
18 b b19 g g20 g g21 g g22 b b23 g g
Pchance = (0,667 x 0,708) + (0,333 x 0,292) = 0,570
K = (0,959 – 0,570) / (1 – 0,570) = 0,905
23 g g24 b b
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 16/30
For Clarification
Example 2: 24 parts assessed by 2 inspectors with 83% agreement Example 2
Part Rater A Rater B1 g g2 g g3 g g4 b
Good Bad
Good 13 2 15
Rater A
er B
Number
4 g b5 g g6 b b7 g b8 g g
Bad 2 7 9
15 9 24
Rat
e
9 g g10 b b11 b g12 b b13 g g
Good Bad
Good 0,54166667 0,08333333 0,625
Rater A
er B
Portion
g g14 g g15 g g16 b b17 b g18 b b
Bad 0,08333333 0,29166667 0,375
0,625 0,375 1
Rat
ePobserved = (0,542 + 0,292) = 0,834
18 b b19 g g20 g g21 g g22 b b23 g g
Pchance = (0,625 x 0,625) + (0,375 x 0,375) = 0,531
K = (0,834 – 0,531) / (1 – 0,531) = 0,646
23 g g24 b b
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 17/30
The Kappa Coefficient
• Kappa can have a value between -1 and 1.
• A value of 1 will be achieved at an absolute agreementA value of 1 will be achieved at an absolute agreement.
• A practical rule is that we don’t accept Kappa values < 0.7.
A l d 0 9 lk b ll• At values around 0.9 we talk about an excellent measurement system.
A l d h h i f “ d”• A value around zero means, that the rating of a part as “good” or “bad” is the same as would be expected by chance.
f 1• A value of -1 means that ratings are exact contrary, e.g. appraiser against appraiser or appraiser against a standard
f fKappa values can be calculated for several persons as well for a single person. We have also the possibility for rating classes
(categories) Examples will follow
Poor Kappa ratings are usually caused by an inadequate “Operational D fi i i ” l i d
(categories). Examples will follow.
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 18/30
Definition” or a poorly trained rater
Example: Leakage Test EvaluationSample George 1 George 2 George 3 Kevin1 Kevin2 Kevin3 Paul1 Paul2 Paul3
1 P P P P P P P P P2 P P P P P P P P P3 P P P P P P P P P4 P P P P P P P P P5 P P P P P P P P P6 P P P P P P P P P
Measurement System Analysis (MSA):For attributive data: acceptable / not acceptable
6 P P P P P P P P P7 P P P P P P P P P8 P P P P P P P P P9 P P P P P P P P P
10 F F F F F F F F F11 P P P P P P P P P12 P P P P P P P P P13
Due to customer complaints of the leakage test reliability it was decided to analyze the current measurement
13 F F F F F F F F F14 P P P P P P P P P15 P P P P P P P P P16 P P P P P P P P P17 P P P P P P P P P18 P P P P P P P P P19 P P P P P P P P P
system capability.
The analysis has been performed with 49 samples including 5 not acceptable 20 P P P P P P P P P
21 F F P F F F F F F22 P P P P P P P P P23 P P P P P P P P P24 P P P P P P P P P25 P P P P P P P P P26 F F F F F F F F F
49 samples, including 5 not acceptable parts, and with three appraiser.
A decision for or against an investment f t t b h d b d 27 P P P P P P P P P
28 P P P P P P P P P29 P P P P P P P P P30 P P P P P P P P P31 P P P P P P P P P32 P P P P P P P P P33 P P P P P P P P P
of a new test bench was made based on the results of this MSA
34 P P P P P P P P P35 P P P P P P P P P36 P P P P P P P P P37 P P P P P P P P P38 P P P P P P P P P39 P P P P P P P P P40 P P P P P P P P P
File: Leak Test Attribute Study.mtw
3 Appraiser:George Ke in and Pa l 41 P P P P P P P P P
42 P P P P P P P P P43 P P P P P P P P P44 F F F F F F F F F45 P P P P P P P P P46 P P P P P P P P P47 P P P P P P P P P
George, Kevin and Paul
3 ratings per appraiser
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 19/30
48 P P P P P P P P P49 P P P P P P P P P49 independent parts (samples)
Example: Leakage Test EvaluationStat
>Quality Tools
>Attribute Agreement Analysis…
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 20/30
The Graphical Analysis Minitab represents the agreement in percent
As additional information with a confidence interval of 95 %
Date of study: Reported by: Name of product: Misc:
Assessment Agreement
10095,0% C IPercent
Within Appraisers
95
90rce
nt
90
85
Pe
r
PaulKevinGeorge80
Appraiser
The numbers for the graphic
Appraiser # Inspected # Matched Percent 95 % CI
George 49 48 97,96 (89,15; 99,95)
Kevin 49 49 100,00 (94,07; 100,00)
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 21/30
for the graphicPaul 49 49 100,00 (94,07; 100,00)
The Evaluation in the Session Window
Attribute Agreement Analysis for George 1; George 2; George 3; Kevin1; ...
Within Appraisers
Assessment Agreement
Appraiser # Inspected # Matched Percent 95 % CI
Between Appraisers
Assessment AgreementAppraiser # Inspected # Matched Percent 95 % CI
George 49 48 97,96 (89,15; 99,95)
Kevin 49 49 100,00 (94,07; 100,00)
Paul 49 49 100,00 (94,07; 100,00)
# Inspected # Matched Percent 95 % CI
49 48 97,96 (89,15; 99,95)
# Matched: All appraisers' assessments agree with each other# Matched: Appraiser agrees with him/herself across trials.
Fleiss' Kappa Statistics
# Matched: All appraisers assessments agree with each other.
Fleiss' Kappa Statistics
Appraiser Response Kappa SE Kappa Z P(vs > 0)
George F 0,92105 0,0824786 11,1672 0,0000
P 0,92105 0,0824786 11,1672 0,0000
Kevin F 1,00000 0,0824786 12,1244 0,0000
P 1,00000 0,0824786 12,1244 0,0000
Response Kappa SE Kappa Z P(vs > 0)
F 0,974754 0,0238095 40,9397 0,0000
P 0,974754 0,0238095 40,9397 0,0000
, , , ,
Paul F 1,00000 0,0824786 12,1244 0,0000
P 1,00000 0,0824786 12,1244 0,0000
The analysis showed excellent agreements within the appraisers and
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 22/30
also between the appraisers
The Worksheet Modification You may want to analyse the data in one attribute column
Data In the first step we stack the results for each appraiser in a separate column then we stack the results of all appraiser in 1 column (operator)
>Stack
>Columns…
column, then we stack the results of all appraiser in 1 column (operator).
1
22
For the analysis we need to store the operator identification
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 23/30
The Worksheet Modification
Calc
>Make Pattern Data
>Simple Set of Numbers… In addition we need to create one column to identify the samples
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 24/30
Example: Surface Inspection Sample Mary Paul Suzanne
1 A A S2 A A A3 D D A
The surface quality for the base material for PCBs has to be very high.
LegendClass 1 (MIL B)Class 2 (MIL D)
3 D D A4 B B B5 B D B6 A A A7 S S S
Classification in accordance to the Norm MIL 13949 in classes A; D;
Class 2 (MIL D)Class 3 (MIL A)Scrap (S)
8 D B D9 B D D10 A S A1 A A A2 A A A; ;
B or scrap.
In this example 10 panels have been assessed by 3
2 A A A3 D D A4 D B B5 B D B6 A A A
S vs A 6S vs D 0S vs B 0
have been assessed by 3 inspectors 3 times each.
7 S S S8 D B D9 B D B10 A S A1 A A S
A vs D 3A vs B 0
D vs B 10
File:
Attribute Gage Study.xls
Sample Covering30 1110 2
1 A A S2 A A A3 A D A4 B B B5 B D B5 B D B6 A S A7 S S S8 D B D9 B D B
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 25/30
10 A S A
The Evaluation with Minitab
After checking the table in the worksheet we can start the evaluation
Stat
>Quality Toolswe can start the evaluation.>Attribute Agreement Analysis…
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 26/30
The Graphical Analysis Minitab represents the agreement in percent
As additional information with a confidence interval of 95 %
Date of study: Reported by:Name of product:Misc:
Assessment Agreement
100 95,0% C IPercent
Within Appraisersrc
en
t
80
60
Pe
r
40
20
AppraiserSuzannePaulMary
0
Appraiser # Inspected # Matched Percent (%) 95,0% CI
Mary 10 8 80,0 ( 44,4, 97,5)
Paul 10 9 90,0 ( 55,5, 99,7)The numbers
for the graphic
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 27/30
Suzanne 10 8 80,0 ( 44,4, 97,5)for the graphic
The Evaluation in the Session Window
Fleiss' Kappa Statistics
Appraiser Response Kappa SE Kappa Z P(vs > 0)
Mary A 0,86425 0,182574 4,73371 0,0000
B 0,82955 0,182574 4,54361 0,0000
Withi A iD 0,58333 0,182574 3,19505 0,0007
S 1,00000 0,182574 5,47723 0,0000
Overall 0,80707 0,113821 7,09075 0,0000
Within Appraisers:
If we consider the overall results of K 0 7 th ld th tPaul A 0,82955 0,182574 4,54361 0,0000
B 1,00000 0,182574 5,47723 0,0000
D 1 00000 0 182574 5 47723 0 0000
Kappa > 0.7, than we could say that all appraisers are qualified. But have a look on the details!D 1,00000 0,182574 5,47723 0,0000
S 0,81366 0,182574 4,45662 0,0000
Overall 0,91045 0,106205 8,57258 0,0000
S A 0 86425 0 182574 4 73371 0 0000
a look on the details!
Two of the three appraiser show weakness with the stabilitySuzanne A 0,86425 0,182574 4,73371 0,0000
B 0,82955 0,182574 4,54361 0,0000
D 0,71154 0,182574 3,89726 0,0000
weakness with the stability (Repeatability)!
S 0,76000 0,182574 4,16269 0,0000
Overall 0,80831 0,112123 7,20908 0,0000
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 28/30
The Evaluation in the Session Window
Now, have a look on the agreement between the appraiser.
B t th i fi d k t Thi h t bBetween the appraiser we find a weak agreement. This has to be improved. Both classes with the highest quality deliver the most poor
results. It seems that parts with minor failures have the highest p gchance for misinterpretation.
Fleiss' Kappa StatisticsFleiss Kappa Statistics
Response Kappa SE Kappa Z P(vs > 0)
A 0 645483 0 0527046 12 2472 0 0000A 0,645483 0,0527046 12,2472 0,0000
B 0,518717 0,0527046 9,8420 0,0000
D 0,299481 0,0527046 5,6823 0,0000
S 0,600000 0,0527046 11,3842 0,0000
Overall 0,525026 0,0312782 16,7857 0,0000
In such cases the appraiser will receive tasks regarding their
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 29/30
experience.
Example : Document Assessment
One additional example you will the file: Attribute Gage Study.xls.
Here 3 inspectors assessed 15 documents (invoices) two times each( )
First Ass. Second Ass. First Ass. Second Ass. First Ass. Second Ass.Sample A A B B C C
1 good good good good good good2 bad bad good bad bad bad3 good good good good good good4 good bad good good good good4 good bad good good good good5 bad bad bad bad bad bad6 good good good good good good7 bad bad bad bad bad bad8 good good bad good good bad9 good good good good good good
10 bad bad bad bad bad bad11 good good good good good good11 good good good good good good12 good good good bad good good13 bad bad bad bad bad bad14 good good bad good good good15 d d d d d d
Knorr-Bremse Group 17 BB W1 Attributive MSA 08, D. Szemkus/H. Winkler Page 30/30
15 good good good good good good
top related