Pre-control and Some Simple Alternatives

Size: px
Start display at page:

Download "Pre-control and Some Simple Alternatives"

Transcription

1 Pre-control and Some Simple Alternatives Stefan H. Steiner Dept. of Statistics and Actuarial Sciences University of Waterloo Waterloo, N2L 3G1 Canada Pre-control, also called Stoplight control, is a quality monitoring scheme similar to a control chart. However, the goal of Pre-control is the prevention of nonconforming units. The appeal of Pre-control is due mainly to its ease of implementation, lack of assumptions, and reported successes. However, it has come under much criticism since many of its decision rules appear ad hoc. In addition, there is some confusion regarding different versions of Pre-control suggested in the literature. This article compares the various Pre-control schemes and contrasts them with more traditional control charts such as acceptance control charts and X charts. Based on this analysis the conditions under which Pre-control is appropriate are laid out. Also, alternatives that retain much of the simplicity of Pre-control but have improved operating characteristics are suggested. Key Words: Acceptance Control Charts; Grouped Data; Modified Precontrol; Operating Characteristic; Process Capability; Stoplight Control

2 2 1. Introduction Pre-control (PC), sometimes called Stoplight control, was developed to monitor the proportion of nonconforming units produced in a manufacturing process. Implementation is typically very straight-forward. All test units are classified into one of three groups: green, yellow, or red, where the colors loosely correspond to good, questionable, and poor quality products (see Figure 1). Decisions to stop and adjust the process or to continue operation are made based on the number of green, yellow, and red units observed in a small sample. The goal of PC is to detect when the proportion of nonconforming units produced becomes too large. Thus, PC monitors the process to ensure that process capability ( C pk ) remains large. PC was initial proposed in 1954 (see Shainin and Shainin, 1989 and Ermer and Roepke, 1991 for more details) as an easier alternative to Shewhart charts. However, since that time, at least three different versions of PC have been suggested in the literature. In this article the three versions are called: Classical PC Two-stage PC Modified PC Classical PC refers to the original formulation of Pre-control as described by Shainin and Shainin (1989) and Traver (1985). Two-stage PC is a modification discussed by Salvia (1988) that improves the method s operating characteristics by taking an additional sample if the initial sample yields ambiguous results. Modified PC was suggested by Gurska and Heaphy (1991), and represents a departure from the philosophy of Classical and Two-stage PC in that it compromises between the design philosophy of Shewhart type charts and the simplicity of application of PC.

3 3 PC schemes are defined by two attributes; their group classification method and their decision criteria. A third criteria, the qualification procedure, is the same for all three PC versions, and is this not discussed in this article. The most important difference between the three versions of PC is the group classification method. Classical PC and Two-stage PC base the classification of units on engineering tolerance or specification limits. A unit is classified as green if its quality dimension of interest falls into the central half of the tolerance range. A yellow unit has a quality dimension that falls into the remaining tolerance range, and a red unit falls outside the tolerance range. Assuming, without loss of generality, that the upper and lower specification limits (USL and LSL) are 1 and 1 respectively, group classification is based on the endpoints: t = [ 1,.5,.5, 1]. The classification scheme is illustrated in Figure 1. The colored circle can be used for the ease of the operators as a dial indictor overlay (Shainin and Shainin, 1989). Green Yellow Yellow Red Red Yellow Green Yellow Red LSL Measurement USL Figure 1: PC Classification Criteria Let the quality dimension of interest be Y. Then, given a probability density function for observations f( y), the group probabilities for Classical and Two-stage PC are given by (1.1)..5 ( ) = P g = f y P green P yellow y=.5.5 ( ) = P y = f y P red ( )dy 1 ( )dy + f( y)dy (1.1) y= 1 ( ) = P r = f y y=.5 1 ( )dy + f( y)dy y= y=1

4 4 Using Modified PC, on the other hand, the group classification scheme is the same as that used with Classical PC and Two-stage PC with one very important difference. Modified PC uses control limits, as defined in Shewhart charts, rather than tolerance limits to define the boundaries between green, yellow and red units. As shown later, this change is fundamental and makes Modified PC much more like a Shewhart chart than like Classical PC. Group probabilities for the Modified PC procedure are given in (1.2). Note that to avoid confusion, through this article the current process mean and standard deviation are denoted µ and σ, whereas estimates of the in-control mean and standard deviation used to set the control limits for Modified PC and Shewhart type charts are denoted µ c and σ c. P m P m µ c +1.5σ c ( green) = f( y)dy µ c 1.5σ c µ c 1.5σ c µ c +3σ c ( yellow) = f( y)dy + f( y)dy (1.2) P m µ c 3σ c µ c +1.5σ c µ c 3σ c ( red) = f( y)dy + f( y)dy µ c +3σ c The group probabilities given by Equations (1.1) and (1.2) are the same when µ c = 0 and σ c = 1/3. This makes sense because in that case control limits and tolerance limits are the same. It has been suggested that Classical PC and Two-stage PC are only applicable if the current process spread (six process standard deviations) covers less than 88% of the tolerance range (Traver, 1985). With specification limits at ±1, as defined previously, this condition corresponds to the constraint σ < The second important distinction between the three PC versions is their decision criteria. With Classical PC the decision to continue operation or to adjust the process is based on only one or two sample units. The decision rules are given below: Classical PC Decision Procedure Sample two consecutive parts A and B If part A is green continue operation (no need to measure B).

5 5 If part A is yellow measure part B. If part B is green continue operation, otherwise stop and adjust process. If part A is red stop and adjust process (no need to measure B). The Two-stage PC and Modified PC decision criteria are more complicated since they are based on a two-stage procedure. If the initial two observations do not provide clear evidence regarding the state of the process, additional observations (up to three more) are taken. The decision criteria for Two-stage and Modified PC is given below: Two-stage and Modified PC Decision Procedure Sample two consecutive parts. If either part is red stop process and adjust. If both parts are green continue operation. If either or both of the parts are yellow continue to sample, up to an additional three units. In the combined sample, if three greens are obtained continue operation; and if three yellows or a single red are observed stop the process. The advantage of this more complicated decision procedure is that more information regarding the state of the process is obtained and thus decision errors are less likely. The disadvantage is that, on average, larger sample sizes are needed, and thus implementation is more demanding. Table 1 summarizes the comparison of the three versions of PC. Table 1: Comparison of Pre-control Versions Pre-control Version Group Classification Decision Criteria Classic tolerance limits two observations Two-stage tolerance limits five observations Modified control limits five observations Based on these comparisons it can be concluded that the purpose of the various versions of PC are quite different. By design, Classical PC, and Two-stage PC tolerate

6 6 some deviation in the target mean of a process so long as the proportion nonconforming does not become too large. Thus, process stability is not required, and Classical and Twostage PC can be quickly implemented since there is no need to estimate the current process mean and standard deviation and show stability. Thus Classical and Two-stage PC are very similar to Acceptance Control Charts. By contrast, the goal of Modified PC is to monitor the process for deviations from stability. As a result, mean drifts are not tolerated, and Modified PC is very similar to Shewhart type charts such as X charts. 2. Comparison with Traditional Control Charting Methods In this section the various PC methods are compared with appropriate traditional control charting techniques. In addition, the performance of PC under some special situations is explored. Classical and Two-stage PC are compared with Acceptance Control Charts (ACCs), and Modified PC is compared with an X chart. Previous comparisons of Classical and Two-stage PC with more traditional control charts were made with X & R charts (Mackertich, 1990, Ermer and Roepke, 1991), however this is the wrong comparison. For both ACC and Classical and Two-stage PC the goal is not statistical control, but rather producing parts within specification. An ACC is designed to monitor a process when the process variability is much smaller than the specification (tolerance) spread. Under that assumption moderate drifts in the mean (from the target value) are tolerable since they do not yield a significant increase in the proportion of nonconforming units. Like Classical and Two-stage PC, ACCs are based on specification limits. However, ACC limits are derived based on a distributional assumption and require a known and constant process standard deviation. ACC limits are usually set assuming a normal process, although if justified other assumptions could be made. For more information on the design of ACCs see Ryan (1989). Modified PC, on the other hand, bases the grouping criteria on the control limits, and thus Modified PC has the same goal as a Shewhart X chart, namely statistical control of a process.

7 7 PC and traditional control charting techniques differ in a number of ways. The first obvious difference is that PC uses only information from the classified (or grouped) observations, whereas traditional control charts ( X charts and ACCs) use variables data. Grouping data results in decision rules that are easy to implement, but clearly, the grouping also discards information. The loss of information can be quantified by calculating the expected statistical (Fisher) information available in the PC grouped observations. Calculating the Fisher information available in grouped data is discussed in Kulldorff (1961) and Steiner, Geyer and Wesolowsky (1995). Figure 2 shows the expected information about the mean and standard deviation for various parameter values. The plots are scaled so that variables data (infinite number of groups) would provide an expected information content of unity for all values of µ and σ. The expected information graph about µ is symmetric about µ = 0, thus negative mean values are not shown. Figure 2 shows that the group classification scheme used in PC is not very efficient when µ is close to zero and/or σ is small. The small amount of information available when µ equals zero may not be a major concern if at that mean value it is very easy to conclude that the process should continue operation. This would be the case, for example, if the process standard deviation were very small compared with the specification range. This issue is explored further in Section Expected Information Expected Information µ σ Figure 2: Expected Information about µ (left, assume σ =0.2933) and σ (right, assume µ =0), t = [ 1,.5,.5, 1]

8 8 PC and traditional control charts also differ in their decision rules. All control charts require some decision rules to determine whether the process should be allowed to continue operation or if it should be stopped. It is difficult to evaluate the effectiveness of the decision rules in isolation from the grouping criteria. However, the overall effectiveness of various control charts can be compared through their Operating Characteristic (OC) curve. An OC curve shows the probability a control chart signals (or fails to signal) for different parameter values. OC curves for ACCs and X charts can be derived from results given in Ryan (1989). The probability Classical and Two-stage PC schemes continue operating, called the probability of acceptance of the process, can be found by calculating the probability of each combination of green and yellow units that leads to acceptance. For Classical and Two-stage PC the results are given by equations (2.1) and (2.2), see Salvia (1988). P accept P accept ( Classical) = Pg + PyPg (2.1) ( Two stage) = Pg + Py ( ) 2 2PgPy(1 Pg 3 3Pg 2 Py) Py 2 1 Pg 3 ( ) (2.2) where Pg and Py are given by Equations (1.1). Figure 3 shows the OC curves for Classical PC, Two-stage PC, as derived from (2.1) and (2.2), and an ACC with sample size n = 2. Figure 3 shows that an ACC with sample size n = 2 is superior to Classical PC since it has significantly better power to detect mean shifts. Similarly, the Two-stage PC scheme is superior to an ACC with n = 2. Note that different σ values simply shift the OC curves horizontally without changing the ranking. Also, a process whose quality characteristic is approximately normally distributed is assumed. A different underlying distribution for the quality characteristic may dramatically change the probabilities of acceptance, but generally all the considered procedure are similarly effected.

9 Probability of Acceptance Classical Precontrol ACC n = stage Precontrol Mean Shift in Sigma Units Figure 3: OC curve Classical and Two-stage PC versus ACC with n=2 N(µ, σ =.2933) process Note however, that on average the Two-stage PC scheme requires a larger sample size. Two-stage PC uses a partially sequential procedure and requires sample sizes of between one and five units. In the appendix, the average sampling number of the Twostage PC procedure, denoted En ( ) is derived. En ( ) depends on the group probabilities, given by (1.1), that are in turn dependent on the process parameters µ and σ. Figure 4 shows plots of En ( ) versus the process mean assuming normality. For example, En;µ ( = 0,σ =.29333) = 2.4, and En;µ ( = 2σ,σ =.29333) = 3.5, En ( ) En ( ) µ µ ( ) for Two-stage PC as function of µ Figure 4: Average sample size En σ = on left, σ = 0.2 on right, N( µ, σ ) process

10 10 These results suggest a comparison of Two-stage PC with ACCs having larger sample sizes. OC curves of Two-stage PC and ACCs with sample sizes of 3, 4 and 5 are shown in Figure 5. Clearly, of the compared curves the PC procedure has least power to detect mean shifts. In addition ACCs are likely easier to implement since they require fixed sample sizes. On the other hand, the power of Two-stage PC is fairly competitive with an ACC with n = 3, and ACCs have the disadvantage of requiring an estimate of σ, an assumption that σ does not change, and precise or variables measurements rather than grouped data. In addition, Two-stage PC requires smaller sample size on average when the process is centered at µ = 0. In conclusion, Two-stage PC appears a reasonable alternative to ACCs when little process knowledge is available or if estimating the process mean and standard deviation are expensive stage Precontrol Probability Accept ACC n=4 ACC n=5 ACC n= Mean Shift in Sigma Units Figure 5: Comparison of Two-stage PC and ACC with n=3, n=4, n=5 N(µ, σ =.29333) process Classical and Two-stage PC schemes are designed to prevent defectives, but they do not adapt to changes in process variability. Figures 6 and 7 explore the relation between the probability Classical and Two-stage PC signal and the probability of a defect. Figure 6

11 11 shows a contour plot of the probability of a defect for various combinations of µ and σ, assuming USL = 1 and LSL = 1. The probability of a defect, or a nonconforming unit, is given by P r in Equations (1.1). Figure 7 shows contour plots of probability that Classical and Two-stage PC signal (one minus the probability of acceptance as given by (2.1) and (2.2). Ideally Figures 6 and 7 would look similar, however, this is not the case. The PC methods are too conservative when σ is small. When σ is small compared with the specification limits a significant drift in the process mean can be tolerated before the proportion defective becomes large. However, using PC, large mean values together with small σ values are likely to lead to a signal since then many yellow units would be observed. As a result, PC is not applicable when σ is very small compared with the specification limits Standard Deviation e Mean Figure 6: Contour plot of the probability of a defect

12 12 Standard Deviation e Standard Deviation e Mean Mean Figure 7: Contour plot of P(signal) for Classical & Two-stage PC 0.9 ACCs do not suffer from this shortcoming since they adjust their control limits for different σ values. Table 2 gives specific values shown on the contour plots and equivalent values from an ACC with n = 5 for a direct comparison. Most enlightening are the two cases: µ = 0, σ =.2, and µ =.6, σ =.1. In both cases the probability of a defect is fairly small and approximately the same. However, when µ = 0, σ =.2 the probability the PC scheme signals is very small, whereas when µ =.6, σ =.1 the probability of a signal using PC is very large. Clearly, when σ is small, PC rejects some processes that yield very few defects. This is appropriate if the goal is also to keep the mean centered on the target value. However, the goal of PC is defect detection and prevention. Thus, PC is too conservative when σ is small, say when 6σ covers less than 40% of the tolerance range. As shown in Table 2 an ACC with n = 5 does not suffer from this problem to the same degree. Table 2: Probability of defect and signals P(signal) P(signal) µ σ P(defect) Classical PC Two-stage PC P(signal) ACC n = ~ e e 18 ~ e e e e e not applicable e e e

13 13 Modified PC, unlike Classical and Two-stage PC, classifies units based on control limits. As a result, Modified PC is the same in philosophy as an X control chart, since its goal is to monitor the stability of a process. With Modified PC and X charts ideally any drift in the process mean would be detected. The operating characteristics of Modified PC can be determined using Equation (2.2) and (1.2) with µ = 0, σ = 1/3. Modified PC is compared with traditional X control charts in Table 3. The average sample size of Modified PC can be determined through the result in the appendix setting µ = 0 and σ = 1/3. A plot of the result would be similar to those shown in Figure 4. Table 3: Modified PC compared with X control charts P signal P signal P signal P signal µ Modified X chart X chart X chart PC n = 3 n = 4 n = ±1σ ±2σ Table 3 shows that Modified PC has a very high false alarm rate that is an order of magnitude larger than that for X charts. When the process is stable ( µ = 0) a Modified PC chart signals a problem on average almost 2.4% of the time. This false alarm rate is too high, since searching for assignable causes is usually time consuming and expensive, and too many false alarms greatly reduces the credibility of the control chart and may lead to it being ignored. In summary, Classical and Two-stage PC are fairly competitive with ACCs, except under certain circumstances. When the process variation is very small compared with the tolerance range ( 6σ <.4T, where T is the tolerance range), Classical and Two-stage PC lead to undesirable results such as rejecting a process that is producing virtually all parts within specification. On the other hand, when the process variation is relatively large compared to the tolerance range (6σ >.88T) Classical and Two-stage PC yield

14 14 excessively large false alarm rates. However, if 6σ lies between 40% and 88% of the tolerance range (with specification limits of 1 and 1, > σ >.13333) Classical or Two-stage PC yield good results. Modified PC, on the other hand, is designed to have the same goal as a Shewhart chart. Thus, Modified PC is similar to Two-stage PC with 6σ equal to 100% of the tolerance range. As a result, Modified PC is not a good method since it yields too many false alarms. 3. Alternatives to Traditional Pre-control As discussed in Section 2, PC has a number of important advantages over traditional control charts, mainly in terms of simplicity of implementation. However, its design choices, in particular the grouping criteria and the decision rules, appear quite ad hoc. We may ask if the performance of PC could be improved while retaining its simplicity? In this section, three simple variations of PC called Ten Unit PC, Mean Shift PC, and Simplified PC are considered. Each variation is very similar to Two-stage PC, and requires only minor modifications in the design. The goal is not to develop the optimal approach under particular assumptions, but rather to consider simple changes that retain the flavor of PC. If optimal charts are desired, and more process information is available, the grouped data ACCs and Shewhart type charts developed by Steiner, Geyer, and Wesolowsky (1994, 1995) should be considered. One reason why Two-stage PC is fairly competitive with ACCs in terms of power is due to its partially sequential decision procedure. As a result, one possible improvement approach is to make the decision process more sequential. A totally sequential procedure is theoretically feasible, but would be difficult to implement since it would occasionally require large sample sizes and defining the acceptance/rejection criteria would be difficult. A version of PC that allows a maximum of ten units at each decision point is a compromise. The proposed decision rules for Ten Unit PC are given below. This sampling procedure should be repeated each time the process is to be monitored.

15 15 Ten Unit PC Decision Process Take sample units one at a time defining #G and #Y as the number of green and yellow units observed so far respectively. Stop the process if there are any red units, or if at any time #Y #G 2 together with #Y 3, or if at any time #Y 5. Continue operation of the process and stop sampling if at anytime #G #Y 2. Otherwise, continue to sample. By enumerating all the possible paths to acceptance and rejection of Ten Unit PC it is possible to derive expressions for the probability of a signal and the average sample size. Table 4 shows a comparison of the operating characteristics of the proposed Ten Unit PC scheme and Two-stage PC for various combinations of process mean and standard deviation. Table 4 shows that Ten Unit PC has lower false alarm rates, more power to detect two-sigma shifts in the mean, and requires approximately the same average number of units as Two-stage PC. Table 4: Comparison of Ten Unit and Two-stage PC P(signal) En ( ) P(signal) µ σ Two-stage PC Two-stage PC Ten Unit PC En ( ) Ten Unit PC ±1σ ±2σ / ±1σ 1/ ±2σ 1/ A second improvement idea stems from the realization that using PC, units are classified into one of five groups (see Figure 1) but only three groups (green, yellow and red) are used during the decision process. As a result, important information pertaining to

16 16 the process mean is ignored. The proposed Mean Shift PC utilizes this information. Mean Shift PC is a very simple adaptation of Two-stage PC since it does not requiring any changes in the data collection process. The Mean Shift PC method classifies observations into one of four groups: Green, Yellow+, Yellow, and Red. Red units above or below the specification limits are not distinguished because they are very rare and automatically lead to a signal in any case. The Mean Shift PC decision rules, given below, are similar to those used in Two-stage PC. Mean Shift PC Decision Process Sample two consecutive parts. If either unit is red stop the process. If both units are green continue operation. If either or both units are yellow, sample an additional three units. If in the combined sample three greens are obtained continue the process. If three Yellow+ or three Yellow or a single Red are observed then stop the process. Table 5: Comparison of Two-stage PC and Mean Shift PC P µ σ signal Twostage PC Shift P signal Mean PC 0 1/ ±1σ 1/ ±2σ 1/ ±1σ ±2σ Table 5 compares the power of the Mean Shift PC and Two-stage PC. The table shows that Mean Shift PC has a much smaller false alarm rate than Two-stage PC, and virtually identical power to detect two sigma mean shifts. This advantage in terms of operating characteristics will typically outweigh the slightly more complicated decision

17 17 process. Thus, Mean Shift PC is a good alternative to Two-stage PC, though the process standard deviation should also be monitored in some way. The third variation, called Simplified PC uses only green and yellow units. The group probabilities and decision process for Simplified PC are given below:.5 ( ) = P g = f y P green P yellow y=.5.5 ( ) = P y = f y ( )dy ( )dy + f( y)dy (3.1) y= y=.5 Simplified PC Decision Process Sample five consecutive parts If three or more units are yellow, then stop the process Otherwise continue operation. The loss of efficiency using Simplified PC rather than Two-stage PC is very slight when the process variation is in the range where PC is appropriate. This only small loss of efficiency arises because the probability of a red unit is usually very small, and thus the red classification provides very little information. In fact, as designed, Simplified PC has a slightly smaller false alarm rate, though not quite as much power to detect mean shifts. Table 6 shows a comparison. Compared with Two-stage PC, Simplified PC has the advantage of requiring less effort in the group classification, and the decision rules are more straight forward. In addition, although Simplified PC requires a larger sample size on average often the required sample sizes are the same and implementation is usually easier if the sample size is fixed.

18 18 Table 6: Comparison of Two-stage PC and Simplified PC P µ σ signal Twostage PC Simplified P signal PC 0 1/ ±1σ 1/ ±2σ 1/ ±1σ ±2σ Simplified PC is also appealing since it can fairly easily be adapted when the process variation is much smaller than the tolerance range. As shown in Section 2, PC is not appropriate if the process standard deviation is less than around 115th of the tolerance range. When σ is small, observing yellow units with the traditional grouping criterion as defined by (1.1), is not necessarily evidence of a problem. See Figures 6 and 7. In this circumstance the Simplified PC group limits [.5,.5] can be replaced with the group limits [ c, c], where c is closer to the tolerance limits (.5 < c < 1). By adjusting the group limits it is possible to only detecting process mean shifts that yield an excessive proportion of non-conforming units. The decision process remains unchanged. The best value for c depends on the process standard deviation and the acceptable and rejectable process levels (APL and RPL). A good choice for c is the midpoint between the APL mean and the RPL mean. APL and RPL mean values can be estimated from acceptable and unacceptable process capability values. Process capability is defined as C pk = min( ( USL µ ) 3σ, ( µ LSL) 3σ), where LSL and USL is the lower and upper specification limits respectively. For example, assume the process standard deviation is σ =.1, and that an acceptable process capability value is C pk = 4/3 (an average of 6 defects out of 100,000 units) whereas C pk = 2/3 (an average of 94 defects out of 10,000 units) is unacceptable. Then looking only at the upper specification limit (the problem is symmetric on the lower limit) the corresponding APL and RPL mean values are µ =.6 and µ =.8 respectively. In this case choosing c =.7 is reasonable. Figure 8 illustrates the operating

19 19 characteristics obtained in this example for various parameter values. For example, the probability of a signal when µ =.6 is.031 and the probability of a signal when µ =.8 is.969. If this false alarm rate is too large adjusting the decision barrier c slightly higher may be appropriate. An OC curve for Two-stage PC would show a much great chance of rejecting the process at the acceptable mean value µ = P signal µ Figure 8: OC curve of the adapted Simplified PC procedure 4. Conclusions This article compared and contrasts previously recommended PC approaches and suggests simple variations that improve performance. Classical and Two-stage PC are compared with acceptance control charts rather than X charts since this is a more appropriate comparison. It is concluded that Classical and Two-stage PC are good methods when the process standard deviation σ lies in the range T 15 σ 11T 75, where T is the tolerance range. Two-stage PC is preferred over Classical PC unless the additional sampling required is very onerous. Modified PC, on the other hand, has the same goal as an X chart, but is shown to have an excessively large false alarm rate, and is thus not recommended. Three alternatives to PC are suggested that retain the great advantage of very simple implementation. The Ten Unit approach is recommended when improved operating

20 20 characteristics are desired and taking some additional observations is not difficult. The Mean Shift approach yields better results than Two-stage PC when the goal is the detection of process mean shifts, and Simplified PC utilizes a simplified classification and decision procedure while providing similar operating characteristics. Simplified PC can also easily be adapted to handle situations when the process standard deviation is smaller than T 15. Appendix In this appendix an expression for the average sampling number En ( ) of the Twostage PC scheme, discussed in Section 2, is derived. En ( ) is determined by calculating the probability of each possible path to either acceptance or rejection of the process. Below in (A.1), p i is the probability that the Two-stage PC reaches a decision in i units. For example, Two-stage PC would stop after two units if we observe either two green units (process acceptable), or if the first observation is yellow or green and the second observation is red (process rejectable). p 1 = p r, p 2 = p 2 g + p r ( p y + p g ), p 3 = p r p 2 3 y + 2 p g p y p r + p y (A.1) p 4 = 3p y 2 p g p r + 2 p g 2 p y p r + 2 p g 3 p y + 3p y 3 p g, p 5 = 5p y 2 p g 2 p r + 5p y 3 p g 2 + 5p g 3 p y 2 5 En ( )= ip i References i=1 Ermer, D.S., Roepke, J.R. (1991), An Analytical Analysis of Pre-control, ASQC Quality Congress Transactions - Milwaukee, Gurska, G.F., Heaphy M.S. (1991), STOP LIGHT CONTROL - Revisited, ASQC Statistics Division Newsletter, Fall. Kulldorff, G. (1961), Contributions to the Theory of Estimation from Grouped and Partially Grouped Samples, John Wiley and Sons, New York. Mackertich, N. A. (1990), Pre-control vs. Control Charting: A Critical Comparison, Quality Engineering, 2,

21 21 Ryan, T.P. (1989), Statistical Methods for Quality Improvement, John Wiley & Sons, New York. Shainin, D., Shainin, P. (1989), Pre-control Versus X & R Charting: Continuous or Immediate Quality Improvement?, Quality Engineering, 1, Salvia, A.A. (1988), Stoplight Control, Quality Progress, September, Steiner, S.H., Geyer, P.L., Wesolowsky, G.O. (1994), Control Charts based on Grouped Data, International Journal of Production Research., 32, Steiner, S.H., Geyer, P.L., Wesolowsky G.O. (1995), "Shewhart Control Charts to Detect Mean Shifts Based on Grouped Data," submitted to Quality and Reliability Engineering International. Traver, R.W. (1985), Pre-control: A Good Alternative to X -R Charts, Quality Progress, September,

Six Sigma Green Belt Part 5

Six Sigma Green Belt Part 5 Six Sigma Green Belt Part 5 Process Capability 2013 IIE and Aft Systems, Inc. 5-1 Process Capability Is the measured, inherent reproducibility of the product turned out by the process. It can be quantified

More information

Cpk: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc.

Cpk: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc. C: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc. C is one of many capability metrics that are available. When capability metrics are used, organizations typically provide

More information

Capability Calculations: Are AIAG SPC Appendix F Conclusions Wrong?

Capability Calculations: Are AIAG SPC Appendix F Conclusions Wrong? WHITE PAPER Capability Calculations: Are AIAG SPC Appendix F Conclusions Wrong? Bob Doering CorrectSPC Page 0 Appendix 7 of the AIAG SPC book contains sample data set and calculations for capability. They

More information

Department of Industrial Engineering. Chap. 8: Process Capability Presented by Dr. Eng. Abed Schokry

Department of Industrial Engineering. Chap. 8: Process Capability Presented by Dr. Eng. Abed Schokry Department of Industrial Engineering Chap. 8: Process Capability Presented by Dr. Eng. Abed Schokry Learning Outcomes: After careful study of this chapter, you should be able to do the following: Investigate

More information

UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication

UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication Citation for published version (APA): Abbas, N. (2012). Memory-type control

More information

= = P. IE 434 Homework 2 Process Capability. Kate Gilland 10/2/13. Figure 1: Capability Analysis

= = P. IE 434 Homework 2 Process Capability. Kate Gilland 10/2/13. Figure 1: Capability Analysis Kate Gilland 10/2/13 IE 434 Homework 2 Process Capability 1. Figure 1: Capability Analysis σ = R = 4.642857 = 1.996069 P d 2 2.326 p = 1.80 C p = 2.17 These results are according to Method 2 in Minitab.

More information

2.3. Quality Assurance: The activities that have to do with making sure that the quality of a product is what it should be.

2.3. Quality Assurance: The activities that have to do with making sure that the quality of a product is what it should be. 5.2. QUALITY CONTROL /QUALITY ASSURANCE 5.2.1. STATISTICS 1. ACKNOWLEDGEMENT This paper has been copied directly from the HMA Manual with a few modifications from the original version. The original version

More information

Part One of this article (1) introduced the concept

Part One of this article (1) introduced the concept Establishing Acceptance Limits for Uniformity of Dosage Units: Part Two Pramote Cholayudth The concept of sampling distribution of acceptance value (AV) was introduced in Part One of this article series.

More information

Control Charts. An Introduction to Statistical Process Control

Control Charts. An Introduction to Statistical Process Control An Introduction to Statistical Process Control Course Content Prerequisites Course Objectives What is SPC? Control Chart Basics Out of Control Conditions SPC vs. SQC Individuals and Moving Range Chart

More information

Chapter 2 Modeling Distributions of Data

Chapter 2 Modeling Distributions of Data Chapter 2 Modeling Distributions of Data Section 2.1 Describing Location in a Distribution Describing Location in a Distribution Learning Objectives After this section, you should be able to: FIND and

More information

Chapter 7: Dual Modeling in the Presence of Constant Variance

Chapter 7: Dual Modeling in the Presence of Constant Variance Chapter 7: Dual Modeling in the Presence of Constant Variance 7.A Introduction An underlying premise of regression analysis is that a given response variable changes systematically and smoothly due to

More information

CHAPTER 2 Modeling Distributions of Data

CHAPTER 2 Modeling Distributions of Data CHAPTER 2 Modeling Distributions of Data 2.2 Density Curves and Normal Distributions The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers Density Curves

More information

Dealing with Categorical Data Types in a Designed Experiment

Dealing with Categorical Data Types in a Designed Experiment Dealing with Categorical Data Types in a Designed Experiment Part II: Sizing a Designed Experiment When Using a Binary Response Best Practice Authored by: Francisco Ortiz, PhD STAT T&E COE The goal of

More information

Analytic Performance Models for Bounded Queueing Systems

Analytic Performance Models for Bounded Queueing Systems Analytic Performance Models for Bounded Queueing Systems Praveen Krishnamurthy Roger D. Chamberlain Praveen Krishnamurthy and Roger D. Chamberlain, Analytic Performance Models for Bounded Queueing Systems,

More information

Acceptance Sampling by Variables

Acceptance Sampling by Variables Acceptance Sampling by Variables Advantages of Variables Sampling o Smaller sample sizes are required o Measurement data usually provide more information about the manufacturing process o When AQLs are

More information

Optimal Clustering and Statistical Identification of Defective ICs using I DDQ Testing

Optimal Clustering and Statistical Identification of Defective ICs using I DDQ Testing Optimal Clustering and Statistical Identification of Defective ICs using I DDQ Testing A. Rao +, A.P. Jayasumana * and Y.K. Malaiya* *Colorado State University, Fort Collins, CO 8523 + PalmChip Corporation,

More information

STA Module 4 The Normal Distribution

STA Module 4 The Normal Distribution STA 2023 Module 4 The Normal Distribution Learning Objectives Upon completing this module, you should be able to 1. Explain what it means for a variable to be normally distributed or approximately normally

More information

STA /25/12. Module 4 The Normal Distribution. Learning Objectives. Let s Look at Some Examples of Normal Curves

STA /25/12. Module 4 The Normal Distribution. Learning Objectives. Let s Look at Some Examples of Normal Curves STA 2023 Module 4 The Normal Distribution Learning Objectives Upon completing this module, you should be able to 1. Explain what it means for a variable to be normally distributed or approximately normally

More information

Graphical Analysis of Data using Microsoft Excel [2016 Version]

Graphical Analysis of Data using Microsoft Excel [2016 Version] Graphical Analysis of Data using Microsoft Excel [2016 Version] Introduction In several upcoming labs, a primary goal will be to determine the mathematical relationship between two variable physical parameters.

More information

Beware the Tukey Control Chart

Beware the Tukey Control Chart Quality Digest Daily, August, 213 Manuscript 28 Another bad idea surfaces Donald J. Wheeler I recently read about a technique for analyzing data called the Tukey control chart. Since Professor John Tukey

More information

MORPHOLOGICAL BOUNDARY BASED SHAPE REPRESENTATION SCHEMES ON MOMENT INVARIANTS FOR CLASSIFICATION OF TEXTURES

MORPHOLOGICAL BOUNDARY BASED SHAPE REPRESENTATION SCHEMES ON MOMENT INVARIANTS FOR CLASSIFICATION OF TEXTURES International Journal of Computer Science and Communication Vol. 3, No. 1, January-June 2012, pp. 125-130 MORPHOLOGICAL BOUNDARY BASED SHAPE REPRESENTATION SCHEMES ON MOMENT INVARIANTS FOR CLASSIFICATION

More information

Flexible Lag Definition for Experimental Variogram Calculation

Flexible Lag Definition for Experimental Variogram Calculation Flexible Lag Definition for Experimental Variogram Calculation Yupeng Li and Miguel Cuba The inference of the experimental variogram in geostatistics commonly relies on the method of moments approach.

More information

Noise-based Feature Perturbation as a Selection Method for Microarray Data

Noise-based Feature Perturbation as a Selection Method for Microarray Data Noise-based Feature Perturbation as a Selection Method for Microarray Data Li Chen 1, Dmitry B. Goldgof 1, Lawrence O. Hall 1, and Steven A. Eschrich 2 1 Department of Computer Science and Engineering

More information

5 Classifications of Accuracy and Standards

5 Classifications of Accuracy and Standards 5 Classifications of Accuracy and Standards 5.1 Classifications of Accuracy All surveys performed by Caltrans or others on all Caltrans-involved transportation improvement projects shall be classified

More information

Statistical Process Control: A Case-Study on Haleeb Foods Ltd., Lahore

Statistical Process Control: A Case-Study on Haleeb Foods Ltd., Lahore 11 ISSN 1684 8403 Journal of Statistics Vol: 12, No.1 (2005) Statistical Process Control: A Case-Study on Haleeb Foods Ltd., Lahore Sarwat Zahara Khan *, Muhammad Khalid Pervaiz * and Mueen-ud-Din Azad

More information

Performance Contracts in SDN Systems

Performance Contracts in SDN Systems Performance Contracts in SDN Systems May 2017 Published in IEEE Softwarization - May 2017 Abstract SDN virtualizes connectivity and access to the underlying bearers. This enables more variety of routes

More information

Moving Average (MA) Charts

Moving Average (MA) Charts Moving Average (MA) Charts Summary The Moving Average Charts procedure creates control charts for a single numeric variable where the data have been collected either individually or in subgroups. In contrast

More information

UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication

UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication UvA-DARE (Digital Academic Repository) Memory-type control charts in statistical process control Abbas, N. Link to publication Citation for published version (APA): Abbas, N. (2012). Memory-type control

More information

Machine Learning Techniques for Data Mining

Machine Learning Techniques for Data Mining Machine Learning Techniques for Data Mining Eibe Frank University of Waikato New Zealand 10/25/2000 1 PART V Credibility: Evaluating what s been learned 10/25/2000 2 Evaluation: the key to success How

More information

Process capability analysis

Process capability analysis 6 Process capability analysis In general, process capability indices have been quite controversial. (Ryan, 2000, p. 186) Overview Capability indices are widely used in assessing how well processes perform

More information

Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2. Xi Wang and Ronald K. Hambleton

Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2. Xi Wang and Ronald K. Hambleton Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2 Xi Wang and Ronald K. Hambleton University of Massachusetts Amherst Introduction When test forms are administered to

More information

Multiple Comparisons of Treatments vs. a Control (Simulation)

Multiple Comparisons of Treatments vs. a Control (Simulation) Chapter 585 Multiple Comparisons of Treatments vs. a Control (Simulation) Introduction This procedure uses simulation to analyze the power and significance level of two multiple-comparison procedures that

More information

PASS Sample Size Software. Randomization Lists

PASS Sample Size Software. Randomization Lists Chapter 880 Introduction This module is used to create a randomization list for assigning subjects to one of up to eight groups or treatments. Six randomization algorithms are available. Four of the algorithms

More information

Scanner Parameter Estimation Using Bilevel Scans of Star Charts

Scanner Parameter Estimation Using Bilevel Scans of Star Charts ICDAR, Seattle WA September Scanner Parameter Estimation Using Bilevel Scans of Star Charts Elisa H. Barney Smith Electrical and Computer Engineering Department Boise State University, Boise, Idaho 8375

More information

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering Digital Image Processing Prof. P. K. Biswas Department of Electronic & Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 21 Image Enhancement Frequency Domain Processing

More information

What is Process Capability?

What is Process Capability? 6. Process or Product Monitoring and Control 6.1. Introduction 6.1.6. What is Process Capability? Process capability compares the output of an in-control process to the specification limits by using capability

More information

Stat 528 (Autumn 2008) Density Curves and the Normal Distribution. Measures of center and spread. Features of the normal distribution

Stat 528 (Autumn 2008) Density Curves and the Normal Distribution. Measures of center and spread. Features of the normal distribution Stat 528 (Autumn 2008) Density Curves and the Normal Distribution Reading: Section 1.3 Density curves An example: GRE scores Measures of center and spread The normal distribution Features of the normal

More information

NCSS Statistical Software

NCSS Statistical Software Chapter 245 Introduction This procedure generates R control charts for variables. The format of the control charts is fully customizable. The data for the subgroups can be in a single column or in multiple

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:

More information

appstats6.notebook September 27, 2016

appstats6.notebook September 27, 2016 Chapter 6 The Standard Deviation as a Ruler and the Normal Model Objectives: 1.Students will calculate and interpret z scores. 2.Students will compare/contrast values from different distributions using

More information

Improving the Post-Smoothing of Test Norms with Kernel Smoothing

Improving the Post-Smoothing of Test Norms with Kernel Smoothing Improving the Post-Smoothing of Test Norms with Kernel Smoothing Anli Lin Qing Yi Michael J. Young Pearson Paper presented at the Annual Meeting of National Council on Measurement in Education, May 1-3,

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

Averages and Variation

Averages and Variation Averages and Variation 3 Copyright Cengage Learning. All rights reserved. 3.1-1 Section 3.1 Measures of Central Tendency: Mode, Median, and Mean Copyright Cengage Learning. All rights reserved. 3.1-2 Focus

More information

Name: Date: Period: Chapter 2. Section 1: Describing Location in a Distribution

Name: Date: Period: Chapter 2. Section 1: Describing Location in a Distribution Name: Date: Period: Chapter 2 Section 1: Describing Location in a Distribution Suppose you earned an 86 on a statistics quiz. The question is: should you be satisfied with this score? What if it is the

More information

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016 edestrian Detection Using Correlated Lidar and Image Data EECS442 Final roject Fall 2016 Samuel Rohrer University of Michigan rohrer@umich.edu Ian Lin University of Michigan tiannis@umich.edu Abstract

More information

Application of Characteristic Function Method in Target Detection

Application of Characteristic Function Method in Target Detection Application of Characteristic Function Method in Target Detection Mohammad H Marhaban and Josef Kittler Centre for Vision, Speech and Signal Processing University of Surrey Surrey, GU2 7XH, UK eep5mm@ee.surrey.ac.uk

More information

To calculate the arithmetic mean, sum all the values and divide by n (equivalently, multiple 1/n): 1 n. = 29 years.

To calculate the arithmetic mean, sum all the values and divide by n (equivalently, multiple 1/n): 1 n. = 29 years. 3: Summary Statistics Notation Consider these 10 ages (in years): 1 4 5 11 30 50 8 7 4 5 The symbol n represents the sample size (n = 10). The capital letter X denotes the variable. x i represents the

More information

Continuous Improvement Toolkit. Normal Distribution. Continuous Improvement Toolkit.

Continuous Improvement Toolkit. Normal Distribution. Continuous Improvement Toolkit. Continuous Improvement Toolkit Normal Distribution The Continuous Improvement Map Managing Risk FMEA Understanding Performance** Check Sheets Data Collection PDPC RAID Log* Risk Analysis* Benchmarking***

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

OPTIMIZING A VIDEO PREPROCESSOR FOR OCR. MR IBM Systems Dev Rochester, elopment Division Minnesota

OPTIMIZING A VIDEO PREPROCESSOR FOR OCR. MR IBM Systems Dev Rochester, elopment Division Minnesota OPTIMIZING A VIDEO PREPROCESSOR FOR OCR MR IBM Systems Dev Rochester, elopment Division Minnesota Summary This paper describes how optimal video preprocessor performance can be achieved using a software

More information

3. Data Analysis and Statistics

3. Data Analysis and Statistics 3. Data Analysis and Statistics 3.1 Visual Analysis of Data 3.2.1 Basic Statistics Examples 3.2.2 Basic Statistical Theory 3.3 Normal Distributions 3.4 Bivariate Data 3.1 Visual Analysis of Data Visual

More information

Math 214 Introductory Statistics Summer Class Notes Sections 3.2, : 1-21 odd 3.3: 7-13, Measures of Central Tendency

Math 214 Introductory Statistics Summer Class Notes Sections 3.2, : 1-21 odd 3.3: 7-13, Measures of Central Tendency Math 14 Introductory Statistics Summer 008 6-9-08 Class Notes Sections 3, 33 3: 1-1 odd 33: 7-13, 35-39 Measures of Central Tendency odd Notation: Let N be the size of the population, n the size of the

More information

APPROACHES TO THE PROCESS CAPABILITY ANALYSIS IN THE CASE OF NON- NORMALLY DISTRIBUTED PRODUCT QUALITY CHARACTERISTIC

APPROACHES TO THE PROCESS CAPABILITY ANALYSIS IN THE CASE OF NON- NORMALLY DISTRIBUTED PRODUCT QUALITY CHARACTERISTIC APPROACHES TO THE PROCESS CAPABILITY ANALYSIS IN THE CASE OF NON- NORMALLY DISTRIBUTED PRODUCT QUALITY CHARACTERISTIC Jiří PLURA, Milan ZEMEK, Pavel KLAPUT VŠB-Technical University of Ostrava, Faculty

More information

AND NUMERICAL SUMMARIES. Chapter 2

AND NUMERICAL SUMMARIES. Chapter 2 EXPLORING DATA WITH GRAPHS AND NUMERICAL SUMMARIES Chapter 2 2.1 What Are the Types of Data? 2.1 Objectives www.managementscientist.org 1. Know the definitions of a. Variable b. Categorical versus quantitative

More information

Pair-Wise Multiple Comparisons (Simulation)

Pair-Wise Multiple Comparisons (Simulation) Chapter 580 Pair-Wise Multiple Comparisons (Simulation) Introduction This procedure uses simulation analyze the power and significance level of three pair-wise multiple-comparison procedures: Tukey-Kramer,

More information

Incompatibility Dimensions and Integration of Atomic Commit Protocols

Incompatibility Dimensions and Integration of Atomic Commit Protocols The International Arab Journal of Information Technology, Vol. 5, No. 4, October 2008 381 Incompatibility Dimensions and Integration of Atomic Commit Protocols Yousef Al-Houmaily Department of Computer

More information

Chapters 5-6: Statistical Inference Methods

Chapters 5-6: Statistical Inference Methods Chapters 5-6: Statistical Inference Methods Chapter 5: Estimation (of population parameters) Ex. Based on GSS data, we re 95% confident that the population mean of the variable LONELY (no. of days in past

More information

Avoiding Costs From Oversizing Datacenter Infrastructure

Avoiding Costs From Oversizing Datacenter Infrastructure Avoiding Costs From Oversizing Datacenter Infrastructure White Paper # 37 Executive Summary The physical and power infrastructure of data centers is typically oversized by more than 100%. Statistics related

More information

Quantification of the characteristics that influence the monitoring of crimping operations of electric terminals for use in the automotive industry

Quantification of the characteristics that influence the monitoring of crimping operations of electric terminals for use in the automotive industry Quantification of the characteristics that influence the monitoring of crimping operations of electric terminals for use in the automotive industry Vasco A. van Zeller vascovanzeller@ist.utl.pt Instituto

More information

Lecture 3 Questions that we should be able to answer by the end of this lecture:

Lecture 3 Questions that we should be able to answer by the end of this lecture: Lecture 3 Questions that we should be able to answer by the end of this lecture: Which is the better exam score? 67 on an exam with mean 50 and SD 10 or 62 on an exam with mean 40 and SD 12 Is it fair

More information

CHAPTER 4 WEIGHT-BASED DESIRABILITY METHOD TO SOLVE MULTI-RESPONSE PROBLEMS

CHAPTER 4 WEIGHT-BASED DESIRABILITY METHOD TO SOLVE MULTI-RESPONSE PROBLEMS 72 CHAPTER 4 WEIGHT-BASED DESIRABILITY METHOD TO SOLVE MULTI-RESPONSE PROBLEMS 4.1 INTRODUCTION Optimizing the quality of a product is widespread in the industry. Products have to be manufactured such

More information

Categorical Data in a Designed Experiment Part 2: Sizing with a Binary Response

Categorical Data in a Designed Experiment Part 2: Sizing with a Binary Response Categorical Data in a Designed Experiment Part 2: Sizing with a Binary Response Authored by: Francisco Ortiz, PhD Version 2: 19 July 2018 Revised 18 October 2018 The goal of the STAT COE is to assist in

More information

Lecture 3 Questions that we should be able to answer by the end of this lecture:

Lecture 3 Questions that we should be able to answer by the end of this lecture: Lecture 3 Questions that we should be able to answer by the end of this lecture: Which is the better exam score? 67 on an exam with mean 50 and SD 10 or 62 on an exam with mean 40 and SD 12 Is it fair

More information

Constructive floorplanning with a yield objective

Constructive floorplanning with a yield objective Constructive floorplanning with a yield objective Rajnish Prasad and Israel Koren Department of Electrical and Computer Engineering University of Massachusetts, Amherst, MA 13 E-mail: rprasad,koren@ecs.umass.edu

More information

Assignment 4/5 Statistics Due: Nov. 29

Assignment 4/5 Statistics Due: Nov. 29 Assignment 4/5 Statistics 5.301 Due: Nov. 29 1. Two decision rules are given here. Assume they apply to a normally distributed quality characteristic, the control chart has three-sigma control limits,

More information

α - CUT FUZZY CONTROL CHARTS FOR BOTTLE BURSTING STRENGTH DATA

α - CUT FUZZY CONTROL CHARTS FOR BOTTLE BURSTING STRENGTH DATA International Journal of Electronics, Communication & Instrumentation Engineering Research and Development (IJECIERD ISSN 2249-684X Vol. 2 Issue 4 Dec 2012 17-30 TJPRC Pvt. Ltd., α - CUT FUZZY CONTROL

More information

A Path Decomposition Approach for Computing Blocking Probabilities in Wavelength-Routing Networks

A Path Decomposition Approach for Computing Blocking Probabilities in Wavelength-Routing Networks IEEE/ACM TRANSACTIONS ON NETWORKING, VOL. 8, NO. 6, DECEMBER 2000 747 A Path Decomposition Approach for Computing Blocking Probabilities in Wavelength-Routing Networks Yuhong Zhu, George N. Rouskas, Member,

More information

Using Excel for Graphical Analysis of Data

Using Excel for Graphical Analysis of Data Using Excel for Graphical Analysis of Data Introduction In several upcoming labs, a primary goal will be to determine the mathematical relationship between two variable physical parameters. Graphs are

More information

John A. Conte, P.E. 2/22/2012 1

John A. Conte, P.E. 2/22/2012 1 John A. Conte, P.E. 2/22/2012 1 Objectives Excited to be here! Students, faculty, engineers Share my engineering career Some thoughts on Six Sigma Some thoughts on Process Capability Cp, Cpk, Pp and Ppk

More information

ANNUAL REPORT OF HAIL STUDIES NEIL G, TOWERY AND RAND I OLSON. Report of Research Conducted. 15 May May For. The Country Companies

ANNUAL REPORT OF HAIL STUDIES NEIL G, TOWERY AND RAND I OLSON. Report of Research Conducted. 15 May May For. The Country Companies ISWS CR 182 Loan c.l ANNUAL REPORT OF HAIL STUDIES BY NEIL G, TOWERY AND RAND I OLSON Report of Research Conducted 15 May 1976-14 May 1977 For The Country Companies May 1977 ANNUAL REPORT OF HAIL STUDIES

More information

6-1 THE STANDARD NORMAL DISTRIBUTION

6-1 THE STANDARD NORMAL DISTRIBUTION 6-1 THE STANDARD NORMAL DISTRIBUTION The major focus of this chapter is the concept of a normal probability distribution, but we begin with a uniform distribution so that we can see the following two very

More information

INF 4300 Classification III Anne Solberg The agenda today:

INF 4300 Classification III Anne Solberg The agenda today: INF 4300 Classification III Anne Solberg 28.10.15 The agenda today: More on estimating classifier accuracy Curse of dimensionality and simple feature selection knn-classification K-means clustering 28.10.15

More information

Convex combination of adaptive filters for a variable tap-length LMS algorithm

Convex combination of adaptive filters for a variable tap-length LMS algorithm Loughborough University Institutional Repository Convex combination of adaptive filters for a variable tap-length LMS algorithm This item was submitted to Loughborough University's Institutional Repository

More information

Quantitative Models for Performance Enhancement of Information Retrieval from Relational Databases

Quantitative Models for Performance Enhancement of Information Retrieval from Relational Databases Quantitative Models for Performance Enhancement of Information Retrieval from Relational Databases Jenna Estep Corvis Corporation, Columbia, MD 21046 Natarajan Gautam Harold and Inge Marcus Department

More information

Modified S-Control Chart for Specified value of Cp

Modified S-Control Chart for Specified value of Cp American International Journal of Research in Science, Technology, Engineering & Mathematics Available online at http://www.iasir.net ISSN (Print): 38-349, ISSN (Online): 38-358, ISSN (CD-ROM): 38-369

More information

2.1 Objectives. Math Chapter 2. Chapter 2. Variable. Categorical Variable EXPLORING DATA WITH GRAPHS AND NUMERICAL SUMMARIES

2.1 Objectives. Math Chapter 2. Chapter 2. Variable. Categorical Variable EXPLORING DATA WITH GRAPHS AND NUMERICAL SUMMARIES EXPLORING DATA WITH GRAPHS AND NUMERICAL SUMMARIES Chapter 2 2.1 Objectives 2.1 What Are the Types of Data? www.managementscientist.org 1. Know the definitions of a. Variable b. Categorical versus quantitative

More information

SELECTION OF A MULTIVARIATE CALIBRATION METHOD

SELECTION OF A MULTIVARIATE CALIBRATION METHOD SELECTION OF A MULTIVARIATE CALIBRATION METHOD 0. Aim of this document Different types of multivariate calibration methods are available. The aim of this document is to help the user select the proper

More information

Vocabulary. 5-number summary Rule. Area principle. Bar chart. Boxplot. Categorical data condition. Categorical variable.

Vocabulary. 5-number summary Rule. Area principle. Bar chart. Boxplot. Categorical data condition. Categorical variable. 5-number summary 68-95-99.7 Rule Area principle Bar chart Bimodal Boxplot Case Categorical data Categorical variable Center Changing center and spread Conditional distribution Context Contingency table

More information

WHAT YOU SHOULD LEARN

WHAT YOU SHOULD LEARN GRAPHS OF EQUATIONS WHAT YOU SHOULD LEARN Sketch graphs of equations. Find x- and y-intercepts of graphs of equations. Use symmetry to sketch graphs of equations. Find equations of and sketch graphs of

More information

Chapter 2. Descriptive Statistics: Organizing, Displaying and Summarizing Data

Chapter 2. Descriptive Statistics: Organizing, Displaying and Summarizing Data Chapter 2 Descriptive Statistics: Organizing, Displaying and Summarizing Data Objectives Student should be able to Organize data Tabulate data into frequency/relative frequency tables Display data graphically

More information

Lecture 3: Linear Classification

Lecture 3: Linear Classification Lecture 3: Linear Classification Roger Grosse 1 Introduction Last week, we saw an example of a learning task called regression. There, the goal was to predict a scalar-valued target from a set of features.

More information

EDGE EXTRACTION ALGORITHM BASED ON LINEAR PERCEPTION ENHANCEMENT

EDGE EXTRACTION ALGORITHM BASED ON LINEAR PERCEPTION ENHANCEMENT EDGE EXTRACTION ALGORITHM BASED ON LINEAR PERCEPTION ENHANCEMENT Fan ZHANG*, Xianfeng HUANG, Xiaoguang CHENG, Deren LI State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing,

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

AC : DETERMINING PROCESS CAPABILITY OF AN INDUSTRIAL PROCESS IN LABORATORY USING COMPUTER AIDED HARDWARE AND SOFTWARE TOOLS

AC : DETERMINING PROCESS CAPABILITY OF AN INDUSTRIAL PROCESS IN LABORATORY USING COMPUTER AIDED HARDWARE AND SOFTWARE TOOLS AC 007-150: DETERMINING PROCESS CAPABILITY OF AN INDUSTRIAL PROCESS IN LABORATORY USING COMPUTER AIDED HARDWARE AND SOFTWARE TOOLS Akram Hossain, Purdue University-Calumet Akram Hossain is a professor

More information

Student Learning Objectives

Student Learning Objectives Student Learning Objectives A. Understand that the overall shape of a distribution of a large number of observations can be summarized by a smooth curve called a density curve. B. Know that an area under

More information

Improving the Discrimination Capability with an Adaptive Synthetic Discriminant Function Filter

Improving the Discrimination Capability with an Adaptive Synthetic Discriminant Function Filter Improving the Discrimination Capability with an Adaptive Synthetic Discriminant Function Filter 83 J. Ángel González-Fraga 1, Víctor H. Díaz-Ramírez 1, Vitaly Kober 1, and Josué Álvarez-Borrego 2 1 Department

More information

Assembly dynamics of microtubules at molecular resolution

Assembly dynamics of microtubules at molecular resolution Supplementary Information with: Assembly dynamics of microtubules at molecular resolution Jacob W.J. Kerssemakers 1,2, E. Laura Munteanu 1, Liedewij Laan 1, Tim L. Noetzel 2, Marcel E. Janson 1,3, and

More information

The Normal Distribution

The Normal Distribution The Normal Distribution Lecture 20 Section 6.3.1 Robb T. Koether Hampden-Sydney College Wed, Sep 28, 2011 Robb T. Koether (Hampden-Sydney College) The Normal Distribution Wed, Sep 28, 2011 1 / 41 Outline

More information

Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS

Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS ABSTRACT Paper 1938-2018 Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS Robert M. Lucas, Robert M. Lucas Consulting, Fort Collins, CO, USA There is confusion

More information

Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms

Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 5, SEPTEMBER 2002 1225 Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Sathiya Keerthi Abstract This paper

More information

Floating Point Considerations

Floating Point Considerations Chapter 6 Floating Point Considerations In the early days of computing, floating point arithmetic capability was found only in mainframes and supercomputers. Although many microprocessors designed in the

More information

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering Digital Image Processing Prof. P.K. Biswas Department of Electronics & Electrical Communication Engineering Indian Institute of Technology, Kharagpur Image Segmentation - III Lecture - 31 Hello, welcome

More information

Cover Page. The handle holds various files of this Leiden University dissertation.

Cover Page. The handle   holds various files of this Leiden University dissertation. Cover Page The handle http://hdl.handle.net/1887/22055 holds various files of this Leiden University dissertation. Author: Koch, Patrick Title: Efficient tuning in supervised machine learning Issue Date:

More information

Using Excel for Graphical Analysis of Data

Using Excel for Graphical Analysis of Data EXERCISE Using Excel for Graphical Analysis of Data Introduction In several upcoming experiments, a primary goal will be to determine the mathematical relationship between two variable physical parameters.

More information

Bootstrapping Method for 14 June 2016 R. Russell Rhinehart. Bootstrapping

Bootstrapping Method for  14 June 2016 R. Russell Rhinehart. Bootstrapping Bootstrapping Method for www.r3eda.com 14 June 2016 R. Russell Rhinehart Bootstrapping This is extracted from the book, Nonlinear Regression Modeling for Engineering Applications: Modeling, Model Validation,

More information

For the hardest CMO tranche, generalized Faure achieves accuracy 10 ;2 with 170 points, while modied Sobol uses 600 points. On the other hand, the Mon

For the hardest CMO tranche, generalized Faure achieves accuracy 10 ;2 with 170 points, while modied Sobol uses 600 points. On the other hand, the Mon New Results on Deterministic Pricing of Financial Derivatives A. Papageorgiou and J.F. Traub y Department of Computer Science Columbia University CUCS-028-96 Monte Carlo simulation is widely used to price

More information

METAL OXIDE VARISTORS

METAL OXIDE VARISTORS POWERCET CORPORATION METAL OXIDE VARISTORS PROTECTIVE LEVELS, CURRENT AND ENERGY RATINGS OF PARALLEL VARISTORS PREPARED FOR EFI ELECTRONICS CORPORATION SALT LAKE CITY, UTAH METAL OXIDE VARISTORS PROTECTIVE

More information

CHAPTER 4. Numerical Models. descriptions of the boundary conditions, element types, validation, and the force

CHAPTER 4. Numerical Models. descriptions of the boundary conditions, element types, validation, and the force CHAPTER 4 Numerical Models This chapter presents the development of numerical models for sandwich beams/plates subjected to four-point bending and the hydromat test system. Detailed descriptions of the

More information

A New Measure of the Cluster Hypothesis

A New Measure of the Cluster Hypothesis A New Measure of the Cluster Hypothesis Mark D. Smucker 1 and James Allan 2 1 Department of Management Sciences University of Waterloo 2 Center for Intelligent Information Retrieval Department of Computer

More information

Math 340 Fall 2014, Victor Matveev. Binary system, round-off errors, loss of significance, and double precision accuracy.

Math 340 Fall 2014, Victor Matveev. Binary system, round-off errors, loss of significance, and double precision accuracy. Math 340 Fall 2014, Victor Matveev Binary system, round-off errors, loss of significance, and double precision accuracy. 1. Bits and the binary number system A bit is one digit in a binary representation

More information