Lesson 10
Lesson 10
Lesson 10
LESSON
10
10.1 INTRODUCTION
Quality, as it is said, is not by chance but by intention. All successful companies value quality as a system in their manufacturing systems. It is on account of high quality that German cars, Swiss watches, Japanese electronics etc. have established global acceptance. Thus, it is imperative for all organizations to make systems for quality management and control. Let us now study the techniques and standards for quality control accepted globally.
l Small-Group Activities The nature of each of these elements may be outlined briefly: (i) Design standardisation denotes that the design of components and their assembly in a product has been rationalised, tested rigorously and proven in manufacture. It is a powerful means for improving the flow of new products through the product and process design function. It also has major implications for simplifying the factory floor environment and the entire product service task in the field. A proven standard design serves to eliminate various bugs from the production process. It makes possible the optimisation of the production process and its error-free operation. (ii) Taguchi methods provide a powerful means for isolating critical product design parameters that need to be controlled in the manufacturing process. They also
enable manufacturing management to relate the variability in their products to monetary losses. Taguchis quality loss function enables management to think of quality in terms of money rather than merely in terms of the implications of various statistical distributions, standard deviations, variability and so on. The importance of Taguchi methods lies in their demonstration of how the cost of variability. The cost of quality to the company and to society can be calculated through Taguchis quality loss function. The function, for example, enables a company to evaluate the significance of a 50 per cent reduction in product variability in terms of monetary gains. The company can then analyse whether the methods by which it can achieve that 50 per cent reduction in variability are worth the reduced quality monetary losses. (iii) Quality Function Deployment (QFD) represents a comprehensive analytic schema or framework for quality. The purpose of this schema is to enable a company to translate any customer preference or desire about products into what has to be done in design, manufacturing or distribution and to the product and the process, to satisfy the customer. Quality function deployment provides structure to the product development cycle. The foundation of this structure is customer requirements. QFD proceeds in a systematic manner from design concepts to manufacturing process to manufactured product. It ensures at each step that quality assurance is built into both process and product. QFD also implies that the company has documented its quality policy that is understood, implemented and maintained at all levels in the organisation and that responsibility and authority are clearly defined. (iv) Performance measurement and statistical quality control are applicable to both the factory of the enterprise and its vendors or suppliers. The latter are enjoined upon and expected to supply materials, components and inputs of required standards and specifications of quality. Without a proper frame of measurement, a company cannot assess and evaluate the success or effectiveness of its efforts towards improving the cost and quality of its operations and outputs.
(v) The concept of employee involvement is essentially concerned with extending decision-making to the lowest possible hierarchic level of the company. It also denotes a high level of workers motivation and morale and their identification with the goals of the organisation. A high level of employee involvement, i.e., their motivation, commitment and empowerment towards productivity, innovation and problem-solving, depends on the strength of an organisations culture, i.e., its system of shared values, beliefs, norms and vision. (vi) The concept of small-group activities is closely aligned with employee involvement. Small voluntary groups of workers known as quality circles or productivity teams represent a mechanism for evoking, sustaining and utilising employee involvement. Small-group activities represent a powerful way of improving productivity, quality and work performance in the organisation in a continuing manner. Six Sigma or Zero Defects in TQM: Six sigma is a major part of the TQM programme. It is defined as 3 to 4 defects per million. It stresses that the goal of zero defects is achievable. The concept and method of six sigma is applicable to everyone and to all functions, i.e., manufacturing, engineering, marketing, personnel, etc. As a concept, it aims at reducing process variation and reducing and finally eliminating all defects. As a method, it aims at the output of work, the customers of that output, customers critical requirements, suppliers and the firms critical requirements of them, the processes used by the firm and the tools and approaches for continuously improving the firms processes. Six sigma, in essence, is a measure of variation.
Methodology of Six Sigma: The application of six sigma as a concept and a method involves the following six steps: 1. Specify clearly the products or services, i.e., the output, you provide. These include output from your processes that the customer receives from you and which incorporate your value-added element. 2. Specify the customers of the output and determine what they consider important. 3. Identify your suppliers and specify your critical requirements of them. Your ability to satisfy your customers depend on the ability of your suppliers to meet your critical requirements. 4. Delineate the process for doing your work. Map key sub-processes or activities and identify tasks, decision-points, storage points, wait points or queues, workflow and items of rework. 5. Examine each link or step in the process with a view to assess whether or not it adds value to the product or service to satisfy the customer. Improve the process in the light of such an examination. 6. Continue the improvement process by measuring and analysing defects or deficiencies and then proceed towards removing them in a planned manner.
l Perpetual Improvement The keynote of these four PIs is teamwork or cooperation. In TQM, however, the concept of teamwork is larger and more inclusive. It implies that (a) employees are viewed as assets; (b) suppliers are viewed as partners; and (c) customers are viewed as guides. Involving all three of them intimately in the companys team effort to accomplish TQM is a continuing thrust of the companys manufacturing policies. The underlying assumptions or key premises of TQM may be briefly summarised:
l Quality cannot be improved by investment in high technology alone. l Quality depends on and comes from, people. l Quality is the result of attitudes and values; it is the result of viewing quality as a
way of life.
l Organisational culture and management style govern the quality of products and
Quality Control implies working to a set standard of quality which is achievable and which has a ready market. Thus Quality Control means adherence to a standard or prevention of a change from the set standard. In general, this is essential because when there is acceptable quality, a manager must ensure that there is no deterioration from the standard. However, in a changing world one is often faced with the fact that the quality which is acceptable today by the customer may not be acceptable to him the a year later. Therefore, there is need for a breakthrough, (creation of change) for improving existing standards. Thus preventing change (control) and creating change (breakthrough) are two important functions of quality management. Unfortunately a large number of managers simply have no time for breakthrough because they are obsessed with day-to-day problems of keeping controls at the existing levels. Many breakthrough programmes call for a change of the existing practices. There is always a resistance to change specifically if the objective is not properly understood. This is because the people likely to be affected by the change are not involved in breakthrough efforts. Many breakthrough programmes have failed to click because of this attitude. The training programme to suit the requirement of the organisation and person involves has been found to be helpful in ensuring breakthrough in attitude. Quality control has the objective of coordinating the quality maintenance and improvement efforts of all groups in the organisation with a view to providing full consumer satisfaction. Statistical Quality Control enables these objectives to be attained most economically reducing scrap and rework, reducing machine downtime and minimising inspection. A successful statistical quality control programme should result in better quality to the consumer at a lower cost. One would instinctively recognise two aspects of quality, quality of design and quality of performance. The difference between an ambassador and a maruti is the quality of design. Once the quality of design has been established, quality of performance concerns itself with how well the target is hit. Statistical Quality Control is, in general, concerned with the quality of performance but it is also a fact that SQC applications have occasionally resulted in the improvement of the design as well.
These questions are answered in the succeeding paragraph. Data can be of two types: attribute and variables. The former is generated when items are inspected and classified as good or defective, number of off beats in a unit time, number of defective moulds, number of NTs rejections etc. The latter involves the actual measurement of the degree of conformance to requirements diameter, weight, temperature, chemical composition, hardness etc. The pros and cons of the two types of data are summarised in the table:
Table 10.1: Comparison of Attribute and Variable Data
Characteristic Cost of measuring instrument Nil or low Grade of operator Speed Recording of data Overall cost per observation Low Information value per observation Low Number of observations needed Large for valid inference Unskilled/ Skilled Semiskilled Quick Simple Slow Complex High High Small Attribute Variable High
Variable data will naturally be preferred for control purpose where the characteristic concerned is important.
distributions, or for distributions which lack a clearly dominant single peak. (2) MODE (value which occurs most often in data) is usually for severely skewed distributions, describing an irregular situation where two peaks are found, or for eliminating the effects of extreme values. (3) MEDIAN (the middle value when the figures are arranged according to magnitude) is usually used for distributions where the mode is not well defined, for reducing the efforts of extreme values, or for data which can be ranked but are not economically measurable shades of colour, visual appearance, odours. Mean is the most generally used measure of central tendency in quality work. It is employed so often to report average size, average yield, average per cent of defective etc. Control charts have been devised to analyse and keep track of it. Such control
231 charts can give the earliest obtainable warning of significant changes in central values of Total Quality Management the group. The mode is the value which corresponds to the greatest frequency, the peak value. It is the number that appears most often or most commonly and is in this sense most typical of the data. Understandably, then, the mode is the measure instinctively picked out when bar charts are used. For example, to compare sizes of inspected parts with blue print limits. It is the size of the parts described by the tallest bar. In contrast, the median is generally reserved for a few special situations such as destructive testing, where it can sometimes be used, through a statistical trick, to reduce the number of parts tested. If, for example, the average of five parts tested is used to decide whether a life test has been met, then the life span of the third part to fail can sometimes be used to predict the average of all five, and thereby the result of the test becomes much sooner.
Dispersion The extent to which the data are scattered about the zone of central tendency is known as the dispersion. Measure of dispersion is the second of the two most fundamental measures in statistical analysis. Followings are the measures of dispersion: (1) Range (2) Variance and Standard Deviation (3) Mean Deviation (4) Coefficient of Variation (1) Range: The simplest measure of dispersion in a sample is the range which is
defined as the difference between the largest and the smallest values included in the distribution. Range = largest value minus smallest value = R. The advantage of the range as a measure of dispersion is its utmost simplicity. However, the range can sometimes be misleading because of the effect of just one extreme value. The range is the most commonly used measure of dispersion in every day life. Examples are: i) In weather forecast min. and max. temp. in a day. ii) In SPC (Statistical Process Control) mean and range charts. iii) Used in studying variation in money rates, share prices.
(2) Variance and Standard Deviation: A second measure of dispersion is the variance. This is defined as the measure of dispersion about the mean and is determined by squaring each deviation, adding these squares (all of which necessarily have plus signs) and dividing by the number of them.
Expressed as a formula: 2
d i n
While the variance is of fundamental importance in statistical analysis, the most useful measure of dispersion is the square root of the variance, known as the standard deviation. It is easily seen that when the data is in the form of a frequency distribution: Std. Deviation = s = sq. of Variance
= sq. of 2
d i n
When the frequency of the variable is given (f) Std. Deviation = r = sq. root of Variance
2 fd ii n (3) Mean Deviation: Mean Deviation in a set of observations is the arithmetic average of the deviations of each individual observation from a measure of the central
= sq. root of tendency (mean, mode, median). Mean Deviation from mean = d where d is deviation from mean. n Mean Deviation from mode = dk where dk is deviation from mode. n Mean Deviation from Median = dm where dm is deviation from median.
Significance of Mean Deviation: As the mean deviation is not affected very much by the extreme values as is the case with Standard deviation, the Mean deviation is useful for many studies in economic field, e.g., computing the personnel distribution of wealth in a community or a nation. (4) Coefficient of Variation: As standard deviation is analogous to some absolute error being based on the deviations of observations from the central value which may be looked upon as the true value, a measure of relative dispersion is comparable to a measure of relative error. Such a measure, to be of any use should be free from any units for the sake of comparability. The most commonly used measure of this type is the co-efficient of variation given by c.v = 100 s/X where s is the standard deviation and X is the mean. The pth percentile of a variable refers to the value below which p% of the observation lie. For example, the median is the 50th percentile. The percentiles can be obtained by drawing a graph of the cumulative frequencies in y axis against the end of the class interval upto which the frequencies are cumulated in x axis and reading off the X value corresponding to any desired percentile value.
Variation consists of two parts: (1) Chance causes: This is the variation which is natural or inherent in the process. (2) Assignable causes: This variation is unnatural or external due to assignable causes that can be traced.
Variations resulting from the assignable causes which can be traced, show some pattern and follow the statistical laws, i.e., laws of distribution normal, poission, hyper-exponential, etc. Examples: Number of machines under breakdown, Variation in Alloy Steels Sheets rolled/forged. The pattern of distribution can be predicted from the samples of size n taken out of the population (N). The process is said to be under Statistical control if the process need not necessarily yield products confirming to specifications as the process under Statistical control produces results which conform to the control limits. The main objective of Quality control is to present defects during production. The difference between the chance causes of variation and assignable causes of variation are given here under:
Chance Causes of Variation Assignable Causes of Variation (1) It consists of many individual-causes. (1) It consists of one or a very few individual causes. (2) All causes taken together normally amount (2) Any one assignable cause can lead to a large for a substantial amount of variation. amount of variation. (3) In case of raw materials these can be (3) In case of raw materials, the entire lot or within the conformance specifications. batch may be defective. (4) In case of running of a machine, there can (4) In case of a machine, the faulty machine set be slight variation of the machine. up gives rise to assignable cause variations. (5) Lack of human perfection in (5) Setting or reading of precise instruments by chance variations in setting or an untrained operator gives rise to assignable reading instruments. cause variation. (6) The chance variation cannot be (6) Assignable cause variation once detected can economically eliminated from the be eliminated and the action of centrally process. assignable cause is usually economically justified.
Inspection in a manufacturing industry is carried out to compare products with known standards or specifications. To ensure the specified quality for the acceptability of the product, inspection stages are: (1) Incoming Raw Materials Stage: Here, the inspection is carried out to find whether the incoming lot is rejectable/acceptable for the manufacturer under the agreed terms of inspection plan. Single sampling plans and multi-sampling plans are in use for this purpose. (2) Process Control: Inspection during manufacturing is termed as Process Control Inspection. The inspection is carried out to find the quality of products being produced is good or bad and take action to bring the process under control. Process inspection should be done at appropriate points in the process so as to provide an immediate and accurate reflection of the quality status and condition of all parts being processed. Process Inspection may include the following checks: (a) Set up and first piece inspection: First piece inspection is established by checking the first item produced in the production set up. It will establish whether the machine set up, jigs & fixtures, and gauges are correct or not,
and whether proper material is being used for the job. It also eliminates the necessity of scrapping a substantial part of production, run by locating the cause for rejection and correcting the deficiencies before production starts. Therefore, the production should not begin until the first piece found is acceptable. (b) Patrol inspection: Patrol inspection is perhaps the most crucial of all functions to keep the process in control throughout the production. It consists of inspection at appropriate intervals of time to verify conformity during manufacturing and is also known as floor inspection. This inspection may be conducted by operators/inspectors monitoring specified operations or by automatic inspection. Whatever applicable, the last piece must be included in the patrol inspection. (3) Final Inspection: Final inspection of finished goods before these are despatched to next stage of production or customer helps in locating various assignable causes and taking suitable remedial actions. (a) Errors associated with inspection The errors erupt in due to the followings: (1) Lack of understanding among standards of inspection. (2) Lack of consistency among various inspectors. (3) Improper sampling from the source population. The errors at (1) & (2) can be minimised but not eliminated altogether whereas the error at (3) can be eliminated through the choice of a correct sampling plan.
(1) Total cost of inspection is (1) As volume of inspection is very high and at times it is very low and hence total cost prohibitive. of inspection involved is low. (2) This inspection is subject to (2) The sampling inspection errors due to operator s is based on scientific fatigue, negligence and due to sampling plan system and poor supervision by hence is free from such inspectors and results cannot errors and results can be be predicted with accuracy. produced accurately. (3) No Sampling error. (3) As the method is based on sample drawn from the population, hence it is to sampling error. However, subjected the magnitude of sampling error can be estimated.
(4) This method of inspection is (4) Sampling inspection is not at all suitable for the only way of inspecting destructive testing. for a destructive test.
Thus, we may infer that Sampling inspection is generally superior to Hundred per cent inspection.
-3s X 6s
+3s
Procedure for Evaluation This involves the following steps: (i) Collect data on a number of rational groups and review them for consistency or homogeneity. (ii) Eliminated data which does not conform to the general pattern observed. If the data gets eliminated by more than 1/3rd, then reject the entire data and collect fresh data till homogeneity is achieved. (iii) Calculate the process capability as given below: Calculation of Process Capability Attribute data (a) Uniform Sample Size, n = number of observations in each sample. (1) Count the number of samples say k = 25, (2) Let sample size (number of observations in each sample) = n, (3) Add all the defectives observed in all the 25 samples, summation of defects =d, (4) Find m = np = D/k where p is the proportion defective, (5) For p < 0.10, read limits from the concerned statistical table, against the value of m, Else (m >= 0.10), Calculate limits as np + 3 npq where q = (1 p), ,
(6) Test for homogeneity: See whether all readings are within Control Limits, if so, accept the data as homogeneous. Otherwise, reject the readings (observations) which are outside control limits and again test for homogeneity. If the total number of observations rejected are more than one-third reject the entire data and fresh data should be collected. (b) Variable sample size (number of observations in each sample not uniform)
(1) Let number of samples = k (2) Let n1, n2 _ _nk be the sample sizes (n1 + n2 + _ _ + nk = N ) (3) Let d1, d2 _ _ dk be the number of defectives found in corresponding samples ( d1 + d2 + _ _ + dk = D )
(4) m = D/n where m = fraction defective (5) Find n1p, n2p _ _ nk.p (6) For p < 0.10 Use control limits from respective statistical table else (p >= 0.10) Calculate control limits by np + 3 npq where q = (1 p) (7) Test for homogenecity as in (a) for uniform size. (8) Accept the p of the homogenecity data as the standard of capability of the process. Calculation of Process Capability Variable Data
A variable data has two parameters, central tendency and dispersion. Whereas the central tendency can be corrected easily, its very difficult to examine the dispersion and hence is critical for assessing the Process Capability. Range: (1) Let us take number of samples k = 25 (2) Let us have n = number of observations in each sample = 4 or 5 (uniform) (3) Let R1, R2 _ _ Rk be the Range of sample 1, 2 _ _ k respectively. (4) Average Range Rbar = summation (R1 + R2 + _ _ + R.k)/k (5) Read off the values of D3 and D4 from the statistical table against the sample size
selected.
(6) Then UCL & LCL i.e., Upper & Lower Control Limits are given by D4.R and D3. R
(7) If no reading is outside the Control Limit, accept R as the standard index of process variability. (8) If readings are beyond Control Limits, reject them and find the revised limits and so on till the homogenecity is achieved. In this case the revised Rbar shall be the standard index of process variability. (1) Calculate Process Capability by the formula
Process Capability = 6 s = 6 R / 2
where d2 is from the statistical table against n. Variation between samples or the stability of the process. This can be checked by examining the consistency of the sample means as given below: (1) Calculate sample means of all samples 1, 2, _ _ k.
Let the sample means be X 1, X 2, _ _ X k.
k (3) Read value of A2 from the statistical table (4) Value of Control Limits for the average are given by
X + or A2 R
(5) Plot the average on the graph and study the graph carefully for any systematic or other variations of the limits and investigate causes thereof. Check Your Progress 1 State whether the following statements are true or false: 1. The objective of quality control is to make change acceptable to everyone. 2. The number of defective pieces and number of defects can be classified as attribute data. 3. The measures of central tendency are mean, median, mode and range. 4. Variance is square root of standard deviation. 5. The loss of a liquid substance through evaporation during heating is an assignable cause. 6. A wrong reading of electric current due to faulty meter is an assignable cause. 7. The probability of error in 100 per cent inspection is very low.
If there is evidence of lack of control, the process should be stopped, investigated, corrected and restarted. Till the process gets stabilised, keep these goods separately segregated for good and bad separately. A point outside the control limit is an index of out of control situation whereas the pattern of points indicate the nature of action desired at any point of time.
where A2, D4, D3 are read from Statistical table against the selected sample size. l Sampling frequency for (X, R) Charts as per Duncans study reveal that:
1. If a shift in the process average causes high rate of loss as compared to cost of inspection, it is better to take small samples quite frequently rather than large samples less frequently e.g., it is better to take 4-5 samples every half hourly rather than 8-10 every hour. 2. If it is possible to decide quickly and the cost of looking for trouble is low, then use 2 r or 1.5 r Control limits rather than 3 r Control limits and use 3 r Control limits if the cost of looking troubles is high. 3. If the unit cost of inspection is relatively high, then its better to take sample size of 2 or 3 at relatively long intervals i.e., once or twice in a shift and use Control limits + or 2r ( or 1.5 r). 4. A Control Chart schedule should take into account detection of changes in process of required degree with desired confidence. However (X, R) charts are not understood easily by Operators/Inspectors and these charts cannot be used for go-on-go type of data. (2) p, np chart: This chart is applicable to Attribute Data (number of defective units of product)
l This chart is used to control the overall fraction defective of a process. The l The chart is easily understood as compared to (X, R) chart. l The chart provides overall picture of the quality. However, this charts does
data required for this chart is already available from inspection records.
charts do not recognise degree of defectiveness in units of product standard and limits vary the sample size.
Static Standard Control Limit UCL np np LCL
3(1 )
np np p +
np np(1 ) 3 p
where p =Total number samples (k) Sample size Number of of defective pieces (n)
If rejection percentage (p) is < 10 then nm chart is convenient to use with a constant sample size and Control Limits may be read directly from the Statistical Table. (3) C chart
l C chart is applicable to attribute data (number of defects per unit of product). l This chart is used to control the overall number of defects per unit. l This chart gives all the advantages given alone for m-charts. Additionally, it
However, it does not provide detailed information and control of individual characteristics as in case of (X, R) charts.
Static Standard Control Limits UCL LCL C C C + 3s C 3s
where = c s
Having determined C, the Control limit can be directly read from the chart. Advantages of Control Charts There are numerous advantages of the Control charts. The alphabets of the Control Charts itself can be used to highlight the advantages of Control Charts: C Controls the process (at desired Economic levels). O Optimises technical resources (as it provides the information) as to take remedial action). N narrows the heterogenecity ( among units of a product). T Traces differences among Operators, Supervisors, Machines etc. R Reduces cost of Inspection. O Overhauling and maintenance of machines, indicated whenever necessary. L Leads to the detection of inspection errors. C Creates quality consciousness. H Histories the process at a glance. A Acceptability of the product by consumer is enhanced. R Reduces waste of materials. T Trains the Operator and improves his skill. S Standardises the stage processes.
Problems on Control Charts Q1. Draw the control charts for X (mean) and R (range) from the data relating to 10 samples, each of size 5.
X 3.456 3.467 3.385 3.380 3.387 3.450 3.560 3.670 3.577 3.213
R 0.012 0.056 0.021 0.045 0.028 0.058 0.018 0.035 0.023 0.067
D4 = 2.115 D3 = 0 Solution: k = number of samples = 10 n = number of observations in each sample = 5 X = SX/10 = 34.545/10 = 3.4545 R = 0.441/10 = 0.0441 For X chart
UCL = X + A2 X R = 3.4545 + 0.577 0.0441
= 3.48
LCL = X A2 X R = 3.4545 - 0.577 0.0441
= 0.013
LCL = D3 R = 0 0.0441
=0 UCL =
1 X LCL 2 3 4 5 6 7 8 9 10 Sample No.
Q2. Draw the control charts for X (mean) and R (range) for the above example with
the following information: For n = 5, d2 = 2.326 d3 = 0.864
Solution:
X = X / 10= 34.545/10
= 3.4545 R = 0.441/10 = 0.0441 For X chart UCL= X + 3 R / ( d2 / n ) = 3.4545 + 3 0.0441/(2.326/ 5 ) = 3.4545 + 0.1268 = 3.5813
LCL = X - 3 R / ( d2/ n )
= 0.0441 + 3 ( 0.0441 / 2.326 ) 0.864 = 0.0441 + 0.0411 = 0.0132 Q3. In a forging operation 20 samples were taken and number of defects observed in each sample. The results are as follows:
Sample # 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. Number of defects observed 3 6 2 7 3 6 14 2 7 13 2 2 8 3 9 3 6 1 2 8
Draw the UCL and LCL for the above mentioned data.
Solution: Here the required process control chart is C chart. k = 20 Sum of defects = 115 C = 115 / 20 = 5.75 UCL = C + 3 ( c ) = 5.75 + 3 ( 5.75 ) = 12.94 LCL = C 3 ( c ) = 5.75 3 (5.75) = 1.44 i.e. 0 Since sample number 7 & 10 fall above UCL values, we need to delete them to have process under control. Thus revised C chart is given by deleting these samples and repeating the same calculations. C = 4.88 UCL = C + 3 c = 4.88 + 3 4.88 = 11.52 LCL = C 3 c = 4.88 3 = 1.74 i.e. 0 4.88
Here (N,n,c) as a set, constitute the sampling plan, called Sampling Plan Attributes. Risk Involved With any sampling plan, there is always a risk: (a) Very bad lots will be passed. (b) Good lots will be rejected. These two risks are appropriately called Consumer s Risk and Procedures Risk respectively.
We can plot a curve between the % defectives in the lot and the probability of acceptance of the lot, under any given sampling plan known as the OC curve. The procedure sends a lot of Acceptable Quality Level (AQL), (given in percent defectives) which can get rejected, the chance or probability of this being the Procedures Risk (PR), whereas on the other hand the customer (manufacturing plant) faces the risk of accepting lots as bad as the LTPD (Low Tolerance Percent Defective), the probability of acceptance of such lots being the Consumers Risk (CR). The probability of acceptance can be determined by making use of the following expression which is found on Hyper exponential distribution:
0.010 0.009 0.008 0.007 AOQL 0.006 0.005 0.004 0.003 0.002 0.001 p AOQL
0.01
0.02
0.03
0.04
0.05
( =- ( Pa 1)Pb )
Pb C = () BG
bb =
b0 =
CC bb
N n
where P(b) = Probability of finding taken from the lot of size N. B = number of bad items in the lot; G = number of good items in the lot; g = number of good items in the sample; b = number of bad items in the sample; P(a) = Probability of Acceptance of the lot.
number
of
bad
items
in
the
sample
of
size
UCL= X + 3
UCL= X + 1
X LCL= X - 1 LCL= X - 3 N
Therefore, when a sample mean overshoots a control limit, there are two possibilities, there is a possibility that the process is in control (it may be one of those 27 units per 10,000). Or the process is not under control, and assignable causes can be determined and the process brought back under control. When the sample mean falls either above the UCL or below the LCL, it leads to tradeoffs between two types of risks, called type I and type II, that are shown in Table 10.2.
Table 10.2: Type I and Type II Errors
Assignable causes are Assignable causes are not searched for searched for Process is in control Type I error Correct approach Process is out of control Correct approach Type II error
Type I error: This is also called the producers risk. This error takes place when the process is in control but seems out of control. This error leads to wastage of effort, time, and money in finding an assignable cause, which in fact does not exist. The risk is reduced by using wide control limits. Type II error: This is also called the customers risk. The error takes place when the process is out of control but the QC manager is not looking for any assignable causes.
This error is harmful to the quality system as some defective products may be produced during this time as long as the process is out of control. The risk can be reduced by narrowing the control limits. If the costs of undetected errors are high compared to correcting the process, lower consumer risks are indicated. However, if the costs restoring the process are high compared to the cost of producing defective parts, lower producer's risks are appropriate. Drawing the OC Curve Step 1: Find N, n and c.
Step 2: Find the probability of accepting the lot for different values of proportion defectives (i.e., p). For this, do the following:
l multiply sample size and proportion defectives, i.e., calculate (np) for each
value of p.
l Use Poisson's Table for each value of 'np' from the column for 'c'. l Note down the value of probability of acceptance (Pa ) for each set of p.
Step 3: Plot Pa on Y axis and P on X axis. Step 4: Find a and b at AQL and LTPD
At p = AQL; a (producer's risk) is (I-Pa ) At p = LTPD: b (consumer's risk) is Pa
Worked Example For a lot of 4000 items, a sample of size 100 is drawn each time. The acceptance number is two for a single-sampling plan. Find producer's risk and consumer's risk at an AQL and LTPD of 1 and 6 percentage defectives per lot respectively. Answer Given: Lot size (N) = 4000' Sample size (n) = 100 Acceptance number (c) = 2 Acceptable Quality Level (AQL) = 1% Lot Tolerance percentage defectives (LTPD) = 6%
Calculation of Probability of acceptance (Pa) Percentage defectives Proportion defectives (p)
np = 100p Pa = (Px? 2)] Probability
Remark
1(AQL) 0.01 2 0.02 2.0 3 0.03 3.0 4 0.04 4.0 5 0.05 5.0
of accepting defectives less than 2 (with reference to table of Poissons distribution) 1.0 0.920 =1.0 - 0.920 = 0.80 0.677 0.423 0.238 0.125
6(LTPD) 0.06 6.0 0.062 = 0.062 7 0.07 7.0 0.030 8 0.08 8.0 0.014
250 249 Most organizations report cost benefits but more important, it has been found that effective Production and Operations Total Quality Management QCs report higher group cohesion, performance norms, job satisfaction and intrinsic Management
satisfaction, satisfaction do you understand by Quality Control?organization commitment. 1. What with co-workers, self-monitoring, and 2. Write short note on Collection and Presentation of Data. In manufacturing, the Japanese practice is that the responsibility for quality rests with 3. What are the major statistical measures for Central Tendency? the manufacturer of the part rather than the quality deptt. acting as a staff function i.e. 4. Define Statistical Quality Control. Describe briefly the techniques of SQC used in: here the responsibility is of the production deptt itself. The workers are organised into teams (3 to 25 members per team) who themselves take the decision on solutions to quality problems. (a) Inspection of Incoming Materials Even if one item produced is of Sub-standard & it is likely to effect the subsequent (b) Inspection during Process Control. process, then the process shall be stopped immediately and the entire team will discuss the cause and effect, decide the remedial action, rectify the how the same is used for Inspection of 5. Quality Acceptance Sampling and explain process and then restart the production. This helps in bettering quality and reducing rejections, motivating workers Incoming Materials. as they feel proud of being a part of the decision process. This helps as an over-all 6. Explain the lowering wastivity & reducing cost of production per unit. achieving higher productivity, Importance and Benefits of SQC Techniques.
The unit focuses on quality 2. False, 3. False, 4. True, 5. False, 6. or perceived, to the 1. False, as a vehicle for delivering value, real True 7. True. customer whose needs and expectations are changing over time. It imparts to dimension CYP 2 to the concept by shifting the emphasis from quality control to quality assurance. ISO 9000 series of standards2. T, 3. T, 4. T 1. T, provide a comprehensive guideline and industry recognized as minimum level of acceptable quality. An increasing trend to adopt quality strategies like TQM, JIT, Quality Circles has also been highlighted in the section. 10.12 SUGGESTED READINGS
10.9 LESSON END ACTIVITY Productions and Operations Management, Upendra Kachru, Excel Books, New Delhi.
Take an example of any ISO 9000 company and study its manufacturing processes. You Operations Management may use the internet for this activity. (Theory & Problems), Joseph G. Monks, McGraw Hill Intl.
Productions and Operations Management, Everest E Adam & Albert, PHI Publications, IVth Ed. Productions and Operations Management, S.N. Chary, TMH Publications. Productions and Operations Management, Chunawala and Patil, Himalaya.
10.10 KEYWORDS
Quality Control: Working to a set standard of quality which is achievable and which has a ready market. Measure of central tendency: A parameter in a series of statistical data which reflects a central value of the same series. Dispersion: The extent to which the data are scattered about the zone of central tendency. Control Chart: A graphical representation between the order of sampling along x-axis and statistics along the y-axis. Statistical Quality Control: The application of statistical techniques to determine how far the product confers to the standards of quality and precision and to what extent its quality deviates from the standard quality. Acceptance Sampling: The acceptance of a consignment of items on the basis of its quality. Total Quality Management: A quality focused customer oriented integrative management method that emphasizes continuing and cumulative gains in quality, productivity and cost reduction.