Lean

Download as pdf or txt
Download as pdf or txt
You are on page 1of 37

Quality in Lean Six Sigma:

• Quality means the totality of features or characteristics of a product or service.

• The goal of quality is to meet or exceed customer expectations.

• Quality is assessed based on relevant features and their performance.

• If the ratio of Performance (P) to Expectations (E) (P/E) is less than 1, it indicates a lower-
quality product, which can lead to customer attrition.

Customer Satisfaction and Delight:

• Customer satisfaction is achieved when P/E is around 1.

• When P/E exceeds 1 (P/E > 1), it leads to customer delight.

• At P/E < 1, customers may start exploring other options due to dissatisfaction.

• Customer loyalty is obtained when P/E > 1, leading to repeat sales for the organization.

• Repeat customers often contribute a significant portion of an organization's revenue,


sometimes up to 80%.

Expectations and Quality Standards:

• Meeting customer expectations is fundamental to quality.

• In some cases, quality standards or industry regulations dictate mandatory inspections to


ensure a product's fitness for use.

• For example, during World War II, all products were required to go through mandatory
inspections before use to ensure they were "fit for use."

ISO and Standardization:

• In 1947, the International Organization for Standardization (ISO) was formed.

• ISO plays a crucial role in developing and publishing international standards for quality and
other areas.

• These standards help ensure product quality and consistency.

Cost Implications:

• The cost of finding and correcting defects or rework can be significant.

• In Lean Six Sigma, the focus is on reducing defects to minimize rework and associated costs.

These notes summarize the key concepts related to quality, customer satisfaction, delight, and the
importance of meeting expectations in Lean Six Sigma. Quality is not just about meeting basic
functionality but exceeding customer expectations to achieve delight and loyalty. Additionally,
adhering to standards and minimizing rework costs are critical aspects of quality management.

Key Figures in Quality and Process Improvement:


• Walter A. Shewhart: The originator of the control chart, a fundamental tool in statistical
quality control. He emphasized the distinction between assignable causes and chance causes
of variation.

• Dr. Joseph M. Juran: A prominent quality management expert who developed a model
known as "Juran on Quality Improvement." He stressed the importance of understanding
customer needs, product features, and the process for quality improvement.

• Edwards Deming: A renowned statistician and quality management expert who is well-
known for the Plan-Do-Study-Act (PDSA) cycle, also known as the Deming Cycle. He played a
crucial role in promoting quality management in Japan.

Quality Improvement in Japan:

• The Japanese invited these scientists to improve the quality of their manufacturing
processes.

• Joseph M. Juran's model, "Juran on Quality Improvement," was instrumental in achieving


quality improvements.

• The focus in Japan was on defect-free manufacturing and achieving total quality.

DMAIC and DMADV in Lean Six Sigma:

• DMAIC (Define, Measure, Analyze, Improve, Control) is a structured process used in Lean Six
Sigma for improving existing processes.

• DMADV (Define, Measure, Analyze, Design, Verify) is another structured process used for
designing and developing new products or improving the performance of existing products.

Quality Improvement Models:

• PDCA: Plan-Do-Check-Act is a continuous improvement cycle often associated with Dr.


Shewhart.

• PDSA: Plan-Do-Study-Act is a similar continuous improvement cycle popularized by Dr.


Deming.

• Dr. Juran's Quality Triangle: An approach that focuses on quality, cost, and time as key factors
in achieving quality improvements.

These notes summarize the key figures, models, and concepts related to quality and process
improvement, with a particular emphasis on the contributions of Walter A. Shewhart, Joseph M.
Juran, and Edwards Deming, as well as the quality improvement initiatives in Japan.

Key Figures in Quality Management:

• Dr. Kaoru Ishikawa: Known for the development of the Fishbone Diagram (Ishikawa diagram)
used for root cause analysis in quality management.

• Philip B. Crosby: Known for the "Absolutes of Quality Management," a set of principles that
emphasize prevention of defects and the importance of conformance to requirements.
Quality Control and Customer Requirements:

• Quality is often represented as the relationship between a dependent variable (Y) and
independent variables (X).

• Y represents the key process output variable (kPOV) and is governed by customer
specification limits, including the Upper Specification Limit (USL) and Lower Specification
Limit (LSL).

• X represents key process input variables (kPIV) and is governed by process control limits,
including the Upper Control Limit (UCL) and Lower Control Limit (LCL).

Process Improvement and Profitability:

• A scenario is described where the goal is to produce 500 cakes per day with an initial defect
rate of 30 defective cakes out of 500.

• The cost per cake is calculated, resulting in a total cost of INR 7,950. The selling price (SP) per
cake is INR 20, and the total revenue is INR 10,000, leading to a profit of INR 2,050.

• After process improvement, the defect rate is reduced to 10 defective cakes out of 500,
resulting in a higher profit of INR 2,350.

• Further process improvement leads to zero defective cakes and a profit of INR 2,500,
highlighting that fewer defects lead to higher profitability.

These notes summarize the key concepts related to quality management, customer requirements,
process improvement, and the impact of defect reduction on profitability. They also mention Dr.
Kaoru Ishikawa's Fishbone Diagram and Philip B. Crosby's quality management principles.

Six Sigma Methodology:

• Six Sigma is a methodology aimed at reducing defects and errors in processes.

• The primary goal is to minimize variations and improve the quality of products or services.

Factors Affecting Defect Reduction:

• Reducing defects involves identifying factors (X1, X2, X3) that affect the key process output
variable (Y).

• In a baking scenario, factors like Baking Temperature, Baking Time, and Quantity of Baking
Soda can influence the quality of the cake.

Lean Approach for Efficiency:

• Lean is an approach that focuses on eliminating waste and non-value-adding activities.

• By improving efficiency and reducing waste, the Lean approach complements Six Sigma's
focus on reducing defects.

Six Sigma for Effectiveness:

• Six Sigma aims for improved process effectiveness, reducing variations and defects.

• It is often measured by the standard deviation and expressed as defects per million
opportunities (DPMO).
• A high Sigma level (e.g., 6 Sigma) indicates high process performance (99.99967%).

Industry Applications:

• The application of Six Sigma varies across industries, with some industries requiring higher
criticality.

• Industries like airlines, healthcare, and space agencies have critical processes that benefit
from Six Sigma.

Historical Context:

• In 1987, Motorola officially launched its Six Sigma program, making significant strides in
quality management.

• The term "Six Sigma" was coined by Bill Smith, an engineer at Motorola.

• General Electric (GE) promoted Six Sigma across its business verticals, expanding its
adoption.

• Ford introduced the concept of Lean in the manufacturing of cars, emphasizing efficiency
and waste reduction.

These notes provide an overview of Six Sigma, its aim to reduce defects, factors affecting defect
reduction, the Lean approach, effectiveness, industry applications, and the historical development of
Six Sigma.

Key Roles in a Six Sigma Organization:

• Apex Council: Analyzes and reviews projects, allocates budgets and resources, and provides
oversight.

• Project Champion/Sponsors: Senior members who support projects, such as the CFO who
helps justify project importance.

• Project Owner: Responsible for driving high-quality output from the process.

• Master Black Belt (MBB): Develops Six Sigma strategy and is an expert in Six Sigma
methodologies.

• Black Belt (BB): Drives projects and is responsible for process improvement; may have a
team of Green Belts.

• Green Belts (GB): Lead projects and are part of the project team.

• Team Members/Yellow Belts: Implement project improvements at the site.

Selection of Team Members:

• Team members are selected based on their skills and attitude.

• "Rat Stars" are individuals with high skills and a positive attitude.

Define Phase in Six Sigma:

• The Define phase is the first step in a Six Sigma project.


• It involves understanding the voice of the customer (VOC) to determine what customers
want.

• Identifying what is critical to quality (CTQ) is a crucial part of this phase.

• Process mapping techniques and creating a project charter are commonly used in the Define
phase.

• There are two types of customers in Six Sigma: internal (within the organization) and external
(actual product or service users).

• Understanding and satisfying the VOC is vital to the success of a Six Sigma project.

These notes provide an overview of key roles in a Six Sigma organization, the selection of team
members, and the activities involved in the Define phase of a Six Sigma project, with a focus on
understanding customer needs and the critical-to-quality elements.

Voice of the Customer (VOC):

• VOC represents the feedback, complaints, and expectations that customers have, which an
organization may struggle to meet.

• Understanding VOC is essential for improving products, services, and customer satisfaction.

Sources of VOC Data:

• VOC data can be gathered from various sources:

• Reviews and ratings from customers.

• Direct feedback from customers.

• Sales and marketing data analysis.

• Customer portals for online feedback.

• Annual general meetings (AGM) where customers may voice concerns.

• Internal meetings and benchmarking against competitors.

• Social media for tracking customer sentiment.

• Service centers for handling customer inquiries and complaints.

• Surveys such as Customer Satisfaction (CSAT) and Net Promoter Score (NPS).

• Research and development efforts to understand customer needs.

Challenges with VOC:

• Sometimes VOC can be vague or generic.

• It's important to work with subject matter experts and sales teams to convert VOC into
specific, actionable statements.

Converting VOC into Quantitative Form:


• VOC can be converted into quantitative data for analysis and prioritization.

• Grouping VOC into categories or buckets can help identify common themes.

• Common groupings can be created to classify VOC data.

• The Kano Model, introduced by Dr. Noriaki Kano, helps prioritize VOC based on its criticality
and impact on customer satisfaction.

Kano Model Segments:

1. Basic Needs: VOC that, when fulfilled, prevent dissatisfaction. More of these doesn't
significantly increase satisfaction.

2. Performance Needs (One-Dimensional): More is better and helps the product compete with
others.

3. Delighters: Fulfilling these needs delights the customer, but if not met, the customer may not
be aware of their absence.

These notes provide insights into understanding VOC, its sources, challenges in handling VOC, and
how the Kano Model can be used to prioritize VOC based on customer criticality and satisfaction.

Customer Satisfaction and Kano Model:

• Customer satisfaction can be classified into three categories based on the Kano Model:

1. Basic Needs (Must Be): These are critical and expected by customers. Failure to meet
these needs results in dissatisfaction. Examples are safe arrival and accurate booking
system uptime.

2. Performance Needs (More the Better): Fulfilling these needs leads to increased
satisfaction and competitiveness but does not cause dissatisfaction if not fully met.
Examples include free upgrades.

3. Delighters: Meeting these needs delights customers, but they may not be aware of
their absence. They provide a competitive advantage. An example is g-seal.com fort.

Critical to Quality (CTQ):

• CTQ represents key measurable characteristics of selected VOC that are critical to the quality
of a project.

• For example, if the VOC is "App crashes 4 out of 5 times," the CTQ may involve metrics like
the number of incidents reported for app crashes or the total number of times the app
crashed and the total number of logins.

Process Mapping Techniques:

• Process mapping involves visualizing and understanding the current state (as-is) of a process.
Various techniques can be used, including:

1. Cross-functional process flowcharts with swimlanes.


2. Value Stream Mapping (VSM) to analyze the flow of materials and information in a
process.

3. Top-down chart to represent sub-processes.

• Process mapping helps identify the supplier-input-process-output-customer (SIPOC)


relationships and analyze the efficiency of the process.

These notes provide an overview of customer satisfaction, the Kano Model, CTQ aspects, and
techniques for process mapping, emphasizing their importance in understanding and improving
quality and processes.

Process Mapping:

• Process mapping is a technique used to visualize and understand the flow of a process. It
helps in identifying suppliers, inputs, processes, and customers (SIPOC) in a process.

• Process mapping often includes various elements such as activities, decision points, data
recording, and terminals.

• There are different ways to represent process mapping, including flowcharts, swim lanes,
and other visual diagrams.

Example: Pizza Making Process:

• This example illustrates a pizza making process with suppliers (e.g., flour and water), inputs
(e.g., dough and pizza base), and various process steps (e.g., adding sauce, toppings, cheese,
baking, etc.).

• The final product (pizza) is delivered to the customer, in this case, named Rahul.

Types of Process Flowcharts:

• Process flowcharts can come in different formats:

1. Activity Flowchart: Represents the steps or activities in a process.

2. Decision-Making Flowchart: Shows decision points in a process where different


paths can be taken.

3. Record Data Flowchart: Symbolizes the documentation and recording of data.

4. Swim Lane Flowchart: Maps different departments or participants involved in the


process.

Project Charter:

• A project charter is a document that describes the objectives, timelines, and the people or
teams involved in a project.

• It is used by the Apex Council to make decisions about projects, allocate resources, and
review project progress.

These notes provide an overview of process mapping techniques, illustrated with a pizza making
process, and the purpose and use of a project charter in project management.
Project Charter Elements:

1. Minimum Requirements: The project charter should include essential components, such as
the project title, business case, problem statement, project goals, project scope, project
timeline, and project team members.

2. Business Case:

• The business case in the project charter should address the urgency, impact, and
potential savings of the project. It helps justify why the project is necessary.

3. Problem Statement:

• The problem statement in the charter defines the pain point, including what it is,
where it's occurring, its duration, magnitude, and how it was identified.

4. Project Goals:

• The project goals should be specific, measurable, achievable, relevant, and time-
bound (SMART). They describe the current state and the target state for critical-to-
quality (CTQ) aspects.

5. Project Scope:

• The project scope sets the boundaries and limits of the project, defining what will
and won't be addressed.

6. Project Timeline:

• The project charter should clearly define the start and end dates for each phase of
the project, especially in the context of the DMAIC methodology.

7. Project Team Members:

• Identify the individuals who will be part of the project team, designating their roles
according to Six Sigma roles for assigning responsibilities.

ORACI Chart:

• The ORACI chart is used to define roles and responsibilities within the project team. It stands
for Responsible, Accountable, Consulted, and Informed.

Project Activities:

• Activities such as creating the charter, collecting data, conducting measurement system
analysis, and other tasks should be outlined to guide the project's progression.

These notes provide an overview of the key elements found in a project charter, which is a crucial
document for initiating and managing Six Sigma projects, ensuring that they are well-defined, well-
organized, and have clearly defined roles and responsibilities.

APMI Chart:

• APMI stands for Approver, Resource, Member, and Interested Party.


• The APMI chart helps define roles and responsibilities in a project, designating individuals or
groups as Approvers, Resources, Members, or Interested Parties.

Project Charter:

• After forming the project charter, it needs to be approved or signed off to conclude the
Define phase.

Measure Phase:

• The Measure phase in Six Sigma is the second phase in the DMAIC methodology, following
the Define phase.

• It involves defining the operational definition of Critical to Quality (CTQ) characteristics and
standardizing parameters.

Elements of the Measure Phase:

• In the Measure phase, several elements are critical:

1. Data Types & Data Collection Plan: Determining what types of data (e.g.,
continuous/variable or discrete/attribute) will be collected and how the data will be
gathered.

2. Samples & Sampling Strategies: Planning how samples will be selected and the
strategy for collecting data.

3. Data Distribution: Understanding how data is distributed.

4. Measurement System Analysis: Assessing the reliability and accuracy of the


measurement system.

5. Process Stability & Capability: Evaluating the stability and capability of the process.

Data Types:

• Two primary types of data are continuous/variable data (measurable characteristics like
height or weight) and discrete/attribute data (categorical data like yes/no or pass/fail).

These notes provide an overview of APMI charts, the transition from the project charter to the
Measure phase, and the key elements in the Measure phase, including data types and measurement
system analysis. Understanding data types and ensuring a reliable measurement system are crucial in
Six Sigma projects.

Types of Data:

1. Ordinal Data: Data that is arranged in a specific order, such as ratings, rankings, or grades.

2. Nominal Data: Data that can be counted, representing categories like the number of bags,
students, or colleges.

3. Continuous Data: Data that has the following characteristics:

• There is a probability of infinite values between any two data points.

• It has a unit of measurement.


• Time is always considered continuous.

• Data can be classified as continuous or discrete based on the numerator of the ratio.

Data Collection Process in Six Sigma:

1. Why Collect Data: It's important to define the purpose and reasons for collecting data.

2. What Data Needs to Be Collected: Determine which specific data points are relevant to the
project.

3. How to Collect Data: Develop a data collection plan, which may involve a pilot data
collection plan to ensure effectiveness.

4. Collect Data: Execute the data collection plan as outlined.

5. Ensure Consistency and Stability: It's critical to maintain data consistency and stability
throughout the collection process.

Sampling Techniques:

• Sampling involves selecting a subset of a population to represent the entire population.

• Two common sampling techniques:

1. Simple Random Sampling: Every unit has an equal opportunity to get selected as a
sample.

2. Stratified Random Sampling: The population is divided into homogeneous


subgroups (strata), and random samples are taken from each stratum.

These notes provide an overview of different data types, the data collection process in Six Sigma, and
key sampling techniques used to collect data effectively. Understanding data types and employing
appropriate sampling methods are crucial for accurate data collection in Six Sigma projects.

Sampling Techniques:

1. Simple Random Sampling Technique:

• Applicable for homogeneous data.

• Every unit in the population has an equal opportunity to be selected as a sample.

2. Stratified Random Sampling Technique:

• Suitable for non-homogeneous data.

• The population is divided into strata or groups, and samples are collected from each
stratum.

3. Systematic Sampling Technique:

• Used for homogeneous data.


• Every "k"th unit is taken as a sample, where "k" is calculated based on the total
population size (N) and the desired sample size (n).

• Collected data can be in the form of individual samples or subgroup samples.

Factors Influencing Sample Size:

• Three key elements affect the determination of sample size:

1. Confidence Level: Decided based on the criticality of the process. Higher criticality
typically requires a larger sample size for a higher confidence level.

2. Error Margin/Precision: A smaller error margin indicates high precision, and a larger
margin indicates lower precision.

3. Standard Deviation of the Population (σ): Reflects how far the values in the
population are from their mean value.

Sample Size Formula (Continuous Data):

• The sample size (n) can be calculated using the formula:

• n = [(Z * σ) / Δ]^2

• Where:

• Z represents the critical value for a specific confidence level (e.g., 1.96 for a
95% confidence level).

• σ is the standard deviation of the population.

• Δ is the desired error margin (precision).

• The formula is used to determine the appropriate sample size based on the specified
confidence level, standard deviation, and precision.

These notes provide an overview of different sampling techniques used in Six Sigma, as well as the
key factors that influence sample size determination. Understanding the relationship between
sample size and confidence level, error margin, and standard deviation is crucial for effective data
collection and analysis in Six Sigma projects.

Discrete Data:

• Discrete data consists of distinct, separate values, often represented by integers.

• It is used for data that can be counted and cannot take on an infinite number of values.

Sample Size Formula for Discrete Data:

• The sample size (n) for discrete data can be calculated using the formula:

• n = [(Z * Z) * P * (1 - P)] / Δ^2

• Where:
• Z is the critical value for the desired confidence level.

• P represents the proportion of defectives in the population.

• Δ is the desired level of precision or error margin.

Data Distribution:

• Data distribution refers to the pattern of data values in a dataset.

Central Tendency:

• Central tendency measures include:

• Mean: The average of the data.

• Median: The middle value when the data is sorted.

• Mode: The value that occurs most frequently.

Dispersion/Variation of Data:

• Measures of variation include:

• Range: The difference between the maximum and minimum values.

• Variance: The average of the squared differences from the mean.

• Standard Deviation: The square root of the variance.

Graphical Tools:

• Graphical tools like histograms and box plots are used to visualize data distributions.

Shapes of Data Sets:

• Different shapes of data sets can be observed:

• Bell-Shaped: Represents a normal distribution, with symmetry around the center.

• Right Skewed (Positively Skewed): Data values cluster on the left side with a tail to
the right.

• Left Skewed (Negatively Skewed): Data values cluster on the right side with a tail to
the left.

• Uniform Distribution: Data is evenly spread without a clear peak.

• Bimodal Distribution: Two distinct peaks in the data.

Normal Distribution Curve:

• The normal distribution curve, or Gaussian distribution curve, is bell-shaped and represents a
symmetrical distribution of data.

• Mean, median, and mode are equal and located at the center of the curve.

• About 68% of data falls within one standard deviation of the mean.
These notes provide an overview of discrete data, sample size calculation for discrete data, data
distribution, measures of central tendency and variation, graphical tools, and different shapes of data
sets, emphasizing the significance of the normal distribution curve. Understanding data distribution
and its characteristics is essential for data analysis in Six Sigma projects.

You've provided information about normality tests, the Anderson-Darling test, and dealing with
outliers in data analysis. Let's structure this information into organized notes:

Normality Test:

• Normality tests are used to check if a dataset follows a normal distribution, which is a bell-
shaped distribution.

• One common normality test is the Anderson-Darling test.

Anderson-Darling Test:

• In the Anderson-Darling test, the null hypothesis is that the data is normally distributed.

• If the p-value is greater than 0.05 (typically used significance level), it suggests that the data
follows a normal distribution.

• If the p-value is less than or equal to 0.05, it indicates that the data is non-normal.

Outliers:

• Outliers are data points that are significantly different from the rest of the data and may
distort statistical analyses.

Boxplot:

• A boxplot is a graphical tool used to identify outliers and visualize the distribution of data.

• It consists of:

• Upper Adjacent Value: A value marking the upper limit of potential outliers.

• Median: The middle value of the dataset.

• Interquartile Range (IQR): The range between the upper quartile (Q3) and lower
quartile (Q1).

• Lower Adjacent Value: A value marking the lower limit of potential outliers.

Use of IQR:

• The Interquartile Range (IQR) is calculated as Q3 - Q1, where Q1 and Q3 are the lower and
upper quartiles, respectively.

• It is used to identify potential outliers.

• A process with a smaller IQR is generally considered more stable.

USL/LSL (Upper Specification Limit/Lower Specification Limit):


• USL and LSL represent the upper and lower limits for a product or process to meet quality
specifications.

• Comparing data to these limits helps in assessing if the process is within the required quality
parameters.

• It can assist in quick comparisons, identifying outliers, and detecting skewness in data.

These notes provide an overview of normality tests, the Anderson-Darling test, dealing with outliers
using boxplots and IQR, and the importance of USL and LSL in quality control. Understanding the
normality of data and addressing outliers are essential steps in data analysis in Six Sigma projects.

You've provided information about Measurement System Analysis (MSA) and the factors that can
contribute to measurement system errors. Let's organize this information into structured notes:

Measurement System Analysis (MSA):

• MSA is a process used to assess the adequacy and reliability of a measurement system.

• The goal is to statistically verify that the measurement system provides unbiased results with
minimal variability and accurately measures the factor being examined.

Factors in Measurement System Analysis:

1. Process: The process refers to the method or procedure used to collect measurements. It
should be consistent and standardized.

2. People/Operators: The individuals or operators who perform measurements. Operator


training and consistency are essential to reduce variability.

3. Gauge/Measuring Instrument: The quality and calibration of the measuring instrument are
critical for accurate measurements.

Measurement System Error:

• Measurement system error refers to the difference between the observed reading and the
actual reading. It represents the variation introduced by the measurement system itself.

Types of Variation in Measurement System Analysis:

1. Overall Variation: The total variation in measurements, including both the variation from the
measurement system and the variation from the part being measured.

2. Part-to-Part Variation: The variation in measurements caused by differences in the parts


themselves.

3. Measurement System Variation: The variation introduced by the measurement system,


including the gauge, operator, and process.

Reproducibility:

• Reproducibility refers to the consistency of measurements when different operators,


instruments, or samples are used.
• If measurements are reproducible, different operators and instruments should produce
consistent results.

Repeatability:

• Repeatability refers to the consistency of measurements when the same operator,


instrument, and sample are used.

• If measurements are repeatable, the same operator and instrument should produce
consistent results.

Sources of Variation:

• High variation can be observed due to factors like operator methods, different operators, or
problems with samples.

Conclusion: Measurement System Analysis is crucial in Six Sigma to ensure that measurement
systems are reliable and produce consistent and accurate results. It helps identify and reduce
measurement errors to improve data quality and decision-making in process improvement projects.

You've provided information about the evaluation of measurement systems, specifically for
continuous data and discrete/attribute data. Let's structure this information into organized notes:

Continuous Data - Gauge R&R (Repeatability and Reproducibility):

• Gauge R&R is used to assess the performance of a measurement system for continuous data.

Prerequisites for Continuous Data Gauge R&R:

• A minimum of 10 samples should be used.

• Each sample should have a minimum of 2 replicates.

Calculation:

• Gauge R&R helps evaluate the contribution of variation in the overall system.

• If the variation is less than 10%, the measurement system is generally accepted.

• If the variation is between 10% and 30%, it becomes a business call.

• If the variation is greater than 30%, the measurement system is usually rejected.

Discrete/Attribute Data - Attribute Agreement Analysis (AAA):

• Attribute Agreement Analysis is used to evaluate measurement systems for discrete or


attribute data.

Prerequisites for Discrete/Attribute Data Attribute Agreement Analysis:

• A minimum of 30 samples is required.

• At least 2 operators should be involved.

• Each sample should have a minimum of 2 replicates.

• Optional use of standard or master data.


Kappa Value:

• The Kappa value is used to assess the measurement system's performance.

• If the Kappa value is 0.9 or above, the measurement system is typically accepted.

• If the Kappa value is between 0.8 and 0.9, it becomes a business call.

• If the Kappa value is below 0.8 (e.g., 0.2), the measurement system is often rejected.

Within-Appraiser Repeatability and Between-Appraiser Reproducibility:

• These factors are part of the Attribute Agreement Analysis and are used to assess the
agreement and consistency of measurements among different appraisers (operators).

These notes provide an overview of the evaluation of measurement systems, considering both
continuous data (Gauge R&R) and discrete/attribute data (Attribute Agreement Analysis). The criteria
for acceptance or rejection are based on the level of variation and the Kappa value. Measurement
system analysis is crucial to ensure reliable and consistent measurements in Six Sigma projects.

Process Stability and Control Charts: Process stability, often assessed using control charts, is a
fundamental concept in statistical process control. Control charts are tools developed by Dr. Walter
A. Shewhart to monitor and maintain the stability and predictability of a process. Here are key points
related to process stability and control charts:

Stable Process:

1. A stable process is one that is consistent and predictable over time.

2. In a stable process, all variations in the process are expected to fall within control limits,
specifically the Upper Control Limit (UCL) and the Lower Control Limit (LCL).

3. These variations are due to common causes, also known as common cause variation.

Common Cause Variations:

1. Common cause variations are inherent to the process and are part of its natural variation.

2. They are typically consistent and predictable over time, and they are referred to as chronic in
nature.

3. Common cause variations are variations that can be expected in day-to-day operations and
do not indicate any specific problems or anomalies.

Special Cause Variations:

1. Special cause variations are variations that are not part of the natural, common cause
variation in the process.

2. They are typically sporadic and unpredictable and are often referred to as assignable causes.

3. Special cause variations indicate that something unusual or abnormal has occurred in the
process.
Process Improvement and Control Charts:

1. In the context of Six Sigma projects, one of the objectives is to eliminate common cause
variations and maintain process stability.

2. When special cause variations are identified, an analysis is conducted to identify their root
causes and take corrective actions to address them.

Process Capability:

1. Process capability is an important aspect of process analysis.

2. It involves checking whether the process is capable of consistently meeting customer


requirements.

3. The capability of a process can be assessed using various statistical measures and is a critical
component of Six Sigma quality improvement efforts.

Example: Let's consider a production process in a manufacturing facility. The goal is to produce a
specific component with consistent dimensions to meet customer specifications. To assess process
stability, a control chart is used. If the control chart shows that the process is within control limits
(UCL and LCL) and common cause variations are the only source of variation, it indicates a stable
process. However, if special cause variations are observed, an analysis is performed to determine the
root causes and take corrective actions to maintain process stability and improve product quality.
Process capability studies are also conducted to ensure that the process consistently meets customer
requirements for component dimensions.

Process Capability for Continuous Data: Process capability is a measure of how well a process can
consistently produce items that meet customer specifications. It assesses whether a process is
"capable" of producing products within the defined specification limits. The capability of a process is
typically determined using statistical measures.

Key Terms in Process Capability for Continuous Data:

1. LSL (Lower Specification Limit): The lowest acceptable value or limit set by the customer or
quality standards for a particular characteristic. Anything below this limit is considered
unacceptable.

2. USL (Upper Specification Limit): The highest acceptable value or limit set by the customer or
quality standards for a particular characteristic. Anything above this limit is considered
unacceptable.

3. CP (Process Capability Index): A measure that quantifies how well a process can produce
items within specification limits. It is calculated as (USL - LSL) divided by (6σ), where σ
represents the standard deviation of the process. The larger the CP value, the more capable
the process.

4. Voice of Customer (VOC): The customer's requirements and expectations for a specific
product or service. VOC includes the defined specification limits (LSL and USL).
5. Voice of Process (VOP): The actual performance of the process in terms of the characteristic
being measured. It is determined by calculating process parameters, such as the mean and
standard deviation.

Process Capability Categories:

• Capable Process: If the process is capable of producing items within specification limits
(between LSL and USL), it is considered a capable process. No major changes are needed to
meet customer requirements.

• Not Capable Process: If the process is not capable of consistently producing items within
specification limits, it is considered not capable. This may lead to rejections or defects.
Process improvements are required to meet customer requirements.

• Just Capable Process: A process that coincides with both the LSL and USL is referred to as
just capable. It can produce items within specification, but it's very close to the limits.

• Highly Capable Process: A highly capable process has a large margin between the process's
spread (6σ) and the specification limits (LSL and USL). It can consistently produce items well
within specification limits, providing a high level of confidence in meeting customer
requirements.

Example: Imagine a manufacturing process that produces pens. The customer's requirement is that
the pens should have a length between 10 cm (LSL) and 10.2 cm (USL). The process's average pen
length (Voice of Process) is measured to be 10 cm, and the process's standard deviation (σ) is
determined to be 0.1 cm.

To assess the process capability (CP), you can use the formula: CP = (USL - LSL) / (6σ) = (10.2 - 10) / (6
* 0.1) = 0.33.

In this case, CP is relatively low, indicating that the process is not highly capable of consistently
producing pens within the specified length range. Process improvements may be necessary to
increase the CP value and meet customer requirements with higher confidence.

Cpk (Process Capability Index) Calculation: Cpk is a measure used to assess how well a process can
produce items within customer specifications. It takes into account both the mean (average) and
standard deviation of the process and is a more comprehensive indicator of process capability
compared to Cp.

Here's how Cpk is calculated:

1. Identify the lower specification limit (LSL) and upper specification limit (USL) based on
customer requirements. In your example, LSL is 10 and USL is 13.

2. Calculate the process mean (average). In your example, the process mean is 12.

3. Calculate the process standard deviation (σ). In your example, σ is 1.333.

4. Use the following formula for Cpk:

Cpk = Min[(USL - Mean) / (3σ), (Mean - LSL) / (3σ)]


• The numerator in each term represents the distance between the process mean and
the specification limit, divided by three times the standard deviation.

5. Calculate Cpk for both the upper and lower specification limits and take the minimum of the
two values.

Example Calculation: For your example, where LSL = 10, USL = 13, Mean = 12, and σ = 1.333, the
calculation would be:

1. For the upper specification limit (USL):

Cpk (USL) = Min[(13 - 12) / (3 * 1.333), (12 - 10) / (3 * 1.333)]

Cpk (USL) = Min[0.75, 0.75] = 0.75

2. For the lower specification limit (LSL):

Cpk (LSL) = Min[(12 - 10) / (3 * 1.333), (13 - 12) / (3 * 1.333)]

Cpk (LSL) = Min[0.75, 0.75] = 0.75

Since both Cpk (USL) and Cpk (LSL) are the same and equal to 0.75, the minimum of the two is 0.75.

Therefore, Cpk = 0.75, indicating that the process is not capable of consistently meeting customer
specifications. The process capability is below the acceptable level (Cpk < 1), and improvements are
needed to meet customer requirements more reliably.

Sigma Level and Process Capability:

Sigma level, often denoted as "σ level," is a measure that indicates how well a process can
consistently meet customer specifications. It is related to process capability indices like Cp and Cpk.
The relationship between these terms is as follows:

1. Cp (Process Capability Index): Cp is a measure of process capability based on the spread of


the process data. It compares the spread of the process to the width of the specification
limits (USL and LSL). Cp is calculated as (USL - LSL) / (6σ), where σ is the standard deviation of
the process.

2. Cpk (Process Capability Index with Respect to the Mean): Cpk also considers the mean of
the process in addition to the spread. It is calculated as the minimum of two values: (USL -
Mean) / (3σ) and (Mean - LSL) / (3σ). Cpk evaluates how well the process center aligns with
the specification limits.

3. Sigma Level (σ level): Sigma level is a measure of process performance expressed in standard
deviations. It indicates how many standard deviations the process mean is away from the
nearest specification limit. A higher sigma level indicates a more capable process. Sigma level
is often used in Six Sigma methodologies.

Example Calculation: For your example, you mentioned the following data:

• CPU (Upper Specification Limit) = 35

• CPL (Lower Specification Limit) = 30

• Process Mean (Average) = 30


• Standard Deviation (σ) = 5

1. Calculate Cpk for the Upper Specification Limit (Cpk (USL)):

Cpk (USL) = Min[(35 - 30) / (3 * 5), (30 - 30) / (3 * 5)]

Cpk (USL) = Min[1, 0] = 0 (Since the process mean is exactly at the Upper Specification Limit)

2. Calculate Cpk for the Lower Specification Limit (Cpk (LSL)):

Cpk (LSL) = Min[(30 - 30) / (3 * 5), (35 - 30) / (3 * 5)]

Cpk (LSL) = Min[0, 1] = 0 (Since the process mean is exactly at the Lower Specification Limit)

3. Calculate the Sigma Level:

Sigma Level = 3 × Cpk = 3 × 0 = 0

In this case, the Sigma Level is 0, indicating that the process is not capable of consistently meeting
customer specifications, as the process mean aligns with one of the specification limits. A higher
Sigma Level would signify a more capable process that is further away from the nearest specification
limit.

Attribute Data and Sigma Levels:

In the context of attribute data, you are examining characteristics like defects or non-conformities.
These characteristics are binary in nature, meaning they are either present (defective) or absent
(non-defective) in a unit.

• Defect: An attribute that signifies a non-conformity or characteristic of a product that


doesn't meet the desired standard.

• Defective: A unit that contains defects.

• Defect Opportunity: This refers to the probability of possible defects occurring within the
same unit. It's the number of opportunities for defects to occur in a single unit.

Calculations:

1. You provided data for a sample with 5,000 units. Out of these units, 329 were found to be
defective. This means the Defects per Unit (DPU) is calculated as:

DPU = Number of Defects / Number of Units = 329 / 5,000 ≈ 0.0658

2. To determine the Parts Per Million (PPM) for this data, you can use the formula:

PPM = DPU * 1,000,000 = 0.0658 * 1,000,000 = 65,800 PPM

This means there are approximately 65,800 defects per million units.

3. You also calculated the Sigma level based on the provided DPMO (Defects Per Million
Opportunities) using a table or reference (Sigma & DPMO table). The Sigma level is found to
be 3.01.

For a Given Number of Defect Opportunities:


Now, you mentioned that the number of Defect Opportunities is 12. To calculate the Defects per
Opportunity, you can use the following formula:

Defects per Opportunity = Number of Defects / Total Number of Opportunities

In this case, it's:

Defects per Opportunity = 329 / (12 * 5,000) ≈ 0.00548

4. To calculate the DPMO for this new data, you can use the formula:

DPMO = Defects per Opportunity * 1,000,000 = 0.00548 * 1,000,000 = 5,480 DPMO

5. Using the DPMO, you can find the corresponding Sigma level, which is calculated as 4.04.

6. Finally, you provided another set of data where the number of Defective Forms is not
specified, but you calculated Defective Per Unit, PPM, and Sigma level as follows:

• Defective Per Unit = 0.0122

• PPM = 0.0224 * 10^6

• Sigma level = 5.51

These calculations indicate the level of process performance and quality based on the attributes and
defects associated with the data. Higher Sigma levels are indicative of better process quality and
reliability.

Analyze Phase:

The Analyze Phase is a crucial step in the Lean Six Sigma methodology, aimed at identifying the root
causes of problems and understanding the factors contributing to variability in a process. This phase
involves various steps and tools to help achieve process improvement.

Steps in the Analyze Phase:

1. Process Door Approach: In this step, you take a closer look at the process itself to
understand its flow, inputs, and outputs. It's important to identify where potential issues
might occur.

2. Data Door Approach: Data analysis is a key part of the Analyze Phase. You collect and
analyze relevant data to identify trends, patterns, and potential causes of problems.

3. Segregation of Causes: The causes of problems in a process can be classified into different
categories. This step involves organizing and segregating these causes for further analysis.

4. Validation of Causes: To ensure that the identified causes are indeed contributing to the
problem, a validation process is carried out. This may involve conducting experiments or
additional data analysis.

5. Brainstorming: Brainstorming is a creative and free-thinking process to identify possible,


potential, or probable causes of the problem. It's a way to generate ideas and hypotheses
about what might be going wrong.
6. Cause & Effect Diagram (Ishikawa or Fishbone Diagram): This tool, developed by Dr.
Ishikawa, is also known as the Fishbone diagram due to its shape. It helps in visualizing the
potential causes of a problem. The main categories, often referred to as the 6Ms (Man,
Machine, Material, Method, Measurement, and Mother Nature), are used to categorize
causes. Under each category, you list specific factors or causes that could be contributing to
the problem.

Types of Causes:

During the Analyze Phase, it's important to distinguish between different types of causes:

1. Non-Controllable Causes: These are factors that cannot be easily controlled or changed
within the scope of the project. They may be external factors that influence the process but
are not directly manageable.

2. Direct Improvement Causes: These are the causes that, when addressed or improved, have a
direct impact on the problem and can be controlled or influenced by process changes.

The Analyze Phase is an essential part of the Lean Six Sigma approach as it guides teams to identify
the causes behind process issues and lays the groundwork for effective solutions. The Cause & Effect
Diagram is a valuable tool for organizing and visualizing these causes, making it easier to prioritize
and address them.

Likelihood and Root Causes Analysis:

1. Why-Why Analysis: This is a systematic approach used to identify the root causes of
problems. You ask "why" multiple times, often five times, to dig deeper into the underlying
issues. The goal is to not only find the immediate or surface-level cause but to identify the
fundamental reasons for a problem. This approach helps in taking corrective and preventive
actions.

2. Pareto Chart/Diagram: This is a graphical tool used to prioritize and focus efforts on the
most significant causes of a problem. The Pareto principle, often known as the 80/20 rule,
suggests that roughly 80% of effects come from 20% of the causes. In problem-solving, it
means that a majority of problems are caused by a small number of issues. By using a Pareto
chart, you can visualize and prioritize the vital few causes that need attention while leaving
aside the trivial many.

Pareto Chart and the 80/20 Rule:

• The Pareto Chart is named after Vilfredo Pareto, an Italian economist. Dr. Joseph M. Juran
introduced this principle into problem-solving approaches, emphasizing that roughly 80% of
problems result from 20% of the causes.

• The chart is a bar graph with causes on the x-axis and their frequency or impact on the y-
axis.

• By analyzing the Pareto Chart, you can identify the most critical issues, focus your efforts on
addressing them, and achieve a more significant impact in problem-solving.

Lean Principles:
Lean principles are a fundamental part of Lean Six Sigma. They aim to streamline processes,
eliminate waste, and enhance efficiency. Lean was introduced by the Lean Enterprise Institute,
founded by James P. Womack and Daniel T. Jones in 1997. Some core principles include:

• Value: Identifying what adds value from the customer's perspective and eliminating what
doesn't.

• Value Stream: Mapping the entire process from start to finish, identifying areas for
improvement.

• Flow: Ensuring a smooth, continuous flow of work or processes.

• Pull: Allowing customers to "pull" products or services as needed rather than pushing excess
inventory.

• Perfection: Striving for continuous improvement to achieve perfection in processes.

Lean principles complement Lean Six Sigma by emphasizing efficiency and waste reduction, which
aligns with the Six Sigma focus on reducing defects and variability in processes.

Principles of Lean:

1. Identify Values: The first principle of Lean involves recognizing the value of a product or
service from the customer's perspective. It's about understanding what the customer truly
values and is willing to pay for.

2. Map the Value Stream: Value Stream Mapping is a critical step in Lean. It involves visually
mapping out the entire process, from start to finish, to identify how value flows and where
waste occurs. This helps in understanding the current state of the process.

3. Create Flow: Once you've identified the value stream and potential areas of waste, the next
step is to streamline the process to create a smooth and continuous flow of work. This aims
to eliminate interruptions and delays.

4. Establish Pull: Lean operates on a "pull" system where work or products are produced based
on customer demand. This means you only produce what the customer needs when they
need it, reducing excess inventory and waste.

5. Seek Perfection: Lean is an ongoing journey toward perfection. It involves a continuous


improvement mindset, striving to eliminate waste and improve processes to achieve the
highest level of efficiency and value delivery.

Value-Related Concepts:

• Value-Adding (VA) Activities: These are activities that directly contribute to the product or
service and are considered valuable from the customer's perspective.

• Non-Value-Adding (NVA) Activities: These are activities that do not add value to the product
or service. Identifying and reducing NVA activities is a key goal of Lean.

• Value-Enabling Activities (VE): These activities, while not directly adding value, are
necessary to complete value-adding activities. They support the value stream.

Key Time Metrics:


• Cycle Time (CT): The time it takes to complete one cycle of operation in a process.

• Lead Time (LT): The total time from the start of a process to its completion, including both
value-adding and non-value-adding activities.

• Takt Time (TT): The rate at which one must complete an operation to meet customer
requirements. It's calculated by dividing the available time by the customer demand.

For example, if the goal is to produce 1,000 units in 10 days, the Takt Time would be 0.01 days per
unit, which is equivalent to 14.4 minutes per unit. This means you need to complete one unit every
14.4 minutes to meet customer demand.

AIPRM - ChatGPT Prompts

o Favorites

o AIPRM

o Public

o Own

o Hidden

o Add List

All
Topic

All
Activity

Top Votes Trending


Sort by

Not specific
Model

4
Prompts per Page

Showing 1 to 4 of 4489 Prompts

PrevNext

Human Written |100% Unique |SEO Optimized Article

SEO / Writing

·Jumma·3 days ago

GPT-3.5-turbo GPT-4 Human Written | Plagiarism Free | SEO Optimized Long-Form Article With
Proper Outline [Upgraded Version]

8.5M 6.5M 1.4K

Yoast SEO Optimized Content Writer

Copywriting / Writing

·Jignesh Kakadiya·1 week ago


Write detail YoastSEO optimized article by just putting blog title. I need 5 more upvotes so that I can
create more prompts. Hit upvote(Like) button.

411.3K 265.1K 1.0K

Fully SEO Optimized Article including FAQ's

SEO / Writing

·Muhammad Talha (MTS)·3 days ago

GPT-3.5-turbo GPT-4 GPT-4 browsing GPT-4 plugins [Version: 3.2] Enjoyed the prompt? Hit Like
button! | Yoast and Rank Math SEO Optimized | Create a 100% Unique | Plagiarism Free Content
with | Title | Meta Description | Headings with Proper H1-H6 Tags | up to 2000 Words Article with
FAQs, and Conclusion.

2.7M 2.1M 621

Midjourney Prompt Generator

Generative AI / Midjourney

·kenny·6 months ago

Outputs four extremely detailed midjourney prompts for your keyword.

1.9M 1.2M 540

Add Public Prompt

4
Prompts per Page

Showing 1 to 4 of 4489 Prompts

PrevNext

WebChatGPT One-Click Prompts

WebChatGPT v4.1.10 - Leave us a review - Discord community - Powered by MaxAI.me

FavoritesPublicOwn

Category

All
All
Category

Use case

All
All
Use case

Search...
Prompts per page:

12

12

1–12 of 218

SEO Optimized Article with 100% UniqueHuman Written Style

SEO / Writing

·MaxAI.me

Human Written Style Original Content SEO Enhanced Long-Form Article With Proper Structure

Article Outrank Rival

SEO / Writing

·MaxAI.me

Live CrawlingBy creating a comprehensive article that similar to your competitor, but with better SEO
(based on the URL of your competitor)

WebChatGPT: ChatGPT with internet access

All / All

·MaxAI.me

Web SearchAugment your ChatGPT prompts with relevant web search results through web browsing.
Entering your query to start.

Strategy for Keywords

SEO / Ideation

·MaxAI.me

Produce a strategy for keywords and a plan for SEO content using only one {{KEYWORD}}.

SEO Enhanced Article with FAQ Integration

SEO / Writing

·MaxAI.me

Entirely Unique, Original and Fully SEO Tuned Articles with Meta Description, Headings, 1500 Words
Length, FAQ's, Meta Description & Much more

Auto Midjourney Prompt Generator

Generative-AI / Midjourney

·MaxAI.me

Generates four highly functional Midjourney prompts according to your keyword.


Create Top Intelligent Article for Ranking on Google

Copywriting / Writing

·MaxAI.me

Compose Top Smart Article Best for ranking on Google by simply providing the Title.

Human Style Rewriter

Copywriting / Writing

·MaxAI.me

Rephrase your AI-crafted article with this instrument! You can achieve up to 90%+ Human Generated
score.

Article Rewriter with Keyword-Abundant Content

SEO / Writing

·MaxAI.me

Elevate your online visibility and draw in more clients with copywriting and SEO solutions from a
skilled professional. Prepare to outpace your competitors and reach peak search rankings with our
human-like writing approach and keyword-dense content.

Intelligent and Comprehensive Article Composer using H-tags

Copywriting / Writing

·MaxAI.me

Provide the article title you want composed. He endeavors to write an extensive and thorough
article. Prepares it for sharing with h tags.

Compose an Entire Book with One Click

Copywriting / Writing

·MaxAI.me

Author a complete book with distinct chapters

YouTube Script Generator

Copywriting / Script Writing

·MaxAI.me

Develop engaging script concepts for your YouTube videos. Input a brief description of your video.
Generates: Title, Scene, and Full Script.
Process Cycle Efficiency (PCE):

• PCE is a metric that measures the efficiency of a process. It's calculated as the ratio of the
total value time to the total lead time. The value time is the time involved in value-adding
activities, and the lead time is the total time from start to finish.

Lean Principles:

• Flow: Creating a smooth and continuous flow of work where value-adding steps occur in
sequence, ensuring that the product or service moves efficiently towards the customer.

• Pull: A "pull" system means that products or services are produced or delivered only when
there's a demand from the customer. This helps in reducing excess inventory and waste.

• Just in Time (JIT): JIT is a Lean approach that focuses on delivering the right quantity at the
right time to meet customer demand, minimizing inventory and waste.

• Perfection: Lean is a continuous improvement process that involves striving for perfection. It
means always looking for ways to eliminate waste and improve processes.

Types of Wastes (7 Wastes):

1. Transportation: Unnecessary movement or handling of materials, products, or information.

2. Inventory: Excess inventory that ties up resources and can lead to waste.

3. Motion: Unnecessary physical or mental actions by employees that don't add value to the
process.

4. Human Intellect: Not tapping into the full potential of employees and their ideas.

5. Waiting: Idle time during the process when work is not being done.

6. Overproduction: Producing more than is required at a given time.

7. Overprocessing: Doing more work on a product or service than what is required by the
customer.

Control Impact Matrix:

This matrix helps in prioritizing and deciding which changes or improvements to make in a process. It
considers the impact and difficulty of each change, categorizing them as high or low impact and easy
or difficult to implement. Based on this, actions can be prioritized as "Just do it," "Target it now," or
"Strategize."

Hypothesis Testing:

Hypothesis testing is a statistical method used to assess the validity of a claim or statement about a
population parameter, such as the mean or variance. The process typically involves the following
steps:

1. Formulate Null and Alternate Hypotheses:

• Null Hypothesis (H0): This is a statement of no change or no difference. It is often


denoted as Xold = Xnew, indicating that there is no significant change or difference.
• Alternate Hypothesis (Ha): This is a statement of change or difference, indicating
that the parameter is not equal to a specific value. It is often denoted as Yold ≠ Xnew,
where Yold is not equal to Xnew.

2. Set the Level of Significance (α):

• The level of significance, denoted as α, represents the probability of making a Type I


error, which is rejecting the null hypothesis when it is true. It is typically set at a
certain confidence level, e.g., 0.05 for a 95% confidence level.

3. Collect Data:

• Data is collected to test the hypothesis. The data can be continuous (normal data) or
discrete, depending on the nature of the analysis.

4. Select the Hypothesis Test:

• Based on the data type and the nature of the hypothesis, an appropriate statistical
test is selected. For example:

• For continuous data: t-tests (1-sample or 2-sample), ANOVA, etc.

• For discrete data: Tests for proportions, chi-squared tests, etc.

5. Inference Based on P-Value:

• The hypothesis test calculates a p-value, which represents the probability of


observing the data under the null hypothesis. If the p-value is less than the level of
significance (α), the null hypothesis is rejected in favor of the alternate hypothesis. If
the p-value is greater than α, the null hypothesis is not rejected.

It's important to note that the null hypothesis (H0) and the alternate hypothesis (Ha) are mutually
exclusive and complementary events. H0 assumes equality, while Ha assumes a difference, and the
choice between them depends on the specific research question.

Confidence Level (1 - α):

• The confidence level is the complement of the level of significance. A 95% confidence level
corresponds to a level of significance (α) of 0.05. It indicates the level of confidence that the
results are not due to chance.

Non-Parametric Tests for Non-Normal Data:

Non-parametric tests are used when the data does not meet the assumptions of normality required
for parametric tests. These tests do not rely on the specific distribution of the data. Here are some
common non-parametric tests and their applications:

1. Sign Test:

• It is used to test whether the median of a sample is equal to a specified value.

• For example, testing if the median delivery time is equal to a certain benchmark.

2. Wilcoxon Signed-Rank Test:


• This test assesses whether the distribution of differences between paired samples is
symmetric around zero.

• For example, testing if there is a significant difference in delivery times before and
after implementing a new process.

3. Mann-Whitney U Test:

• It is a non-parametric alternative to the independent t-test. It is used to compare


two independent samples.

• For example, comparing the delivery times of two different delivery modes (Mode A
and Mode B).

4. Kruskal-Wallis Test:

• An extension of the Mann-Whitney U test, it is used to compare more than two


independent groups.

• For example, comparing the delivery times of multiple delivery modes (Mode A,
Mode B, Mode C, etc.).

5. Chi-Squared Test:

• This test is used to determine if there is an association between two categorical


variables.

• For example, testing whether there is a significant relationship between the number
of errors in purchase orders and the supplier.

6. Fisher's Exact Test:

• Similar to the Chi-Squared test, it is used for 2x2 contingency tables with small
sample sizes.

• For example, assessing the association between the presence of errors in purchase
orders and the supplier's status (approved or not approved).

7. 1-Proportion Test:

• It is used to compare a sample proportion to a known population proportion.

• For example, testing whether the proportion of purchase orders with errors is
different from a known industry average.

8. 2-Proportion Test:

• This test compares two sample proportions.

• For example, comparing the proportion of errors in purchase orders between two
different time periods.

These non-parametric tests are valuable tools when dealing with data that does not follow a normal
distribution, and they provide reliable statistical results in such cases. The choice of test depends on
the specific research question and the characteristics of the data being analyzed.
I apologize for the confusion earlier. Let's break down the calculation step by step.

To calculate DPMO (Defects Per Million Opportunities), you need the following information:

1. Number of Defects: 20

2. Total Opportunities: 500 samples * 400 possible defects per sample = 500 * 400 = 200,000

Now, you can calculate DPMO:

DPMO = (Number of Defects / Total Opportunities) * 1,000,000 DPMO = (20 / 200,000) * 1,000,000
DPMO = (0.0001) * 1,000,000 DPMO = 100

So, the DPMO in this case is 100, which means there are 100 defects per one million opportunities.

Hypothesis Testing:

• Hypothesis testing is a statistical method used to evaluate claims or statements about a


population.

• It involves setting up a null hypothesis (Ho) and an alternate hypothesis (Ha).

• The level of significance (α) is chosen, typically at 0.05 or 5%.

• Data is collected, and an appropriate hypothesis test is selected based on the data type and
research question.

• Inference is made based on the p-value:

• If p-value ≤ α, the null hypothesis is rejected.

• If p-value > α, the null hypothesis is accepted, or we fail to reject it.

Type 1 and Type 2 Errors:

• Type 1 Error (α) occurs when we reject the null hypothesis when it's actually true, leading to
a false positive.

• Type 2 Error (β) occurs when we accept the null hypothesis when it's actually false, leading
to a false negative.

• Type 1 errors are also known as the producer's risk.

• Type 2 errors are known as the consumer's risk.

• Consumer's risk is generally more dangerous than producer's risk.

Correlation Analysis:

• Correlation analysis is used to measure the relationship between two variables, typically
denoted as X and Y.

• It is used to find logical relationships between variables.

• Scatter plots are often used to visualize the relationship.

• There are different types of correlation:


• Positive correlation: As one variable increases, the other also increases.

• Negative correlation: As one variable increases, the other decreases.

• No correlation: There is no clear relationship between the variables.

If you need more detailed notes or have specific questions on any of these topics, please let me
know!

Correlation Coefficient:

• A correlation coefficient measures the strength and direction of a linear relationship


between two variables.

• The value of the correlation coefficient ranges from -1 to 1.

• An absolute value close to 1 indicates a strong relationship. For example, 0.907 represents a
strong positive relationship.

Regression Analysis:

• Regression analysis is used to establish causal relationships between variables.

• Linear regression is used when the relationship between variables is linear, especially when
one variable (y) is continuous.

• In simple linear regression, there's one predictor variable, while in multiple linear regression,
there are multiple predictor variables.

• Logistic regression is used when the dependent variable is discrete or binary (e.g., yes/no).

Coefficient of Determination (R²):

• R² measures how well the regression equation can predict future values.

• It ranges from 0 to 1.

• R² shows the proportion of the total variation in the dependent variable (y) explained by the
independent variable (x).

• Higher R² values indicate a better-fitting model.

• There are different forms of R²: R² (plain), R² (adj), and R² (pred).

• R² (pred) measures how well the model can predict new, unseen data.

Interpretation of R²:

• R² (pred) > 0.50: A strong model.

• R² (adj) 0.30-0.50: A moderate model.

• R² (plain) < 0.30: A weak model.

These values help assess the accuracy and strength of the regression model.

If you have any more specific questions or need further information, please feel free to ask!
Improve Phase Steps:

1. Generation of Solid Solutions: Brainstorm and create a pool of potential solutions to the
problem.

2. Prioritize and Proof Solutions: Evaluate the solutions generated to identify the most
promising ones.

3. Test Solutions: Conduct small-scale tests or pilots to assess the feasibility and effectiveness
of the selected solutions.

4. Justify Solutions: Provide clear and data-backed reasoning for selecting particular solutions
over others.

Brainstorming Techniques:

• Structured Brainstorming: Channeling the brainstorming session to focus on specific


categories of solutions.

• Unstructured Brainstorming: Allowing for a more open and creative exchange of ideas. A
combination of both structured and unstructured brainstorming can be effective.

Brainstorming Principles:

• Channeling: Use a structured approach to brainstorm solutions within specific categories.

• Antisolution: Sometimes, thinking about what not to do or the opposite can help identify
the right path.

• Analogy and Similar-Looking Processes: Draw parallels from processes that are similar or
have faced similar challenges.

• Brainwriting: Encourage participants to write down their ideas on paper for shared
discussion.

Selecting Solutions:

• Strategies for selecting solutions can include voting, pay-off matrix, and screening against
specific criteria.

• A Pay-off Matrix helps assess the impact and feasibility of solutions.

• Screening Against Must Be: Ensure solutions align with essential criteria like compliance,
regulations, and customer Critical to Quality (CTQ).

Criteria-Based Matrix:

• Use a matrix that includes various criteria and evaluate each solution against these criteria.

• Criteria can include criteria that measure compliance, effort required, and alignment with
project goals.

Voting:

• Allocate votes to participants and have them vote for the solutions they find most favorable.

• Tally up the votes to prioritize solutions.


These methods and principles help ensure that the most appropriate and effective solutions are
chosen for implementation in the Improve Phase of a project.

Nominal Group Technique (NGT): In NGT, participants rank or rate each solution based on various
parameters or implementation factors. The solutions with the highest cumulative ratings are
selected. It's a structured method for group decision-making and encourages active participation
from team members.

Kaizen: Kaizen is a Japanese term that means "change for better" or "continuous improvement." It is
a process of making small, incremental improvements in processes, products, or systems. The
primary goal of Kaizen is to eliminate waste (Muda) and create a culture of continuous improvement
within an organization.

Kaizen Blitz: A Kaizen Blitz, also known as a Kaizen Event, is a focused and intensive approach to
implement radical improvements in a specific process or area in a short period of time. It involves
concentrated efforts and can lead to significant, immediate enhancements.

Kaizen Cycle: The Kaizen Cycle, often represented as the PDCA (Plan-Do-Check-Act) cycle, is a
systematic approach to improvement. It involves documenting the current process, identifying areas
of waste, planning countermeasures, implementing changes, verifying results, and then repeating
the cycle. Continuous improvement is the key focus.

Kanban: Kanban is a visual system that uses cards or signboards to signal the need for certain
actions, such as restocking inventory or initiating a specific task. It is widely used in Lean and "Just in
Time" production systems. Kanban helps optimize processes and reduce waste by ensuring materials
or tasks are pulled only when needed.

Types of Kanban:

• Red Kanban: Signals a high inventory level, indicating that it's time to order more.

• Yellow Kanban: Suggests that inventory is at a reorder level, and it's advisable to start
preparations.

• Green Kanban: Indicates that there is no need to order more inventory.

5S Methodology: The 5S methodology is a systematic approach to organizing and standardizing the


workplace. It focuses on five principles, each starting with the letter "S":

• Seiri (Sort): Sorting and decluttering the workspace.

• Seiton (Set in Order): Organizing items in a way that makes them easily accessible.

• Seiso (Shine): Keeping the workplace clean and maintaining cleanliness.

• Seiketsu (Standardize): Standardizing processes and procedures.

• Shitsuke (Sustain/Self-Discipline): Creating a culture of continuous improvement and


maintaining the changes.

These Lean and Kaizen principles help organizations become more efficient, reduce waste, and foster
a culture of continuous improvement and quality.
Roles for Implementation:

• Top Management: Provides leadership and support for the implementation of Lean and
Kaizen principles. They set the vision, objectives, and strategic direction for the organization.

• Middle/Line Management: Middle managers play a critical role in facilitating and driving the
implementation process within their specific areas. Line managers are responsible for day-to-
day operations.

• Employees: All employees contribute to the success of Lean and Kaizen by actively
participating in continuous improvement efforts, suggesting improvements, and ensuring
adherence to standards and processes.

Poka Yoke (Mistake Proofing): Poka Yoke is a method for mistake avoidance and error prevention. It
involves identifying potential errors or defects in processes and implementing mechanisms to
prevent them.

FMEA (Failure Modes and Effects Analysis): FMEA is a structured approach for identifying potential
failure modes in a process, assessing their impact, and prioritizing them based on a Risk Priority
Number (RPN). The goal is to address high-priority failure modes to reduce the risk of process
failures.

Testing and Piloting: Testing and piloting involve verifying the effectiveness of proposed solutions,
often through simulation or pilot projects to ensure that changes will produce the desired results.

Solution Justification: Before implementing changes, solutions should be justified and presented to
an apex team for approval. This involves demonstrating how the solution aligns with the
organization's goals and objectives and outlining the expected benefits, both tangible and intangible.

Control Phase: In the control phase, it is essential to establish and maintain control over the
processes and monitor them for any deviations. This includes defining control limits, documenting
procedures, and preparing for potential failures by creating response plans.

Control Plan: A control plan is a document that outlines the process control measures to ensure that
the improvements made during the implementation phase are maintained over time. It includes a
response plan for addressing any deviations or potential failures.

The control plan should address the following aspects:

• What is being measured (Y)?

• Specification limits for Y.

• Who is the first point of contact for Y-related issues?

• First-responder plans if Y goes out of specification.

• Causes (Xs) that may affect Y.

• Measurement systems for X.

• Control limits for X.

• First point of contact for X-related issues.

• First-responder plans if X goes out of control.


By addressing these aspects in the control plan, organizations can maintain the improvements
achieved during Lean and Kaizen implementation and continue to work toward sustained efficiency
and quality.

Training Recommendations: It is recommended to provide training to employees at all levels, from


top management to front-line workers, in Lean and Kaizen principles. Training should cover the core
concepts, methodologies, and tools used in these approaches. In particular, consider training in areas
such as process improvement, problem-solving techniques, waste reduction, continuous
improvement, and statistical analysis. Training should be ongoing and customized to meet the
specific needs of different roles within the organization.

Document Changes: Document changes and updates should be thoroughly documented. Ensure that
all relevant documents, including procedures, checklists, templates, and manuals, are reviewed and
updated to reflect the new processes and standards. Any changes to document formats or content
should be clearly communicated to all relevant personnel. Additionally, establish a system for version
control to keep track of document revisions.

Statistical Process Control (SPC): Statistical Process Control (SPC) is a critical tool in maintaining and
improving process quality. It involves the use of control charts to monitor processes and detect
variations. The following are some key points related to control charts:

• Control Charts: Control charts are graphical tools used to monitor and control processes.
They typically display process data over time and include control limits to identify when a
process is out of control.

• Test 1: One of the common rules for identifying when a process is out of control is when a
single data point falls beyond the control limits (either the Upper Control Limit, UCL, or the
Lower Control Limit, LCL).

• T-Test (10): A common rule is the 10-point rule, which involves observing ten consecutive
points on the same side of the centerline. This can indicate a significant shift or trend in the
process.

• H-Test (2): The H-Test is another rule that checks for patterns in data. If there are 9 or more
consecutive points on the same side of the centerline, it could signal a problem in the
process.

• Test 13: This rule looks for occurrences where 13 or more points are on the same side of the
centerline, which can indicate a shift or trend.

• Types of Control Charts: There are various types of control charts, each suitable for different
types of data and purposes. Common control charts include the X-bar and R-chart for
continuous data, and p-charts and c-charts for attribute data.

Overall, it is essential to implement effective SPC practices to monitor processes continuously and
ensure that any deviations or variations are addressed promptly to maintain process stability and
product or service quality.

You might also like