Objective & Challenge- To set up a quality framework to capture value that commensurate with investments. Need to assess and design a statistical solution for a dynamic business at operational level and improve inconsistencies and gaps in the project timeliness and end -to- end operations quality.

Need to build a scalable solution which can transform the tactical and operational business activities by integrating Six Sigma DMAIC and SAS SEMMA methodologies. It is very essential to apply statistical modeling and simulation techniques to various business challenges facing by the client/customer and align the quantitative results to decision making. This helps us to “Go To Market First” and competes with the product leaders in the global markets.

Create a framework for the entire quality program with defined quality processes and metrics in place to measure Marketing campaign RoI. Decide on the structure of the quality teams so that quality initiatives gain visibility. Lay out metrics that go beyond service levels to monitor and measure marketing database operations and process-level delivery performance to set a continuous improvement program.

Prioritize the opportunities for improvement and implement a few that will yield quick wins.

Follow a Five phase approach –

Discovery

Collection of Metrics

Definition of Metrics Implementation

Execution.

Develop a lean, efficient marketing workflow designed for growth.

Enhance the three marketing arenas for growth: strategic, tactical, and operational.

Identify leading indicators of growth and become proactive about performance improvement.

Strengthen links between customers, products, and profitability.

Redesign marketing work to streamline workflow and reduce variability.

Assess and mitigate cycle-time risk in any marketing initiative or project.

Leverage DMAIC to solve specific problems and improve existing processes.

Use lean techniques to streamline repeatable processes, such as collateral development and trade-show participation.

Benefits- Reduced time required, cost and effort by identifying, capturing and fixing defects through improved requirements capturing and implementation of high performance operational metrics.

Business Transformation :

Need to establish a quality engagement and effectiveness measurement summit between service provider and Customer and discuss the following initiatives.

Six Sigma DMAIC/SEMMA methodology design and assessment

Integrated quality methodology deployment for e-marketing analytic operations

Process Improvement Execution strategy

Measure execution quotient on timely basis

Knowledge management

Re-Design of Business Rules Management System (BRMS) and data collection procedures.

Project Selection and planning

Project execution, control and closure

Business Opportunity assessment

Business Improvement and Process Consulting

Workflow Monitoring and Analysis Tool

Six Sigma Business Scorecard

Building Statistical Process Control (SPC) culture across all business processes.

Data Mining and Direct marketing- XL Miner

Conduct Business Process Modeling, Design, Documentation, Analysis, Simulation, Monitoring, Reporting and Control through leading BPM tools.

Ex: IBM Websphere Business Modeler, ARIS- IDS Scheer AG, Savvion BPM, Tibco i- process modeler etc

Metrics reporting and visualization- Crystal XCelsius

Business/IT strategy mapping – Smart Draw software

In order to define business objectives for enterprise level operational processes and design organizational structure, system landscape, and IT infrastructure and resolve strategic issues analytically, the above mentioned software helps to define, formulate and execute strategic vision of the company aligning resources.

The given framework and methodology for business analysis helps us to embed six sigma philosophy with strong focus on project management utilizing state of art of IT tools. The integration of BPM, SOA, Six Sigma, Analytics & Reporting, and Project Management across core business processes helps to transform customer business as well as people transformation through change management. This requires strong commitment and vision to deliver outstanding solutions to the customer and capture new business in the market based on this success story.

I believe this is possible through effective consulting and engagement with the client and innovative business model through the defined business process framework.

Transition, Stabilization and optimization are key to any kind of service delivery. If we can compare global business operating models and transformation approach towards core business, the optimization strategy and implementation varies from company to company.

The assumption and study of the given proposal can bridge the gap between IT and KPO/BPO.

Please find the below given solution approach built by integrating Six Sigma DMAIC and SAS SEMMA functional methodologies to design and deploy an analytical framework at operational level.

Data Analysis Approach

Data extraction/Gathering

Data Consolidation

Data Integration

Data Measurement criteria

Data Quality Check/Monitoring

Data Presentation

Reporting/Analysis of KPIs

Sampling Plan

Change Management

Change identification and analysis

Change presentation

Change control and documentation

Solution Design & Presentation

Standardized approach for process design and development-Transitioning SOA with BPM.

Documenting process architectures

Value proposition and business case

Presentation of the solution.

DMAIC/SEMMA Roadmap To Operations

The integration Six Sigma DMAIC with data mining methodology SEMMA gives us better analytic results.

SEMMA – Sample, Explore, Modify, Model, and Assess is the standard process of conducting data mining activities.

Step1- Integrate process sampling strategy with the Define phase. This helps us to extract critical data from the enterprise warehouse or other databases.

Step2- Combine Explore and Measure phases to explore your data by searching for unanticipated trends and anomalies in order to gain understanding and ideas. Exploration helps refine the discovery process. If visual exploration doesn’t reveal clear trends, you can explore the data through statistical techniques including factor analysis, correspondence analysis, and clustering.

Perform Measurement System Analysis (MSA) and drill down the data.

Step3- Modify the data by creating, selecting, and transforming the variables to focus the model selection process and analyze the data.

Step4- Model the data by applying tree-based models, logistic models, and other statistical models and Improve the business process by following the DMAIC techniques.

Step5-Assess your data by evaluating the usefulness and reliability of the findings from the data mining process and estimate how well it performs in the Control Stage.

The starting point for any Six Sigma project is by fully utilising the DMAIC process. The following describes the process and the key elements, tools & techniques of DMAIC. Remember the use of Six Sigma tools & techniques will aid to understand customer requirements, tolerances and specifications, simplify processes and reduce variation.

Define

• Understanding the problem and its financial impact

Problem Statement & Goal Statement

• The purpose of the problem statement is to describe what is wrong

• The goal statement then defines the team‘s improvement objective

SIPOC & Process Mapping

• SIPOC diagrams describe the flow of information or materials from the Supplier, Input, Process, Output to the Customer Process Mapping, using a tool such as ARIS, charts the process following actual path of data or material and clarifies the start and the stop boundaries of the process

• A typical method of process mapping users ‘brown paper’ & ‘post-its’, people from within the process and a large wall

• It’s very important to verify the process map and a consensus is reached between all parties

Voice of the Customer

• The term Voice of the Customer (VOC) is used to describe customers’ needs and their perceptions of the product or service

• Remember customer requirements change constantly and specifications tend to focus on technical data only

• An affinity diagram is a tool that organises language data into related groups. These can prove useful particularly when dealing with complex problems or issues

SWOT Analysis

• SWOT analysis is a very simple four box model within a one box tool, which enables the identification of Strengths, Weaknesses, Opportunities and Threats during the project definition stage. It can be used to identify why we need it and the consequences of not doing it (it being the project or improvement initiative)

Stakeholder Analysis

• People who have an interest in the outcome of a business or process. For example, stakeholders of the business include customers, employees, and shareholders. Again very similar to the SWOT analysis, a very simple four box model within a one box tool which enables the identification of levels of interest against power, rated high through to low .

Measure

• Identify what and why you need to measure and what are the desired characteristics of data?

• Sufficient, relevant, representative, contextual

• The identification of process measures i.e. what’s the KPI (key performance indicator). By using the SIPOC diagram you can see where measures and data are likely to be required

• The following table relates to the Data Worlds of Six Sigma:

When you can: The Data World is: The Data Type is: A useful model is:

Classify Defectives

are yes/no, pass/fail, on time/not on time Attribute Binomial

Count Defects

are number of things missing, faults/errors/rejects Attribute Poisson

Measure Continuous

are time, volume (any value within physical limits) Variable Normal

Measurement System analysis – MSA

• Measurement System variation is an integral and often significant part of process variation

• You cannot differentiate it from other sources of variation unless you measure it

• Always complete a Measurement System Analysis first; before you commit resources to reducing variation elsewhere

Gage R&R – Gauge Repeatability & Reproducibility

• Gage R&R allows you to measure the level of variation caused by the equipment and the appraisers

• This can be achieved by inputting your data into Minitab via stat> quality tools> gage R&R study (crossed).

• Remember as the process improves the variance the measurement systems must always be dramatically lower that the process variance.

Data Collection

• Key questions; what are the objectives of the project, what are the key process measurables, what are the customers key measures, what process inputs do you want to monitor

• Develop data collection plans

• Consider your methods of sampling and consider sample sizes

Graphical Data Exploration

• Graphical tools are some of the most powerful (and simplest to apply) in the 6 toolbox

They are the starting point for understanding your data

• The following table gives an overview of some graphical tools and the route via Minitab:

Tool Mini-tab menu Usage

Frequency diagrams

• Histogram

• Dot plot

Graph > Histogram

Graph > Dot plot

Occurrence, Shape, Continuous

Pareto diagram Stat> Quality tools> Pareto Prioritise

Scatter diagram Graph> Plot Relationships, x/y continuous

Time series plot Graph>Time series plot Trend / pattern over time

Control charts Stat> Control charts Trend/ pattern, in control

Process Capability

• The performance level of a process under control, compared with the performance level required to meet customer specifications.

First Time Yield

• First Time Yield (FTY) measures the units that avoid the hidden factory e.g. FTY is the percentage of units that are produced without any ‘defectives’ – first time around

• First time yield (FTY) is an excellent indicator of process performance and quality

Defects Per Unit – DPU

• The total number of defects or nonconformities found in a sample, divided by the number of units sampled. Also known as the defect rate

• Pareto charts can be effectively used to help identify which defects are most significant and to enable improvement efforts on areas where the largest gains can be made. Pareto charts can be generated by Minitab via stat> quality tools> pareto chart

Relationship Between DPU & FYI

• Defects data generally follows the Poisson distribution model. Therefore it is possible to mathematically convert from; dpu to FTY, or FTY to dpu. The equations are:

Analyse

The following tools, although not exhaustive, demonstrate the most common used during the analysis phase of any SixSigma project/improvement initiative. A large proportion of Analysis includes ‘Hypothesis Testing’; the following tools are included where appropriate.

Where Minitab is used, a table demonstrating the route will be included in this section

Confidence Intervals

• Understanding Confidence Intervals is key to:

• Understanding the limitations of point estimate data

• Being able to quickly and efficiently screen a series of point estimate data for significance

• For those who completed Blackbelt training remember the 95% confidence levels that were being used and how Minitab outputs referred to being 95% confident, however, where ‘life & death’ situations were involved a higher level of confidence was needed

P-value

• P-values are often used in hypothesis tests, where you either reject or fail to reject a null hypothesis. In other words ‘my gut feeling is’, therefore, does the data confirm that or not. The smaller the p-value, the smaller is the probability that you would be making a mistake by rejecting the null hypothesis. A cut-off value often used is p = 0.05, that is, reject the null hypothesis when the p-value is less than 0.05. See Minitab help for further details via Hypothesis Testing

Normality Testing

• Normality Test (the ‘normality of the data’). The tool used to check for a normal distribution of data, important for the use of further statistical tools in the analysis and improvement phase

• There are a number of choices for normality testing particularly during the hypothesis phase within A of DMAIC.

• The most common used is the Anderson-Darling test (the default test).

• There are two other tests; the Ryan-Joiner test which is a correlation based test and Kolmogorov-Smirnov test. These tests have similar power for detecting non – normality. The Kolmogorov-Smirnov test has lesser power. Again see Minitab help for further detail

T & F – tests

• T test

• The one sample t-test is used to decide if the average of your (continuous data) sample could be the same as a known average of a population. The known average is known as the ‘test mean’ in Minitab.

• F Test

• If your data samples are Normally distributed – then the F-test is appropriate. The F-test is used to decide if two (continuous data) samples could have the same level of variation. Note: it doesn’t matter if the two samples have different averages, since this test just looks at variation, i.e. sample/process more consistent than the other?

ANOVA

• The one wave ANOVA (analysis of variance) is used to decide if three or more (continuous data) samples could share the same average

Correlation

• The degree of relationship, often linear, between two variables. This also sets the scene for potential cause and effect

Chi Square-Test

• Tool used to test process or product improvement or a benchmark if we only have attribute data, e.g. to decide if the three or more (‘defective’ data) samples could share the same percentages.

FMEA (Failure Mode & Effects Analysis)

• A structured approach to identify, estimate, prioritise and evaluate risk

• Aims at failure prevention

• Primarily used to limit the risk involved in changing the process

• Can also be used to focus the data collection effort on those input and process variables that are critical for the current process

Fishbone Diagram

• A graphic tool used to isolate an effect visually and to diagram possible related causes. Can be used to identify root causes of problems. Also known as a cause-and-effect or Ishikawa diagram

Summary

The following table gives an overview of some Hypothesis Testing tools and the route via Minitab:

Tool Mini-tab menu Usage

Normality test Stat> Basic Stats> Display Descriptive Stats> Graphs> Graphical Summary

Stat> Basic Stats> Display Descriptive Stats> Graphs> Histogram of data, with Normal curve

Generates a normal probability plot and performs a hypothesis test to examine whether or not the observations follow a normal distribution

Anderson-Darling test See above The Anderson-Darling Normality test is a hypothesis test that helps to determine if the data is Normal

T test Stat > Basic Statistics > 1 – sample t

Performs a one sample t-test or t-confidence interval for the mean

F test Stat > Basic Statistics > 2 – sample t

The two sample t-test is used to decide if two (continuous data) samples could share the same average

Anova Stat> ANOVA> One way

Stat>ANOVA>One way (unstacked)

The one wave ANOVA (analysis of variance) is used to decide if three or more (continuous data) samples could share the same average

Correlation Stat> Basic Statistics> Correlation

The degree of relationship, often linear, between two variables

Chi Square-Test

Stat> Tables> Chi-Squared Test

Used to decide if the three or more (‘defective’ data) samples could share the same percentages

Improve

Design Of Experiments (DoE)

• Term used to describe a Design of Experiments in the improve phase where we want to find the Vital Few X’s and eliminate the Trivial Many X’s (X’s are causes)

• This particular tool can be quite complicated, therefore, it is recommended that detail relating to DoE should be sought from the Blackbelt training material (week 3 session 3).

Mistake Proofing

• Prevention or detecting defects where they occur is the best way of improving and sustaining the gain and it is vital to sustain the gains from improvements

• Understand how defects originate

• Recognise Red Flag conditions

• Identify key mistake proofing devices/tools

• Link mistake proofing approaches to projects

• Prevention or detecting defects where they occur is the best way of improving and sustaining the gain

Control

The state of stability, normal variation, and predictability. The process of regulating and guiding operations and processes using quantitative data. Control mechanisms are also used to detect and avoid potential adverse effects of change

• There are two main tools for Control

• Control Charts & Control Plans

• Control charts can be generated by Minitab; Stat> Control Charts, however, you will need to determine the type of control chart which is best suited to your improvement/project. Again see Minitab help for further information .

• The intent of an effective control plan strategy is to:

• Operate processes consistently on target with minimum variation

• Minimise process tampering/interference

• Assure that the process improvements that have been identified and implemented become institutionalised

• Provide for adequate training in all procedures

• Include required maintenance schedules

DMAIC Tools :

Define

Project Charter.

Stakeholder Analysis.

Communication Plan.

Identify and segment Key Customers.

Critical to Quality (CTQ) Requirements.

Verifying CTQs.

Hi-level Process map.

Process Vision.

Project Plan.

Quality Function Deployment

Cost of Quality Trend Analysis

Cause-and-Effect Diagrams

Process Maps

Measure

Measurement Basics.

Measurement process and plan.

Selecting Measures; Measuring Value, Cost of Poor Quality.

Data definition and sources.

Sampling.

Measurement system studies.

Measuring yields and capability.

Implementing measurement plan.

Scatterplots

Exploratory Plots

Time Sequence Plots

Gage Studies for Variables and Attributes

Sample Size Determination

Analyse

Data Analysis; Pareto charts, Frequency charts, Run charts, Common and special cause variation.

Process Mapping and Analysis; Value Analysis.

Cause and Effect Analysis.

Verifying causes; Scatter diagrams, Design of experiments.

One Variable Analysis

Capability Analysis for Variables

Capability Analysis for Attributes

Multivariate Capability Analysis

Distribution Fitting

Two Sample Comparisons

Multiple Sample Comparisons

Comparison of Rates and Proportions

Outlier Identification

Multivariate Methods

Reliability and Life Data Analysis

Outlier Identification

Multivariate Methods

Reliability and Life Data Analysis

Improve

Process Vision.

Brain storming.

Lean Principles; 5Ss, Push versus Pull, Little’s Law, Visibility, Setup reduction.

Evaluating solutions.

Selecting solutions.

Developing solution options.

Business scenarios.

Pilot testing.

Failure Mode and Effects Analysis.

Implementation planning; Force field analysis.

Regression Analysis for Measurement Data

Regression Analysis for Attribute Data

Life Data Regression

Analysis of Variance

Design of Experiments

Screening, Response Surface, and Mixture Designs

D-Optimal Designs

Inner and Outer Arrays

Designs for Categorical Variables

Multiple Response Optimization

Control

Simple / appropriate documentation.

Statistical Control; Variation, Control Charts, I, X Bar and R Charts.

Response Charts.

Process Management.

Process Scorecards.

Project Close and Handover.

Phase II Control Charts

Multivariate Control Charts

Acceptance Sampling

Classification Methods

…………………………………………………………………………………………………………………………………………………

Forecast

Descriptive Time Series Methods

Smoothing

Seasonal Decomposition

Forecasting

SnapStats

One Sample Analysis

Two Sample Comparison

Paired Sample Comparison

Multiple Sample Comparison

Curve Fitting

Capability Assessment (Individuals)

Capability Assessment (Grouped Data)

Gage R&R

Automatic Forecasting

Tools

Expression Evaluator

Six Sigma Calculator

Probability Distributions

Interpolation

Surface and Contour Plots

Custom Charts

## Leave a Reply