This curriculum spans the full lifecycle of a Six Sigma project, comparable in structure and rigor to multi-phase improvement initiatives led by internal process excellence teams or external consultants, covering charter development, customer-driven scoping, data validation, statistical analysis, solution implementation, and sustained control within complex organizational environments.
Define Phase: Project Charter Development
- Selecting measurable business outcomes aligned with organizational KPIs to justify project initiation
- Drafting a problem statement that isolates the specific process gap without assigning root causes prematurely
- Negotiating project scope boundaries with stakeholders to prevent scope creep while ensuring meaningful impact
- Identifying primary process owners and securing their commitment to resource allocation and decision rights
- Establishing baseline performance metrics from existing data systems or designing data collection protocols
- Documenting assumptions about process stability and data availability that may affect later analysis
- Defining the project timeline with milestone reviews tied to DMAIC phase gates
Define Phase: Voice of the Customer (VOC) Analysis
- Conducting structured interviews with internal and external customers to extract critical-to-quality (CTQ) requirements
- Translating qualitative feedback into quantifiable CTQs using a requirements hierarchy matrix
- Resolving conflicts between competing customer needs by prioritizing based on business impact and feasibility
- Selecting appropriate data collection methods (surveys, focus groups, transaction logs) based on customer accessibility
- Mapping customer requirements to current process outputs to identify gaps in delivery
- Validating CTQs with operational teams to ensure measurability and alignment with process capabilities
- Documenting VOC limitations, such as sample bias or low response rates, in the project risk log
Measure Phase: Process Mapping and Baseline Metrics
- Constructing a detailed SIPOC (Suppliers, Inputs, Process, Outputs, Customers) diagram with cross-functional input
- Identifying and validating key process steps through observation and workflow analysis, not just documentation
- Selecting primary and secondary metrics based on data availability, sensitivity, and alignment with CTQs
- Assessing current data collection systems for reliability, frequency, and granularity gaps
- Designing and pilot-testing data collection forms to minimize operator error and ensure consistency
- Calculating baseline process performance using yield, cycle time, and defect rates with confidence intervals
- Conducting a measurement system analysis (MSA) for critical metrics to evaluate repeatability and reproducibility
Measure Phase: Data Collection and Validation
- Assigning data collectors with documented training and calibration to maintain consistency
- Implementing controls to prevent data manipulation or selective reporting during collection
- Addressing missing or outlier data through predefined imputation or exclusion rules
- Validating data integrity by cross-referencing multiple sources or systems
- Documenting deviations from the original data plan and their impact on metric validity
- Storing collected data in a secure, version-controlled repository with access logs
- Generating time-ordered run charts to assess process stability before statistical analysis
Analyze Phase: Root Cause Identification
- Selecting root cause analysis tools (e.g., fishbone diagrams, 5 Whys) based on problem complexity and team expertise
- Facilitating cross-functional workshops to surface potential causes while managing group bias
- Prioritizing potential causes using Pareto analysis or failure mode and effects analysis (FMEA)
- Distinguishing between correlation and causation when interpreting process data patterns
- Designing hypothesis tests to statistically validate suspected root causes
- Challenging assumptions about cause-effect relationships with counterfactual reasoning
- Documenting rejected causes with rationale to prevent redundant investigation later
Analyze Phase: Statistical Validation of Causes
- Selecting appropriate statistical tests (t-tests, ANOVA, regression) based on data type and distribution
- Verifying assumptions of normality, independence, and homogeneity of variance before test execution
- Adjusting significance thresholds when conducting multiple comparisons to control Type I error
- Interpreting p-values and effect sizes in the context of practical significance, not just statistical significance
- Using control charts to determine if process variation is due to common or special causes
- Generating residual plots to diagnose model fit issues in regression analyses
- Communicating statistical findings to non-technical stakeholders using visualizations and plain language
Improve Phase: Solution Design and Risk Assessment
- Generating alternative solutions using structured ideation techniques while constraining to technical and budgetary limits
- Evaluating proposed changes using a weighted decision matrix that includes implementation effort and risk
- Conducting a pilot test in a controlled environment to assess impact on key metrics
- Identifying unintended consequences on related processes or downstream operations
- Updating process documentation and work instructions to reflect proposed changes
- Securing approvals from compliance, safety, and quality functions before full rollout
- Developing a rollback plan in case the solution fails to deliver expected outcomes
Control Phase: Sustaining Process Improvements
- Transferring ownership of the improved process to the operational team with documented handover criteria
- Implementing control charts or dashboards to monitor key metrics in real time
- Establishing response plans for out-of-control signals with defined escalation paths
- Training process operators and supervisors on new procedures and control mechanisms
- Integrating updated metrics into performance management systems for accountability
- Conducting periodic audits to verify adherence to revised standards
- Scheduling follow-up reviews at 30, 60, and 90 days post-implementation to assess sustainability
Project Governance and Stakeholder Management
- Presenting phase-gate reviews to executive sponsors with clear recommendations and decision options
- Updating the project risk register with new issues and mitigation status at each phase transition
- Managing stakeholder expectations when project scope or timeline adjustments are required
- Resolving conflicts between functional departments over process ownership or resource allocation
- Documenting lessons learned in a standardized format for organizational knowledge reuse
- Ensuring compliance with internal audit and regulatory requirements throughout the project lifecycle
- Archiving project artifacts in a central repository with metadata for future retrieval