Optimizing Pharmaceutical Production Processes Using Quality by Design Methods

Similar documents
Cpk: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc.

Removing Subjectivity from the Assessment of Critical Process Parameters and Their Impact

JMP 10 Student Edition Quick Guide

QQ normality plots Harvey Motulsky, GraphPad Software Inc. July 2013

Computer Experiments. Designs

Predict Outcomes and Reveal Relationships in Categorical Data

Challenges of Statistical Analysis/Control in a Continuous Process

SAS IT Resource Management Forecasting. Setup Specification Document. A SAS White Paper

Multiple Regression White paper

Data Visualization Techniques

How to implement NIST Cybersecurity Framework using ISO WHITE PAPER. Copyright 2017 Advisera Expert Solutions Ltd. All rights reserved.

Multivariate Capability Analysis

Demonstration of the DoE Process with Software Tools

Extensibility Synergy with MATLAB JMP. Interface to MATLAB. Case Studies Using the JMP WHITE PAPER

Design of Experiments for Coatings

CREATING THE ANALYSIS

Bootstrapping Method for 14 June 2016 R. Russell Rhinehart. Bootstrapping

Marketing Performance in Executive perspective on the strategy and effectiveness of marketing

Lab 12: Sampling and Interpolation

Tips and Guidance for Analyzing Data. Executive Summary

ChristoHouston Energy Inc. (CHE INC.) Pipeline Anomaly Analysis By Liquid Green Technologies Corporation

AN OVERVIEW AND EXPLORATION OF JMP A DATA DISCOVERY SYSTEM IN DAIRY SCIENCE

Continuous Improvement Toolkit. Normal Distribution. Continuous Improvement Toolkit.

Chairside Correlation in CEREC 3D software

Using Excel for Graphical Analysis of Data

Flat-Plate As stated earlier, the purpose of the flat-plate study was to create an opportunity for side-by-side comparison of ÒfastÓ RNG and

Accelerates Timelines for Development and Deployment of Coatings for Consumer Products.

Impact of 3D Laser Data Resolution and Accuracy on Pipeline Dents Strain Analysis

Analytical model A structure and process for analyzing a dataset. For example, a decision tree is a model for the classification of a dataset.

Parallelizing a Monte Carlo simulation of the Ising model in 3D

CHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY

1. Determine the population mean of x denoted m x. Ans. 10 from bottom bell curve.

Chapter 4. The Classification of Species and Colors of Finished Wooden Parts Using RBFNs

Light and the Properties of Reflection & Refraction

THE UGLY TRUTH BEHIND CALL, VOIC , AND PRACTICES

Projection Technique for Vortex-Free Image Registration

NinthDecimal Mobile Audience Q Insights Report

Support vector machines

Error Analysis, Statistics and Graphing

FUNCTIONS AND MODELS

Information Criteria Methods in SAS for Multiple Linear Regression Models

METAL OXIDE VARISTORS

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, September 18, ISSN SOFTWARE TESTING

The Time Series Forecasting System Charles Hallahan, Economic Research Service/USDA, Washington, DC

Quality by Design Facilitating Real Time Release (RTR) Practical Challenges and Opportunities during RTR Implementation

SAS Structural Equation Modeling 1.3 for JMP

Dissolution Modeling for Real Time Release Testing (RTRT)

Design of Experiments

FUSION PRODUCT DEVELOPMENT SOFTWARE

Nested Sampling: Introduction and Implementation

PATTERN SEARCHING FOR FAILING DIE

Satisfactory Peening Intensity Curves

Tutorial: Using Tina Vision s Quantitative Pattern Recognition Tool.

Part One of this article (1) introduced the concept

c Fluent Inc. May 16,

AC : SURFACE MODELING TECHNIQUES FOR AUTOMOTIVE AND PRODUCT DESIGN

Case study for robust design and tolerance analysis

IT & DATA SECURITY BREACH PREVENTION

CHAOS Chaos Chaos Iterate

Feature Selection. Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester / 262

IBM SPSS Categories. Predict outcomes and reveal relationships in categorical data. Highlights. With IBM SPSS Categories you can:

Hosting Provider Migrates from VMware to Hyper-V, Trims Licensing Significantly

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, April- ICITDA 18,

your companion on Robust Design an industry guide for medico and pharma

Quantification of the characteristics that influence the monitoring of crimping operations of electric terminals for use in the automotive industry

Particle Insight Dynamic Image Analyzer

A Modified Approach for Detection of Outliers

IPS with isensor sees, identifies and blocks more malicious traffic than other IPS solutions

Huqun Liu, Varian Inc., Lake Forest CA

Mira Shapiro, Analytic Designers LLC, Bethesda, MD

Sandeep Kharidhi and WenSui Liu ChoicePoint Precision Marketing

Bootstrap Confidence Interval of the Difference Between Two Process Capability Indices

Risk Assessment of a LM117 Voltage Regulator Circuit Design Using Crystal Ball and Minitab (Part 1) By Andrew G. Bell

Data Visualization Techniques

Lecture: Simulation. of Manufacturing Systems. Sivakumar AI. Simulation. SMA6304 M2 ---Factory Planning and scheduling. Simulation - A Predictive Tool

LISA: Explore JMP Capabilities in Design of Experiments. Liaosa Xu June 21, 2012

SAS Visual Analytics 8.2: Getting Started with Reports

DARWIN 9.1 Release Notes

Paper SAS Taming the Rule. Charlotte Crain, Chris Upton, SAS Institute Inc.

Inspection Technology Europe BV Allied NDT Engineers

DDoS: Evolving Threats, Solutions FEATURING: Carlos Morales of Arbor Networks Offers New Strategies INTERVIEW TRANSCRIPT

Control Charts. An Introduction to Statistical Process Control

Using the DATAMINE Program

OPTIMISATION OF PIN FIN HEAT SINK USING TAGUCHI METHOD

User's Guide. Version 3.2. Dr. Wayne A. Taylor

Grid Computing with Voyager

Integration Best Practices: Net Change Design Patterns

Scattering/Wave Terminology A few terms show up throughout the discussion of electron microscopy:

MULTI-CLOUD REQUIRES NEW MANAGEMENT STRATEGIES AND A FORWARD-LOOKING APPROACH

CHAPTER 2. Morphometry on rodent brains. A.E.H. Scheenstra J. Dijkstra L. van der Weerd

( ) = Y ˆ. Calibration Definition A model is calibrated if its predictions are right on average: ave(response Predicted value) = Predicted value.

Data Analysis and Solver Plugins for KSpread USER S MANUAL. Tomasz Maliszewski

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation

1. Assumptions. 1. Introduction. 2. Terminology

Surface Quality Monitoring

Box-Cox Transformation for Simple Linear Regression

Using Edge Detection in Machine Vision Gauging Applications

API documentation from the perspective of WHO-PQP

Practical Design of Experiments: Considerations for Iterative Developmental Testing

Week 7 Picturing Network. Vahe and Bethany

Transcription:

Optimizing Pharmaceutical Production Processes Using Quality by Design Methods Bernd Heinen, SAS WHITE PAPER

SAS White Paper Table of Contents Abstract.... The situation... Case study and database... Step : Building a model.... 2 Step 2: Monte Carlo simulation............................... 5 Step 3: Description of the design space.... 6 Conclusion... 7

Quality by Design Abstract Quality by design is an approach that aims to ensure the quality of medicines by employing statistical, analytical and risk management methodology in their design, development and manufacturing. In the course of the design process, all sources of variability are identified and evaluated with respect to quality parameters of the finished product. The influence of critical variables is often assessed by running designed experiments. The evaluation requires multivariate analysis, model building and optimization. A Monte Carlo simulation study can be used to find and describe the optimal area of operation. This paper uses a case study to demonstrate that quality by design goals can be derived in a straightforward way, that the results are easy to verify and that this method allows for further improvement of the process without the need for re-registration. The situation Pharmaceutical production faces a lot of side conditions that other industries do not. There are strict requirements with respect to transparency, quality and reliability, and authorities have to approve every change. In an attempt to give pharma companies more flexibility and freedom of decision, the concept of quality by design was developed. If companies can demonstrate that the items they produce meet specifications within a wider range of production settings, then no registration procedures are required if the process is varied within these boundaries. Typically the optimization of a process is targeted toward meeting a quality criterion, maximizing throughput or minimizing costs. These goals remain important, but in addition, it is now required to find the maximum area of production parameters that deliver these targets under stable conditions. Sometimes that insight may be derived from data collected during the development phase, but often extra experiments are needed. The extra effort yields an extended freedom of choice without the overhead of continuing registration. In pharmaceutical production, one does not only need to deliver the product according to specification, but also to mitigate risks and side effects. And determining what must be considered to end up with a safe and efficient product is a whole discussion in its own right. For this paper, in the interest of clarity and brevity, we ll take these aspects as a given, and use a simple case study to demonstrate the concept. Case study and database A small set of nearly 00 batch processes of a tablet production builds the base for this analysis. Some data is disguised for confidentiality reasons. There are 7 production parameters available that are potentially decisive. One critical output parameter dissolution after two hours is analyzed. The whole concept can easily be extended to more complicated situations.

SAS White Paper The critical output characteristic dissolution has a lower specification limit of 70 percent as displayed in Figure. All the batches below this limit are selected, and Screen Size (top row, fourth from left) shows a clear correlation with these unsuccessful batches. At Screen Size 5, a significantly larger proportion of the bad batches are produced than with the other screen sizes. All other parameters can be inspected in the same way to reveal the most influential one. Distributions Dissolution API Particle Size Mill Time Screen Size Mag. Stearate Supplier Lactose Supplier 80 Small 30 5 25 Smith Ind James Ind 75 20 Medium LSL 5 65 Large 0 5 3 Jones Inc Bond Inc Sugar Supplier Talc Supplier Blend Time Blend Speed Compressor Force 8 26.0 7 62 Sweet Smooth 6 5 6 Compress2 25.5 60 25.0 3 Sour Rough 2 59 Compress 2.5 58 0 2.0 Coating Supplier Coating Viscosity Inlet Temp Exhaust Temp Spray Rate Atomizer Pressure 09.0 73 20 Mac Down Coat 0 05 00 95 90 08.5 08.0 07.5 07.0 06.5 06.0 05.5 05.0 72 7 70 69 68 5 0 05 00 395 390 62 6 60 59 58 85 0.5 385 57 Figure : Distributions of target and production parameters This inspection is important to understand the data, the factors and their correlation with specific outcomes of the target variable. But it is limited to pairwise comparisons: one factor with one outcome. To define a safe production environment, one needs to know how a set of influential factors simultaneously affects the quality of the outcome. A statistical model helps describe this influence more formally. And it is the base for further analysis such as optimization. Step : Building a model Building a model is the effort to find the most influential factors and then describe how they affect the target variable. There may be synergistic or antagonistic interactions; factors may have a linear or nonlinear influence; there may be boundaries or tipping points. There exists no silver bullet to get the proper model right away; there is not even the best model. The approach taken here involves a screening procedure that combines all possible effects and weights their influence on the target variable. The highlighted entries in Figure 2 confirm that not all factors are really influential. 2

Quality by Design Six factors are shown to have a significant contribution: Screen Size Mill Time Blend Time Spray Rate Coating Viscosity Compressor Two interactions and a squared effect of Mill Time also appear to be influential. Figure 2: Screening for modeling terms With these terms, we can build a model that generates further insights. The best and easiest way to look at a model is to examine the interactive prediction profiler. The curvature in the effect of Mill Time and the steepness of the effect of Spray Rate are eye-catching, whereas the two levels of Compressor don t make a real difference. Compress2 accounts for an average yield increase of 0.5 percent over Compress. Due to this marginal contribution, this effect is excluded from further analyses. 3

SAS White Paper We can also see that a Screen Size of 3 gives the best result, so this one will certainly be chosen. Prediction Profiler Dissolution 7.28832 [73.0805, 75.962] 80 75 LSL 65 3 Screen Size 0 5 20 25 30 7.0 Mill Time 0 2.922 Blend Time 2 399.7 Spray Rate 89. Coating Viscosity Comprees Comprees2 Compress2 Compressor Figure 3: Profiles of influential terms that explain Dissolution The interactions of Spray Rate with Blend Time and Coating Viscosity produce some interesting consequences. At lower Spray Rates, Coating Viscosity has no effect at all, whereas Blend Time has a clear positive influence. At higher Spray Rates, the opposite is true. 0 2 2 0 2 2.992 Blend Time 390.2 Spray Rate 0.25 Coating Viscosity.992 Blend Time 09.95 Spray Rate 0.25 Coating Viscosity Figure : Visual inspection of the interaction of Spray Rate with other terms In this example, we want a higher solubility. Figure 5 shows the recommended settings for the production parameters that yield the maximum solubility. With a lower boundary for the 95 percent confidence interval for solubility of 79.9 percent, the expected result has an appeasing distance to the minimum requirement of 70 percent. Prediction Profiler Dissolution 8.27079 [79.926, 88.65] 90 85 80 75 LSL 65 0 5 20 25 30 0 2 2 3 Screen Size 2 Mill Time 8 Blend Time 388 Spray Rate 86 Coating Viscosity Figure 5: Optimal settings that maximize Dissolution

Quality by Design Step 2: Monte Carlo simulation Statistics are always dealing with variation. It is the reason for confidence intervals, the subject of statistical tests and the rationale behind all modeling efforts. One way to explore the effects of variation is a Monte Carlo simulation. It is not by chance that this method is named after a gambling hot spot; randomization is the core of gambling theory. Here the input parameters of the model derived earlier are not taken as fixed constants, but are allowed to vary randomly. The variation of every parameter follows a uniform distribution with the mean at the respective values and standard deviations as estimated from the data. For a set of randomly chosen parameters, the target variable is predicted using the model equation. The result is also subject to random variation. Now a normal distribution is assumed with mean zero and a standard deviation that corresponds to the residual error of the model estimate. This process of randomly varying the parameters and predicted results is repeated many times. The outcome is a distribution of the target variable that gives a very good impression of the variation that is to be expected in real runs of the process. In this case study, there is just a lower spec limit, and from the Monte Carlo simulation we can estimate how many batches would fail to surpass this limit. Figure 6 shows each parameter set to its best value, the random distribution that controls its variation and the expected distribution of the dissolution that is the result of 5,000 simulation runs. Although the parameters are set to values that maximize the predicted outcome of dissolution, a failure rate of 0.08 percent is to be expected. On the way to the exploitation of the design space, the failure rate becomes the new target variable. The results of the simulation runs are stored in a table that is the base for the final analysis. Figure 6: Monte Carlo simulation of 5,000 runs 5

SAS White Paper Step 3: Description of the design space Again, we build a model that describes the surface of the defect rate over the whole parameter space. This is a specific modeling approach called a Gaussian process. It fits a spatial correlation model to the data, where the correlation of the response between two observations decreases as the values of the independent variables become more distant. It often perfectly fits all points and is used for further analysis and optimization. Now it is more illustrative to look at contour graphs that display the area of minimum defect rates as a function of pairs of parameters. Figure 7: Contours of areas with low defect rates at different parameter values The graphs in Figure 7 show the iso-contours for the log0 defect rate for the two variables Mill Time and Blend Time. The white area shows the part of the surface where the log0 defect rate is less than -2.5. The two areas of low defect rates are different because different values have been chosen for a third variable, Coating Viscosity. 6

Quality by Design Figure 8: Synchronized graphs for factor profiling and defect contours Comparing the contour plot for the defect rate with the profiling graphs for the Dissolution clearly reveals that the area for the lowest defect rate covers a larger area than the parameter values for the maximum dissolution would imply. Conclusion It pays to know exactly which production parameters are influential for the critical outcomes of a process. It is not enough to fit a model for the measurements of the results that can be used for optimization; a separate look at the surface of the failure rate over the whole parameter space is critical. The area for robust processes with lowest failure rates does not necessarily contain the point for an optimal target variable prediction. It is important to spend some effort in finding the right model(s) that describe the critical process parameters; but once it is developed, it is straightforward to build the framework to identify the design area for the process. For ease of explanation the process was described using one target variable and five influential factors. The analytical procedure is the same for more complex situations. The benefit of getting the freedom to further optimize a production process without the need for repeated registration applications clearly exceeds the effort for the definition of the design space. 7

About SAS and JMP JMP is a software solution from SAS that was first launched in 989. John Sall, SAS co-founder and Executive Vice President, is the chief architect of JMP. SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market. Through innovative solutions, SAS helps customers at more than 80,000 sites improve performance and deliver value by making better decisions faster. Since 976 SAS has been giving customers around the world THE POWER TO KNOW. SAS Institute Inc. World Headquarters + 99 677 8000 JMP is a software solution from SAS. To learn more about SAS, visit sas.com For JMP sales in the US and Canada, call 877 59 6567 or go to jmp.com... SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies. 0892_S556.06