Design of Experiments-A simple approach K.Gautami*,CH.S.Vijaya Vani, V. Umamaheswara Rao CMR College of Pharmacy, Kandlakoya village, Medchal Road, Hyderabad - 501401, INDIA. ABSTRACT DoE is a systematic approach to investigate a system or process. A series of structured tests are designed and planned changes are made to the input variables of a process. The effects of these changes on a pre-defined output are then assessed. As per USFDA requirement, each formulation development and process development should be such that it should fulfill the desired quality continuously. To achieve this target it is essential to work scientifically and in a designed manner. DoE comes in picture for the fulfillment of desired quality and for future aspects also. Different types of model or designs are available which are utilized in the formulation development as well as in the process development also. E.g. Full Factorial, Fractional factorial, Resolutions III, IV, V) Plackett-Burman, Box Behanken, Central Composite designs etc. Designs are very specific for their use, it essential to have some idea before using the design which can provogue in the target achievement. The review article implies the basic knowledge about the different types of designs and their use in specific cases. Key words: Full Factorial, Fractional factorial, Resolutions III, IV, V) Plackett-Burman, Box Behanken, Central Composite designs INTRODUCTION Design of experiments (DoE) or experimental design is the design of any information-gathering exercises where variation is present, whether under the full control of the experiment or not. DoE helps in the best configuration of factor values to minimize variation in a response. The major advantage1 of using DoE to develop formulations as well as screening of process parameters for pharmaceutical products is that it allows all potential factors to be evaluated simultaneously, systematically, and quickly. Using DoE, we can evaluate the effect of each formulation or process factor on each response and identify the critical factor based on statistical analysis. Once the critical factors have been identified, the optimal formulation can be defined by using proper DoE to optimize the levels of all critical factors. DoE approach can make scale-up and process validation very efficient due to robustness of the formulation and manufacturing process. DEFINITION OF DOE: A series of tests, in which purposeful changes are made to input factors, so that the causes for significant changes in the output responses can be identified.2 A structured.organised method for determining the relationship between factors affecting a process and the output of that process. Also known as “Design of Experiment”(ICHPHARMACEUTICAL DEVELOPMENT Q8. Input factors Process Responses DOE APPROACH: Study multiple factors changing at once (parallel processing). Accounts for interactions between variables. Maximize information with minimum runs. PURPOSES OF DOE3: a. To determine which factors are most critical or influential. b. Discover potential interactions between critical factors. c. Determine optimal levels of operation for critical factors. d. Determine if controllable input levels help minimize uncontrollable input variables. APPLICATIONS: Design new product. Develop new manufacturing process. Optimization of parameters process. Process control and improvement. DOE TERMINOLOGIES4: 1. Variables(Factors) Input variables are referred as Factors or Predictors. Type of variables: a. Quantitative e.g: Temperature, Pressure, volume, Weight, Concentration etc. b. Qualitative e.g: Manufacturer (A, B, C), Lubricant(X, Y), Catalyst (A, B) Variables Controllable Uncontrollable (Noise) Therefore it is important to Identify all the possible variables affecting response using cause and relationship as well as according to the experience. 2. Response: The response of an experiment should be quantitative. In case it is not possible to quantify, binary number like 1,0 can be used for acceptable or not acceptable. e.g: Dissolution (80% release in 30 minutes) PSD (55% Retains on #60 mesh) Weight Variation(NMT ±5%) etc. 3. Levels: Levels are values of the factors for which experiment is to be performed. a) Continuous variables: e.g.: Temperature (Variable range -10ᵒC to +10ᵒC.) Then the high level will be +10 and the lower range will be -10ᵒC. Normally 2to 3 levels are chosen for a factor. b) Categorial variables: e.g: Grade of a particular material. Use of particular size of screen(1.0mm or 0.8mm) TYPES OF EXPERIMENTAL DESIGN5 A. Screening Design To find out most influential variables. Only main effects can be studied. Can also detect existence of interactions(Except Plackett-Burman) e.g: a) Plackett-Burman b) Fractional Factorial B. Advance Screening Design Study of most influential variables with intermediate precision. Study of 2-variables interactions. Detect non-linearity. e.g: a) Fractional Factorial b) Full Factorial C. Optimization Designs: Stepwise achievement of target, like first use of full factorial to reach close to the target after that by means of RSM reach to the target. e.g. c) Full Factorial d) Box Behanken(Response Surface ) e) Central Composite(Response Surface) Plackett-Burman It is a class of resolution III, two level fractional factorial designs. Used to study main effects. In a resolution III designs, main effects are confounded with the two-way interactions. It can be used to eliminate insignificant factors and arrive on few important factors for full factorial/Response surface design. As it does not show interactions of factors, decision of elimination should be taken consciously. Total number of run is calculated by K+1, where K is no. of factors. Runs are multiple of 4 eg: 12, 16, 20, 24, 28, 32….. All the designs are orthogonal. When there are large numbers of factors to screen to eliminate unimportant ones, then we can go for PBD, and it has to be assumed that factors that seem to be have large main effects and not interactions. Suitable for 2 level multi factor experiments. Can be used to demonstrate ruggedness or robustness of equipment or processes, This assumes that the factors used are critical or not critical. To find a set of conditions that performs better than those being used and when empirical relationships are not a concern. Plakett Burman designs are an alternative to fractional factorials for screening. One useful characteristic is that sample size is a multiple of four rather than a power of two. There are no two – level fractional designs sizes between 16 and 32 runs. However, there are 20-run, 24-run, 28-run Plakett- Burman designs. Factorial designs 6:( Full Factorial and Fractional Factorial) Full Factorial: A full design consists of all possible combinations of all levels of two or more factors. Fractional Factorial design: A fractional design consists of a carefully chosen subset (fraction) of the experimental runs of a full factorial design. Table: 1(Type of Factorial designs) DOE type No. of factors Objectives Estimation Full Factorial 2-5 Optimization/Screening All main effects and interactions Fractional 5-10 Screening Main effects and Factorial some interactions Table: 2(Formula for Full factorial and Fractional factorial) 2p, Where 2=No. of levels, and p=No. of Full Factorial factors. 2p-n,Where n =No. of fractions. Fractional Factorial Construction: Full Factorial with 3 factors can be given by 8 trials by following means Table: 3(Runs for full factorial design) Run A B C 1 - - - 2 - - + 3 - + - 4 - + + 5 + - - 6 + - + 7 + + - 8 + + + These 8 trials can be shown by the 8 points of a cube where all the 8 points are orthogonal to each other. Fractional Factorial with 3 factors can be given by 4 trials by following means Table: 4(Runs for Fractional factorial design) Run A B C 1 - - + 2 - + - 3 + - - 4 + + + These trials are derived from the full factorial; due to less number of trials we cannot identify interaction of two or more factors. Confounding7: When using Fractional factorial designs, results in to fewer experiments which causes confounding. This means that some effects cannot be studied independently of others. Degree of confounding is described by the confounding pattern and the resolution. Effects that cannot be estimated separately from one another are said to be confounded. Sometimes an experiment is constructed in such a way that two variables (x1 and x2 ) have same levels for each run in the experiments , then x1 = x2 Under this condition, effects of x1 and x2 cannot be estimated separately. Confounding occurs normally in fractional factorial experiments. Due to confounding, resolution of design increases. Before selection of fractional factorial designs, it is necessary to ascertain the confounding pattern of designs. Table: 5 Experiments with confounding relationship. A B C AB BC CA -1 -1 1 1 -1 -1 1 -1 -1 -1 1 -1 -1 1 -1 -1 -1 1 1 1 1 1 1 1 Confounding B with CA Confounding A with BC Resolution of fractional designs: Resolution of design is expressed as a roman number. Resolutions III design: main effects are confounding with 2-factors interactions. Resolutions IV design: main effects are free of confounding with 2-factors interactions, but 2-factors interactions are confounded with each other. Resolutions V design: Main effects and 2-factor interactions are free of confounding. Table:6 Resolution types Resolution III 1+2(One main +Interaction of 2) Resolution IV 1+3 or 2+2 Resolution V 1+4 or 2+3 Optimization design: Two level full and fractional factorial designs are not suitable for estimating the curvature if present in the response. A variable must have at least 3 levels in order to fit a model that can solve the curvature in response The designs which are capable of resolving curvature in the response associated with each design variable are called Response surface designs and designs for quadratic models. Two level factorial designs with centre will not help as it would not be able to estimate the quadratic effects. 3 level factorial designs will need 3(power k) factorial experiments which will result in too much number of experiments as k increases. E.g. for 3 factors, 27 experiments, 4 factors, 81 experiments & so on. Response surfaces designs are more efficient design with less number of experiment runs. Box – Behanken designs Central composite designs Optimization can be done by following means: Screening (Determination of most influential factor) Improvement (Optimum approach by repeatatively changing the factors) Determinatin of optimum (Finding the optimal setting of factors by means of RSM) Response surface designs: Response Surface Methodology (RSM) is an experimental technique which can find out the optimal response within specified ranges of the factors. By means of using quadratic equation it can evaluate curvature as true response. If a maximum or minimum exists inside the factor region, RSM can estimate it i.e. it can give operating window for the experiment. As by increasing the number of factors increases dramatically then it is better to use RSM with 2 to 8 factors. It can be run sequentially i.e. at first stage estimates linear and two factor interaction effect and in the second stage estimates curvature effects. Factors to consider: Critical factors are known. Region of interest, where factor levels influencing product is known. Factors vary throughout the experimental tested range. A mathematical function relates the factors to the measured response. The response defined by the function is a smooth curve. Uses of RSM: To determine how a specific response is affected by changes in the level of the factors over the specified levels of interest. To determine the optimum combination of factors that yields a desired response and describes the response near the optimum. To determine the factor levels that will simultaneously satisfy a set of desired specifications. Response surface designs are useful for modeling a curved quadratic surface to continuous factors. A response surface model can pinpoint a minimum or maximum response, if one exists inside the factor region. Three distinct values for each factor are necessary to fit a quadratic function, so the standard two-level designs cannot fit curved surfaces. The following designs are widely used for fitting a quadratic model: 1. Central Composite Design (uniform precision of effect estimates) 2. Box-Behnken Design (almost uniform precision of effect estimates, but usually fewer runs required than for CCD). The choice between these models is usually decided by the availability of these designs for a given number of runs and number of factors. 1. Central Composite Design: It models the response surface precisely and there are 5 levels for each variable. Models can be linear, quadratic or higher order Polynomials. Include terms for interaction and quadratic effects. Insignificant terms are discarded. A CCD is often executed by adding points to an already performed 2p-design. A CCD consists of 3 parts a. Factorial points(For two level full/fractional(Resolution V or higher).Cube points. b. Centre points(Estimate first-order and two factor interactions):All the factor values are at the zero (or midrange) value. c. Axial points or star points(Estimate pure error and tie blocks together): Factors at outer (axial) values. Figure: Circle showing all the points of a CCD Based on the number of center points and axial values in the design CCD is divided in to two types of designs. These properties of central composite designs relate to the number of center points in the design and to the axial values: a) Uniform precision: Means that the number of center points is chosen so that the prediction variance near the center of the design space is very flat. b) For orthogonal designs: The number of center points is chosen so that the second order parameter estimates are minimally correlated with the other parameter estimates. Some of the common types of CCD: a. Circumscribed central composite design(CCC): b. Faced central composite design(FCCD): c. Inscribed central composite design(ICCD): d. Planar Response Surface: Number of Experiments for CCD as per number of factors: No. of Factors No. of Experiments 2 8+3 3 14+3 4 24+3 5 26+3 6 44+3 7 78+3 Table 7: No. of Experiments as per CCD 2. Box-Behnken Design8: Box-Behnken designs are response surface designs, specially made to require only 3 levels, coded as -1, 0 and +1. Can be applicable for 3-21 factors. Box-Behnken provides strong estimation near the center of the design space(Presumed optimum) but weaker at the corners of the cube as there are no design points. Properties: Second order (quadratic) model for response variables. Also known as orthogonally balanced incomplete block design. Smallest response surface design for less than 5 factors. Factors studied for 3 levels(-1,0,+1) Does not contain factorial or fractional factorial design. Efficient (few runs than CCD). All points lie on a sphere. Consists of points that are vertices of a polygon and equidistant from the centre. Not a sequential in nature. Either rotatable or nearly rotatable. No corner points of hypercube (these are extreme conditions which are often hard to set). Extreme situations are avoided (cut off corners). BBD for three factors: Few three dimensional images: Comparison of number of runs for CCD and BBD: No. of Number of runs needed factors CCD BBD 2 13(5 center point runs) ---- 3 20(6 center point runs) 15 4 30(6 center point runs) 27 5 33(fractional factorial or 52(full factorial) 46 6 54(fractional factorial or 91(full factorial) 54 Table 8: Comparison of number of runs for CCD and BBD Taguchi Designs9,10: It is a type of fractional factorial design. Taguchi design cannot support response surface models and are limited to only predicting at the points where data was taken. It is pre-created, cataloged designs intended to quickly find a set of conditions that meet the criteria of success. Taguchi developed a method for designing experiments to investigate how different parameters affect the mean and variance of a process performance characteristic that defines how well the process is functioning. The experimental design proposed by Taguchi involves using orthogonal arrays to organize the parameters affecting the process and the levels at which they should be varies. Instead of having to test all possible combinations like the factorial design, the Taguchi method tests pairs of combinations. This allows for the collection of the necessary data to determine which factors most affect product quality with a minimum amount of experimentation, thus saving time and resources. The Taguchi method is best used when there is an intermediate number of variables (3 to 50), few interactions between variables, and when only a few variables contribute significantly. The general steps involved in the Taguchi Method are as follows 11,12: Define the process objective (Target value) E.g. flow rate, temperature, etc. It may be minimum or maximum; for example, the goal may be to maximize the output flow rate. The deviation in the performance characteristic from the target value is used to define the loss function for the process. Determination of design parameters affecting the process. Parameters are variables within the process that affect the performance measure such as temperatures, pressures, etc. that can be easily controlled. The number of levels that the parameters should be varied at must be specified. For example, a temperature might be varied to a low and high value of 40 ºC and 80 º C. Creation of orthogonal arrays It indicates the number of and conditions for each experiment. The selection of orthogonal arrays is based on the number of parameters and the levels of variation for each parameter. Running of experiments Running of experiments with their respective parameters. Complete data analysis To determine the effect of the different parameters on the performance measure. Taguchi Method of orthogonal arrays 13: Key Feature: Instead of testing all possible combinations of variables, we can test all pairs of combinations in some more efficient way. All the possible combinations of experiments are evaluated,repeatation of experiment is avoided. Table 9: All the possible combinations with 4 factors and 2 levels. In case of full factorial design for the similar case we have to run 16 trials, therefore we can say that with Taguchi method of DOE we will not able to interactions which have significant impact on the process. Advantages of Taguchi design 14: It emphasizes a mean performance characteristic value close to the target value rather than a value within certain specification limits, thus improving the product quality. Straightforward and easy to apply to many engineering situations, making it a powerful and simple tool. It can be used to quickly narrow down the scope of a research project or to identify problems in a manufacturing process from data already in existence. Less number of experiments are required as compared to the full factorial design therefore it allows for the identification of key parameters that have the most effect on the performance characteristic value so that further experimentation on these parameters can be performed and the parameters that have little effect can be ignored. Noise is shown to be present in the process but should have no effect on the output! This is the primary aim of the Taguchi experiments - to minimize variations in output even though noise is present in the process. The process is then said to have become ROBUST. Disadvantages of Taguchi design: The results obtained are only relative and do not exactly indicate what parameter has the highest effect on the performance characteristic value. Since orthogonal arrays do not test all variable combinations, this method should not be used with all relationships between all variables are needed. Difficulty in accounting for interactions between parameters. Taguchi methods deal with designing quality in rather than correcting for poor quality, they are applied most effectively at early stages of process development. After design variables are specified, use of experimental design may be less cost effective. Overall Conclusion: A good formulation also must be easy to manufacture and must produce good products consistently. To achieve this target, DoE is utilized in a efficient manner during formulation development as well as during optimization of the process parameters, which provides robustness, rigorousness to the formulation and the process to achieve continuous critical quality attributes. DoE is a systematic approach to investigate a system or process. A series of structured tests are designed and planned changes are made to the input variables of a process. The effects of these changes on a pre-defined output are then assessed. Based on the criticality and significance of the variable or non-variable factors suitable design of DoE is applied to get the optimum design space. If the factors are less then we can go with the full factorial design opposite to that if the factors are more we have to go with the fractional factorial design followed by RSM design to avoid the large number of experiments and also to evaluate the interaction between the factors. CCD gives the 5 levels of each variable and the models can be linear, quadratic or polynomial. It can resolve interactions and quadratic effects and the advantage is that it is able to discard the insignificant terms. Box-Behnken designs are response surface designs, specially made to require only 3 levels, can be applicable for 3-21 factors. Box-Behnken provides strong estimation near the center of the design space (Presumed optimum) but weaker at the corners of the cube as there are no design points. Taguchi design is a type of fractional factorial design. Taguchi design cannot support response surface models and are limited to only predicting at the points where data was taken. The Taguchi method is best used when there is an intermediate number of a variable (3 to 50), few interactions between variables, and when only a few variables contribute significantly. ANOVA play a vital role in the statistical evaluation of the designs output. By means of this we can evaluate each and every factor and their interaction in a very efficient and precise manner. References: 1. Food and Drug Administration: Final report on pharmaceutical cGMPs for the 21st century-A risk-based approach: http:// www.fda.gov /cder/gmp/gmp2004/GMP_finalreport2004.htm 2. Lawrence X, Raw A, Lionberger R, Rajagopalan R, Lee L, Holcombe F, Patel R, Fang F, Sayeed V, Schwartz P, Adams P, and Buehler G. U.S. FDA question-based review for generic drugs: A new pharmaceutical quality assessment system. J. Generic Med 2007; 4:239–248 3. Food and Drug Administration CDER. Guidance for industry, Q8 pharmaceutical development; 2006. 4. Nasr M. Risk-based CMC review paradigm. Advisory committee for pharmaceutical science meeting; 2004. 5. Food and Drug Administration CDER. Guidance for industry: Immediate release solid oral dosage forms scale-up and post approval changes: Chemistry, manufacturing, and controls, in vitro dissolution testing, and in vivo bioequivalence documentation; 1995. 6. Food and Drug Administration CDER. Guidance for industry: Modified release solid oral dosage forms scale-up and post approval changes: Chemistry, manufacturing, and controls, in vitro dissolution testing, and in vivo bioequivalence documentation; 1997. 7. Food and Drug Administration CDER. Guidance for industry: Non sterile semisolid dosage forms scale-up and post approval changes: chemistry, manufacturing, and controls, in vitro dissolution testing, and in vivo bioequivalence documentation; 1997. 8. Food and Drug Administration CDER. Guidance for industry: Changes to an approved NDA or ANDA; 2004. 9. Woodcock J. The concept of pharmaceutical quality. Am. Pharm. Rev 2004:1–3. 10. Food and Drug Administration Office of Generic Drugs White Paper on Question-based Review: http://www.fda.gov/cder/ OGD/QbR.htm. 11. Food and Drug Administration CDER. Guidance for industry, Q6A specifications for new drug substances and products: Chemical substances; 1999. 12. Nasr M. FDA’s quality initiatives: An update. http://www.gmpcompliance. com/daten/download/FDAs_Quality_Initiative.pdf; 2007. 13. International Journal of Pharmacy and Pharmaceutical Sciences (quality by design (QbD) in pharmaceuticals) Bhagyesh Trivedi. 14. DoE Simplified (Practical tool for effective Experimentation) Mark J. Anderson, Patrick J. Whitcomb.
© Copyright 2026 Paperzz