1. INTRODUCTION. condi tioned, 2) this degrading

Size: px
Start display at page:

Download "1. INTRODUCTION. condi tioned, 2) this degrading"

Transcription

1 EVALUATING LINEAR MODEL REPRESENTATIONS OF CUBIC SPLINES USING PROC REG Hina Mehta, Thomas Capizzi Merck Sharp & Dohme Research Laboratories l, i, ) 1. INTRODUCTION Spline functions, are defined as piecewise continuous polynomials of degree n with n-l continuous derivatives (1-13). The abscissa values corresponding to the join points of the spline are ~alled. knots. Quadratic and CUbLC sp!lnes seem to be the most popular among statistical practitioners because of their computational simplicity. At least five papers presented at previous SUGI conferences ~ave discussed applications of spllne functions (1-5). Once the number and location of the knots have been specified, the spline can be fit to data using ordinary least squares. There are several linear model representations of a sp~i~e func~ tion. They are: (a) + functlons (6;7;8) (b) ANW-sp1ines (9;10;11) and (c) B-splines (12;13.) The '+' function representation is the simpler approach both for under-. standing and statistical hypothesls testing. If the number of knots (i.e., polynomial pieces) is large or if the knots are placed close together, then this representation is thought to be an ill-condi~ioned linear system because of multlcollinearity (cf. 7;8;11). However, no formal work seems to have been done in determining specific circumstances which will produce ill-conditioning. The other two representations are considered more stable and computationally efficient basis for fitting splines using linear models. Their motivation is analagous to that of orthogonal polynomials and transformations of the predictor variables in order to make the columns of the design matrix orthogonal in ordinary polynomial or multiple regression. The purpose of this paper is to show by applying diagnostic procedures for col linearity obtained with PROC REG to a published data set (14) that 1) the '+' function linear model approach suffers from degrading collinearity with just two interior knots whereas the 'ANW' and IB' splines bases are well- condi tioned, 2) this degrading collinearity has potentially harmful effects on tests for structural change (9; 10) and to discuss the noneffectiveness of a remedy for the collinearity problem. Collineari ty and its diagnostic techniques are reviewed in section 2. In section 3, the linear model formulations of spline functions, a test for structural change, and end conditions are discussed. The results of our col linearity analysis of the published data are given in Section 4. Section 5 contains conclusions, recommendations and areas for further research. 2. COLLINEARITY IN REGRESSION 2.1 Definition Consider the usual linear model Y = X~ + E (2.1) where-y is-the-n x 1 vector of observations, X is tren x p matrix of known predictor (or explanatory) variables, B is the p x 1 vector of regression coefficients to be estimated, and E is the N x 1 vector of errors with-the usual assumptions. COllinearity denotes the presence of near linear relationships among the predictor variables (i.e., the colu~ns of X). This causes the matrix X X to be ill-conditioned (i.e., almost singular). The potential harmful effects that result from collinear data are twofold: (a) numerical instability of the least squares solution, b and its variance matrix, and (b) decreased precision (inflated variances) for estimated parameters. Inflated variances are quite harmful to the hypothesis testing, estimation and forecasting aspects of linear regression. 2.2 Col linearity Diagnostics in SAS Options in PROC REG allow the computation of two diagnostics for collinearity. The two diagnostics are (a) variance inflation factors (VIF) (IS) and (b) the diagnostic procedure of Belsley, Kuh, and Welsch (BKW) (16). Variance inflation factors are the diagonal elements of the inverse correlation matrix of X. Their colline- 562

2 r, arity diagnostic potential sterns from the relation 2 VIF. = l!(l-r. ), (2.4) where R. is the multiple correlation co~fficient of the ith variable regressed on the remaining predictor variables. A large VIF (greater than 10) indicates R~ near- unity and hence points to colliffearity. BKW (16) used the singular value decomposi-tion of the matrix X (scaled to unit length) to provide a diagnostic procedure. This procedure involves (a) the calculation of the condition indices of the matrix which are defined as the ratios of the largest singular value of X to each individual singular value (or equivalently the square roots of the ratio of the largest eigenvalue of X X to each individual eigenvalue and (b) the proportion of the variance of each regression estimate, bi' j=l.., P, acc~unted for by each eigenvector of X X. BKW claimed that degrading col linearity exists if the following two conditions hold: I) A singular value with a high condition index (greater than 30 or 100) which is associated with 2) high variance decomposition proportions (greater than SO%) for two or more estimated regression coefficient variances. The number of large condition indices gives the number of dependencies among the columns of X. The variancedecomposition proportions identifies the variables involved in the near dependencies. 2.3 Remedies for Collinearity There are several possible corrective measures for collinearity. 'l'hey include: (a) Collecting new data with a well designed experiment (which often is not practical), (bj Bayesian or mixed estimation, (16), (c) A biased estimation procedure such as ridge regression, (15) or (d) restricted least squares (RLS), (17). 3. Spline Regression We limit our discussion to cubic splines. 3.1 Linear Model Representations Suppose data are g1ven by Y i = S(t i ) + e i, i=l,,n ( 3.1) where S(t) is a cubic spline with knots at a l =k l <k 2 <,, <k,_l<k,= a 2 and e., independent identically distribueed random variabtes with mean 0 and variance, cr. It can be easily shown that the expression t-l S(t)=SO+Slt+S2t2+S3t3+" S (t_k.)3 J=2 j+2 J + where (3.2) (t-k j )+ = t-k j, if t>k j, j=2, o, otherwise is a cubic spline over the interval (ai' a 2 ) The '+' function formulation of a cubic spline can be set up as a linear regression model with the elements of the design matrix given by j-l._ ti ' J-l,2,3,4 x. 1.J. ( t -k _ ), 3 j=s,6, i j 3 + i=1,2,...,n (3.3) It is believed (7,8,11) that the '+'function representation has severe collineari ty when there are large numbels of interior knots k 2, k it-i. In this case, two alternative bases for splines (ANW-splines (9,11) and B-splines (12,13)) are recommended because they are more stable linear systems. These representations are just linear reparameterizations of the '+' function approach i.e., (3.4) where T is a pxp non-singular matrix and Z is the reparametrized matrix of predictors. For ANW-splines, the parameter vector, a, represents the ordinate values of the spline at the knots and the two pseudo-knots which estimate the boundary conditions of the spline (10,11). For B-splines it is difficult to interpret what the parameter vector ~ represents (8). Th~ choice. of a right linear nodel formulat1.oll of spll.lle depends on the spec i fic problem at hand. If it is desired to test hypotheses on the parameters B in "(3.3), then the '+' function ' bases should be used and assessed for any collinearity problem. Likewise if the objective of the analysis involves testing hypotheses on linear combination ~ = T~, then the appropriately reparametrized model (3.4) should be employed and also examined for col linearity_ Using a well-conditioned basis to fit a spline, when one is actually interested in estimating the parameters of an equivalent ill-conditioned parameterization, does little good and provides 563

3 a false sense of security. 3.2 Testing for Structural Change Spline functions have been used to test for structural change (10;11), that is to test for changes in the regression function at the interior knots (k Z ',.,.,k,). To check for structural-change one tests the hypothesis,j=2,.,t-l where~. is given in (3.2). )+2 (3.5) In this case, the '+' function linear model should be the procedure of choice. However, the usual approach has been to use ANW-splines and then backtransform the estimated parameters to obtain the test statistic for structural change. As we show in section 4, this strategy hides potentially harmful collinearity in tests for structural change and thus causes problems in the interpretation of results. 3.3 Specifying End Conditions Cubic splines are completely determined if the values of the spline at the knots and either (a) the values of the first derivative or (b) the second derivative at the end points are known. For the linear model formulations discussed in section (3.1), the end point conditions are implicitly estimated by ordinary least squares. However, another method is to specify the end conditions in advance i.e., incorporate them in the model formulation or as a set of linear restrictions on the regression parameters (RLS). There are several ways to express end conditions (9,10,11,18), e.g., s " " S (k,) = ~,S (k'_l) (3.8) where 5" (k) is the second derivative and TIl and TIt are proportionality constants to be specified. Recalling that RL5 (or equivalently model reformulation) is a possible remedy for collinearity, specifying end conditions in advance could have the beneficial effect of more precise estimates in terms of mean squared error. Unfortunately there are few guidelines concerning the choice of end conditions, and incorrect specication could result in biased estimates and, hence, affect the interpretation of results. 4. Col linearity Analysis of the Indy 500 Data This example originates from an article by Barzel (14), in which the growth of technical progress in the Indianapolis 500 race from its inception in 1911 and the effect that the non-racing war years had on this progress were examined. Poirer (10) examined whether there were structural changes in the Indy 500 winning speed vs. time (Year-1910) relationship due to two wars by employing a ANW-Spline regression model with knots {I, 7.5, 33.5, 61} (the interior knots were placed midway through the non-racing years) and end conditions constants 'IT1 = TI4 = 2 (given without justification). Like Barzel, Poirier concluded that the non-racing war years hampered technical progress. Sampson (18) showed that the results of the structural change tests were sensitive to the specification of end conditions. He used ANW-splines with a) no end conditions and b) Tl = 2 and TI4 = 4, and found that the results and conclusions differed with those obtained by Poirier. Further analysis, led Sampson to conclude that a quadratic spline with one interior knot at 33.5 provided the most parsimonious description of the data. The purpose of our reanalysis was twofold. First we wanted to provide an example in which the '+' function approach can suffer from degrading collinearity with just a few interior knots whereas ANW-splines and -B-splines do not. Secondly, that the tests for structural change are degraded by '+' function col linearity and this cannot be ascertained by employing ANW-splines. To this end we present the '+' function formulation of the three cubic spline models described above in Table 2. The coefficients for th) variables containing PI = (X-7.5)+ dnd P, = (X-33.S)!, are estimates of struceu:al change due to WWI and WWII, respectlvely. The collinearity diagnostics for the model (a) in Table 2 and the equivalent ANW and B-spline formulations are given in Table 1. For the '+' function representation (Table la), there are two condition indices greater than 100 indicating two near dependencies. The variance proportions as well as the large variance inflation factors show that a 1.1 the regression estimates suffer from degrading collinearity. Also, the WWI structural change estimate (b ) appears to be degraded more than the 4 WWII estimate 564

4 (b S )' The diagnostics applied to the corresponding ANW-spline and B-splines (Tables 2b and 2c, respectively) show that, as expected, these respresentations are well conditioned bases for this cubic spline function. Table 3 contains the collinearity diagnostics for the restricted models (bl and (el in 'I'able l. Also as advertised, the restricted models, while exhibiting degrading col linearity according to the usual rules of thumb, provide a large reduction in the degree of collinearity. 'l'his reanalysis of the Indy 500 data has shown that, in addition to the problems associated with the specification of end conditions (18), the potentially harmful effects of col linearity in assessing structural change must be considered when interpreting results. 5. Concluding Remarks We have shown that for a specific cubic spline applied to a published data set, the '+' function approach with just two interior knots is an ill-conditioned linear system. This should not have been a surprising result since it is well known that simple quadratic and cubic polynomial regression is collinear (191 (See Appendix AI. Unpublished work has shown that degrading col linearity for '+' functions, exists in many applications including the use of linear splines. Thus, the use of '+' functions should always be accompanied by a col linearity analysis. As advertised the 'ANW' and 'B' spline bases are well conditioned. Obviously the choice of bases depends on the purposes of the application. If the purpose is to test for structural change, the '+' function basis should be employed. To apply the col linearity diagnostics to a wellconditioned bases, and then backtransform these parameter estimates to obtain the test statistics for structural change would yield misleading information in that the potentially harmful effects of collinearity would be hidden (S1nce var (T a) = var (T Tb) var (bl = (X 1 XI- l o 2 11 Further, based on the results discussed in section 4, it would appear that the use of spline functions to test for structural change has limited practical value. If the test statistic is not statistically significant, it would be difficult to ascertain whether this was because of collinearity or because there actually is no structural change (although a procedure developed by Belsley (20) might help in deciding whether the degrading collinearity is harmful). Even in the case where a statistically significant result is obtained, the magnitude of the structural change could be affected by collinearity. Employing a restricted model through the specification of end conditions appeared to give a large reduction in the degree of degrading collinearity (equivalent to that of a cubic polynomial). However, the lack of guidelines in specifying these restrictions also makes this approach unattractive. Finding a data-based strategy for selection of end-conditions (such as cross-validation) might make the spline approach more appealing. However, in the absence of such a strategy, other statistical techniques such as intervention analysis (21) would appear to be more appropriate methods for testing structural change. Splines also have been extensively used for numerical differentiation and integration. Since derivative and integral estimates are linear combinations of the regression estimates, investigating the effects of col linearity on these estimates would also seem warranted. 6. References 1. Brunelle, R. L. and Johnson, D. W., 19RO. Proceedings of the 5th Annual SUGT Conference pp Carmer, S. G., proceedings of the 5th Annual SUGI Conference pp Hwang, I., Proceedings of the 3rd Annual SUGI Conference pp 4. Kral, K., Capizzi, T. P., and Small, R. D., Proceedings of the 4th Annual SUGI Conference pp Mehta, H., Capizzi, T. P., Oppenheimer, L., Proceedings of the 6th Annual SUGI Conference pp 2ll-216. S. Fuller, W., Introduction to Statistical Time Series, John Wiley, New York. 7. Lenth, R. V., Commun. Statist (A.) 6, Smith, P. L., Amer. Statist, 33,

5 9. Ahlberg, J. H., Nelso~, E. N., and Walsh, J.', The Theory of Splines and their Applications, Academic press, New York. 10. Poirier, D. J., J. Amer. Statist. Assoc., 68, Poirier, D. J., The Econometrics of Structural Change, Amer. Elsevier Publishing Co. r New York. 12. de Boor, C A Practical Guide to Splines, Springer-Verlag, New York. 13. Wold,S., J. Econ. Theory, 4, Barze1, Y., J. Econ. Theory, 4, Marquardt, D. W., Technometrics, 12, Be1s1ey, D.A., Kuh, E. and Welsch, R. E., Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. John Wiley & Sons, New York. 17. Searle, G., Linear Models. John Wi. ley, New York. 18. Sampson, P. D., J. Amer. Statist. Assoc., 74, Mosteller, F. and Tukey, J. w., Data Analysis and Regression. Addison-Wesley, Reading, Mass. 20. Belsley, D. A., MIT Technical Report ttr-i BOX, G. E. P. and Tiao, G., J. Amer. Statist. Assoc. 70, TABLE 1 COLLINEARITY DIAGNOSTICS OF CUBIC SPLINE REGRESSION MODEL WITH NO END CONDITIONS a) '+' function VIF b b l b 2 b 3 b 4 b S ,188 1,139,053 8,364,337 3,800, b) ANW-Splines Wo WI 1<2 1<3 1<4 W VIF t c ~, r f r, I ~ t ~ ; c) B-Splines VIF S S

6 ~-, TABLE 2 FITTED '+' FUNCTION SPLINES FOR INDY 500 WINNING SPEED VS. TIME DATA BY END CONDITIONS MODEL a) No End Conditions 1T4 = SPLINE S1t t t Pl P 2 (0.0163) (0.0007) Sit) t - O.OOOR (-0.19C+P ) (-3.44C+P ) 2 Sit) l.b8t (0.0003) (-1.63C+P 1 ) (0.0025) (0.0001) (0.89C+P2) (0.0006) where, PI ~ (t-7.s)3, P 2 ~ (t-33.5)3, C ~ (-42 t 2 + t 3 ), lsts a: Number in parenthesis is standard error for tests of structural change a) 1T 1 = 1T 4 = VIP TABLE 3 COLLINEARITY DIAGNOSTICS FOR RESTRICTED CUBIC SPLINE MODELS (Eqs. (4.2) and (4.3)) b b 1 b b s G<ll ) b) '1 ~ 2 '4 ~ VIP b o b 1 b b ?'? APPENDIX A COLLINEARITY ANALYSIS POR A CUBIC POLYNOMIAL FIT TO INDY 500 DATA (P(t) ~ ~O+~lt+~2t2+~3t3) VIP Est.:l..mates b b 1 b b

Detecting and Circumventing Collinearity or Ill-Conditioning Problems

Detecting and Circumventing Collinearity or Ill-Conditioning Problems Chapter 8 Detecting and Circumventing Collinearity or Ill-Conditioning Problems Section 8.1 Introduction Multicollinearity/Collinearity/Ill-Conditioning The terms multicollinearity, collinearity, and ill-conditioning

More information

CREATING THE ANALYSIS

CREATING THE ANALYSIS Chapter 14 Multiple Regression Chapter Table of Contents CREATING THE ANALYSIS...214 ModelInformation...217 SummaryofFit...217 AnalysisofVariance...217 TypeIIITests...218 ParameterEstimates...218 Residuals-by-PredictedPlot...219

More information

Living with Collinearity in Local Regression Models

Living with Collinearity in Local Regression Models Living with Collinearity in Local Regression Models Chris Brunsdon 1, Martin Charlton 2, Paul Harris 2 1 People Space and Place, Roxby Building, University of Liverpool,L69 7ZT, UK Tel. +44 151 794 2837

More information

Linear Methods for Regression and Shrinkage Methods

Linear Methods for Regression and Shrinkage Methods Linear Methods for Regression and Shrinkage Methods Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Linear Regression Models Least Squares Input vectors

More information

Generalized Additive Models

Generalized Additive Models :p Texts in Statistical Science Generalized Additive Models An Introduction with R Simon N. Wood Contents Preface XV 1 Linear Models 1 1.1 A simple linear model 2 Simple least squares estimation 3 1.1.1

More information

Generalised Mean Averaging Interpolation by Discrete Cubic Splines

Generalised Mean Averaging Interpolation by Discrete Cubic Splines Publ. RIMS, Kyoto Univ. 30 (1994), 89-95 Generalised Mean Averaging Interpolation by Discrete Cubic Splines By Manjulata SHRIVASTAVA* Abstract The aim of this work is to introduce for a discrete function,

More information

The perturb Package. April 11, colldiag... 1 consumption... 3 perturb... 4 reclassify Index 13

The perturb Package. April 11, colldiag... 1 consumption... 3 perturb... 4 reclassify Index 13 Title Tools for evaluating collinearity Version 2.01 Author John Hendrickx The perturb Package April 11, 2005 Description "perturb" evaluates collinearity by adding random noise to selected variables.

More information

REPLACING MLE WITH BAYESIAN SHRINKAGE CAS ANNUAL MEETING NOVEMBER 2018 GARY G. VENTER

REPLACING MLE WITH BAYESIAN SHRINKAGE CAS ANNUAL MEETING NOVEMBER 2018 GARY G. VENTER REPLACING MLE WITH BAYESIAN SHRINKAGE CAS ANNUAL MEETING NOVEMBER 2018 GARY G. VENTER ESTIMATION Problems with MLE known since Charles Stein 1956 paper He showed that when estimating 3 or more means, shrinking

More information

SASEG 9B Regression Assumptions

SASEG 9B Regression Assumptions SASEG 9B Regression Assumptions (Fall 2015) Sources (adapted with permission)- T. P. Cronan, Jeff Mullins, Ron Freeze, and David E. Douglas Course and Classroom Notes Enterprise Systems, Sam M. Walton

More information

A Robust Optimum Response Surface Methodology based On MM-estimator

A Robust Optimum Response Surface Methodology based On MM-estimator A Robust Optimum Response Surface Methodology based On MM-estimator, 2 HABSHAH MIDI,, 2 MOHD SHAFIE MUSTAFA,, 2 ANWAR FITRIANTO Department of Mathematics, Faculty Science, University Putra Malaysia, 434,

More information

Moving Beyond Linearity

Moving Beyond Linearity Moving Beyond Linearity Basic non-linear models one input feature: polynomial regression step functions splines smoothing splines local regression. more features: generalized additive models. Polynomial

More information

Nonparametric Approaches to Regression

Nonparametric Approaches to Regression Nonparametric Approaches to Regression In traditional nonparametric regression, we assume very little about the functional form of the mean response function. In particular, we assume the model where m(xi)

More information

CDAA No. 4 - Part Two - Multiple Regression - Initial Data Screening

CDAA No. 4 - Part Two - Multiple Regression - Initial Data Screening CDAA No. 4 - Part Two - Multiple Regression - Initial Data Screening Variables Entered/Removed b Variables Entered GPA in other high school, test, Math test, GPA, High school math GPA a Variables Removed

More information

[spa-temp.inf] Spatial-temporal information

[spa-temp.inf] Spatial-temporal information [spa-temp.inf] Spatial-temporal information VI Table of Contents for Spatial-temporal information I. Spatial-temporal information........................................... VI - 1 A. Cohort-survival method.........................................

More information

Splines and penalized regression

Splines and penalized regression Splines and penalized regression November 23 Introduction We are discussing ways to estimate the regression function f, where E(y x) = f(x) One approach is of course to assume that f has a certain shape,

More information

On the deviation of a parametric cubic spline interpolant from its data polygon

On the deviation of a parametric cubic spline interpolant from its data polygon Computer Aided Geometric Design 25 (2008) 148 156 wwwelseviercom/locate/cagd On the deviation of a parametric cubic spline interpolant from its data polygon Michael S Floater Department of Computer Science,

More information

Discrete Cubic Interpolatory Splines

Discrete Cubic Interpolatory Splines Publ RIMS, Kyoto Univ. 28 (1992), 825-832 Discrete Cubic Interpolatory Splines By Manjulata SHRIVASTAVA* Abstract In the present paper, existence, uniqueness and convergence properties of a discrete cubic

More information

Estimating normal vectors and curvatures by centroid weights

Estimating normal vectors and curvatures by centroid weights Computer Aided Geometric Design 21 (2004) 447 458 www.elsevier.com/locate/cagd Estimating normal vectors and curvatures by centroid weights Sheng-Gwo Chen, Jyh-Yang Wu Department of Mathematics, National

More information

99 International Journal of Engineering, Science and Mathematics

99 International Journal of Engineering, Science and Mathematics Journal Homepage: Applications of cubic splines in the numerical solution of polynomials Najmuddin Ahmad 1 and Khan Farah Deeba 2 Department of Mathematics Integral University Lucknow Abstract: In this

More information

Non-linear Point Distribution Modelling using a Multi-layer Perceptron

Non-linear Point Distribution Modelling using a Multi-layer Perceptron Non-linear Point Distribution Modelling using a Multi-layer Perceptron P.D. Sozou, T.F. Cootes, C.J. Taylor and E.C. Di Mauro Department of Medical Biophysics University of Manchester M13 9PT email: pds@svl.smb.man.ac.uk

More information

7. Collinearity and Model Selection

7. Collinearity and Model Selection Sociology 740 John Fox Lecture Notes 7. Collinearity and Model Selection Copyright 2014 by John Fox Collinearity and Model Selection 1 1. Introduction I When there is a perfect linear relationship among

More information

Natural Quartic Spline

Natural Quartic Spline Natural Quartic Spline Rafael E Banchs INTRODUCTION This report describes the natural quartic spline algorithm developed for the enhanced solution of the Time Harmonic Field Electric Logging problem As

More information

Multicollinearity and Validation CIVL 7012/8012

Multicollinearity and Validation CIVL 7012/8012 Multicollinearity and Validation CIVL 7012/8012 2 In Today s Class Recap Multicollinearity Model Validation MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3.

More information

Data Management - 50%

Data Management - 50% Exam 1: SAS Big Data Preparation, Statistics, and Visual Exploration Data Management - 50% Navigate within the Data Management Studio Interface Register a new QKB Create and connect to a repository Define

More information

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov.

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov. EE613 Machine Learning for Engineers LINEAR REGRESSION Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov. 4, 2015 1 Outline Multivariate ordinary least squares Singular value

More information

SAS PROC GLM and PROC MIXED. for Recovering Inter-Effect Information

SAS PROC GLM and PROC MIXED. for Recovering Inter-Effect Information SAS PROC GLM and PROC MIXED for Recovering Inter-Effect Information Walter T. Federer Biometrics Unit Cornell University Warren Hall Ithaca, NY -0 biometrics@comell.edu Russell D. Wolfinger SAS Institute

More information

MS in Applied Statistics: Study Guide for the Data Science concentration Comprehensive Examination. 1. MAT 456 Applied Regression Analysis

MS in Applied Statistics: Study Guide for the Data Science concentration Comprehensive Examination. 1. MAT 456 Applied Regression Analysis MS in Applied Statistics: Study Guide for the Data Science concentration Comprehensive Examination. The Part II comprehensive examination is a three-hour closed-book exam that is offered on the second

More information

Assignment 2. Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions

Assignment 2. Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions ENEE 739Q: STATISTICAL AND NEURAL PATTERN RECOGNITION Spring 2002 Assignment 2 Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions Aravind Sundaresan

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 Lecture 25: Bezier Subdivision And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 1. Divide and Conquer If we are going to build useful

More information

Linear Interpolating Splines

Linear Interpolating Splines Jim Lambers MAT 772 Fall Semester 2010-11 Lecture 17 Notes Tese notes correspond to Sections 112, 11, and 114 in te text Linear Interpolating Splines We ave seen tat ig-degree polynomial interpolation

More information

Stat 5100 Handout #14.a SAS: Logistic Regression

Stat 5100 Handout #14.a SAS: Logistic Regression Stat 5100 Handout #14.a SAS: Logistic Regression Example: (Text Table 14.3) Individuals were randomly sampled within two sectors of a city, and checked for presence of disease (here, spread by mosquitoes).

More information

Rational Bezier Surface

Rational Bezier Surface Rational Bezier Surface The perspective projection of a 4-dimensional polynomial Bezier surface, S w n ( u, v) B i n i 0 m j 0, u ( ) B j m, v ( ) P w ij ME525x NURBS Curve and Surface Modeling Page 97

More information

A GENTLE INTRODUCTION TO THE BASIC CONCEPTS OF SHAPE SPACE AND SHAPE STATISTICS

A GENTLE INTRODUCTION TO THE BASIC CONCEPTS OF SHAPE SPACE AND SHAPE STATISTICS A GENTLE INTRODUCTION TO THE BASIC CONCEPTS OF SHAPE SPACE AND SHAPE STATISTICS HEMANT D. TAGARE. Introduction. Shape is a prominent visual feature in many images. Unfortunately, the mathematical theory

More information

Spatial Patterns Point Pattern Analysis Geographic Patterns in Areal Data

Spatial Patterns Point Pattern Analysis Geographic Patterns in Areal Data Spatial Patterns We will examine methods that are used to analyze patterns in two sorts of spatial data: Point Pattern Analysis - These methods concern themselves with the location information associated

More information

Basics of Multivariate Modelling and Data Analysis

Basics of Multivariate Modelling and Data Analysis Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 9. Linear regression with latent variables 9.1 Principal component regression (PCR) 9.2 Partial least-squares regression (PLS) [ mostly

More information

I How does the formulation (5) serve the purpose of the composite parameterization

I How does the formulation (5) serve the purpose of the composite parameterization Supplemental Material to Identifying Alzheimer s Disease-Related Brain Regions from Multi-Modality Neuroimaging Data using Sparse Composite Linear Discrimination Analysis I How does the formulation (5)

More information

On an approach for cubic Bézier interpolation

On an approach for cubic Bézier interpolation Second International Conference Modelling and Development of Intelligent Systems Sibiu - Romania, September 29 - October 02, 2011 On an approach for cubic Bézier interpolation Dana Simian, Corina Simian

More information

An introduction to interpolation and splines

An introduction to interpolation and splines An introduction to interpolation and splines Kenneth H. Carpenter, EECE KSU November 22, 1999 revised November 20, 2001, April 24, 2002, April 14, 2004 1 Introduction Suppose one wishes to draw a curve

More information

Lecture on Modeling Tools for Clustering & Regression

Lecture on Modeling Tools for Clustering & Regression Lecture on Modeling Tools for Clustering & Regression CS 590.21 Analysis and Modeling of Brain Networks Department of Computer Science University of Crete Data Clustering Overview Organizing data into

More information

Lacunary Interpolation Using Quartic B-Spline

Lacunary Interpolation Using Quartic B-Spline General Letters in Mathematic, Vol. 2, No. 3, June 2017, pp. 129-137 e-issn 2519-9277, p-issn 2519-9269 Available online at http:\\ www.refaad.com Lacunary Interpolation Using Quartic B-Spline 1 Karwan

More information

Perception Viewed as an Inverse Problem

Perception Viewed as an Inverse Problem Perception Viewed as an Inverse Problem Fechnerian Causal Chain of Events - an evaluation Fechner s study of outer psychophysics assumes that the percept is a result of a causal chain of events: Distal

More information

Performance Estimation and Regularization. Kasthuri Kannan, PhD. Machine Learning, Spring 2018

Performance Estimation and Regularization. Kasthuri Kannan, PhD. Machine Learning, Spring 2018 Performance Estimation and Regularization Kasthuri Kannan, PhD. Machine Learning, Spring 2018 Bias- Variance Tradeoff Fundamental to machine learning approaches Bias- Variance Tradeoff Error due to Bias:

More information

STAT 2607 REVIEW PROBLEMS Word problems must be answered in words of the problem.

STAT 2607 REVIEW PROBLEMS Word problems must be answered in words of the problem. STAT 2607 REVIEW PROBLEMS 1 REMINDER: On the final exam 1. Word problems must be answered in words of the problem. 2. "Test" means that you must carry out a formal hypothesis testing procedure with H0,

More information

Fitting latency models using B-splines in EPICURE for DOS

Fitting latency models using B-splines in EPICURE for DOS Fitting latency models using B-splines in EPICURE for DOS Michael Hauptmann, Jay Lubin January 11, 2007 1 Introduction Disease latency refers to the interval between an increment of exposure and a subsequent

More information

Approximation by NURBS curves with free knots

Approximation by NURBS curves with free knots Approximation by NURBS curves with free knots M Randrianarivony G Brunnett Technical University of Chemnitz, Faculty of Computer Science Computer Graphics and Visualization Straße der Nationen 6, 97 Chemnitz,

More information

Introduction to Statistical Analyses in SAS

Introduction to Statistical Analyses in SAS Introduction to Statistical Analyses in SAS Programming Workshop Presented by the Applied Statistics Lab Sarah Janse April 5, 2017 1 Introduction Today we will go over some basic statistical analyses in

More information

Annotated multitree output

Annotated multitree output Annotated multitree output A simplified version of the two high-threshold (2HT) model, applied to two experimental conditions, is used as an example to illustrate the output provided by multitree (version

More information

Lecture 2.2 Cubic Splines

Lecture 2.2 Cubic Splines Lecture. Cubic Splines Cubic Spline The equation for a single parametric cubic spline segment is given by 4 i t Bit t t t i (..) where t and t are the parameter values at the beginning and end of the segment.

More information

See the course website for important information about collaboration and late policies, as well as where and when to turn in assignments.

See the course website for important information about collaboration and late policies, as well as where and when to turn in assignments. COS Homework # Due Tuesday, February rd See the course website for important information about collaboration and late policies, as well as where and when to turn in assignments. Data files The questions

More information

Splines. Patrick Breheny. November 20. Introduction Regression splines (parametric) Smoothing splines (nonparametric)

Splines. Patrick Breheny. November 20. Introduction Regression splines (parametric) Smoothing splines (nonparametric) Splines Patrick Breheny November 20 Patrick Breheny STA 621: Nonparametric Statistics 1/46 Introduction Introduction Problems with polynomial bases We are discussing ways to estimate the regression function

More information

PERFORMANCE OF THE DISTRIBUTED KLT AND ITS APPROXIMATE IMPLEMENTATION

PERFORMANCE OF THE DISTRIBUTED KLT AND ITS APPROXIMATE IMPLEMENTATION 20th European Signal Processing Conference EUSIPCO 2012) Bucharest, Romania, August 27-31, 2012 PERFORMANCE OF THE DISTRIBUTED KLT AND ITS APPROXIMATE IMPLEMENTATION Mauricio Lara 1 and Bernard Mulgrew

More information

KEYWORDS Morphing, CAE workflow, Optimization, Automation, DOE, Regression, CFD, FEM, Python

KEYWORDS Morphing, CAE workflow, Optimization, Automation, DOE, Regression, CFD, FEM, Python DESIGN OPTIMIZATION WITH ANSA MORPH 1 Tobias Eidevåg *, 1 David Tarazona Ramos *, 1 Mohammad El-Alti 1 Alten AB, Sweden KEYWORDS Morphing, CAE workflow, Optimization, Automation, DOE, Regression, CFD,

More information

Normals of subdivision surfaces and their control polyhedra

Normals of subdivision surfaces and their control polyhedra Normals of subdivision surfaces and their control polyhedra I. Ginkel, a, J. Peters b, and G. Umlauf a, a University of Kaiserslautern, Germany b University of Florida, Gainesville, FL, USA Abstract For

More information

Important Properties of B-spline Basis Functions

Important Properties of B-spline Basis Functions Important Properties of B-spline Basis Functions P2.1 N i,p (u) = 0 if u is outside the interval [u i, u i+p+1 ) (local support property). For example, note that N 1,3 is a combination of N 1,0, N 2,0,

More information

A fast Mixed Model B-splines algorithm

A fast Mixed Model B-splines algorithm A fast Mixed Model B-splines algorithm arxiv:1502.04202v1 [stat.co] 14 Feb 2015 Abstract Martin P. Boer Biometris WUR Wageningen The Netherlands martin.boer@wur.nl February 17, 2015 A fast algorithm for

More information

Normals of subdivision surfaces and their control polyhedra

Normals of subdivision surfaces and their control polyhedra Computer Aided Geometric Design 24 (27 112 116 www.elsevier.com/locate/cagd Normals of subdivision surfaces and their control polyhedra I. Ginkel a,j.peters b,,g.umlauf a a University of Kaiserslautern,

More information

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory Roshdy Foaad Abo-Shanab Kafr Elsheikh University/Department of Mechanical Engineering, Kafr Elsheikh,

More information

Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS

Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS ABSTRACT Paper 1938-2018 Frequencies, Unequal Variance Weights, and Sampling Weights: Similarities and Differences in SAS Robert M. Lucas, Robert M. Lucas Consulting, Fort Collins, CO, USA There is confusion

More information

Know it. Control points. B Spline surfaces. Implicit surfaces

Know it. Control points. B Spline surfaces. Implicit surfaces Know it 15 B Spline Cur 14 13 12 11 Parametric curves Catmull clark subdivision Parametric surfaces Interpolating curves 10 9 8 7 6 5 4 3 2 Control points B Spline surfaces Implicit surfaces Bezier surfaces

More information

Heteroscedasticity-Consistent Standard Error Estimates for the Linear Regression Model: SPSS and SAS Implementation. Andrew F.

Heteroscedasticity-Consistent Standard Error Estimates for the Linear Regression Model: SPSS and SAS Implementation. Andrew F. Heteroscedasticity-Consistent Standard Error Estimates for the Linear Regression Model: SPSS and SAS Implementation Andrew F. Hayes 1 The Ohio State University Columbus, Ohio hayes.338@osu.edu Draft: January

More information

Sandra S. Stinnett. Department of Biostatistics University of North Carolina. Institute of Statistics Mimeo Series No. 2125T.

Sandra S. Stinnett. Department of Biostatistics University of North Carolina. Institute of Statistics Mimeo Series No. 2125T. COLLINEARITY IN MIXED MODELS by Sandra S. Stinnett Department of Biostatistics University of North Carolina Institute of Statistics Mimeo Series No. 2125T December 1993 COLLINEARITY IN MIXED MODELS by

More information

LISA: Explore JMP Capabilities in Design of Experiments. Liaosa Xu June 21, 2012

LISA: Explore JMP Capabilities in Design of Experiments. Liaosa Xu June 21, 2012 LISA: Explore JMP Capabilities in Design of Experiments Liaosa Xu June 21, 2012 Course Outline Why We Need Custom Design The General Approach JMP Examples Potential Collinearity Issues Prior Design Evaluations

More information

Stat 5100 Handout #19 SAS: Influential Observations and Outliers

Stat 5100 Handout #19 SAS: Influential Observations and Outliers Stat 5100 Handout #19 SAS: Influential Observations and Outliers Example: Data collected on 50 countries relevant to a cross-sectional study of a lifecycle savings hypothesis, which states that the response

More information

CS 450 Numerical Analysis. Chapter 7: Interpolation

CS 450 Numerical Analysis. Chapter 7: Interpolation Lecture slides based on the textbook Scientific Computing: An Introductory Survey by Michael T. Heath, copyright c 2018 by the Society for Industrial and Applied Mathematics. http://www.siam.org/books/cl80

More information

Regression on SAT Scores of 374 High Schools and K-means on Clustering Schools

Regression on SAT Scores of 374 High Schools and K-means on Clustering Schools Regression on SAT Scores of 374 High Schools and K-means on Clustering Schools Abstract In this project, we study 374 public high schools in New York City. The project seeks to use regression techniques

More information

Lecture 8. Divided Differences,Least-Squares Approximations. Ceng375 Numerical Computations at December 9, 2010

Lecture 8. Divided Differences,Least-Squares Approximations. Ceng375 Numerical Computations at December 9, 2010 Lecture 8, Ceng375 Numerical Computations at December 9, 2010 Computer Engineering Department Çankaya University 8.1 Contents 1 2 3 8.2 : These provide a more efficient way to construct an interpolating

More information

ST512. Fall Quarter, Exam 1. Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false.

ST512. Fall Quarter, Exam 1. Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false. ST512 Fall Quarter, 2005 Exam 1 Name: Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false. 1. (42 points) A random sample of n = 30 NBA basketball

More information

Preemptive Scheduling of Equal-Length Jobs in Polynomial Time

Preemptive Scheduling of Equal-Length Jobs in Polynomial Time Preemptive Scheduling of Equal-Length Jobs in Polynomial Time George B. Mertzios and Walter Unger Abstract. We study the preemptive scheduling problem of a set of n jobs with release times and equal processing

More information

Bezier Curves : Simple Smoothers of Noisy Data

Bezier Curves : Simple Smoothers of Noisy Data Developments in Statistics and Methodology A. Ferligoj and A. Kramberger (Editors) Metodološki zvezki, 9, Ljubljana : FDV 1993 Bezier Curves : Simple Smoothers of Noisy Data Andrej Blejec1 Abstract In

More information

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov.

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov. EE613 Machine Learning for Engineers LINEAR REGRESSION Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov. 9, 2017 1 Outline Multivariate ordinary least squares Matlab code:

More information

Parameterization of triangular meshes

Parameterization of triangular meshes Parameterization of triangular meshes Michael S. Floater November 10, 2009 Triangular meshes are often used to represent surfaces, at least initially, one reason being that meshes are relatively easy to

More information

Structural performance simulation using finite element and regression analyses

Structural performance simulation using finite element and regression analyses Structural performance simulation using finite element and regression analyses M. El-Sayed 1, M. Edghill 1 & J. Housner 2 1 Department of Mechanical Engineering, Kettering University, USA 2 NASA Langley

More information

Computational Physics PHYS 420

Computational Physics PHYS 420 Computational Physics PHYS 420 Dr Richard H. Cyburt Assistant Professor of Physics My office: 402c in the Science Building My phone: (304) 384-6006 My email: rcyburt@concord.edu My webpage: www.concord.edu/rcyburt

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

MODERN FACTOR ANALYSIS

MODERN FACTOR ANALYSIS MODERN FACTOR ANALYSIS Harry H. Harman «ö THE pigj UNIVERSITY OF CHICAGO PRESS Contents LIST OF ILLUSTRATIONS GUIDE TO NOTATION xv xvi Parti Foundations of Factor Analysis 1. INTRODUCTION 3 1.1. Brief

More information

Design Optimization of a Compressor Loop Pipe Using Response Surface Method

Design Optimization of a Compressor Loop Pipe Using Response Surface Method Purdue University Purdue e-pubs International Compressor Engineering Conference School of Mechanical Engineering 2004 Design Optimization of a Compressor Loop Pipe Using Response Surface Method Serg Myung

More information

Operators to calculate the derivative of digital signals

Operators to calculate the derivative of digital signals 9 th IMEKO TC 4 Symposium and 7 th IWADC Workshop July 8-9, 3, Barcelona, Spain Operators to calculate the derivative of digital signals Lluís Ferrer-Arnau, Juan Mon-Gonzalez, Vicenç Parisi-Baradad Departament

More information

Business Club. Decision Trees

Business Club. Decision Trees Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Basis Functions Tom Kelsey School of Computer Science University of St Andrews http://www.cs.st-andrews.ac.uk/~tom/ tom@cs.st-andrews.ac.uk Tom Kelsey ID5059-02-BF 2015-02-04

More information

Globally Stabilized 3L Curve Fitting

Globally Stabilized 3L Curve Fitting Globally Stabilized 3L Curve Fitting Turker Sahin and Mustafa Unel Department of Computer Engineering, Gebze Institute of Technology Cayirova Campus 44 Gebze/Kocaeli Turkey {htsahin,munel}@bilmuh.gyte.edu.tr

More information

Data Analysis and Solver Plugins for KSpread USER S MANUAL. Tomasz Maliszewski

Data Analysis and Solver Plugins for KSpread USER S MANUAL. Tomasz Maliszewski Data Analysis and Solver Plugins for KSpread USER S MANUAL Tomasz Maliszewski tmaliszewski@wp.pl Table of Content CHAPTER 1: INTRODUCTION... 3 1.1. ABOUT DATA ANALYSIS PLUGIN... 3 1.3. ABOUT SOLVER PLUGIN...

More information

davidr Cornell University

davidr Cornell University 1 NONPARAMETRIC RANDOM EFFECTS MODELS AND LIKELIHOOD RATIO TESTS Oct 11, 2002 David Ruppert Cornell University www.orie.cornell.edu/ davidr (These transparencies and preprints available link to Recent

More information

Bootstrap selection of. Multivariate Additive PLS Spline. models

Bootstrap selection of. Multivariate Additive PLS Spline. models Bootstrap selection of Multivariate Additive PLS Spline models Jean-François Durand Montpellier II University, France E-Mail: support@jf-durand-pls.com Abdelaziz Faraj Institut Français du Pétrole E-Mail:

More information

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review Gautam Kunapuli Machine Learning Data is identically and independently distributed Goal is to learn a function that maps to Data is generated using an unknown function Learn a hypothesis that minimizes

More information

Effects of PROC EXPAND Data Interpolation on Time Series Modeling When the Data are Volatile or Complex

Effects of PROC EXPAND Data Interpolation on Time Series Modeling When the Data are Volatile or Complex Effects of PROC EXPAND Data Interpolation on Time Series Modeling When the Data are Volatile or Complex Keiko I. Powers, Ph.D., J. D. Power and Associates, Westlake Village, CA ABSTRACT Discrete time series

More information

Generalized Additive Model

Generalized Additive Model Generalized Additive Model by Huimin Liu Department of Mathematics and Statistics University of Minnesota Duluth, Duluth, MN 55812 December 2008 Table of Contents Abstract... 2 Chapter 1 Introduction 1.1

More information

Spline Models. Introduction to CS and NCS. Regression splines. Smoothing splines

Spline Models. Introduction to CS and NCS. Regression splines. Smoothing splines Spline Models Introduction to CS and NCS Regression splines Smoothing splines 3 Cubic Splines a knots: a< 1 < 2 < < m

More information

Applied Regression Modeling: A Business Approach

Applied Regression Modeling: A Business Approach i Applied Regression Modeling: A Business Approach Computer software help: SPSS SPSS (originally Statistical Package for the Social Sciences ) is a commercial statistical software package with an easy-to-use

More information

REGULARIZED REGRESSION FOR RESERVING AND MORTALITY MODELS GARY G. VENTER

REGULARIZED REGRESSION FOR RESERVING AND MORTALITY MODELS GARY G. VENTER REGULARIZED REGRESSION FOR RESERVING AND MORTALITY MODELS GARY G. VENTER TODAY Advances in model estimation methodology Application to data that comes in rectangles Examples ESTIMATION Problems with MLE

More information

8 Piecewise Polynomial Interpolation

8 Piecewise Polynomial Interpolation Applied Math Notes by R. J. LeVeque 8 Piecewise Polynomial Interpolation 8. Pitfalls of high order interpolation Suppose we know the value of a function at several points on an interval and we wish to

More information

Four equations are necessary to evaluate these coefficients. Eqn

Four equations are necessary to evaluate these coefficients. Eqn 1.2 Splines 11 A spline function is a piecewise defined function with certain smoothness conditions [Cheney]. A wide variety of functions is potentially possible; polynomial functions are almost exclusively

More information

Assignment 2. with (a) (10 pts) naive Gauss elimination, (b) (10 pts) Gauss with partial pivoting

Assignment 2. with (a) (10 pts) naive Gauss elimination, (b) (10 pts) Gauss with partial pivoting Assignment (Be sure to observe the rules about handing in homework). Solve: with (a) ( pts) naive Gauss elimination, (b) ( pts) Gauss with partial pivoting *You need to show all of the steps manually.

More information

STAT 705 Introduction to generalized additive models

STAT 705 Introduction to generalized additive models STAT 705 Introduction to generalized additive models Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 22 Generalized additive models Consider a linear

More information

Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2. Xi Wang and Ronald K. Hambleton

Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2. Xi Wang and Ronald K. Hambleton Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2 Xi Wang and Ronald K. Hambleton University of Massachusetts Amherst Introduction When test forms are administered to

More information

Chapter 6: Linear Model Selection and Regularization

Chapter 6: Linear Model Selection and Regularization Chapter 6: Linear Model Selection and Regularization As p (the number of predictors) comes close to or exceeds n (the sample size) standard linear regression is faced with problems. The variance of the

More information

Lab 07: Multiple Linear Regression: Variable Selection

Lab 07: Multiple Linear Regression: Variable Selection Lab 07: Multiple Linear Regression: Variable Selection OBJECTIVES 1.Use PROC REG to fit multiple regression models. 2.Learn how to find the best reduced model. 3.Variable diagnostics and influential statistics

More information

Lab 3: From Data to Models

Lab 3: From Data to Models Lab 3: From Data to Models One of the goals of mathematics is to explain phenomena represented by data. In the business world, there is an increasing dependence on models. We may want to represent sales

More information

CHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY

CHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY 23 CHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY 3.1 DESIGN OF EXPERIMENTS Design of experiments is a systematic approach for investigation of a system or process. A series

More information

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm In the name of God Part 4. 4.1. Dantzig-Wolf Decomposition Algorithm Spring 2010 Instructor: Dr. Masoud Yaghini Introduction Introduction Real world linear programs having thousands of rows and columns.

More information