The Piecewise Regression Model as a Response Modeling Tool

Size: px
Start display at page:

Download "The Piecewise Regression Model as a Response Modeling Tool"

Transcription

1 NESUG 7 The Piecewise Regression Model as a Response Modeling Tool Eugene Brusilovskiy University of Pennsylvania Philadelphia, PA Abstract The general problem in response modeling is to identify a response curve and estimate the diminishing returns effect. There are d ifferent approaches to response modeling in SAS with emphasis on caveats (OLS segmented regression, robust regression, neural nets, and nonparametric regression). In this paper, we formulate a new problem statement of response modeling as a concave piecewise approximation of a response curve. We use artificial data to illustrate this approach. As the accuracy of the solution depends on the ratio of signal/(signal+noise), we can obtain the exact solution when the ratio is close to one, or when the knots are known. A three step heuristic approach is suggested for the situation when the data are significantly contaminated and/or the knots are unknown. The approach includes three steps. First, we run a dummy regression with the SAS PROC REG to estimate the parameters of dummy variables. Then, we test for a structural break in the series of estimated parameters from the dummy regression using the Chow test in PROC AUTOREG or PROC MODEL. If a change point is identified, then it is treated as a knot or as a first approximation of a knot coordinates in piecewise regression. Finally, we develop a piecewise concave regression with PROC NLP. Introduction Promotion response modeling (PRM) is a necessary decision support tool in today s highly-competitive market. Since the consequences (cost, for instance) of wrong decisions are increasing, so is the role of the promotion response modeling. PRM is industry-specific: for example, response modeling in the credit card industry is essentially different from that in the pharmaceutical industry. In this paper, we will concentrate on the latter, where a response curve is used for the evaluation of effectiveness of the promotion campaign, sales force allocation, development of the optimal marketing mix, etc. Each of these problems may require its own definition of the response curve. In general, however, a promotion response curve is the result of PRM and could be defined as a mathematical construct (depending on the nature of its application) that relates a promotional effort to its response. In the pharmaceutical industry, the response could be the number of new prescriptions, and the promotion effort could be doctor detailing, controlling for all other promotion efforts. As stated above, an adequate definition of the response curve is very important. Real promotion campaign data are very noisy, and the relationship between promotional efforts and responses to them are very weak. Moreover, it is necessary to take into account the diminishing returns and the monotonically increasing nature of the response curve. In other words, we assume that the higher the promotion effort, the higher the response, until some point where the over-promotion effect may kick in. Nonparametric regression would not be helpful here because the resulting response curve will, as a rule, have multiple maxima and minima. Therefore, we formulate the problem of response modeling as a problem of nonlinear optimization with linear and nonlinear constraints. To be specific, we have to find a concave piecewise linear approximation of the relationship between the response and promotion efforts. Concavity and monotonicity are necessary to reflect the diminishing returns in the resulting response curve. The piecewise linearity condition has to hold for the sake of simplicity of the next steps of the decision support, that requires optimization. Problem Statement To make things easier, we will consider only three pieces in the concave piecewise linear approximation of the response curve (see Graph ). Many authors have considered the problem of piecewise linear approximation (for example, see (), We know that a product is over-promoted when the marginal response becomes negative.

2 NESUG 7 (3)). In this paper, we impose the concavity restriction that is not present in other works; moreover, and we want to solve this problem in SAS. Let Y be the response, X be the promotion effort, and S and S be the first and second knots, respectively. Then, based on promotional data, we need to find the set of unknown parameters B 0, B, B, B 3, S, S that minimize the objective function (the sum of squared residuals): () Y Y Y = B = B = B B + B S + B S X, + B ( X + B ( S S ), S ) + B 3 ( X S ), X S S < X S X > S Where B > 0, B > 0, B B > 0, B B 3 > 0, S > 0, and S S > 0. This formulation assumes that B 3 could be negative. If this is the case, the response goes down as promotion efforts go up this is the over-promotion effect we mentioned above. If we believe that there is no over-promotion effect, then we have to impose an additional restriction on B 3 to be nonnegative. According to the definition of the response curve Y, it is continuous, but not differentiable at the knots. Mathematically, the problem of finding the response curve Y, formulated above, is that of nonlinear programming with a continuous but non-differentiable objective function and linear and nonlinear constraints. This type of problem could be solved by the SAS/OR Non-Linear Programming procedure (PROC NLP). PROC NLP has different optimization algorithms. However, we can use only one of them, namely Nelder-Mead Simplex Method (NMSIMP). It is the only algorithm that doesn t require first and second order derivatives, and allows for boundary constraints, linear constraints and nonlinear constraints. (p. 49, ()) Caveats and the Solution The formulation of the problem of finding the concave piecewise linear response curve seems very simple, however, this is just an illusion. Even when the data don t contain any noise, the estimation of the concave nonlinear piecewise response curve does not always lead to a precise solution. Let s consider the following example, where we assume that our response curve consists of three linear pieces. () Y Y Y = = = A0 + A X, A0 + AT + A ( X T ), A + A T + A ( T T ) + A ( X 0 3 T ), X T T < X X > T T Where A 0 =0, A =4, A =, A 3 = -, T =8, T =6. Substituting, we get: (3) Y Y Y = 4X, = ( X 8) = 4 + X, = (6 8) ( X 6) = 56 X, X 8 8 < X 6 X > 6 Our goal is to use PROC NLP (see code below) to find estimates B 0, B, B, B 3, S, S of parameters A 0, A, A, A 3, T, T based only on the data for X and Y generated by () using the program in Appendix. As can be seen from the code, we consider the situation without the over-promotion effect: A 3 =0.>0. First, the PROC NLP code did not include any initial values for parameters. In this situation, PROC NLP automatically assigns random initial values, and the results of the parameter estimation vary according to those initial values. If these randomly assigned initial values are not close to the actual values, PROC NLP is not able to find the exact solution. The code below

3 NESUG 7 PROC NLP data=data_no_noise OUTEST=STATS TECH=NMSIMP MAXFUNC=500000; PARAMETERS B0, B, B, B3, S, S; * PARAMETERS B0, B, B, B3, S=5, S=9; MIN F; IF X <= S THEN DO; YY=B0 + B*X; F=(YY-Y)**; ELSE IF X <= S THEN DO; YY=B0 + B*S + B*(X-S); F=(YY-Y)**; ELSE DO; YY=B0 + B*S + B*(S-S) + B3*(X - S); F=(YY-Y)**; LINCON S > 3, S > S+3, B > 0, B > 0, B - B > 0, B- B3 > 0; NLINCON B3>=0; /****NO OVERPROMOTION EFFECT****/ produced very different results every time it was run (out of ten times). The best solution was: PROC NLP: Nonlinear Minimization Optimization Results Parameter Estimates Gradient Objective N Parameter Estimate Function B B 3 B B S 6 S Value of Objective Function = where the initial values B 0 = , B = , B = , B 3 = , S = and S = When we put plausible initial values for knots S = 5 and S = 9, proc NLP immediately found almost the exact solution: PROC NLP: Nonlinear Minimization Optimization Results Parameter Estimates Gradient Objective N Parameter Estimate Function B B 3 B B S 6 S Value of Objective Function = Since the objective function possesses potentially many local minima, and is non-differentiable at a number of points, the strong dependence of the accuracy of the solution on initial values is the general problem of nonlinear optimization. The situation becomes more complex when PROC NLP tries to estimate a response curve without over-promotion effect from the data with over-promotion effect. Even when there is no noise in the data, this is still a very complicated problem.

4 NESUG 7 In real response data, the Signal Signal + Noise ratio is very small, and thus the problem frequently becomes intractable. Noisy data (see Graph ) were generated by the code in Appendix, with the exact same parameters A 0 =0, A =4, A =.5, A 3 =0., S =8, S =6 as in our example above. Since real parameters are unknown, it s very difficult to evaluate the results, but the dependence of the solution on initial values for noisy data is stronger. Thus, to overcome this problem, we offer a three-step heuristic approach. In the first step, we use dummy regression (PROC REG), in order to estimate the parameters of dummy variables for X= to X=max[X]. These regression parameters are treated as a time series, where instead of the time index, the number of the dummy variable is used. (The code for the dummy regression is in Appendix 3, and the graph of dummy regression parameter estimates as a series is in Graph 3). Secondly, if we have some sort of expert knowledge about the number of the knots and their locations, we can apply the Chow Test (PROC AUTOREG) to test the hypothesis about the breakpoints, i.e., knots. The last step involves estimating the parameters of the piecewise concave response curve from the series data using PROC NLP. Here, we can set the breakpoints from the second step as the initial values. Although the problem of assigning the initial values remains, the optimization problem becomes significantly simpler. In our example, we don t know the optimal number of segments it could be one, two or three. Thus, we need to run PROC NLP three times and then compare the values of the objective function in all three cases with zero, one and two knots. The final response curve will consist of the number of segments that has the smallest objective function. Comparing values of the objective function, the optimal number of segments in our example is. Summary We formulate a new problem statement of response modeling as a concave piecewise approximation of a response curve. We use artificial data to illustrate this approach. As the accuracy of the solution depends on the ratio of Signal Signal + Noise we can obtain the exact solution when the ratio is close to one, or when the knots are known. A three step heuristic approach is suggested for the situation when the data are significantly contaminated and/or the knots are unknown., References. Chapter 5. The NLP Procedure. SAS Institute Inc., SAS/OR User s Guide: Mathematical Programming, Version 8, Cary, NC: SAS Institute, Inc., 999, pp Lerman, P.M. Fitting Segmented Regression Models by Grid Search. Applied Statistics, Vol. 9, No. (980), pp McGee, Victor E and Willard T. Carleton. Piecewise Regression. Journal of the American Statistical Association, Vol. 65, No. 33 (Sep., 970), 09-4.

5 NESUG 7 APPENDIX Macro for generation of piecewise linear response curve. %MACRO DDATA_NO_NOISE(A0=,A=,A=,A3=, S=,S=); DATA DATA_NO_NOISE; %DO XX= %TO 5 ; /****ONLY ONE PIECE***/ %IF &S=0 %THEN %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); /***TWO PIECES***/ %ELSE %IF &S=0 %THEN %DO; %IF &XX<=&S %THEN %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); %ELSE %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&XX-&S))); /****THREE PIECES***/ %ELSE %IF &XX<=&S %THEN %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); %ELSE %IF &XX <=&S %THEN %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&XX- &S))); %ELSE %DO; Y=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&S- &S)) + %SYSEVALF(&A3*(&XX-&S))); PROC PRINT DATA=DATA_NO_NOISE; PROC GPLOT DATA=DATA_NO_NOISE; PLOT Y*X; %MEND DDATA_NO_NOISE; %DDATA_NO_NOISE(A0=0,A=4,A=.5,A3=0., S=8,S=6)

6 NESUG 7 APPENDIX Macro for generation of data with normal noise. %MACRO DDATA_WITH_NORNOISE(A0=,A=,A=,A3=, S=,S=); DATA DATA_WITH_NORNOISE; %DO XX= %TO 5 ; /****ONLY ONE PIECE***/ %IF &S=0 %THEN %DO; DO I= TO 5; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); Y=YY + SQRT(5)*RANNOR(345+I*4); /****TWO PIECES***/ %ELSE %IF &S=0 %THEN %DO; DO I= TO 5; %IF &XX<=&S %THEN %DO; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); Y=YY + SQRT(5)*RANNOR(345+I*4); %ELSE %DO; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&XX-&S))); Y=YY + SQRT(5)*RANNOR(345+I*00); /****THREE PIECES***/ %ELSE %IF &XX<=&S %THEN %DO; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&XX)); DO I= TO 5; Y=YY +.8*SQRT(5)*RANNOR(345+I*4); %ELSE %IF &XX <=&S %THEN %DO; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&XX-&S))); DO I= TO 50; Y=YY + 4*SQRT(5)*RANNOR(345+I*00)+ *SQRT(5)*RANNOR(345+I*); %ELSE %DO; YY=%SYSEVALF(&A0 + %SYSEVALF(&A*&S) + %SYSEVALF(&A*(&S-&S)) + %SYSEVALF(&A3*(&XX-&S))); DO I= TO 5; Y=YY + 3*SQRT(5)*RANNOR(345+I*5);

7 NESUG 7 DATA DATA_WITH_NORNOISE; SET DATA_WITH_NORNOISE (WHERE=(Y >=0)); DROP YY I; PROC PRINT DATA=DATA_WITH_NORNOISE; PROC GPLOT DATA=DATA_WITH_NORNOISE; PLOT Y*X; %MEND DDATA_WITH_NORNOISE; %DDATA_WITH_NORNOISE(A0=0,A=4,A=.5,A3=0., S=8, S=6)

8 NESUG 7 APPENDIX 3 Dummy Regression. DATA DATA_FOR_DUMMY_REG; /*****CREATION DUMMY VARIABLE FRO EACH VALUE OF PROMOTION EFFORT*****/ ARRAY DUMMY(5) ; SET DATA_WITH_NORNOISE; DO I= TO 5; * IF X = I THEN DUMMY(I)=; * ELSE DUMMY(I)=0; DUMMY(I)=(X=I); DROP I; PROC PRINT DATA=DDD (OBS=50); ODS OUTPUT PARAMETERESTIMATES=PPP; PROC REG DATA=DATA_FOR_DUMMY_REG; /*****DUMMY REGRESSION****/ MODEL Y=DUMMY-DUMMY4; QUIT; DATA PPP; SET PPP; KEEP Variable Estimate ; PROC PRINT DATA=PPP; PROC TRANSPOSE DATA=PPP OUT=TTT PREFIX=COEFF; /***ONE COLUMN PER COEFFICIENT****/ PROC PRINT DATA=TTT; /***COEF=INTERCEPT***/ DATA COEFS; /****COEFFICIENTS ADJUSTMENT****/ ARRAY PARMS(4) PARMS-PARMS4; ARRAY COEFF(5) COEFF-COEFF5; SET TTT; PARMS0=COEFF; DO I= TO 4; PARMS(I)=PARMS0+COEFF(I+); DROP COEFF-COEFF5 I _NAME LABEL_ PARMS0; PROC PRINT DATA=COEFS; PROC TRANSPOSE DATA=COEFS OUT=SERIES; /****COEFFICIENTS AS "TIME SERIES"****/ PROC PRINT DATA=SERIES; DATA SERIES; SET SERIES (RENAME=(COL=Y)); X+; DROP _NAME_; PROC PRINT DATA=SERIES; PROC GPLOT DATA=SERIES; PLOT Y*X;

9 NESUG 7 Graph : The Three-piece response curve The x-axis is the promotion efforts and the y-axis is the response. Graph - Simulated data similar to the real world data The x-axis is the promotion efforts and the y-axis is the response

10 NESUG 7 Graph 3 - Series of dummy regression parameter estimates The x-axis is the promotion efforts and the y-axis is the response

Outline. Topic 16 - Other Remedies. Ridge Regression. Ridge Regression. Ridge Regression. Robust Regression. Regression Trees. Piecewise Linear Model

Outline. Topic 16 - Other Remedies. Ridge Regression. Ridge Regression. Ridge Regression. Robust Regression. Regression Trees. Piecewise Linear Model Topic 16 - Other Remedies Ridge Regression Robust Regression Regression Trees Outline - Fall 2013 Piecewise Linear Model Bootstrapping Topic 16 2 Ridge Regression Modification of least squares that addresses

More information

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors (Section 5.4) What? Consequences of homoskedasticity Implication for computing standard errors What do these two terms

More information

Centering and Interactions: The Training Data

Centering and Interactions: The Training Data Centering and Interactions: The Training Data A random sample of 150 technical support workers were first given a test of their technical skill and knowledge, and then randomly assigned to one of three

More information

An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs

An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs 2010 Ninth International Conference on Machine Learning and Applications An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs Juan Luo Department of Computer Science

More information

Sandeep Kharidhi and WenSui Liu ChoicePoint Precision Marketing

Sandeep Kharidhi and WenSui Liu ChoicePoint Precision Marketing Generalized Additive Model and Applications in Direct Marketing Sandeep Kharidhi and WenSui Liu ChoicePoint Precision Marketing Abstract Logistic regression 1 has been widely used in direct marketing applications

More information

Multiple Regression White paper

Multiple Regression White paper +44 (0) 333 666 7366 Multiple Regression White paper A tool to determine the impact in analysing the effectiveness of advertising spend. Multiple Regression In order to establish if the advertising mechanisms

More information

Introduction to Statistical Analyses in SAS

Introduction to Statistical Analyses in SAS Introduction to Statistical Analyses in SAS Programming Workshop Presented by the Applied Statistics Lab Sarah Janse April 5, 2017 1 Introduction Today we will go over some basic statistical analyses in

More information

THIS IS NOT REPRESNTATIVE OF CURRENT CLASS MATERIAL. STOR 455 Midterm 1 September 28, 2010

THIS IS NOT REPRESNTATIVE OF CURRENT CLASS MATERIAL. STOR 455 Midterm 1 September 28, 2010 THIS IS NOT REPRESNTATIVE OF CURRENT CLASS MATERIAL STOR 455 Midterm September 8, INSTRUCTIONS: BOTH THE EXAM AND THE BUBBLE SHEET WILL BE COLLECTED. YOU MUST PRINT YOUR NAME AND SIGN THE HONOR PLEDGE

More information

Introduction to ANSYS DesignXplorer

Introduction to ANSYS DesignXplorer Lecture 4 14. 5 Release Introduction to ANSYS DesignXplorer 1 2013 ANSYS, Inc. September 27, 2013 s are functions of different nature where the output parameters are described in terms of the input parameters

More information

piecewise ginireg 1 Piecewise Gini Regressions in Stata Jan Ditzen 1 Shlomo Yitzhaki 2 September 8, 2017

piecewise ginireg 1 Piecewise Gini Regressions in Stata Jan Ditzen 1 Shlomo Yitzhaki 2 September 8, 2017 piecewise ginireg 1 Piecewise Gini Regressions in Stata Jan Ditzen 1 Shlomo Yitzhaki 2 1 Heriot-Watt University, Edinburgh, UK Center for Energy Economics Research and Policy (CEERP) 2 The Hebrew University

More information

Recent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will

Recent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will Lectures Recent advances in Metamodel of Optimal Prognosis Thomas Most & Johannes Will presented at the Weimar Optimization and Stochastic Days 2010 Source: www.dynardo.de/en/library Recent advances in

More information

3 Nonlinear Regression

3 Nonlinear Regression CSC 4 / CSC D / CSC C 3 Sometimes linear models are not sufficient to capture the real-world phenomena, and thus nonlinear models are necessary. In regression, all such models will have the same basic

More information

The Fly & Anti-Fly Missile

The Fly & Anti-Fly Missile The Fly & Anti-Fly Missile Rick Tilley Florida State University (USA) rt05c@my.fsu.edu Abstract Linear Regression with Gradient Descent are used in many machine learning applications. The algorithms are

More information

Robust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson

Robust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson Robust Regression Robust Data Mining Techniques By Boonyakorn Jantaranuson Outline Introduction OLS and important terminology Least Median of Squares (LMedS) M-estimator Penalized least squares What is

More information

Ensemble methods in machine learning. Example. Neural networks. Neural networks

Ensemble methods in machine learning. Example. Neural networks. Neural networks Ensemble methods in machine learning Bootstrap aggregating (bagging) train an ensemble of models based on randomly resampled versions of the training set, then take a majority vote Example What if you

More information

Generalized Additive Model

Generalized Additive Model Generalized Additive Model by Huimin Liu Department of Mathematics and Statistics University of Minnesota Duluth, Duluth, MN 55812 December 2008 Table of Contents Abstract... 2 Chapter 1 Introduction 1.1

More information

3 Nonlinear Regression

3 Nonlinear Regression 3 Linear models are often insufficient to capture the real-world phenomena. That is, the relation between the inputs and the outputs we want to be able to predict are not linear. As a consequence, nonlinear

More information

Chapter 1 Introduction. Chapter Contents

Chapter 1 Introduction. Chapter Contents Chapter 1 Introduction Chapter Contents OVERVIEW OF SAS/STAT SOFTWARE................... 17 ABOUT THIS BOOK.............................. 17 Chapter Organization............................. 17 Typographical

More information

MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS

MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS GRADO EN A.D.E. GRADO EN ECONOMÍA GRADO EN F.Y.C. ACADEMIC YEAR 2011-12 INDEX UNIT 1.- AN INTRODUCCTION TO OPTIMIZATION 2 UNIT 2.- NONLINEAR PROGRAMMING

More information

Statistics & Analysis. A Comparison of PDLREG and GAM Procedures in Measuring Dynamic Effects

Statistics & Analysis. A Comparison of PDLREG and GAM Procedures in Measuring Dynamic Effects A Comparison of PDLREG and GAM Procedures in Measuring Dynamic Effects Patralekha Bhattacharya Thinkalytics The PDLREG procedure in SAS is used to fit a finite distributed lagged model to time series data

More information

Polymath 6. Overview

Polymath 6. Overview Polymath 6 Overview Main Polymath Menu LEQ: Linear Equations Solver. Enter (in matrix form) and solve a new system of simultaneous linear equations. NLE: Nonlinear Equations Solver. Enter and solve a new

More information

Algebra 2 Semester 1 (#2221)

Algebra 2 Semester 1 (#2221) Instructional Materials for WCSD Math Common Finals The Instructional Materials are for student and teacher use and are aligned to the 2016-2017 Course Guides for the following course: Algebra 2 Semester

More information

EPSRC Centre for Doctoral Training in Industrially Focused Mathematical Modelling

EPSRC Centre for Doctoral Training in Industrially Focused Mathematical Modelling EPSRC Centre for Doctoral Training in Industrially Focused Mathematical Modelling More Accurate Optical Measurements of the Cornea Raquel González Fariña Contents 1. Introduction... 2 Background... 2 2.

More information

Using the CLP Procedure to solve the agent-district assignment problem

Using the CLP Procedure to solve the agent-district assignment problem Using the CLP Procedure to solve the agent-district assignment problem Kevin K. Gillette and Stephen B. Sloan, Accenture ABSTRACT The Challenge: assigning outbound calling agents in a telemarketing campaign

More information

Concept of Curve Fitting Difference with Interpolation

Concept of Curve Fitting Difference with Interpolation Curve Fitting Content Concept of Curve Fitting Difference with Interpolation Estimation of Linear Parameters by Least Squares Curve Fitting by Polynomial Least Squares Estimation of Non-linear Parameters

More information

IDENTIFYING OPTICAL TRAP

IDENTIFYING OPTICAL TRAP IDENTIFYING OPTICAL TRAP Yulwon Cho, Yuxin Zheng 12/16/2011 1. BACKGROUND AND MOTIVATION Optical trapping (also called optical tweezer) is widely used in studying a variety of biological systems in recent

More information

5 Day 5: Maxima and minima for n variables.

5 Day 5: Maxima and minima for n variables. UNIVERSITAT POMPEU FABRA INTERNATIONAL BUSINESS ECONOMICS MATHEMATICS III. Pelegrí Viader. 2012-201 Updated May 14, 201 5 Day 5: Maxima and minima for n variables. The same kind of first-order and second-order

More information

Unit 2: Day 1: Linear and Quadratic Functions

Unit 2: Day 1: Linear and Quadratic Functions Unit : Day 1: Linear and Quadratic Functions Minds On: 15 Action: 0 Consolidate:0 Learning Goals Activate prior knowledge by reviewing features of linear and quadratic functions such as what the graphs

More information

Discrete Optimization. Lecture Notes 2

Discrete Optimization. Lecture Notes 2 Discrete Optimization. Lecture Notes 2 Disjunctive Constraints Defining variables and formulating linear constraints can be straightforward or more sophisticated, depending on the problem structure. The

More information

Stat 5100 Handout #11.a SAS: Variations on Ordinary Least Squares

Stat 5100 Handout #11.a SAS: Variations on Ordinary Least Squares Stat 5100 Handout #11.a SAS: Variations on Ordinary Least Squares Example 1: (Weighted Least Squares) A health researcher is interested in studying the relationship between diastolic blood pressure (bp)

More information

Curve fitting using linear models

Curve fitting using linear models Curve fitting using linear models Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark September 28, 2012 1 / 12 Outline for today linear models and basis functions polynomial regression

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

x, with a range of 1-50 (You can use the randbetween command on excel)

x, with a range of 1-50 (You can use the randbetween command on excel) HW 2, Part 1 Question 1 (In EXCEL) Create a data set with the following characteristics: Sample size: 100 One independent variable: x, with a range of 1-50 (You can use the randbetween command on excel)

More information

EE368 Project Report CD Cover Recognition Using Modified SIFT Algorithm

EE368 Project Report CD Cover Recognition Using Modified SIFT Algorithm EE368 Project Report CD Cover Recognition Using Modified SIFT Algorithm Group 1: Mina A. Makar Stanford University mamakar@stanford.edu Abstract In this report, we investigate the application of the Scale-Invariant

More information

Modern Methods of Data Analysis - WS 07/08

Modern Methods of Data Analysis - WS 07/08 Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints

More information

FITTING PIECEWISE LINEAR FUNCTIONS USING PARTICLE SWARM OPTIMIZATION

FITTING PIECEWISE LINEAR FUNCTIONS USING PARTICLE SWARM OPTIMIZATION Suranaree J. Sci. Technol. Vol. 19 No. 4; October - December 2012 259 FITTING PIECEWISE LINEAR FUNCTIONS USING PARTICLE SWARM OPTIMIZATION Pavee Siriruk * Received: February 28, 2013; Revised: March 12,

More information

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics Civil Engineering Systems Analysis Lecture XIV Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics Today s Learning Objectives Dual 2 Linear Programming Dual Problem 3

More information

Both the polynomial must meet and give same value at t=4 and should look like this

Both the polynomial must meet and give same value at t=4 and should look like this Polymath Regression tutorial on Polynomial fitting of data The following table shows the raw data for experimental tracer concentration from a reactor which you need to fit using Polymath (refer Example

More information

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review Gautam Kunapuli Machine Learning Data is identically and independently distributed Goal is to learn a function that maps to Data is generated using an unknown function Learn a hypothesis that minimizes

More information

Classroom Tips and Techniques: Least-Squares Fits. Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft

Classroom Tips and Techniques: Least-Squares Fits. Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft Introduction Classroom Tips and Techniques: Least-Squares Fits Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft The least-squares fitting of functions to data can be done in

More information

Stat 500 lab notes c Philip M. Dixon, Week 10: Autocorrelated errors

Stat 500 lab notes c Philip M. Dixon, Week 10: Autocorrelated errors Week 10: Autocorrelated errors This week, I have done one possible analysis and provided lots of output for you to consider. Case study: predicting body fat Body fat is an important health measure, but

More information

Nonparametric Approaches to Regression

Nonparametric Approaches to Regression Nonparametric Approaches to Regression In traditional nonparametric regression, we assume very little about the functional form of the mean response function. In particular, we assume the model where m(xi)

More information

L-Bit to M-Bit Code Mapping To Avoid Long Consecutive Zeros in NRZ with Synchronization

L-Bit to M-Bit Code Mapping To Avoid Long Consecutive Zeros in NRZ with Synchronization L-Bit to M-Bit Code Mapping To Avoid Long Consecutive Zeros in NRZ with Synchronization Ruixing Li Shahram Latifi Yun Lun Ming Lun Abstract we investigate codes that map bits to m bits to achieve a set

More information

The PK Package. January 26, Author Martin J. Wolfsegger and Thomas Jaki

The PK Package. January 26, Author Martin J. Wolfsegger and Thomas Jaki The PK Package January 26, 2006 Version 0.03 Date 2006-01-26 Title Basic Pharmacokinetics Author Martin J. Wolfsegger and Thomas Jaki Maintainer Martin J. Wolfsegger

More information

CS 450 Numerical Analysis. Chapter 7: Interpolation

CS 450 Numerical Analysis. Chapter 7: Interpolation Lecture slides based on the textbook Scientific Computing: An Introductory Survey by Michael T. Heath, copyright c 2018 by the Society for Industrial and Applied Mathematics. http://www.siam.org/books/cl80

More information

Optimization of Design. Lecturer:Dung-An Wang Lecture 8

Optimization of Design. Lecturer:Dung-An Wang Lecture 8 Optimization of Design Lecturer:Dung-An Wang Lecture 8 Lecture outline Reading: Ch8 of text Today s lecture 2 8.1 LINEAR FUNCTIONS Cost Function Constraints 3 8.2 The standard LP problem Only equality

More information

Regression. Dr. G. Bharadwaja Kumar VIT Chennai

Regression. Dr. G. Bharadwaja Kumar VIT Chennai Regression Dr. G. Bharadwaja Kumar VIT Chennai Introduction Statistical models normally specify how one set of variables, called dependent variables, functionally depend on another set of variables, called

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space Medical Image Processing Using Transforms Hongmei Zhu, Ph.D Department of Mathematics & Statistics York University hmzhu@yorku.ca Outlines Image Quality Gray value transforms Histogram processing Transforms

More information

Lecture 7: Most Common Edge Detectors

Lecture 7: Most Common Edge Detectors #1 Lecture 7: Most Common Edge Detectors Saad Bedros sbedros@umn.edu Edge Detection Goal: Identify sudden changes (discontinuities) in an image Intuitively, most semantic and shape information from the

More information

Chapter 3 Numerical Methods

Chapter 3 Numerical Methods Chapter 3 Numerical Methods Part 1 3.1 Linearization and Optimization of Functions of Vectors 1 Problem Notation 2 Outline 3.1.1 Linearization 3.1.2 Optimization of Objective Functions 3.1.3 Constrained

More information

Discovery of the Source of Contaminant Release

Discovery of the Source of Contaminant Release Discovery of the Source of Contaminant Release Devina Sanjaya 1 Henry Qin Introduction Computer ability to model contaminant release events and predict the source of release in real time is crucial in

More information

Moving Beyond Linearity

Moving Beyond Linearity Moving Beyond Linearity The truth is never linear! 1/23 Moving Beyond Linearity The truth is never linear! r almost never! 1/23 Moving Beyond Linearity The truth is never linear! r almost never! But often

More information

Fundamentals of Operations Research. Prof. G. Srinivasan. Department of Management Studies. Indian Institute of Technology Madras.

Fundamentals of Operations Research. Prof. G. Srinivasan. Department of Management Studies. Indian Institute of Technology Madras. Fundamentals of Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology Madras Lecture No # 06 Simplex Algorithm Initialization and Iteration (Refer Slide

More information

Linear programming II João Carlos Lourenço

Linear programming II João Carlos Lourenço Decision Support Models Linear programming II João Carlos Lourenço joao.lourenco@ist.utl.pt Academic year 2012/2013 Readings: Hillier, F.S., Lieberman, G.J., 2010. Introduction to Operations Research,

More information

Globally Stabilized 3L Curve Fitting

Globally Stabilized 3L Curve Fitting Globally Stabilized 3L Curve Fitting Turker Sahin and Mustafa Unel Department of Computer Engineering, Gebze Institute of Technology Cayirova Campus 44 Gebze/Kocaeli Turkey {htsahin,munel}@bilmuh.gyte.edu.tr

More information

CSC 411: Lecture 02: Linear Regression

CSC 411: Lecture 02: Linear Regression CSC 411: Lecture 02: Linear Regression Raquel Urtasun & Rich Zemel University of Toronto Sep 16, 2015 Urtasun & Zemel (UofT) CSC 411: 02-Regression Sep 16, 2015 1 / 16 Today Linear regression problem continuous

More information

Simulation of Imputation Effects Under Different Assumptions. Danny Rithy

Simulation of Imputation Effects Under Different Assumptions. Danny Rithy Simulation of Imputation Effects Under Different Assumptions Danny Rithy ABSTRACT Missing data is something that we cannot always prevent. Data can be missing due to subjects' refusing to answer a sensitive

More information

186 Statistics, Data Analysis and Modeling. Proceedings of MWSUG '95

186 Statistics, Data Analysis and Modeling. Proceedings of MWSUG '95 A Statistical Analysis Macro Library in SAS Carl R. Haske, Ph.D., STATPROBE, nc., Ann Arbor, M Vivienne Ward, M.S., STATPROBE, nc., Ann Arbor, M ABSTRACT Statistical analysis plays a major role in pharmaceutical

More information

Parameter optimization model in electrical discharge machining process *

Parameter optimization model in electrical discharge machining process * 14 Journal of Zhejiang University SCIENCE A ISSN 1673-565X (Print); ISSN 1862-1775 (Online) www.zju.edu.cn/jzus; www.springerlink.com E-mail: jzus@zju.edu.cn Parameter optimization model in electrical

More information

Civil Engineering Systems Analysis Lecture XV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Civil Engineering Systems Analysis Lecture XV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics Civil Engineering Systems Analysis Lecture XV Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics Today s Learning Objectives Sensitivity Analysis Dual Simplex Method 2

More information

2014 Stat-Ease, Inc. All Rights Reserved.

2014 Stat-Ease, Inc. All Rights Reserved. What s New in Design-Expert version 9 Factorial split plots (Two-Level, Multilevel, Optimal) Definitive Screening and Single Factor designs Journal Feature Design layout Graph Columns Design Evaluation

More information

Online Algorithm Comparison points

Online Algorithm Comparison points CS446: Machine Learning Spring 2017 Problem Set 3 Handed Out: February 15 th, 2017 Due: February 27 th, 2017 Feel free to talk to other members of the class in doing the homework. I am more concerned that

More information

Constructing Hidden Units using Examples and Queries

Constructing Hidden Units using Examples and Queries Constructing Hidden Units using Examples and Queries Eric B. Baum Kevin J. Lang NEC Research Institute 4 Independence Way Princeton, NJ 08540 ABSTRACT While the network loading problem for 2-layer threshold

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Clustering and EM Barnabás Póczos & Aarti Singh Contents Clustering K-means Mixture of Gaussians Expectation Maximization Variational Methods 2 Clustering 3 K-

More information

Generalized Network Flow Programming

Generalized Network Flow Programming Appendix C Page Generalized Network Flow Programming This chapter adapts the bounded variable primal simplex method to the generalized minimum cost flow problem. Generalized networks are far more useful

More information

Spatial Interpolation & Geostatistics

Spatial Interpolation & Geostatistics (Z i Z j ) 2 / 2 Spatial Interpolation & Geostatistics Lag Lag Mean Distance between pairs of points 1 Tobler s Law All places are related, but nearby places are related more than distant places Corollary:

More information

CS 188: Artificial Intelligence

CS 188: Artificial Intelligence CS 188: Artificial Intelligence Constraint Satisfaction Problems II and Local Search Instructors: Sergey Levine and Stuart Russell University of California, Berkeley [These slides were created by Dan Klein

More information

Today is the last day to register for CU Succeed account AND claim your account. Tuesday is the last day to register for my class

Today is the last day to register for CU Succeed account AND claim your account. Tuesday is the last day to register for my class Today is the last day to register for CU Succeed account AND claim your account. Tuesday is the last day to register for my class Back board says your name if you are on my roster. I need parent financial

More information

STAT 705 Introduction to generalized additive models

STAT 705 Introduction to generalized additive models STAT 705 Introduction to generalized additive models Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 22 Generalized additive models Consider a linear

More information

Applied Regression Modeling: A Business Approach

Applied Regression Modeling: A Business Approach i Applied Regression Modeling: A Business Approach Computer software help: SAS code SAS (originally Statistical Analysis Software) is a commercial statistical software package based on a powerful programming

More information

HW 10 STAT 672, Summer 2018

HW 10 STAT 672, Summer 2018 HW 10 STAT 672, Summer 2018 1) (0 points) Do parts (a), (b), (c), and (e) of Exercise 2 on p. 298 of ISL. 2) (0 points) Do Exercise 3 on p. 298 of ISL. 3) For this problem, try to use the 64 bit version

More information

Functions and Graphs. The METRIC Project, Imperial College. Imperial College of Science Technology and Medicine, 1996.

Functions and Graphs. The METRIC Project, Imperial College. Imperial College of Science Technology and Medicine, 1996. Functions and Graphs The METRIC Project, Imperial College. Imperial College of Science Technology and Medicine, 1996. Launch Mathematica. Type

More information

Excerpt from "Art of Problem Solving Volume 1: the Basics" 2014 AoPS Inc.

Excerpt from Art of Problem Solving Volume 1: the Basics 2014 AoPS Inc. Chapter 5 Using the Integers In spite of their being a rather restricted class of numbers, the integers have a lot of interesting properties and uses. Math which involves the properties of integers is

More information

Linear and quadratic Taylor polynomials for functions of several variables.

Linear and quadratic Taylor polynomials for functions of several variables. ams/econ 11b supplementary notes ucsc Linear quadratic Taylor polynomials for functions of several variables. c 016, Yonatan Katznelson Finding the extreme (minimum or maximum) values of a function, is

More information

MEI GeoGebra Tasks for AS Pure

MEI GeoGebra Tasks for AS Pure Task 1: Coordinate Geometry Intersection of a line and a curve 1. Add a quadratic curve, e.g. y = x 2 4x + 1 2. Add a line, e.g. y = x 3 3. Use the Intersect tool to find the points of intersection of

More information

Constrained and Unconstrained Optimization

Constrained and Unconstrained Optimization Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt2@illinois.edu Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical

More information

Practical Guidance for Machine Learning Applications

Practical Guidance for Machine Learning Applications Practical Guidance for Machine Learning Applications Brett Wujek About the authors Material from SGF Paper SAS2360-2016 Brett Wujek Senior Data Scientist, Advanced Analytics R&D ~20 years developing engineering

More information

Lecture 24: Generalized Additive Models Stat 704: Data Analysis I, Fall 2010

Lecture 24: Generalized Additive Models Stat 704: Data Analysis I, Fall 2010 Lecture 24: Generalized Additive Models Stat 704: Data Analysis I, Fall 2010 Tim Hanson, Ph.D. University of South Carolina T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 1 / 26 Additive predictors

More information

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in

More information

Want to Do a Better Job? - Select Appropriate Statistical Analysis in Healthcare Research

Want to Do a Better Job? - Select Appropriate Statistical Analysis in Healthcare Research Want to Do a Better Job? - Select Appropriate Statistical Analysis in Healthcare Research Liping Huang, Center for Home Care Policy and Research, Visiting Nurse Service of New York, NY, NY ABSTRACT The

More information

Machine Learning. Decision Trees. Manfred Huber

Machine Learning. Decision Trees. Manfred Huber Machine Learning Decision Trees Manfred Huber 2015 1 Decision Trees Classifiers covered so far have been Non-parametric (KNN) Probabilistic with independence (Naïve Bayes) Linear in features (Logistic

More information

Cover Page. The handle holds various files of this Leiden University dissertation.

Cover Page. The handle   holds various files of this Leiden University dissertation. Cover Page The handle http://hdl.handle.net/1887/22055 holds various files of this Leiden University dissertation. Author: Koch, Patrick Title: Efficient tuning in supervised machine learning Issue Date:

More information

Spatial Interpolation - Geostatistics 4/3/2018

Spatial Interpolation - Geostatistics 4/3/2018 Spatial Interpolation - Geostatistics 4/3/201 (Z i Z j ) 2 / 2 Spatial Interpolation & Geostatistics Lag Distance between pairs of points Lag Mean Tobler s Law All places are related, but nearby places

More information

Chapter 1 Introduction to Optimization

Chapter 1 Introduction to Optimization Chapter 1 Introduction to Optimization Chapter Contents OVERVIEW................................... 15 DATA FLOW................................... 16 PROC LP................................... 17 PROC

More information

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW Thorsten Thormählen, Hellward Broszio, Ingolf Wassermann thormae@tnt.uni-hannover.de University of Hannover, Information Technology Laboratory,

More information

APPLICATION OF FUZZY REGRESSION METHODOLOGY IN AGRICULTURE USING SAS

APPLICATION OF FUZZY REGRESSION METHODOLOGY IN AGRICULTURE USING SAS APPLICATION OF FUZZY REGRESSION METHODOLOGY IN AGRICULTURE USING SAS Himadri Ghosh and Savita Wadhwa I.A.S.R.I., Library Avenue, Pusa, New Delhi 110012 him_adri@iasri.res.in, savita@iasri.res.in Multiple

More information

PORTFOLIO OPTIMISATION

PORTFOLIO OPTIMISATION PORTFOLIO OPTIMISATION N. STCHEDROFF Abstract. Portfolio optimisation is computationally intensive and has potential for performance improvement. This paper examines the effects of evaluating large numbers

More information

For Monday. Read chapter 18, sections Homework:

For Monday. Read chapter 18, sections Homework: For Monday Read chapter 18, sections 10-12 The material in section 8 and 9 is interesting, but we won t take time to cover it this semester Homework: Chapter 18, exercise 25 a-b Program 4 Model Neuron

More information

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability.

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability. CS 188: Artificial Intelligence Fall 2008 Lecture 5: CSPs II 9/11/2008 Announcements Assignments: DUE W1: NOW P1: Due 9/12 at 11:59pm Assignments: UP W2: Up now P2: Up by weekend Dan Klein UC Berkeley

More information

CS 188: Artificial Intelligence Fall 2008

CS 188: Artificial Intelligence Fall 2008 CS 188: Artificial Intelligence Fall 2008 Lecture 5: CSPs II 9/11/2008 Dan Klein UC Berkeley Many slides over the course adapted from either Stuart Russell or Andrew Moore 1 1 Assignments: DUE Announcements

More information

Quadratic Equations Group Acitivity 3 Business Project Week #5

Quadratic Equations Group Acitivity 3 Business Project Week #5 MLC at Boise State 013 Quadratic Equations Group Acitivity 3 Business Project Week #5 In this activity we are going to further explore quadratic equations. We are going to analyze different parts of the

More information

HW 10 STAT 472, Spring 2018

HW 10 STAT 472, Spring 2018 HW 10 STAT 472, Spring 2018 1) (0 points) Do parts (a), (b), (c), and (e) of Exercise 2 on p. 298 of ISL. 2) (0 points) Do Exercise 3 on p. 298 of ISL. 3) For this problem, you can merely submit the things

More information

Solution Methods Numerical Algorithms

Solution Methods Numerical Algorithms Solution Methods Numerical Algorithms Evelien van der Hurk DTU Managment Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (6) 09/10/2017 Class

More information

Solving for dynamic user equilibrium

Solving for dynamic user equilibrium Solving for dynamic user equilibrium CE 392D Overall DTA problem 1. Calculate route travel times 2. Find shortest paths 3. Adjust route choices toward equilibrium We can envision each of these steps as

More information

Mapping of Hierarchical Activation in the Visual Cortex Suman Chakravartula, Denise Jones, Guillaume Leseur CS229 Final Project Report. Autumn 2008.

Mapping of Hierarchical Activation in the Visual Cortex Suman Chakravartula, Denise Jones, Guillaume Leseur CS229 Final Project Report. Autumn 2008. Mapping of Hierarchical Activation in the Visual Cortex Suman Chakravartula, Denise Jones, Guillaume Leseur CS229 Final Project Report. Autumn 2008. Introduction There is much that is unknown regarding

More information

Optimization and least squares. Prof. Noah Snavely CS1114

Optimization and least squares. Prof. Noah Snavely CS1114 Optimization and least squares Prof. Noah Snavely CS1114 http://cs1114.cs.cornell.edu Administrivia A5 Part 1 due tomorrow by 5pm (please sign up for a demo slot) Part 2 will be due in two weeks (4/17)

More information

Stephen Scott.

Stephen Scott. 1 / 33 sscott@cse.unl.edu 2 / 33 Start with a set of sequences In each column, residues are homolgous Residues occupy similar positions in 3D structure Residues diverge from a common ancestral residue

More information

Getting Correct Results from PROC REG

Getting Correct Results from PROC REG Getting Correct Results from PROC REG Nate Derby Stakana Analytics Seattle, WA, USA SUCCESS 3/12/15 Nate Derby Getting Correct Results from PROC REG 1 / 29 Outline PROC REG 1 PROC REG 2 Nate Derby Getting

More information

Chapter 1 Section 1 Lesson: Solving Linear Equations

Chapter 1 Section 1 Lesson: Solving Linear Equations Introduction Linear equations are the simplest types of equations to solve. In a linear equation, all variables are to the first power only. All linear equations in one variable can be reduced to the form

More information