APPLICATION OF FUZZY REGRESSION METHODOLOGY IN AGRICULTURE USING SAS
|
|
- Joy Clark
- 5 years ago
- Views:
Transcription
1 APPLICATION OF FUZZY REGRESSION METHODOLOGY IN AGRICULTURE USING SAS Himadri Ghosh and Savita Wadhwa I.A.S.R.I., Library Avenue, Pusa, New Delhi Multiple linear regression modelling is a very powerful technique and is extensively used in agricultural research (Lalitha et al. 1999, Guo and Sun 2001). This technique estimates linear relationship between dependent (response) and independent (explanatory) variables. If X i, i=1,2,,n are explanatory variables and Y is response variable, the model is expressed as : Y b0 b1 X1... bn X n e (1) where b s are parameters and e is the error term assumed to be following a normal distribution. The parameters are generally estimated using method of least squares. A good description of various aspects of multiple linear regression methodology is given in Draper and Smith (1998). One drawback of the above methodology is that the underlying relationship is assumed to be crisp or precise, as its gives a precise value of response for a set of values of explanatory variables. However, in a realistic situation, the underlying relationship is not a crisp function of a given form; it contains some vagueness or impreciseness. So, by assuming a crisp relationship, some vital information may be lost (Slowinski 1998). A very promising technique of fuzzy regression has been developed. This technique can be applied to solve agricultural research problems. A fuzzy regression model corresponding to equation (1) can be written as: Y A A X...A (2) nxn Here explanatory variables X i s, as before, are assumed to be precise. However, as mentioned above, response variable Y is not crisp but is instead fuzzy in nature. This implies that the parameters are also fuzzy in nature. Our aim is to estimate these parameters. In subsequent discussion, it is assumed that A i s are symmetric fuzzy numbers (ie vagueness is expressible as equidistant from the center) and so can be represented by intervals. For example, A i can be expressed as fuzzy set given by: A 1 a a 1 1c, w (3) where a 1 c is centre and belief of regression coefficient around a 1 w is radius or vagueness associated. The above fuzzy set describes a ic in terms of symmetric triangular membership function. It is also to be noted that the methodology is applied when the underlying phenomenon is fuzzy which means that the response variable is fuzzy and the relationship is also considered to be fuzzy. Equation (3) is sometimes also written as: where A 1 [ a 1L, a 1R ] (4) a1l a1c a1w and a1r a1c a1w (Kacprzyk and Fedrizzi 1992)
2 Method of estimation of parameters of equation (2) is different from that of equation (1). In fuzzy regression methodology, parameters are estimated by minimizing total vagueness in the model, ie sum of radii of predicted intervals, From equation (2): Y j A 0 A X 1 ij... A n X nj Using equation (3), y a,a a a x... a,a x y,y, say Thus j 0c 0w 1c, 1w y jc a c a1c x1 j... ancxnj 1j 0 (5a) y jw a0w a1w x1 j... anw xnj (5b) As y jw represents radius and so cannot be negative, therefore on the right-hand side of equation (5b), absolute values of x ij are taken. Suppose there are m data points, each comprising an 1 row vector. Then parameters A i are estimated by minimizing the quantity, which is total vagueness of the model-data set combination, subject to the constraints that each data point must fall within estimated value of response variable. This can be visualized as the following linear programming problem (Tanaka 1987): m Minimize a a x... a x j1 nc 0 w 1w 1j nw nj (6) nw nj jc jw and a 0 iw Subject to n n a0 c aicx ij a0w aiwx ij Y j i1 i1 n n a c aicx ij a w aiwx ij Y 0 0 j i1 i1 To solve the above linear programming problem, Simplex procedure (Taha 1997) is generally employed. ILLUSTRATION 1: Data given in article of Sengupta et al. (2001) are considered. They have studied the effect of sulphur-containing fertilizers on productivity of rainfed greengram (Phaseolus radiatus L.). The response variable is dry-matter accumulation (Y) and the explanatory variables are plant height (X 1 ) and leaf area index (X 2 ). Only the data pertaining to maturity level, i.e. 60 days after sowing (DAS), are considered for data analysis and the same are presented in Table 1 for ready reference.
3 Table 1: Data of dry matter accumulation, plant height and leaf area index for effect of sulphur on growth of greengram crop Dry-matter accumulation Plant Height Leaf area index (g/m 2 ) (cm) The multiple linear regression model and fuzzy regression model are fitted to the above data using SAS, version 9.2 software package and following are the SAS codes and results obtained: Method of Multiple linear regression (MLR) Title 'Method of least square'; ods csv file= resultls.csv ; data plant; input y x1 x2; cards; ; proc reg; model y=x1 x2; output out=all; proc print data=all; run; quit; ods csv closed; Method of Fuzzy regression (FR) (OPTMODEL) Title Linear programming ; data plant; input y x1 x2; datalines;
4 ; run; ods rtf file='result_ex1.rtf'; proc optmodel; set j= 1..8; number y{j}, x1{j}, x2{j}; read data plant into [_n_] y x1 x2; /*Print y x1 x2*/ print y x1 x2; number n init 8; /* Total number of Observations*/ /* Decision Variables*/ var aw{1..3}>=0; /*Theses three variables are bounded*/ var ac{1..3}; /* These three variables are not bounded*/ /* Objective function*/ min z1= aw[1] * n + sum{i in j} x1[i] * aw[2] + sum{i in j} x2[i] * aw[3]; /*Linear Constraints*/ con c{i in 1..n}: ac[1]+x1[i]*ac[2]+x2[i]*ac[3]-aw[1]-x1[i]*aw[2]- x2[i]*aw[3] <= y[i]; con c1{i in 1..n}: ac[1]+x1[i]*ac[2]+x2[i]*ac[3]+aw[1]+x1[i]*aw[2]+x2[i]*aw[3] >= y[i]; expand; /* This provides all equations */ solve; print ac aw; quit; ods rtf close; RESULTS: Partial SAS output: Method of Multiple linear regression (MLR) Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept x x
5 Method of Fuzzy regression (FR) (OPTMODEL) [1] y x1 x Objective Sense Problem Summary Objective Function Objective Type Minimizati on z1 Linear Number of Variables 6 Bounded Above 0 Bounded Below 3 Bounded Below and Above Free 3 Fixed 0 0 Number of Constraints 16 Linear LE (<=) 8 Linear EQ (=) 0 Linear GE (>=) 8 Linear Range 0 Constraint Coefficients 96
6 Solver Objective Function Solution Summary Solution Status Dual Simplex z1 Optimal Objective Value Iterations 7 Primal Infeasibility Dual Infeasibility 0 Bound Infeasibility 0 0 [1] ac aw From the above results ac1= ac2= ac3= aw1= 7.97 aw2=0 aw3=0 The fitted model for MLR is Y= X X 2 (7) Standard Errors (107.63) (2.16) (7.57) The fitted model for FR is Y = <217.08, 7.97 > + <-3.06, 0 > X 1 + < 59.73, 0 > X 2 (8) In order to compare performance of above 2 approaches, viz multiple linear regression methodology and fuzzy regression methodology, width of prediction intervals corresponding to each observed value of response variable is computed. For the former, upper limits of prediction interval are computed from the prediction equation (7) by taking the coefficient as their corresponding estimated values plus standard error, i.e. using the equation Y = ( ) + ( ) X 1 + ( ) X 2 Similarly, lower limits of prediction interval for multiple linear regression models are computed using the equation Y = ( ) + ( ) X 1 + ( ) X 2
7 Further, for fuzzy regression model, the prediction equations for computing upper and lower limits, obtained from equation (8), are respectively and Y = ( ) + ( ) X 1 + ( ) X 2 Y = ( ) + ( ). X 1 + ( ) X 2 The width of prediction intervals in respect of multiple linear regression model and fuzzy regression model corresponding to each set of observed explanatory variables is computed in MS Excel (We can open above SAS results in MS Excel directly) and the results are reported in the following Table 2. From this table, average width for former was found to be , while that for latter was only 15.93, indicating thereby the superiority of fuzzy regression methodology. Table 2: Fitting of MLR and FLR Multiple Linear Regression (MLR) Model Fuzzy Regression (FR) Model Lower limit Upper limit Width Lower limit Upper limit Width Average width Average width In reality, underlying phenomenon is fuzzy; therefore, as emphasized above, correct methodology to obtain relationship between response and explanatory variables is to apply fuzzy regression methodology rather than multiple linear regression methodology. ILLUSTRATION 2: Length (L) weight (W) data of a fish species is given below: Length (mm): Weight (g) : Length (mm): Weight (g) : Assuming the underlying phenomenon to be fuzzy, fit Fuzzy linear regression using the Method of fuzzy least squares (FLS) to the deterministic allometric model W=a L b. For estimating Length-weight relationship statistical form of the above deterministic allometric model is: Log W= log a + b log L +e Also compare the results with those obtained through fitting least squares (LS). (i)
8 Note: Optmodel procedure can also be used for this illustration SAS Codes: /* Method of least squares (LS)*/ data LW; input l logl w logw ; cards; ; Ods rtf file= result.rtf ; proc reg; model logw=logl/p; run; quit; ods rtf close; Partial SAS output : The REG Procedure Model: MODEL1 Dependent Variable: logw Number of Observations Read 16 Number of Observations Used 16 Analysis of Variance Source DF Sum of Squares Mean Square F Value Pr > F Model <.0001 Error Corrected Total
9 Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var Variable DF Parameter Estimate Standard Error t Value Pr > t Intercept <.0001 logl <.0001 Substituting the values of parameter estimates in model (i) of illustration 2 logw = logl Standard Errors (0.38) (0.08) Now we substitute the values of parameters i.e. a=exp(-11.99) and b=2.96 and their corresponding standard errors in the deterministic allometric model W=a L b and calculate the width as follows: Width= exp( )*l ( ) -(exp( )*l ( ) for different values of L(Length) given in the illustration 2. /* Method of fuzzy least squares (FLS) */ proc nlp; min Y; decvar ar br ac bc; bounds ar>=0, br>=0, ac= , bc= 2.96; lincon ac+4.38*bc-ar-4.38*br<=1.12; lincon ac+4.44*bc-ar-4.44*br<=1.12; lincon ac+4.50*bc-ar-4.50*br<=1.30; lincon ac+4.55*bc-ar-4.55*br<=1.52; lincon ac+4.61*bc-ar-4.61*br<=1.55; lincon ac+4.65*bc-ar-4.65*br<=1.81; lincon ac+4.70*bc-ar-4.70*br<=1.89; lincon ac+4.74*bc-ar-4.74*br<=2.03; lincon ac+4.79*bc-ar-4.79*br<=2.21; lincon ac+4.83*bc-ar-4.83*br<=2.32; lincon ac+4.87*bc-ar-4.87*br<=2.34; lincon ac+4.91*bc-ar-4.91*br<=2.56; lincon ac+4.94*bc-ar-4.94*br<=2.67; lincon ac+4.98*bc-ar-4.98*br<=2.73; lincon ac+5.01*bc-ar-5.01*br<=2.85; lincon ac+5.04*bc-ar-5.04*br<=3.04; lincon ac+4.38*bc+ar+4.38*br>=1.12; lincon ac+4.44*bc+ar+4.44*br>=1.12; lincon ac+4.50*bc+ar+4.50*br>=1.30;
10 lincon ac+4.55*bc+ar+4.55*br>=1.52; lincon ac+4.61*bc+ar+4.61*br>=1.55; lincon ac+4.65*bc+ar+4.65*br>=1.81; lincon ac+4.70*bc+ar+4.70*br>=1.89; lincon ac+4.74*bc+ar+4.74*br>=2.03; lincon ac+4.79*bc+ar+4.79*br>=2.21; lincon ac+4.83*bc+ ar+4.83*br>=2.32; lincon ac+4.87*bc+ar+4.87*br>=2.34; lincon ac+4.91*bc+ar+4.91*br>=2.56; lincon ac+4.94*bc+ar+4.94*br>=2.67; lincon ac+4.98*bc+ar+4.98*br>=2.73; lincon ac+5.01*bc+ar+5.01*br>=2.85; lincon ac+5.04*bc+ar+5.04*br>=3.04; Y=16*ar *br; run; Partial SAS output: Newton-Raphson Ridge Optimization Without Parameter Scaling 4 Lower Bounds 4 Upper Bounds 2 Linear Constraints 32 Using Sparse Hessian _ Optimization Start Active Constraints 2 Objective Function Max Abs Gradient Element All parameters are actively constrained. Optimization cannot proceed. PROC NLP: Nonlinear Minimization Optimization Results N Parameter Estimate Gradient Objective Function 1 ar Active Bound Constraint 2 br E Lower BC
11 Optimization Results N Parameter Estimate Gradient Objective Function Active Bound Constraint 3 ac Equal BC 4 bc Equal BC Value of Objective Function = Substituting the values of parameter estimates in model (i) of illustration 2 logw = logl Standard Errors (0.15) (0.00) Now we substitute the values of parameters i.e. a=exp(-11.99) and b=2.96 and their corresponding standard errors in the deterministic allometric model W=a L b and calculate the width as follows: Width = exp( ) *L (exp( )*l 2.96 L(Length) given in the illustration 2. for different values of Table 3: Width of predicted interval by LS and FLS approach Length Weight Estimated weight Width of predicted interval (g) (mm) (g) (g) LS FLS Average width Conclusion: The predicted interval computed using Method of fuzzy least squares have much shorter average width as compared to that obtained using Method of least squares. This implies that former procedure is more efficient than latter. The main
12 message emerging out of this illustration is that correct methodology to determine length-weight relationship in fish is that of Fuzzy least squares rather than ordinary Least squares. ILLUSTRATION 3: Wheat crop and spectral vegetation indices Normalized Difference Vegetation Index (NDVI) and Ratio Vegetation Index (RVI) (95 days a fter sowing) are observed from the experimental study at IARI farms, New Delhi and are produced below: Y (Qtl/Hec.): NDVI: RVI : Use Fuzzy regression Methodology (FRM) to fit the data using linear programming approach available in SAS software package and show its superiority over corresponding Multiple linear regression (MLR) model. Models are same as given in equations (1) and (2) above. Note: Optmodel procedure can also be used for this illustration SAS CODES: Method of Least squares (LS) Data plant; input Y NDVI RVI; cards; ; proc reg; model Y= NDVI ; proc reg; model Y= RVI; proc reg; model Y= NDVI RVI; run; quit;
13 Partial SAS Output: Variable DF Parameter Estimate Standard Error t Value Pr > t Intercept NDVI Variable DF Parameter Estimate Standard Error t Value Pr > t Intercept RVI Variable DF Parameter Estimate Standard Error t Value Pr > t Intercept NDVI RVI So our equations are: i) Y= NDVI Standard Errors (13.46) (27.04) ii) Y= RVI Standard Errors (11.51) (3.82) iii) Y= NDVI RVI Standard Errors (25.04) (238.14) (32.11) Method of Fuzzy linear regression (FLR) /*FOR NDVI*/ Proc nlp; min Y; decvar a0c a0w a1c a1w ; bounds a0w>=0, a1w>=0; lincon a0c+.52*a1c-a0w-.52*a1w<=52.29; lincon a0c+.54*a1c-a0w-.54*a1w<=43.05; lincon a0c+.52*a1c-a0w-.52*a1w<=44.20; lincon a0c+.48*a1c-a0w-.48*a1w<=44.05; lincon a0c+.48*a1c-a0w-.48*a1w<=36.08;
14 lincon a0c+.48*a1c-a0w-.48*a1w<=40.04; lincon a0c+.54*a1c-a0w-.54*a1w<=46.93; lincon a0c+.52*a1c-a0w-.52*a1w<=47.64; lincon a0c+.53*a1c-a0w-.53*a1w<=46.62; lincon a0c+.49*a1c-a0w-.49*a1w<=46.13; lincon a0c+.38*a1c-a0w-.38*a1w<=29.57; lincon a0c+.46*a1cc-a0w-.46*a1w<=45.17; lincon a0c+.52*a1c+a0w+.52*a1w>=52.29; lincon a0c+.54*a1c+a0w+.54*a1w>=43.05; lincon a0c+.52*a1c+a0w+.52*a1w>=44.20; lincon a0c+.48*a1c+a0w+.48*a1w>=44.05; lincon a0c+.48*a1c+a0w+.48*a1w>=36.08; lincon a0c+.48*a1c+a0w+.48*a1w>=40.04; lincon a0c+.54*a1c+a0w+.54*a1w>=46.93; lincon a0c+.52*a1c+a0w+.52*a1w>=47.64; lincon a0c+.53*a1c+a0w+.53*a1w>=46.62; lincon a0c+.49*a1c+a0w+.49*a1w>=46.13; lincon a0c+.38*a1c+a0w+.38*a1w>=29.57; lincon a0c+.46*a1c+a0w+.46*a1w>=45.17; Y=a0w* *a1w; run; Partial SAS output: PROC NLP: Nonlinear Minimization N Parameter Optimization Results Estimate Gradient Objective Function 1 a0c a0w a1c a1w Value of Objective Function = So our estimates are: parameters: a0c a0w a1c a1w estimates: /* For RVI*/ proc nlp; min Y; decvar a0c a0w a2c a2w ; bounds a0w>=0, a2w>=0; lincon a0c+3.18*a2c-a0w-3.18*a2w<=52.29;
15 lincon a0c+3.36*a2c-a0w-3.36*a2w<=43.05; lincon a0c+3.20*a2c-a0w-3.20*a2w<=44.20; lincon a0c+2.87*a2c-a0w-2.87*a2w<=44.05; lincon a0c+2.88*a2c-a0w-2.88*a2w<=36.08; lincon a0c+2.88*a2c-a0w-2.88*a2w<=40.04; lincon a0c+3.35*a2c-a0w-3.35*a2w<=46.93; lincon a0c+3.19*a2c-a0w-3.19*a2w<=47.64; lincon a0c+3.24*a2c-a0w-3.24*a2w<=46.62; lincon a0c+2.95*a2c-a0w-2.95*a2w<=46.13; lincon a0c+2.20*a2c-a0w-2.20*a2w<=29.57; lincon a0c+2.67*a2c-a0w-2.67*a2w<=45.17; lincon a0c+3.18*a2c+a0w+3.18*a2w>=52.29; lincon a0c+3.36*a2c+a0w+3.36*a2w>=43.05; lincon a0c+3.20*a2c+a0w+3.20*a2w>=44.20; lincon a0c+2.87*a2c+a0w+2.87*a2w>=44.05; lincon a0c+2.88*a2c+a0w+2.88*a2w>=36.08; lincon a0c+2.88*a2c+a0w+2.88*a2w>=40.04; lincon a0c+3.35*a2c+a0w+3.35*a2w>=46.93; lincon a0c+3.19*a2c+a0w+3.19*a2w>=47.64; lincon a0c+3.24*a2c+a0w+3.24*a2w>=46.62; lincon a0c+2.95*a2c+a0w+2.95*a2w>=46.13; lincon a0c+2.20*a2c+a0w+2.20*a2w>=29.57; lincon a0c+2.67*a2c+a0w+2.67*a2w>=45.17; Y=a0w* *a2w; run; Partial SAS output: PROC NLP: Nonlinear Minimization N Parameter Optimization Results Estimate Gradient Objective Function 1 a0c a0w a2c Active Bound Constraint 4 a2w E Lower BC Value of Objective Function = So our estimates are: parameters: a0c a0w a2c a2w estimates:
16 /* For NDVI and RVI*/ proc nlp; min Y; decvar a0c a0w a1c a1w a2c a2w ; bounds a0w>=0, a1w>=0, a2w>=0; lincon a0c+.52*a1c+3.18*a2c-a0w-.52*a1w-3.18*a2w<=52.29; lincon a0c+.54*a1c+3.36*a2c-a0w-.54*a1w-3.36*a2w<=43.05; lincon a0c+.52*a1c+3.20*a2c-a0w-.52*a1w-3.20*a2w<=44.20; lincon a0c+.48*a1c+2.87*a2c-a0w-.48*a1w-2.87*a2w<=44.05; lincon a0c+.48*a1c+2.88*a2c-a0w-.48*a1w-2.88*a2w<=36.08; lincon a0c+.48*a1c+2.88*a2c-a0w-.48*a1w-2.88*a2w<=40.04; lincon a0c+.54*a1c+3.35*a2c-a0w-.54*a1w-3.35*a2w<=46.93; lincon a0c+.52*a1c+3.19*a2c-a0w-.52*a1w-3.19*a2w<=47.64; lincon a0c+.53*a1c+3.24*a2c-a0w-.53*a1w-3.24*a2w<=46.62; lincon a0c+.49*a1c+2.95*a2c-a0w-.49*a1w-2.95*a2w<=46.13; lincon a0c+.38*a1c+2.20*a2c-a0w-.38*a1w-2.20*a2w<=29.57; lincon a0c+.46*a1c+2.67*a2c-a0w-.46*a1w-2.67*a2w<=45.17; lincon a0c+.52*a1c+3.18*a2c+a0w+.52*a1w+3.18*a2w>=52.29; lincon a0c+.54*a1c+3.36*a2c+a0w+.54*a1w+3.36*a2w>=43.05; lincon a0c+.52*a1c+3.20*a2c+a0w+.52*a1w+3.20*a2w>=44.20; lincon a0c+.48*a1c+2.87*a2c+a0w+.48*a1w+2.87*a2w>=44.05; lincon a0c+.48*a1c+2.88*a2c+a0w+.48*a1w+2.88*a2w>=36.08; lincon a0c+.48*a1c+2.88*a2c+a0w+.48*a1w+2.88*a2w>=40.04; lincon a0c+.54*a1c+3.35*a2c+a0w+.54*a1w+3.35*a2w>=46.93; lincon a0c+.52*a1c+3.19*a2c+a0w+.52*a1w+3.19*a2w>=47.64; lincon a0c+.53*a1c+3.24*a2c+a0w+.53*a1w+3.24*a2w>=46.62; lincon a0c+.49*a1c+2.95*a2c+a0w+.49*a1w+2.95*a2w>=46.13; lincon a0c+.38*a1c+2.20*a2c+a0w+.38*a1w+2.20*a2w>=29.57; lincon a0c+.46*a1c+2.67*a2c+a0w+.46*a1w+2.67*a2w>=45.17; Y=a0w* *a1w+35.97*a2w; run; Partial SAS output: N Parameter PROC NLP: Nonlinear Minimization Optimization Results Estimate Gradient Objective Function 1 a0c a0w a1c Active Bound Constraint 4 a1w E Lower BC 5 a2c
17 N Parameter Optimization Results Estimate Gradient Objective Function 6 a2w Active Bound Constraint Value of Objective Function = So our estimates are: parameters: a0c a0w a1c a1w a2c a2w estimates: For NDVI : Upper and lower widths of prediction interval for Multiple linear regression models are computed respectably as Y = ( ) + ( ) NDVI (a) Y = ( ) + ( ) NDVI (b) Upper and lower widths of prediction interval for Fuzzy linear regression models are computed respectably as Y = ( ) + ( ) NDVI (c) Y = ( ) + ( ) NDVI (d) By subtracting equation(b) from (a) and then taking average, we can get average width for Multiple linear regression model and by subtracting equation(d) from (c) and then taking average, we can get average width for Fuzzy linear regression model. Similarly we can get average widths for RVI and both NDVI and RVI. The following table shows the average width for the three predictor variables. Predictor Variable Table 4: Average width for fitted regression models Method of Least Squares (LS) Fuzzy Linear Regression (FLR) Model NDVI RVI Both (NDVI & RVI) Conclusion: The above table 4 shows that average widths for linear regression models vis-à-vis their fuzzy counterparts are much higher. Thus Fuzzy regression methodology is more efficient than Multiple linear regression technique. It may also be pointed out that, for fuzzy approach, average widths, when both NDVI and RVI are taken into account, are generally smaller than when only one of these is considered, which is quite logical. This clearly shows that, unlike multiple linear regression technique, fuzzy regression methodology is capable of handling situations in which predictor variables are highly correlated.
JMASM41: An Alternative Method for Multiple Linear Model Regression Modeling, a Technical Combining of Robust, Bootstrap and Fuzzy Approach (SAS)
Journal of Modern Applied Statistical Methods Volume 15 Issue 2 Article 44 11-1-2016 JMASM41: An Alternative Method for Multiple Linear Model Regression Modeling, a Technical Combining of Robust, Bootstrap
More informationTHE UNIVERSITY OF BRITISH COLUMBIA FORESTRY 430 and 533. Time: 50 minutes 40 Marks FRST Marks FRST 533 (extra questions)
THE UNIVERSITY OF BRITISH COLUMBIA FORESTRY 430 and 533 MIDTERM EXAMINATION: October 14, 2005 Instructor: Val LeMay Time: 50 minutes 40 Marks FRST 430 50 Marks FRST 533 (extra questions) This examination
More informationThe SAS/OR s OPTMODEL Procedure :
The SAS/OR s OPTMODEL Procedure : A Powerful Modeling Environment for Building, Solving, and Maintaining Mathematical Optimization Models Maurice Djona OASUS - Wednesday, November 19 th, 2008 Agenda Context:
More informationArtificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield
Agricultural Economics Research Review Vol. 21 January-June 2008 pp 5-10 Artificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield Rama Krishna Singh and Prajneshu * Biometrics
More information5.5 Regression Estimation
5.5 Regression Estimation Assume a SRS of n pairs (x, y ),..., (x n, y n ) is selected from a population of N pairs of (x, y) data. The goal of regression estimation is to take advantage of a linear relationship
More informationST512. Fall Quarter, Exam 1. Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false.
ST512 Fall Quarter, 2005 Exam 1 Name: Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false. 1. (42 points) A random sample of n = 30 NBA basketball
More informationTHIS IS NOT REPRESNTATIVE OF CURRENT CLASS MATERIAL. STOR 455 Midterm 1 September 28, 2010
THIS IS NOT REPRESNTATIVE OF CURRENT CLASS MATERIAL STOR 455 Midterm September 8, INSTRUCTIONS: BOTH THE EXAM AND THE BUBBLE SHEET WILL BE COLLECTED. YOU MUST PRINT YOUR NAME AND SIGN THE HONOR PLEDGE
More informationA User Manual for the Multivariate MLE Tool. Before running the main multivariate program saved in the SAS file Part2-Main.sas,
A User Manual for the Multivariate MLE Tool Before running the main multivariate program saved in the SAS file Part-Main.sas, the user must first compile the macros defined in the SAS file Part-Macros.sas
More informationOutline. Topic 16 - Other Remedies. Ridge Regression. Ridge Regression. Ridge Regression. Robust Regression. Regression Trees. Piecewise Linear Model
Topic 16 - Other Remedies Ridge Regression Robust Regression Regression Trees Outline - Fall 2013 Piecewise Linear Model Bootstrapping Topic 16 2 Ridge Regression Modification of least squares that addresses
More informationCH5: CORR & SIMPLE LINEAR REFRESSION =======================================
STAT 430 SAS Examples SAS5 ===================== ssh xyz@glue.umd.edu, tap sas913 (old sas82), sas https://www.statlab.umd.edu/sasdoc/sashtml/onldoc.htm CH5: CORR & SIMPLE LINEAR REFRESSION =======================================
More informationConditional and Unconditional Regression with No Measurement Error
Conditional and with No Measurement Error /* reg2ways.sas */ %include 'readsenic.sas'; title2 ''; proc reg; title3 'Conditional Regression'; model infrisk = stay census; proc calis cov; /* Analyze the
More informationSimulating Multivariate Normal Data
Simulating Multivariate Normal Data You have a population correlation matrix and wish to simulate a set of data randomly sampled from a population with that structure. I shall present here code and examples
More information1. What specialist uses information obtained from bones to help police solve crimes?
Mathematics: Modeling Our World Unit 4: PREDICTION HANDOUT VIDEO VIEWING GUIDE H4.1 1. What specialist uses information obtained from bones to help police solve crimes? 2.What are some things that can
More informationMODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM
CHAPTER-7 MODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM 7.1 Introduction To improve the overall efficiency of turning, it is necessary to
More informationStat 5100 Handout #6 SAS: Linear Regression Remedial Measures
Stat 5100 Handout #6 SAS: Linear Regression Remedial Measures Example: Age and plasma level for 25 healthy children in a study are reported. Of interest is how plasma level depends on age. (Text Table
More informationGetting Correct Results from PROC REG
Getting Correct Results from PROC REG Nate Derby Stakana Analytics Seattle, WA, USA SUCCESS 3/12/15 Nate Derby Getting Correct Results from PROC REG 1 / 29 Outline PROC REG 1 PROC REG 2 Nate Derby Getting
More informationBivariate (Simple) Regression Analysis
Revised July 2018 Bivariate (Simple) Regression Analysis This set of notes shows how to use Stata to estimate a simple (two-variable) regression equation. It assumes that you have set Stata up on your
More informationStat 500 lab notes c Philip M. Dixon, Week 10: Autocorrelated errors
Week 10: Autocorrelated errors This week, I have done one possible analysis and provided lots of output for you to consider. Case study: predicting body fat Body fat is an important health measure, but
More informationSUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING
Bulletin of Mathematics Vol. 06, No. 0 (20), pp.. SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING Eddy Roflin, Sisca Octarina, Putra B. J Bangun,
More informationMath 414 Lecture 30. The greedy algorithm provides the initial transportation matrix.
Math Lecture The greedy algorithm provides the initial transportation matrix. matrix P P Demand W ª «2 ª2 «W ª «W ª «ª «ª «Supply The circled x ij s are the initial basic variables. Erase all other values
More informationGOAL GEOMETRIC PROGRAMMING PROBLEM (G 2 P 2 ) WITH CRISP AND IMPRECISE TARGETS
Volume 4, No. 8, August 2013 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info GOAL GEOMETRIC PROGRAMMING PROBLEM (G 2 P 2 ) WITH CRISP AND IMPRECISE TARGETS
More informationEXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling
SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM EXERCISES Prepared by Natashia Boland 1 and Irina Dumitrescu 2 1 Applications and Modelling 1.1
More informationJMASM 46: Algorithm for Comparison of Robust Regression Methods In Multiple Linear Regression By Weighting Least Square Regression (SAS)
Journal of Modern Applied Statistical Methods Volume 16 Issue 2 Article 27 December 2017 JMASM 46: Algorithm for Comparison of Robust Regression Methods In Multiple Linear Regression By Weighting Least
More informationChapter 5 Accumulating Change: Limits of Sums and the Definite Integral
Chapter 5 Accumulating Change: Limits of Sums and the Definite Integral 5.1 Results of Change and Area Approximations So far, we have used Excel to investigate rates of change. In this chapter we consider
More informationLinear Programming with Bounds
Chapter 481 Linear Programming with Bounds Introduction Linear programming maximizes (or minimizes) a linear objective function subject to one or more constraints. The technique finds broad use in operations
More informationCentering and Interactions: The Training Data
Centering and Interactions: The Training Data A random sample of 150 technical support workers were first given a test of their technical skill and knowledge, and then randomly assigned to one of three
More informationCell means coding and effect coding
Cell means coding and effect coding /* mathregr_3.sas */ %include 'readmath.sas'; title2 ''; /* The data step continues */ if ethnic ne 6; /* Otherwise, throw the case out */ /* Indicator dummy variables
More informationCHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY
23 CHAPTER 3 AN OVERVIEW OF DESIGN OF EXPERIMENTS AND RESPONSE SURFACE METHODOLOGY 3.1 DESIGN OF EXPERIMENTS Design of experiments is a systematic approach for investigation of a system or process. A series
More informationSUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING
ASIAN JOURNAL OF MATHEMATICS AND APPLICATIONS Volume 2014, Article ID ama0156, 11 pages ISSN 2307-7743 http://scienceasia.asia SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING
More informationSTAT 2607 REVIEW PROBLEMS Word problems must be answered in words of the problem.
STAT 2607 REVIEW PROBLEMS 1 REMINDER: On the final exam 1. Word problems must be answered in words of the problem. 2. "Test" means that you must carry out a formal hypothesis testing procedure with H0,
More informationIntroduction to Statistical Analyses in SAS
Introduction to Statistical Analyses in SAS Programming Workshop Presented by the Applied Statistics Lab Sarah Janse April 5, 2017 1 Introduction Today we will go over some basic statistical analyses in
More informationMarkscheme May 2017 Mathematical studies Standard level Paper 1
M17/5/MATSD/SP1/ENG/TZ/XX/M Markscheme May 017 Mathematical studies Standard level Paper 1 3 pages M17/5/MATSD/SP1/ENG/TZ/XX/M This markscheme is the property of the International Baccalaureate and must
More informationSAS/STAT 13.1 User s Guide. The NESTED Procedure
SAS/STAT 13.1 User s Guide The NESTED Procedure This document is an individual chapter from SAS/STAT 13.1 User s Guide. The correct bibliographic citation for the complete manual is as follows: SAS Institute
More informationDetecting and Circumventing Collinearity or Ill-Conditioning Problems
Chapter 8 Detecting and Circumventing Collinearity or Ill-Conditioning Problems Section 8.1 Introduction Multicollinearity/Collinearity/Ill-Conditioning The terms multicollinearity, collinearity, and ill-conditioning
More informationLinear Programming Problems
Linear Programming Problems Two common formulations of linear programming (LP) problems are: min Subject to: 1,,, 1,2,,;, max Subject to: 1,,, 1,2,,;, Linear Programming Problems The standard LP problem
More informationThe Piecewise Regression Model as a Response Modeling Tool
NESUG 7 The Piecewise Regression Model as a Response Modeling Tool Eugene Brusilovskiy University of Pennsylvania Philadelphia, PA Abstract The general problem in response modeling is to identify a response
More informationÇANKAYA UNIVERSITY Department of Industrial Engineering SPRING SEMESTER
TECHNIQUES FOR CONTINOUS SPACE LOCATION PROBLEMS Continuous space location models determine the optimal location of one or more facilities on a two-dimensional plane. The obvious disadvantage is that the
More informationNotes on Fuzzy Set Ordination
Notes on Fuzzy Set Ordination Umer Zeeshan Ijaz School of Engineering, University of Glasgow, UK Umer.Ijaz@glasgow.ac.uk http://userweb.eng.gla.ac.uk/umer.ijaz May 3, 014 1 Introduction The membership
More informationGeneralized Least Squares (GLS) and Estimated Generalized Least Squares (EGLS)
Generalized Least Squares (GLS) and Estimated Generalized Least Squares (EGLS) Linear Model in matrix notation for the population Y = Xβ + Var ( ) = In GLS, the error covariance matrix is known In EGLS
More informationLinear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.
University of Southern California Viterbi School of Engineering Daniel J. Epstein Department of Industrial and Systems Engineering ISE 330: Introduction to Operations Research - Deterministic Models Fall
More informationMS in Applied Statistics: Study Guide for the Data Science concentration Comprehensive Examination. 1. MAT 456 Applied Regression Analysis
MS in Applied Statistics: Study Guide for the Data Science concentration Comprehensive Examination. The Part II comprehensive examination is a three-hour closed-book exam that is offered on the second
More informationPsychology 282 Lecture #21 Outline Categorical IVs in MLR: Effects Coding and Contrast Coding
Psychology 282 Lecture #21 Outline Categorical IVs in MLR: Effects Coding and Contrast Coding In the previous lecture we learned how to incorporate a categorical research factor into a MLR model by using
More informationLinear Methods for Regression and Shrinkage Methods
Linear Methods for Regression and Shrinkage Methods Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Linear Regression Models Least Squares Input vectors
More informationMark Scheme (Results) November 2007
Mark Scheme (Results) November 007 IGCSE IGCSE Mathematics (4400_4H) Edexcel Limited. Registered in England and Wales No. 4496750 Registered Office: One90 High Holborn, London WC1V 7BH 4400 IGCSE Mathematics
More informationStat 5100 Handout #19 SAS: Influential Observations and Outliers
Stat 5100 Handout #19 SAS: Influential Observations and Outliers Example: Data collected on 50 countries relevant to a cross-sectional study of a lifecycle savings hypothesis, which states that the response
More informationThe NESTED Procedure (Chapter)
SAS/STAT 9.3 User s Guide The NESTED Procedure (Chapter) SAS Documentation This document is an individual chapter from SAS/STAT 9.3 User s Guide. The correct bibliographic citation for the complete manual
More informationThe x-intercept can be found by setting y = 0 and solving for x: 16 3, 0
y=-3/4x+4 and y=2 x I need to graph the functions so I can clearly describe the graphs Specifically mention any key points on the graphs, including intercepts, vertex, or start/end points. What is the
More informationFuzzy time series forecasting of wheat production
Fuzzy time series forecasting of wheat production Narendra kumar Sr. lecturer: Computer Science, Galgotia college of engineering & Technology Sachin Ahuja Lecturer : IT Dept. Krishna Institute of Engineering
More informationSimplifying Algebraic Expressions Involving Exponents
Simplifying Algebraic Expressions Involving Exponents GOAL Simplify algebraic expressions involving powers and radicals. LEARN ABOUT the Math The ratio of the surface area to the volume of microorganisms
More informationBeta-Regression with SPSS Michael Smithson School of Psychology, The Australian National University
9/1/2005 Beta-Regression with SPSS 1 Beta-Regression with SPSS Michael Smithson School of Psychology, The Australian National University (email: Michael.Smithson@anu.edu.au) SPSS Nonlinear Regression syntax
More informationGraphical Analysis of Data using Microsoft Excel [2016 Version]
Graphical Analysis of Data using Microsoft Excel [2016 Version] Introduction In several upcoming labs, a primary goal will be to determine the mathematical relationship between two variable physical parameters.
More information14.2 The Regression Equation
14.2 The Regression Equation Tom Lewis Fall Term 2009 Tom Lewis () 14.2 The Regression Equation Fall Term 2009 1 / 12 Outline 1 Exact and inexact linear relationships 2 Fitting lines to data 3 Formulas
More informationST Lab 1 - The basics of SAS
ST 512 - Lab 1 - The basics of SAS What is SAS? SAS is a programming language based in C. For the most part SAS works in procedures called proc s. For instance, to do a correlation analysis there is proc
More informationPolymath 6. Overview
Polymath 6 Overview Main Polymath Menu LEQ: Linear Equations Solver. Enter (in matrix form) and solve a new system of simultaneous linear equations. NLE: Nonlinear Equations Solver. Enter and solve a new
More information13.1 2/20/2018. Conic Sections. Conic Sections: Parabolas and Circles
13 Conic Sections 13.1 Conic Sections: Parabolas and Circles 13.2 Conic Sections: Ellipses 13.3 Conic Sections: Hyperbolas 13.4 Nonlinear Systems of Equations 13.1 Conic Sections: Parabolas and Circles
More informationWeek 4: Simple Linear Regression III
Week 4: Simple Linear Regression III Marcelo Coca Perraillon University of Colorado Anschutz Medical Campus Health Services Research Methods I HSMP 7607 2017 c 2017 PERRAILLON ARR 1 Outline Goodness of
More informationExercise: Graphing and Least Squares Fitting in Quattro Pro
Chapter 5 Exercise: Graphing and Least Squares Fitting in Quattro Pro 5.1 Purpose The purpose of this experiment is to become familiar with using Quattro Pro to produce graphs and analyze graphical data.
More informationLab #9: ANOVA and TUKEY tests
Lab #9: ANOVA and TUKEY tests Objectives: 1. Column manipulation in SAS 2. Analysis of variance 3. Tukey test 4. Least Significant Difference test 5. Analysis of variance with PROC GLM 6. Levene test for
More informationChapter 1 Introduction to Optimization
Chapter 1 Introduction to Optimization Chapter Contents OVERVIEW................................... 3 LINEAR PROGRAMMING PROBLEMS.................. 4 PROC OPTLP................................. 4 PROC
More informationNOTATION AND TERMINOLOGY
15.053x, Optimization Methods in Business Analytics Fall, 2016 October 4, 2016 A glossary of notation and terms used in 15.053x Weeks 1, 2, 3, 4 and 5. (The most recent week's terms are in blue). NOTATION
More informationIntroduction to Hierarchical Linear Model. Hsueh-Sheng Wu CFDR Workshop Series January 30, 2017
Introduction to Hierarchical Linear Model Hsueh-Sheng Wu CFDR Workshop Series January 30, 2017 1 Outline What is Hierarchical Linear Model? Why do nested data create analytic problems? Graphic presentation
More informationALGEBRA 1 NOTES. Quarter 3. Name: Block
2016-2017 ALGEBRA 1 NOTES Quarter 3 Name: Block Table of Contents Unit 8 Exponent Rules Exponent Rules for Multiplication page 4 Negative and Zero Exponents page 8 Exponent Rules Involving Quotients page
More informationA Nonlinear Presolve Algorithm in AIMMS
A Nonlinear Presolve Algorithm in AIMMS By Marcel Hunting marcel.hunting@aimms.com November 2011 This paper describes the AIMMS presolve algorithm for nonlinear problems. This presolve algorithm uses standard
More information1 Downloading files and accessing SAS. 2 Sorting, scatterplots, correlation and regression
Statistical Methods and Computing, 22S:30/105 Instructor: Cowles Lab 2 Feb. 6, 2015 1 Downloading files and accessing SAS. We will be using the billion.dat dataset again today, as well as the OECD dataset
More informationCHAPTER 2 LITERATURE REVIEW
22 CHAPTER 2 LITERATURE REVIEW 2.1 GENERAL The basic transportation problem was originally developed by Hitchcock (1941). Efficient methods of solution are derived from the simplex algorithm and were developed
More informationSection 1.4 Equations and Graphs of Polynomial Functions soln.notebook September 25, 2017
Section 1.4 Equations and Graphs of Polynomial Functions Sep 21 8:49 PM Factors tell us... the zeros of the function the roots of the equation the x intercepts of the graph Multiplicity (of a zero) > The
More informationMulti objective linear programming problem (MOLPP) is one of the popular
CHAPTER 5 FUZZY MULTI OBJECTIVE LINEAR PROGRAMMING PROBLEM 5.1 INTRODUCTION Multi objective linear programming problem (MOLPP) is one of the popular methods to deal with complex and ill - structured decision
More informationGrade 9 Math Terminology
Unit 1 Basic Skills Review BEDMAS a way of remembering order of operations: Brackets, Exponents, Division, Multiplication, Addition, Subtraction Collect like terms gather all like terms and simplify as
More informationCLUSTER ANALYSIS. V. K. Bhatia I.A.S.R.I., Library Avenue, New Delhi
CLUSTER ANALYSIS V. K. Bhatia I.A.S.R.I., Library Avenue, New Delhi-110 012 In multivariate situation, the primary interest of the experimenter is to examine and understand the relationship amongst the
More informationNUMERICAL ADJUSTMENT OF PROFILE LINE ON MATHEMATICALLY APPROXIMATED TERRAIN SURFACE
Proceedings, 11 th FIG Symposium on Deformation Measurements, Santorini, Greece, 2003. NUMERIAL ADJUSTMENT OF PROFILE LINE ON MATHEMATIALLY APPROXIMATED TERRAIN SURFAE Zbigniew Piasek 1, Monika Siejka
More information0_PreCNotes17 18.notebook May 16, Chapter 12
Chapter 12 Notes BASIC MATRIX OPERATIONS Matrix (plural: Matrices) an n x m array of elements element a ij Example 1 a 21 = a 13 = Multiply Matrix by a Scalar Distribute scalar to all elements Addition
More informationCHAPTER 4 FREQUENCY STABILIZATION USING FUZZY LOGIC CONTROLLER
60 CHAPTER 4 FREQUENCY STABILIZATION USING FUZZY LOGIC CONTROLLER 4.1 INTRODUCTION Problems in the real world quite often turn out to be complex owing to an element of uncertainty either in the parameters
More informationCBSE Sample Papers for Class 10 SA2 Maths Solved 2016 (Set 2)
CBSE Sample Papers for Class 10 SA2 Maths Solved 2016 (Set 2) Code-LNCBSE Roll No. Please check that this question paper contains 5 printed pages. Code number given on the right hand side of the question
More informationStat 5100 Handout #11.a SAS: Variations on Ordinary Least Squares
Stat 5100 Handout #11.a SAS: Variations on Ordinary Least Squares Example 1: (Weighted Least Squares) A health researcher is interested in studying the relationship between diastolic blood pressure (bp)
More informationRobust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson
Robust Regression Robust Data Mining Techniques By Boonyakorn Jantaranuson Outline Introduction OLS and important terminology Least Median of Squares (LMedS) M-estimator Penalized least squares What is
More informationIntroduction. Linear because it requires linear functions. Programming as synonymous of planning.
LINEAR PROGRAMMING Introduction Development of linear programming was among the most important scientific advances of mid-20th cent. Most common type of applications: allocate limited resources to competing
More informationIntroduction to SAS proc calis
Introduction to SAS proc calis /* path1.sas */ %include 'SenicRead.sas'; title2 'Path Analysis Example for 3 Observed Variables'; /************************************************************************
More informationSAS/OR 14.2 User s Guide: Mathematical Programming. The Quadratic Programming Solver
SAS/OR 14.2 User s Guide: Mathematical Programming The Quadratic Programming Solver This document is an individual chapter from SAS/OR 14.2 User s Guide: Mathematical Programming. The correct bibliographic
More informationnotes13.1inclass May 01, 2015
Chapter 13-Coordinate Geometry extended. 13.1 Graphing equations We have already studied equations of the line. There are several forms: slope-intercept y = mx + b point-slope y - y1=m(x - x1) standard
More informationWe have already studied equations of the line. There are several forms:
Chapter 13-Coordinate Geometry extended. 13.1 Graphing equations We have already studied equations of the line. There are several forms: slope-intercept y = mx + b point-slope y - y1=m(x - x1) standard
More informationNew Directions in Linear Programming
New Directions in Linear Programming Robert Vanderbei November 5, 2001 INFORMS Miami Beach NOTE: This is a talk mostly on pedagogy. There will be some new results. It is not a talk on state-of-the-art
More informationBCN Decision and Risk Analysis. Syed M. Ahmed, Ph.D.
Linear Programming Module Outline Introduction The Linear Programming Model Examples of Linear Programming Problems Developing Linear Programming Models Graphical Solution to LP Problems The Simplex Method
More informationPart 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm
In the name of God Part 4. 4.1. Dantzig-Wolf Decomposition Algorithm Spring 2010 Instructor: Dr. Masoud Yaghini Introduction Introduction Real world linear programs having thousands of rows and columns.
More informationCS450 - Structure of Higher Level Languages
Spring 2018 Streams February 24, 2018 Introduction Streams are abstract sequences. They are potentially infinite we will see that their most interesting and powerful uses come in handling infinite sequences.
More informationRecall the expression for the minimum significant difference (w) used in the Tukey fixed-range method for means separation:
Topic 11. Unbalanced Designs [ST&D section 9.6, page 219; chapter 18] 11.1 Definition of missing data Accidents often result in loss of data. Crops are destroyed in some plots, plants and animals die,
More informationReals 1. Floating-point numbers and their properties. Pitfalls of numeric computation. Horner's method. Bisection. Newton's method.
Reals 1 13 Reals Floating-point numbers and their properties. Pitfalls of numeric computation. Horner's method. Bisection. Newton's method. 13.1 Floating-point numbers Real numbers, those declared to be
More informationP1 REVISION EXERCISE: 1
P1 REVISION EXERCISE: 1 1. Solve the simultaneous equations: x + y = x +y = 11. For what values of p does the equation px +4x +(p 3) = 0 have equal roots? 3. Solve the equation 3 x 1 =7. Give your answer
More informationC A S I O f x S UNIVERSITY OF SOUTHERN QUEENSLAND. The Learning Centre Learning and Teaching Support Unit
C A S I O f x - 1 0 0 S UNIVERSITY OF SOUTHERN QUEENSLAND The Learning Centre Learning and Teaching Support Unit MASTERING THE CALCULATOR USING THE CASIO fx-100s Learning and Teaching Support Unit (LTSU)
More informationA New Fuzzy Neural System with Applications
A New Fuzzy Neural System with Applications Yuanyuan Chai 1, Jun Chen 1 and Wei Luo 1 1-China Defense Science and Technology Information Center -Network Center Fucheng Road 26#, Haidian district, Beijing
More informationSparse matrices, graphs, and tree elimination
Logistics Week 6: Friday, Oct 2 1. I will be out of town next Tuesday, October 6, and so will not have office hours on that day. I will be around on Monday, except during the SCAN seminar (1:25-2:15);
More informationAM 121: Intro to Optimization Models and Methods Fall 2017
AM 121: Intro to Optimization Models and Methods Fall 2017 Lecture 10: Dual Simplex Yiling Chen SEAS Lesson Plan Interpret primal simplex in terms of pivots on the corresponding dual tableau Dictionaries
More informationAssignment 4 (Sol.) Introduction to Data Analytics Prof. Nandan Sudarsanam & Prof. B. Ravindran
Assignment 4 (Sol.) Introduction to Data Analytics Prof. andan Sudarsanam & Prof. B. Ravindran 1. Which among the following techniques can be used to aid decision making when those decisions depend upon
More informationStatistics & Analysis. A Comparison of PDLREG and GAM Procedures in Measuring Dynamic Effects
A Comparison of PDLREG and GAM Procedures in Measuring Dynamic Effects Patralekha Bhattacharya Thinkalytics The PDLREG procedure in SAS is used to fit a finite distributed lagged model to time series data
More informationSection 3.2: Multiple Linear Regression II. Jared S. Murray The University of Texas at Austin McCombs School of Business
Section 3.2: Multiple Linear Regression II Jared S. Murray The University of Texas at Austin McCombs School of Business 1 Multiple Linear Regression: Inference and Understanding We can answer new questions
More informationUsing the OPTMODEL Procedure in SAS/OR to Solve Complex Problems. Rob Pratt, Senior R&D Manager, SAS/OR
Using the OPTMODEL Procedure in SAS/OR to Solve Complex Problems Rob Pratt, Senior R&D Manager, SAS/OR Outline 1 Recent Features in PROC OPTMODEL 2 Graph Partitioning with Connectivity Constraints 2 /
More informationOne Factor Experiments
One Factor Experiments 20-1 Overview Computation of Effects Estimating Experimental Errors Allocation of Variation ANOVA Table and F-Test Visual Diagnostic Tests Confidence Intervals For Effects Unequal
More informationEXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY
EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2015 MODULE 4 : Modelling experimental data Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal
More informationWe have already studied equations of the line. There are several forms:
Chapter 13-Coordinate Geometry extended. 13.1 Graphing equations We have already studied equations of the line. There are several forms: slope-intercept y = mx + b point-slope y - y1=m(x - x1) standard
More information4751 (C1) Introduction to Advanced Mathematics
475 Mark Scheme 475 (C) Introduction to Advanced Mathematics [ a = ]c b www o.e. for each of complete correct steps, ft from previous error if equivalent difficulty condone = used for first two Ms 5x
More informationOpen Access Research on the Prediction Model of Material Cost Based on Data Mining
Send Orders for Reprints to reprints@benthamscience.ae 1062 The Open Mechanical Engineering Journal, 2015, 9, 1062-1066 Open Access Research on the Prediction Model of Material Cost Based on Data Mining
More information