Trajectory Optimization with Memetic Algorithms: Time-to-Torque Minimization of Turbocharged Engines

Size: px
Start display at page:

Download "Trajectory Optimization with Memetic Algorithms: Time-to-Torque Minimization of Turbocharged Engines"

Transcription

1 Trajectory Optimization with Memetic Algorithms: Time-to-Torque Minimization of Turbocharged Engines Dan Simon *, Yan Wang **, Oliver Tiber *, Dawei Du *, Dimitar Filev **, John Michelini ** * Cleveland State University, Cleveland, USA ** Ford Motor Company, Dearborn, USA d.j.simon@csuohio.edu Abstract A general memetic trajectory optimization method is introduced. The method is comprised of an evolutionary algorithm (EA) for global optimization, followed by local optimization. The global optimization algorithm is biogeography-based optimization (BBO), which is an EA motivated by the migratory behavior of biological organisms. For local optimization, we start with identifying a local linearized model within the region of the BBO solution by approximating the linear model with Jacobian matrix, and then optimize trajectory using gradient method. The process iterates Jacobian learning and optimization until an optimal trajectory is identified. We apply this memetic algorithm to a time-to-torque minimization problem for a gasoline turbocharged direct injection automotive engine. The optimized trajectory demonstrates significant improvement over the intuitive bang-bang controls that were originally thought to deliver the fastest transient torque response. Simulation results show that BBO decreases time-to-torque by 48% relative to bang-bang controls, and adaptive optimization decreases time-totorque by an additional 26%. These results have significant implications for improved automotive engine performance. I. INTRODUCTION Trajectory optimization is a broad field of research that began in the 1960s with the advent of optimal control and its implementation in digital computers. Trajectory optimization can be viewed as the problem of determining a set of time-varying control inputs for a dynamic system that minimizes a given cost function. The cost function may also be referred to as a performance index or return function. The problem may include constraints on the states or controls. Necessary mathematical conditions can be readily derived for broad classes of trajectory optimization problems. However, analytical solutions are available only for special types of problems, such as linear quadratic (LQ) problems in which the system dynamics are linear and the cost function is quadratic. If analytic solutions are not available, as is typical in real-world problems, then solutions can be found using numerical methods. Numerical trajectory optimization methods can be categorized as direct or indirect. Direct methods parameterize the controls or the states. The time points at which the controls and states are sampled are called collocation points, and direct trajectory optimization methods are sometimes called collocation methods. Direct methods have the advantage that they can be initialized with guesses for not only the controls but also for the states, which can often be easier to guess than initial estimates for the controls. Because of the conversion of the dynamics into constraints at each time point, direct methods can result in huge optimization problems, but this drawback can be alleviated by the sparseness of the matrices that are involved. One direct method is direct shooting, which parameterizes only the control and then integrates the system dynamics to satisfy the differential constraints [1]. Another direct method is state and control parameterization, which parameterizes both states and controls and converts the system dynamics into a set of constraints, thus removing the need for dynamic simulation [2]. Indirect methods simulate the dynamic system and then check if the necessary optimality conditions are satisfied. If not, the initial conditions or parameters of the simulation are modified with some numerical method, and the process is repeated. Indirect methods can suffer from poor numerical properties, such as high sensitivity of the optimality conditions with respect to the initial conditions. However, because of the use of simulation, every iteration results in a trajectory that satisfies the dynamic constraints, whereas in direct methods the constraints need to be satisfied before optimality can be pursued. Some popular indirect methods are steepest descent [3], conjugate gradient [4], and evolutionary algorithms [5]. This paper presents a framework for trajectory optimization using memetic algorithms. Our method is a direct method because we parameterize the control trajectories, followed by optimization of those parameters to obtain optimal trajectories. First we parameterize the trajectories using Fourier series or Gaussian kernels, and then optimize the Fourier series parameters or Gaussian kernel gains using an evolutionary algorithm (EA) called biogeography-based optimization (BBO). Then we further improve the trajectories by optimizing the Fourier series parameters or Gaussian kernel gains using a local optimization algorithm called adaptive optimization using iterative Jacobian learning. Finally, we apply our methods to a practical and important automotive problem, which is the minimization of time-to-torque of GTDI engines. This paper is organized as follows. Section 2 discusses the approach we propose for trajectory optimization, including parameterization of the control profiles using two different methods (Fourier series and Gaussian kernels); global optimization using BBO as a first step to find the neighborhood of the global optimum; and local optimization to search for the best solution within the globally-optimal neighborhood using adaptive optimization with Jacobian learning. Section 3 discusses the case study for our approach, which is a time-to-torque minimization problem for a GTDI engine. Section 4 contains conclusions and suggestions for future work. II. MEMETIC TRAJECTORY OPTIMIZATION WITH Researchers have found EAs to be attractive options for solving difficult control problems. For trajectory optimization problems, we can parameterize the control trajectories and then use the EA for parameter optimization. Previous research along these lines includes

2 GAs for nonlinear, second-order, two-point boundary value problems [6], for trajectory optimization [7], for missile guidance [8], and for robot control [9]. However, the weakness of EAs is their inability to home in on the global optimum. EAs have been shown to be good global optimizers, but they only find candidate solutions that are in the neighborhood of the global optimum. Therefore, further optimization within the neighborhood is required to find a more exact optimum. Such an algorithm combines evolutionary search with local search and is called a memetic algorithm. In this section we introduce our memetic algorithm for trajectory optimization. The first part of this section discusses parameterization of control trajectories using static parameters, the second part discusses global optimization of the static parameters with an EA to find the neighborhood of the global optimum, and the last part discusses local optimization to home in on the exact optimum within the neighborhood of the global optimum. A. Parameterization of Transient Profile The key to the approach in this section is the conversion of the trajectory optimization problem to a parameter optimization problem. In this section, we introduce two approaches to parameterize control profiles: Fourier series and Gaussian kernels. 1) Fourier Series The Fourier series is a general approach to parameterization that was first used for the optimization of structural systems [10]. We assume that the optimal profile of each control signal is continuous on the interval [0, T], where T is the fixed final time of the transient response. Although T is fixed here, the optimization cost function that we describe later is the time at which the transient response reaches steady-state, which, in general, is less than T. So even though T must be fixed in this approach, the approach can still be used for free-final-time problems. In order to solve the trajectory optimization problem, we parameterize each candidate control solution as a Fourier cosine series. = + (1) = half of control duration 1,, where s the number of control signals s the number of Fourier coefficients per control is the number of independent variables M is a tradeoff between search resolution and problem size. T is half of the control duration. We use the Fourier cosine series rather than the full Fourier series to reduce the dimension of the optimization problem by almost 50% [11]. That is, u k(t) is defined as an even function on the time interval t [ T, +T ], but we only use the control signal between time 0 and T when simulating the system. The value of T that we use is somewhat arbitrary. If we re trying to minimize time, then we set it to some value that is comfortably larger than our expected minimum transient response. 2) Gaussian Kernel Another approach to parameterize trajectories is to use a Gaussian kernel, which is defined in one dimension as follows:, = (2) where σ defines the width of the Gaussian. As described in [12], any trajectory U i(t) can be approximated by the sum of multiple kernels: = where k is the control index. For the k-th control, M i is the number of kernels, A kj is the j-th kernel gain, and σ k is the smoothness parameter. With this representation, the total number of tuning parameters for the optimal control problem is, where N is the number of control profiles. Note that the Fourier series above could use the same approach, in which we have a different number of coefficients for each control. B. Global Optimization With the parameterization approaches discussed above, the transient optimization problem has been converted to a parameter optimization problem. An EA can optimize parameters to find the neighborhood of the optimum. In this paper, BBO is selected as the global optimization method. It has shown good performance compared with many other EAs for a variety of problems [13], including benchmarks and real-world problems such as ECG classification [14], power system optimization [15], and image classification [16]. BBO is based on the science of species migration between habitats. Habitats have different degrees of suitability for species habitation. This is called the habitat suitability index (HSI). Habitats with a high HSI tend to have a large number of species, and habitats with a low HSI tend to have a small number of species. Species will immigrate to, and emigrate from, a habitat with a probability that is determined by how many species reside in the habitat. A habitat with a large number of species (high HSI) will tend to have a low immigration rate and a high emigration rate. Conversely, a habitat with a small number of species (low HSI) will tend to have a high immigration rate and low emigration rate. Figure 1 shows the migration curves for BBO. S1 S 2 Figure 1. BBO migration curves. This figure shows two candidate solutions, S1 and S2, for the same problem. S1 is a relatively poor solution, and S2 is a relatively good solution. Poor solutions are likely to receive features from other solutions, and unlikely to share features with other solutions. Good solutions are likely to share features with other solutions, and unlikely to receive features from other solutions. BBO treats each candidate solution as a biological habitat, and treats each species as a specific feature of that candidate solution. Candidate solutions are also called (3)

3 individuals, or simply solutions. Solution features are also called independent variables or decision variables. The number of solution features in each habitat is the dimension of the problem. Each candidate solution shares its features with other candidate solutions, and this sharing process is analogous to migration in biogeography. As migration occurs, the habitats become more suitable for their species, which corresponds to the improvement of candidate solutions. One iteration is often called a generation. We also implement common EA concepts in BBO such as elitism and mutation (Simon 2013b). Figure 2 shows the outline of the BBO algorithm. where k is the time step index, U(k) is the input vector at time k, Y(k) is the output vector which includes optimization objectives as well as constraints, and F(U(k)) is a nonlinear and smooth function of the correlation between the inputs and outputs. The objective of the closed-loop controller is to find the input vector U that minimizes the error (5) where Y d(k) is the desired output vector at time step k. 2) Surrogate Model Learning The core of adaptive optimization is an algorithm for accurate and accelerated estimation of the Jacobian that correlates the change of trajectories (parameterized as multiple calibrations) with the time-to-torque output and the constraints. For MIMO systems, the linearized system can be represented as a Jacobian matrix between the inputs and outputs: Y = J U, where J is the Jacobian. (6) Figure 2. Outline of BBO. N is the population size, that is, the number of candidate solutions, y is the population of candidate solutions, and z is a temporary population of candidate solutions so that migration can complete before the population is replaced for the next generation. zk(s) is the s-th feature of zk. C. Local Optimization Here we use adaptive optimization, which involves Jacobian learning. The main idea is to use an iterative process that includes the learning of a local surrogate model of a linearized time-varying approximation of a nonlinear process, followed by constrained gradient-based optimization with the surrogate model. 1) Surrogate Model Based Adaptive Optimization Surrogate model based adaptive optimization is illustrated by Figure 3. Controlling the plant output Y to match the reference trajectory Y d is achieved by an iterative process of online learning of the plant model, based on the input U and output Y, followed by optimizing the input U, until given stopping criteria are met. Figure 3. Model Based Adaptive Optimization Consider the optimization problem as a static system with the following representation: (4) Y is the n-element output vector, and X is the m-element input vector. Learning and updating the Jacobian can be viewed as a real-time least squares minimization problem: (7) where k is the time step index. For the purpose of learning, the MIMO linearized model is decomposed into q MISO subsystems, where q is the output dimension: (8) The n-element vector J s(k) is the s-th column of the Jacobian matrix for s [1, q]. We apply the Kalman filter implementation of the recursive least squares method to estimate each individual row of the Jacobian: 1 1 (9) 1 where vector w s(k) represents the inaccuracy of the linearized model of Equation (34) with covariance,, and v s(k) is measurement noise with zero mean and variance. The Kalman filter based expression for recursively learning the rows of the Jacobian estimate is given as 1 1 (10) (11) (12) The learning algorithm of Equation (11) minimizes the cost, where is the deviation between the predicted output and the actual output Y s(k): 1 (13) Q s in Equation (12) represents the drift factor that controls the rate of forgetting old data. It is analogous to

4 the forgetting factor in RLS [17] and can be estimated from the expected changes in the Jacobian. The advantage of using the drift factor versus the exponentially forgetting factor lies in cases when the system is not excited [18]. It forces the covariance matrix P s (which controls the variable learning rate of the Kalman filter) to grow linearly rather than exponentially. 3) Constrained Optimization Given the Jacobian estimate above, the next step is to find optimal inputs to minimize the cost in Equation (7). An optimal input update can be calculated from the pseudoinverse of the Jacobian: + 1 = + (14) = + = + (15) where I is the identity matrix and ρ is a small positive constant which is analogous to the Tychonoff regulation matrix [19] and which improves the numerical conditioning of the inverse system. Directly updating the input with Equation (14) from the pseudo-inversion of the Jacobian is not practical since it can affect the convergence of the mapping algorithm. Limiting the discussion to the more common case of n q we modify Equation (15) by a gain factor as follows: + 1 = + + = + = + (16) where diagonal matrix G represents the gain of the input update, and () is the estimated Jacobian from the learning algorithm of Equation (10). The input update of Equation (16) resembles the one-step calculation of the optimal direction of input change in the Levenberg- Marquardt method. One disadvantage of this input update rule is that it does not explicitly consider constraints on the inputs. Our solution to this problem is to write the input update as a constrained optimization problem: = argmin + 1 (17) = 1 + () ( 1) The penalty γ in the input update in Equation (17) is analogous to the gain matrix G in Equation (16) and improves the robustness of the algorithm due to the fact that () is a local approximation, and any large change of the inputs may lead to output oscillation for highly nonlinear functions of U(k). Since U(k) U(k 1) is always a solution to Equation (17), a sufficiently small value of γ guarantees that the mean square error will not increase from one iteration to the next, and consequently the Jacobian estimate () will approach the true Jacobian J(k). The update in Equation (17) can be performed using readily available QP solvers, such as FMINCON and LSQLIN in the Matlab Optimization Toolbox. III. CASE STUDY: TIME-TO-TORQUE OPTIMIZATION In this section, we apply the proposed approach to a time-to-torque optimization problem for GTDI engines. A. Gasoline Turbocharged Direct Injection Engines GTDI engine technology delivers higher specific power and torque than more traditional engines. The direct injection hardware cools the cylinder air-charge as the injected fuel evaporates. Charge cooling reduces spark knock and allows spark timing closer to the optimal time. In turn, GTDI designs utilize high compression ratio (CR). GTDI engine technology increases specific output, which translates to I4 engines with peak torque that rivals V6 engines. However, turbo hardware can take longer to reach full output torque; hence the term turbo-lag. Therefore, engine performance metrics include transient response, and specifically time-to-torque. This metric is obtained from engine hardware on dynamometers early in the design process to ensure acceptable performance. Increasing the size of a turbocharger increases peak power, but also increases transient response lag. The combination of transient and steady state metrics forces engine designs to consider both effects. B. Gasoline Turbocharged Direct Injection Engines A Simulink engine model is used in this work. The model represents a typical I4 GTDI engine with low pressure (LP) exhaust gas recirculation (EGR) (Wheeler 2013). It models the effects of actuators on brake engine torque output in both steady state and transient conditions. The model does not attempt to replicate exact performance on any given engine, but provides a simulation with the correct actuator effect on output torque. The model includes the following controls. throttle angle degrees ( ) [2,90] spark timing (degrees) [ 10,55] variable cam timing intake (degrees) [ 50,10] variable cam timing exhaust (degrees) [ 10,30] Exhaust gas recirculation valve (mm) [0,9] Air induction system throttle (degrees) [2,90] Waste gate valve position (close/open) [0,1] The model is depicted in Figure 4 and consists of induction system manifolds, where each manifold pressure derives from the ideal gas law and does not include heat transfer. The model integrates the sum of the mass flows entering (+) and exiting ( ) the manifolds to determine instantaneous mass. The induction system includes the manifold between the air induction system (AIS) throttle and the turbo compressor, the manifold between the compressor and the main throttle, and the main throttle and the intake valves (referred to as the intake manifold). The induction system includes the compressor by-pass valve (CBV), which opens during a back out when the desired torque decreases. Opening the CBV allows the pressure downstream of the compressor to decrease as the main throttle would close to reduce cylinder charge. The model uses the Heywood throttle flow model for the AIS, CBV, and main throttle [20]. The induction system includes a charge air cooler (CAC) in the manifold between the turbo compressor and main throttle, and in the intake manifold. The exhaust system consists of the manifold between the exhaust valves and the turbo charger, and a back-pressure model of the remaining exhaust.

5 Gasoline Turbocharged Direct Injection (GTDI) Engine 6) Air Induction Throttle (AIS) 1) Main Throttle 5) EGR Valve C A C Compressor Turbine 7) Waste-gate CAC 3) Intake VCT 4) Exhaust VCT Figure 4. Gasoline Turbocharged Direct Injection Engines The turbo charger model calculates turbine speed using turbo and compressor maps. It calculates turbine speed using the power balance between the compressor and the turbo, and the turbine inertia. The turbo charger model also includes a waste-gate to provided boost pressure control. The LP-EGR system consists of an EGR valve between the AIS and compressor. Exhaust flows from downstream of the turbo through a CAC into the induction system via the EGR valve. The model includes the transport delay of the EGR mass from the valve through the induction system to the cylinders. The actuators in the model include AIS, waste-gate, EGR valve, main throttle, intake variable cam timing (VCT), exhaust VCT, spark timing, and CBV. All of the actuators include characteristic response delays and time constants. Actuator position therefore lags behind commanded values during transients and converges to commanded values at steady state conditions. C. Time-toTorque Minimization In order to solve the minimum time-to-torque problem, we parameterize each candidate control solution as a Fourier cosine series. cos (18) where T = half the control duration, k [1, 7] (seven control signals), and M = 4. There are thus a total of 35 independent variables in the control problem. M is a tradeoff between search resolution and problem size. BBO solves the minimum time-to-torque problem by randomly generating a population {y k} of individuals for k = 1,, N p, where N p is the population size (see Figure 2). Each individual y k is a 35-element vector of Fourier coefficients. Each y k is used to generate a timevarying control vector. Based on the torque output from the simulation, time-to-torque is measured for each y k. After performing the above process for each of the N p BBO individuals, we have a list of N p time-to-torque values. We sort the values in order of increasing time-totorque, and sort the y k individuals correspondingly, from y 1 (smallest time-to-torque) to y N (largest time-to-torque). We assign emigration rates to each individual as follows: Intake Manifold & CAC Exhaust Manifold CAC Charge Air Cooler 2) Spark Timing (19) We demonstrate these methods on the problem of minimum-time torque transition from 27 Nm to 374 Nm. Although not discussed in this paper, methods for determining steady-state control solutions for the GTDI engine are available as a baseline for the techniques used in this paper. We find the steady-state controls as follows (listed in the same order as above): T 1 = 27 Nm u 1 = [7, 52, 10, 30, 6, 7, 1] T q= 374 Nm u q = [90, 9.5, -50, 0, 0, 90, 0] Suppose that we want to transition from torque T 1 to torque T q in minimum time. Intuitively, the quickest way is to step, or ramp very quickly, from u 1 to u q. This gives a 95% time-to-torque of 2.90 seconds, as shown in Figure 5. Torque (Nm) Time (seconds) Figure 5. If we use ramp controls starting at 27 Nm, it takes 2.90 seconds to reach 95% of the goal torque of 374 Nm. We initialize one of the BBO individuals to the ramp controls, which means that at the first generation, the best BBO individual will have a time-to-torque of 2.90 seconds or better. The rest of the BBO individuals are initialized to random Fourier series. Figure 6 shows the improvement of the best time to torque achieved over 30 generations (out of 60 individuals), where 0.82 seconds time-to-torque was achieved. This is a 72% decrease in the time-to-torque achieved by intuitive ramp controls. Minimum Time to Torque (seconds) Generation Figure 6. BBO reduces time-to-torque to 0.82 seconds after 30 generations. Figure 7 shows significant overshoot, undershoot, and torque fluctuations. Such torque response is not acceptable during driving. In order to alleviate this undesired behavior, we modify the BBO cost function as follows: Cost max min: max : and where W i are user-defined weights. This approach results in a time-to-torque of 1.52 seconds, worse than the previous value of 0.82 seconds, but reduces fluctuations.

6 EAs besides BBO, might provide better performance. In view of the multiple objectives in the optimization problem (for example, time-to-torque, overshoot, and undershoot), multi-objective optimization might provide good results. Parameterizing the controls with functions other than sinusoids might provide better results. Figure 7. BBO reduces time-to-torque to 0.82 seconds, but at the expense of undesirable fluctuations. These undesirable behaviors can be reduced or eliminated by modifying the BBO cost function. Next we use adaptive optimization, which converges to a time-to-torque of 1.13 seconds. We see that BBO was effective at finding the neighborhood of the global optimum, and local optimization was effective at finding a more precise optimum. BBO decreased time-to-torque by 48% relative to bang-bang controls. Adaptive optimization decreased time-to-torque by an additional 26%. Engine Torque (Nm) Base line BBO BBO+Adaptive time (sec) Figure 8. Time-to-torque with the modified cost function. BBO achieves a rise time of 1.52 seconds, and adaptive optimization reduces the rise time to 1.13 seconds. IV. CONCLUSION We have presented a memetic algorithm for general trajectory optimization. We parameterized the controls with Fourier series or Gaussian kernels, transforming the problem to a parameter optimization problem. Our memetic algorithm includes an evolutionary global optimizer called biogeography-based optimization (BBO) to find controls that are in the neighborhood of the global optimum. The global optimizer is followed by local optimization which uses Gaussian kernels to approximate the BBO-generated controls. The method iteratively identifies the local characteristics between the calibration parameters and the objectives and uses gradient-based optimization to optimize all calibration parameters. We applied our methods to the minimization of time-totorque for a gasoline turbocharged direct injection (GTDI) engine. Compared to intuitive bang-bang controls, BBO found control trajectories that decreased time-to-torque by 48%, and was further decreased by local optimization an additional 26%. We included soft constraints on the cost function to penalize undesirable factors such as overshoot, undershoot, and non-minimum phase behavior. Given the high computational expense of BBO (like other EAs), we suggest exploring methods to reduce the number of simulations required for BBO convergence [Simon 2013b, Section 21.1]. BBO variations, or other Engine Torque (Nm) Base line BBO BBO+Adaptive time (sec) REFERENCES [1] D. Kraft, On Converting Optimal Control Problems into Nonlinear Programming Codes, in Comp. Math. Programming, vol. 15, K. Schittkowski, Ed., Springer, 1985, pp [2] C. R. Hargraves and S. W. Paris. Direct trajectory optimization using nonlinear programming and collocation, J. of Guidance, Control, and Dynamics, vol. 10, no. 4, pp , [3] R. McGill, Optimal control, inequality state constraints, and the generalized Newton-Raphson algorithm, SIAM Journal on Control, Series A, vol. 3, no. 2, pp , [4] L. Lasdon, S. Mitter, and A. Warren, The conjugate gradient method for optimal control problems, IEEE Transactions on Automatic Control, vol. AC-12, no. 2, pp , [5] N. Yokoyama and S. Suzuki, Modified Genetic Algorithm for Constrained Trajectory Optimization, Journal of Guidance, Control, and Dynamics, vol. 28, no. 1, pp , [6] Z. Abo-Hammou, M. Yusuf, N. Mirza, S. Mirza, M. Arif, and J. Khurshid, Numerical solution of second-order, two-point boundary value problems using continuous genetic algorithms, Int. J. for Num. Methods in Engr., vol. 6, pp , [7] Y. Crispin, An evolutionary approach to nonlinear discrete-time optimal control with terminal constraints, in Informatics in Control, Automation and Robotics I, J. Braz, A. Vieira, and B. Encarnacao, Eds, Springer Netherlands, 2006, pp [8] Z. Yang, J. Fang, and Z. Qi, Flight midcourse guidance control based on genetic algorithm, Genetic and Evolutionary Computation Conference, Washington DC, 2005, pp [9] Y. Yokose and T. Izumi, Non-linear two-point boundary value problem obtaining the expansion coefficients by the dynamic GA and its application, IEEJ Transactions on Electronics, Information and Systems, vol. 124, pp , [10] V. Yen and M. Nagurka, Fourier-based optimal control approach for structural systems, AIAA Journal of Guidance, Control, and Dynamics, vol. 13, pp , [11] G. Konidaris, S. Osentoski, and P. Thomas, Value function approximation in reinforcement learning using the Fourier basis, Conf. on Artif. Intell., San Francisco, CA, 2011, pp [12] D. Filev and P. Angelov, Algorithms for Real Time Clustering and Generation of Rules from Data, in Advances in Fuzzy Clustering and Its Applications, J. V. de Oliveira and W. Pedrycz, Eds, John Wiley & Sons, 2007, pp [13] D. Simon, A. Shah, and C. Scheidegger, Distributed learning with biogeography-based optimization: Markov modeling and robot control, Swarm and Evol. Comp., vol. 10, pp , [14] M. Ovreiu and D. Simon, Biogeography-based optimization of neuro-fuzzy system parameters for diagnosis of cardiac disease, Genetic Evol. Comp. Conf., Portland, OR, 2010, pp [15] P. Roy, S. Ghoshal, and S. Thakur, Optimal VAR control for improvements in voltage profiles and for real power loss minimization using biogeography based optimization, Int. J. of Electrical Power & Energy Systems, vol. 43, pp , [16] V. Panchal, P. Singh, N. Kaur, and H. Kundra, Biogeography based satellite image classification, International Journal of Comp. Sci. and Information Security, vol. 6, pp , 2009 [17] K. Astrom and B. Wittenmark, Adaptive Control, Addison- Wesley. [18] R. Kalman, A New Approach to Linear Filtering and Prediction Problems, Transactions of the ASME - Journal of Basic Engineering, pp , [19] A. Tychonoff and V. Arsenin, Solution of Ill-posed Problems, Winston & Sons, 1977 [20] J. Heywood, Internal Combustion Engine Fundamentals, McGraw-Hill, 1988

Engine Calibration Process for Evaluation across the Torque- Speed Map

Engine Calibration Process for Evaluation across the Torque- Speed Map Engine Calibration Process for Evaluation across the Torque- Speed Map Brian Froelich Tara Hemami Manish Meshram Udaysinh Patil November 3, 2014 Outline : Background Objective Calibration process for torque

More information

Creation and Validation of a High-Accuracy, Real-Time-Capable Mean-Value GT-POWER Model

Creation and Validation of a High-Accuracy, Real-Time-Capable Mean-Value GT-POWER Model 1 Creation and Validation of a High-Accuracy, Real-Time-Capable Mean-Value GT-POWER Model Tim Prochnau Advanced Analysis and Simulation Department Engine Group International Truck and Engine Corporation

More information

BIOGEOGRAPHY-BASED OPTIMIZATION OF A VARIABLE CAMSHAFT TIMING SYSTEM GEORGE THOMAS

BIOGEOGRAPHY-BASED OPTIMIZATION OF A VARIABLE CAMSHAFT TIMING SYSTEM GEORGE THOMAS BIOGEOGRAPHY-BASED OPTIMIZATION OF A VARIABLE CAMSHAFT TIMING SYSTEM GEORGE THOMAS Bachelor of Electrical Engineering Cleveland State University December, 2012 submitted in partial fulfillment of requirements

More information

Engine Plant Model Development and Controller Calibration using Powertrain Blockset TM

Engine Plant Model Development and Controller Calibration using Powertrain Blockset TM Engine Plant Model Development and Controller Calibration using Powertrain Blockset TM Brad Hieb Scott Furry Application Engineering Consulting Services 2017 The MathWorks, Inc. 1 Key Take-Away s Engine

More information

Distributed Learning with Biogeography-Based Optimization

Distributed Learning with Biogeography-Based Optimization Distributed Learning with Biogeography-Based Optimization Carre Scheidegger, Arpit Shah, and Dan Simon Cleveland State University, Electrical and Computer Engineering Abstract. We present hardware testing

More information

Fuzzy Robot Controller Tuning with Biogeography-Based Optimization

Fuzzy Robot Controller Tuning with Biogeography-Based Optimization Fuzzy Robot Controller Tuning with Biogeography-Based Optimization George Thomas *, Paul Lozovyy, and Dan Simon Cleveland State University, Department of Electrical and Computer Engineering 2121 Euclid

More information

Multi Layer Perceptron trained by Quasi Newton learning rule

Multi Layer Perceptron trained by Quasi Newton learning rule Multi Layer Perceptron trained by Quasi Newton learning rule Feed-forward neural networks provide a general framework for representing nonlinear functional mappings between a set of input variables and

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Discussion of Various Techniques for Solving Economic Load Dispatch

Discussion of Various Techniques for Solving Economic Load Dispatch International Journal of Enhanced Research in Science, Technology & Engineering ISSN: 2319-7463, Vol. 4 Issue 7, July-2015 Discussion of Various Techniques for Solving Economic Load Dispatch Veerpal Kaur

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

THE CLASSICAL method for training a multilayer feedforward

THE CLASSICAL method for training a multilayer feedforward 930 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 4, JULY 1999 A Fast U-D Factorization-Based Learning Algorithm with Applications to Nonlinear System Modeling and Identification Youmin Zhang and

More information

ONE DIMENSIONAL (1D) SIMULATION TOOL: GT-POWER

ONE DIMENSIONAL (1D) SIMULATION TOOL: GT-POWER CHAPTER 4 ONE DIMENSIONAL (1D) SIMULATION TOOL: GT-POWER 4.1 INTRODUCTION Combustion analysis and optimization of any reciprocating internal combustion engines is too complex and intricate activity. It

More information

Experimental Data and Training

Experimental Data and Training Modeling and Control of Dynamic Systems Experimental Data and Training Mihkel Pajusalu Alo Peets Tartu, 2008 1 Overview Experimental data Designing input signal Preparing data for modeling Training Criterion

More information

I How does the formulation (5) serve the purpose of the composite parameterization

I How does the formulation (5) serve the purpose of the composite parameterization Supplemental Material to Identifying Alzheimer s Disease-Related Brain Regions from Multi-Modality Neuroimaging Data using Sparse Composite Linear Discrimination Analysis I How does the formulation (5)

More information

User Activity Recognition Based on Kalman Filtering Approach

User Activity Recognition Based on Kalman Filtering Approach User Activity Recognition Based on Kalman Filtering Approach EEC 592, Prosthesis Design and Control Final Project Report Gholamreza Khademi khademi.gh@gmail.com December 2, 214 Abstract Different control

More information

Demonstration of the DoE Process with Software Tools

Demonstration of the DoE Process with Software Tools Demonstration of the DoE Process with Software Tools Anthony J. Gullitti, Donald Nutter Abstract While the application of DoE methods in powertrain development is well accepted, implementation of DoE methods

More information

Theoretical Concepts of Machine Learning

Theoretical Concepts of Machine Learning Theoretical Concepts of Machine Learning Part 2 Institute of Bioinformatics Johannes Kepler University, Linz, Austria Outline 1 Introduction 2 Generalization Error 3 Maximum Likelihood 4 Noise Models 5

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING Second Edition P. Venkataraman Rochester Institute of Technology WILEY JOHN WILEY & SONS, INC. CONTENTS PREFACE xiii 1 Introduction 1 1.1. Optimization Fundamentals

More information

Image Compression: An Artificial Neural Network Approach

Image Compression: An Artificial Neural Network Approach Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in

More information

Lecture 5: Optimization of accelerators in simulation and experiments. X. Huang USPAS, Jan 2015

Lecture 5: Optimization of accelerators in simulation and experiments. X. Huang USPAS, Jan 2015 Lecture 5: Optimization of accelerators in simulation and experiments X. Huang USPAS, Jan 2015 1 Optimization in simulation General considerations Optimization algorithms Applications of MOGA Applications

More information

Robust Pole Placement using Linear Quadratic Regulator Weight Selection Algorithm

Robust Pole Placement using Linear Quadratic Regulator Weight Selection Algorithm 329 Robust Pole Placement using Linear Quadratic Regulator Weight Selection Algorithm Vishwa Nath 1, R. Mitra 2 1,2 Department of Electronics and Communication Engineering, Indian Institute of Technology,

More information

Model Based Engine Map Adaptation Using EKF

Model Based Engine Map Adaptation Using EKF Model Based Engine Map Adaptation Using EKF Erik Höckerdal,, Erik Frisk, and Lars Eriksson Department of Electrical Engineering, Linköpings universitet, Sweden, {hockerdal,frisk,larer}@isy.liu.se Scania

More information

Applying Supervised Learning

Applying Supervised Learning Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information

Crank Angle-resolved Realtime Engine Simulation for the Optimization of Control Strategies. Engine Management

Crank Angle-resolved Realtime Engine Simulation for the Optimization of Control Strategies. Engine Management Development Engine Management Crank Angle-resolved Realtime Engine Simulation for the Optimization of Control Strategies An engine simulation model permits new control strategies to be optimized at an

More information

Classical Gradient Methods

Classical Gradient Methods Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):

More information

Differential Evolution Biogeography Based Optimization for Linear Phase Fir Low Pass Filter Design

Differential Evolution Biogeography Based Optimization for Linear Phase Fir Low Pass Filter Design Differential Evolution Biogeography Based Optimization for Linear Phase Fir Low Pass Filter Design Surekha Rani * Balwinder Singh Dhaliwal Sandeep Singh Gill Department of ECE, Guru Nanak Dev Engineering

More information

Parameter Estimation of a DC Motor-Gear-Alternator (MGA) System via Step Response Methodology

Parameter Estimation of a DC Motor-Gear-Alternator (MGA) System via Step Response Methodology American Journal of Applied Mathematics 2016; 4(5): 252-257 http://www.sciencepublishinggroup.com/j/ajam doi: 10.11648/j.ajam.20160405.17 ISSN: 2330-0043 (Print); ISSN: 2330-006X (Online) Parameter Estimation

More information

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation Xingguo Li Tuo Zhao Xiaoming Yuan Han Liu Abstract This paper describes an R package named flare, which implements

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

A Comparative Study of Frequency-domain Finite Element Updating Approaches Using Different Optimization Procedures

A Comparative Study of Frequency-domain Finite Element Updating Approaches Using Different Optimization Procedures A Comparative Study of Frequency-domain Finite Element Updating Approaches Using Different Optimization Procedures Xinjun DONG 1, Yang WANG 1* 1 School of Civil and Environmental Engineering, Georgia Institute

More information

Linear Methods for Regression and Shrinkage Methods

Linear Methods for Regression and Shrinkage Methods Linear Methods for Regression and Shrinkage Methods Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Linear Regression Models Least Squares Input vectors

More information

Chapter 7: Numerical Prediction

Chapter 7: Numerical Prediction Ludwig-Maximilians-Universität München Institut für Informatik Lehr- und Forschungseinheit für Datenbanksysteme Knowledge Discovery in Databases SS 2016 Chapter 7: Numerical Prediction Lecture: Prof. Dr.

More information

European Journal of Science and Engineering Vol. 1, Issue 1, 2013 ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR

European Journal of Science and Engineering Vol. 1, Issue 1, 2013 ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR Ahmed A. M. Emam College of Engineering Karrary University SUDAN ahmedimam1965@yahoo.co.in Eisa Bashier M. Tayeb College of Engineering

More information

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Learning Adaptive Parameters with Restricted Genetic Optimization Method Learning Adaptive Parameters with Restricted Genetic Optimization Method Santiago Garrido and Luis Moreno Universidad Carlos III de Madrid, Leganés 28911, Madrid (Spain) Abstract. Mechanisms for adapting

More information

Numerical Robustness. The implementation of adaptive filtering algorithms on a digital computer, which inevitably operates using finite word-lengths,

Numerical Robustness. The implementation of adaptive filtering algorithms on a digital computer, which inevitably operates using finite word-lengths, 1. Introduction Adaptive filtering techniques are used in a wide range of applications, including echo cancellation, adaptive equalization, adaptive noise cancellation, and adaptive beamforming. These

More information

An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs

An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs 2010 Ninth International Conference on Machine Learning and Applications An Optimal Regression Algorithm for Piecewise Functions Expressed as Object-Oriented Programs Juan Luo Department of Computer Science

More information

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2 Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods

More information

Automated Parameterization of the Joint Space Dynamics of a Robotic Arm. Josh Petersen

Automated Parameterization of the Joint Space Dynamics of a Robotic Arm. Josh Petersen Automated Parameterization of the Joint Space Dynamics of a Robotic Arm Josh Petersen Introduction The goal of my project was to use machine learning to fully automate the parameterization of the joint

More information

Adaptive Filtering using Steepest Descent and LMS Algorithm

Adaptive Filtering using Steepest Descent and LMS Algorithm IJSTE - International Journal of Science Technology & Engineering Volume 2 Issue 4 October 2015 ISSN (online): 2349-784X Adaptive Filtering using Steepest Descent and LMS Algorithm Akash Sawant Mukesh

More information

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li International Conference on Applied Science and Engineering Innovation (ASEI 215) Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm Yinling Wang, Huacong Li School of Power and

More information

WE consider the gate-sizing problem, that is, the problem

WE consider the gate-sizing problem, that is, the problem 2760 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: REGULAR PAPERS, VOL 55, NO 9, OCTOBER 2008 An Efficient Method for Large-Scale Gate Sizing Siddharth Joshi and Stephen Boyd, Fellow, IEEE Abstract We consider

More information

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION Manjeet Singh 1, Divesh Thareja 2 1 Department of Electrical and Electronics Engineering, Assistant Professor, HCTM Technical

More information

The comparison of performance by using alternative refrigerant R152a in automobile climate system with different artificial neural network models

The comparison of performance by using alternative refrigerant R152a in automobile climate system with different artificial neural network models Journal of Physics: Conference Series PAPER OPEN ACCESS The comparison of performance by using alternative refrigerant R152a in automobile climate system with different artificial neural network models

More information

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute (3 pts) Compare the testing methods for testing path segment and finding first

More information

APPLICATION OF PATTERN SEARCH METHOD TO POWER SYSTEM ECONOMIC LOAD DISPATCH

APPLICATION OF PATTERN SEARCH METHOD TO POWER SYSTEM ECONOMIC LOAD DISPATCH APPLICATION OF PATTERN SEARCH METHOD TO POWER SYSTEM ECONOMIC LOAD DISPATCH J S Alsumait, J K Sykulski A K Alothman University of Southampton Electronics and Computer Sience School Electrical Power Engineering

More information

10703 Deep Reinforcement Learning and Control

10703 Deep Reinforcement Learning and Control 10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture

More information

Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm

Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm Acta Technica 61, No. 4A/2016, 189 200 c 2017 Institute of Thermomechanics CAS, v.v.i. Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm Jianrong Bu 1, Junyan

More information

Solving Optimization Problems with MATLAB Loren Shure

Solving Optimization Problems with MATLAB Loren Shure Solving Optimization Problems with MATLAB Loren Shure 6 The MathWorks, Inc. Topics Introduction Least-squares minimization Nonlinear optimization Mied-integer programming Global optimization Optimization

More information

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory Roshdy Foaad Abo-Shanab Kafr Elsheikh University/Department of Mechanical Engineering, Kafr Elsheikh,

More information

Algebraic Iterative Methods for Computed Tomography

Algebraic Iterative Methods for Computed Tomography Algebraic Iterative Methods for Computed Tomography Per Christian Hansen DTU Compute Department of Applied Mathematics and Computer Science Technical University of Denmark Per Christian Hansen Algebraic

More information

Dynamic Analysis of Structures Using Neural Networks

Dynamic Analysis of Structures Using Neural Networks Dynamic Analysis of Structures Using Neural Networks Alireza Lavaei Academic member, Islamic Azad University, Boroujerd Branch, Iran Alireza Lohrasbi Academic member, Islamic Azad University, Boroujerd

More information

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms Somayyeh Nalan-Ahmadabad and Sehraneh Ghaemi Abstract In this paper, pole placement with integral

More information

CS 395T Lecture 12: Feature Matching and Bundle Adjustment. Qixing Huang October 10 st 2018

CS 395T Lecture 12: Feature Matching and Bundle Adjustment. Qixing Huang October 10 st 2018 CS 395T Lecture 12: Feature Matching and Bundle Adjustment Qixing Huang October 10 st 2018 Lecture Overview Dense Feature Correspondences Bundle Adjustment in Structure-from-Motion Image Matching Algorithm

More information

Solar Radiation Data Modeling with a Novel Surface Fitting Approach

Solar Radiation Data Modeling with a Novel Surface Fitting Approach Solar Radiation Data Modeling with a Novel Surface Fitting Approach F. Onur Hocao glu, Ömer Nezih Gerek, Mehmet Kurban Anadolu University, Dept. of Electrical and Electronics Eng., Eskisehir, Turkey {fohocaoglu,ongerek,mkurban}

More information

Modeling with Uncertainty Interval Computations Using Fuzzy Sets

Modeling with Uncertainty Interval Computations Using Fuzzy Sets Modeling with Uncertainty Interval Computations Using Fuzzy Sets J. Honda, R. Tankelevich Department of Mathematical and Computer Sciences, Colorado School of Mines, Golden, CO, U.S.A. Abstract A new method

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

PSO based Adaptive Force Controller for 6 DOF Robot Manipulators

PSO based Adaptive Force Controller for 6 DOF Robot Manipulators , October 25-27, 2017, San Francisco, USA PSO based Adaptive Force Controller for 6 DOF Robot Manipulators Sutthipong Thunyajarern, Uma Seeboonruang and Somyot Kaitwanidvilai Abstract Force control in

More information

An Improved Energy-Efficient BBO-Based PEGASIS Protocol in Wireless Sensors Network

An Improved Energy-Efficient BBO-Based PEGASIS Protocol in Wireless Sensors Network RESEARCH ARTICLE OPEN ACCESS An Improved Energy-Efficient BBO-Based PEGASIS Protocol in Wireless Sensors Network Bipandeep Singh*, Er. Simranjit Kaur** *(Student, M.Tech-E.C.E, Department of E.C.E, Ludhiana

More information

3 Nonlinear Regression

3 Nonlinear Regression CSC 4 / CSC D / CSC C 3 Sometimes linear models are not sufficient to capture the real-world phenomena, and thus nonlinear models are necessary. In regression, all such models will have the same basic

More information

Journal of Engineering Research and Studies E-ISSN

Journal of Engineering Research and Studies E-ISSN Journal of Engineering Research and Studies E-ISS 0976-79 Research Article SPECTRAL SOLUTIO OF STEADY STATE CODUCTIO I ARBITRARY QUADRILATERAL DOMAIS Alavani Chitra R 1*, Joshi Pallavi A 1, S Pavitran

More information

Interactive segmentation, Combinatorial optimization. Filip Malmberg

Interactive segmentation, Combinatorial optimization. Filip Malmberg Interactive segmentation, Combinatorial optimization Filip Malmberg But first... Implementing graph-based algorithms Even if we have formulated an algorithm on a general graphs, we do not neccesarily have

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

Machine Learning / Jan 27, 2010

Machine Learning / Jan 27, 2010 Revisiting Logistic Regression & Naïve Bayes Aarti Singh Machine Learning 10-701/15-781 Jan 27, 2010 Generative and Discriminative Classifiers Training classifiers involves learning a mapping f: X -> Y,

More information

Artificial Intelligence for Robotics: A Brief Summary

Artificial Intelligence for Robotics: A Brief Summary Artificial Intelligence for Robotics: A Brief Summary This document provides a summary of the course, Artificial Intelligence for Robotics, and highlights main concepts. Lesson 1: Localization (using Histogram

More information

Research on Evaluation Method of Product Style Semantics Based on Neural Network

Research on Evaluation Method of Product Style Semantics Based on Neural Network Research Journal of Applied Sciences, Engineering and Technology 6(23): 4330-4335, 2013 ISSN: 2040-7459; e-issn: 2040-7467 Maxwell Scientific Organization, 2013 Submitted: September 28, 2012 Accepted:

More information

Multi-Objective Optimization Using Genetic Algorithms

Multi-Objective Optimization Using Genetic Algorithms Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves

More information

Efficient Iterative Semi-supervised Classification on Manifold

Efficient Iterative Semi-supervised Classification on Manifold . Efficient Iterative Semi-supervised Classification on Manifold... M. Farajtabar, H. R. Rabiee, A. Shaban, A. Soltani-Farani Sharif University of Technology, Tehran, Iran. Presented by Pooria Joulani

More information

Radial Basis Function Networks: Algorithms

Radial Basis Function Networks: Algorithms Radial Basis Function Networks: Algorithms Neural Computation : Lecture 14 John A. Bullinaria, 2015 1. The RBF Mapping 2. The RBF Network Architecture 3. Computational Power of RBF Networks 4. Training

More information

CHAPTER 2 TEXTURE CLASSIFICATION METHODS GRAY LEVEL CO-OCCURRENCE MATRIX AND TEXTURE UNIT

CHAPTER 2 TEXTURE CLASSIFICATION METHODS GRAY LEVEL CO-OCCURRENCE MATRIX AND TEXTURE UNIT CHAPTER 2 TEXTURE CLASSIFICATION METHODS GRAY LEVEL CO-OCCURRENCE MATRIX AND TEXTURE UNIT 2.1 BRIEF OUTLINE The classification of digital imagery is to extract useful thematic information which is one

More information

Adjoint Optimization combined with mesh morphing for CFD applications

Adjoint Optimization combined with mesh morphing for CFD applications Adjoint Optimization combined with mesh morphing for CFD applications Alberto Clarich*, Luca Battaglia*, Enrico Nobile**, Marco Evangelos Biancolini, Ubaldo Cella *ESTECO Spa, Italy. Email: engineering@esteco.com

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

Introduction to Support Vector Machines

Introduction to Support Vector Machines Introduction to Support Vector Machines CS 536: Machine Learning Littman (Wu, TA) Administration Slides borrowed from Martin Law (from the web). 1 Outline History of support vector machines (SVM) Two classes,

More information

Model learning for robot control: a survey

Model learning for robot control: a survey Model learning for robot control: a survey Duy Nguyen-Tuong, Jan Peters 2011 Presented by Evan Beachly 1 Motivation Robots that can learn how their motors move their body Complexity Unanticipated Environments

More information

Efficient MR Image Reconstruction for Compressed MR Imaging

Efficient MR Image Reconstruction for Compressed MR Imaging Efficient MR Image Reconstruction for Compressed MR Imaging Junzhou Huang, Shaoting Zhang, and Dimitris Metaxas Division of Computer and Information Sciences, Rutgers University, NJ, USA 08854 Abstract.

More information

Neuro Fuzzy Controller for Position Control of Robot Arm

Neuro Fuzzy Controller for Position Control of Robot Arm Neuro Fuzzy Controller for Position Control of Robot Arm Jafar Tavoosi, Majid Alaei, Behrouz Jahani Faculty of Electrical and Computer Engineering University of Tabriz Tabriz, Iran jtavoosii88@ms.tabrizu.ac.ir,

More information

DESIGN AND MODELLING OF A 4DOF PAINTING ROBOT

DESIGN AND MODELLING OF A 4DOF PAINTING ROBOT DESIGN AND MODELLING OF A 4DOF PAINTING ROBOT MSc. Nilton Anchaygua A. Victor David Lavy B. Jose Luis Jara M. Abstract The following project has as goal the study of the kinematics, dynamics and control

More information

Modern Methods of Data Analysis - WS 07/08

Modern Methods of Data Analysis - WS 07/08 Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints

More information

A Markov model of biogeography-based optimization for complex systems

A Markov model of biogeography-based optimization for complex systems Abstract A Markov model of biogeography-based optimization for complex systems Dawei Du, Dan Simon Cleveland State University Email: d.du@csuohio.edu, d.j.simon@csuohio.edu Biogeography-based optimization

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne Hartley - Zisserman reading club Part I: Hartley and Zisserman Appendix 6: Iterative estimation methods Part II: Zhengyou Zhang: A Flexible New Technique for Camera Calibration Presented by Daniel Fontijne

More information

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL

More information

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal

More information

CS 229 Midterm Review

CS 229 Midterm Review CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask

More information

Cellular Learning Automata-Based Color Image Segmentation using Adaptive Chains

Cellular Learning Automata-Based Color Image Segmentation using Adaptive Chains Cellular Learning Automata-Based Color Image Segmentation using Adaptive Chains Ahmad Ali Abin, Mehran Fotouhi, Shohreh Kasaei, Senior Member, IEEE Sharif University of Technology, Tehran, Iran abin@ce.sharif.edu,

More information

Active contour: a parallel genetic algorithm approach

Active contour: a parallel genetic algorithm approach id-1 Active contour: a parallel genetic algorithm approach Florence Kussener 1 1 MathWorks, 2 rue de Paris 92196 Meudon Cedex, France Florence.Kussener@mathworks.fr Abstract This paper presents an algorithm

More information

A Brief Look at Optimization

A Brief Look at Optimization A Brief Look at Optimization CSC 412/2506 Tutorial David Madras January 18, 2018 Slides adapted from last year s version Overview Introduction Classes of optimization problems Linear programming Steepest

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Humanoid Robotics. Least Squares. Maren Bennewitz

Humanoid Robotics. Least Squares. Maren Bennewitz Humanoid Robotics Least Squares Maren Bennewitz Goal of This Lecture Introduction into least squares Use it yourself for odometry calibration, later in the lecture: camera and whole-body self-calibration

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms

Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms R.J. Stonier, M.J. Drumm and J. Bell Faculty of Informatics

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

Author's personal copy. Information Sciences

Author's personal copy. Information Sciences Information Sciences 220 (2013) 492 506 Contents lists available at SciVerse ScienceDirect Information Sciences journal homepage: www.elsevier.com/locate/ins Variations of biogeography-based optimization

More information

Parameter Estimation in Differential Equations: A Numerical Study of Shooting Methods

Parameter Estimation in Differential Equations: A Numerical Study of Shooting Methods Parameter Estimation in Differential Equations: A Numerical Study of Shooting Methods Franz Hamilton Faculty Advisor: Dr Timothy Sauer January 5, 2011 Abstract Differential equation modeling is central

More information

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation November 2010 Nelson Shaw njd50@uclive.ac.nz Department of Computer Science and Software Engineering University of Canterbury,

More information

Available online at ScienceDirect. Procedia Technology 21 (2015 ) SMART GRID Technologies, August 6-8, 2015

Available online at   ScienceDirect. Procedia Technology 21 (2015 ) SMART GRID Technologies, August 6-8, 2015 Available online at www.sciencedirect.com ScienceDirect Procedia Technology 21 (2015 ) 611 618 SMART GRID Technologies, August 6-8, 2015 Comparison of Harmony Search Algorithm, Improved Harmony search

More information

Semi-Supervised Clustering with Partial Background Information

Semi-Supervised Clustering with Partial Background Information Semi-Supervised Clustering with Partial Background Information Jing Gao Pang-Ning Tan Haibin Cheng Abstract Incorporating background knowledge into unsupervised clustering algorithms has been the subject

More information