Introduction to ANSYS DesignXplorer

Similar documents
Parametric. Practices. Patrick Cunningham. CAE Associates Inc. and ANSYS Inc. Proprietary 2012 CAE Associates Inc. and ANSYS Inc. All rights reserved.

Introduction to ANSYS DesignXplorer

Chap.12 Kernel methods [Book, Chap.7]

Introduction to ANSYS DesignXplorer

Using a Single Rotating Reference Frame

Simulation of Turbulent Flow around an Airfoil

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

EE795: Computer Vision and Intelligent Systems

Spatial Interpolation & Geostatistics

A B C D E. Settings Choose height, H, free stream velocity, U, and fluid (dynamic viscosity and density ) so that: Reynolds number

Recent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will

Applying Supervised Learning

Tutorial 1. Introduction to Using FLUENT: Fluid Flow and Heat Transfer in a Mixing Elbow

Spatial Interpolation - Geostatistics 4/3/2018

Introduction to ANSYS CFX

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Neural Networks: What can a network represent. Deep Learning, Fall 2018

Data Mining Practical Machine Learning Tools and Techniques. Slides for Chapter 6 of Data Mining by I. H. Witten and E. Frank

Linear Models. Lecture Outline: Numeric Prediction: Linear Regression. Linear Classification. The Perceptron. Support Vector Machines

Topology Optimization in Fluid Dynamics

Module D: Laminar Flow over a Flat Plate

Nonparametric regression using kernel and spline methods

Spatial Interpolation & Geostatistics

Optimization Methods for Machine Learning (OMML)

Assignment 2. with (a) (10 pts) naive Gauss elimination, (b) (10 pts) Gauss with partial pivoting

Neural Networks: What can a network represent. Deep Learning, Spring 2018

Simulation of Turbulent Flow around an Airfoil

The Automation of the Feature Selection Process. Ronen Meiri & Jacob Zahavi

Sketching graphs of polynomials

Mathematical Methods 2019 v1.2

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Polymath 6. Overview

Verification and Validation of Turbulent Flow around a Clark-Y Airfoil

Instance-based Learning

Flow and Heat Transfer in a Mixing Elbow

Introduction to Support Vector Machines

Swapnil Nimse Project 1 Challenge #2

DISTRIBUTION STATEMENT A Approved for public release: distribution unlimited.

Announcements. Edges. Last Lecture. Gradients: Numerical Derivatives f(x) Edge Detection, Lines. Intro Computer Vision. CSE 152 Lecture 10

Perceptron as a graph

Lecture #11: The Perceptron

Support Vector Machines

ENGG1811: Data Analysis using Spreadsheets Part 1 1

5 Learning hypothesis classes (16 points)

Simulation and Validation of Turbulent Pipe Flows

Verification of Laminar and Validation of Turbulent Pipe Flows

Optimization. Industrial AI Lab.

Classification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska

Clustering Lecture 5: Mixture Model

Spatial Analysis and Modeling (GIST 4302/5302) Guofeng Cao Department of Geosciences Texas Tech University

Simulation of Laminar Pipe Flows

Knowledge Discovery and Data Mining

A Dendrogram. Bioinformatics (Lec 17)

Support vector machines

Concept of Curve Fitting Difference with Interpolation

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

Index. Umberto Michelucci 2018 U. Michelucci, Applied Deep Learning,

Minnesota Academic Standards for Mathematics 2007

Simulation of Flow Development in a Pipe

Appendix: To be performed during the lab session

Graphing Techniques. Domain (, ) Range (, ) Squaring Function f(x) = x 2 Domain (, ) Range [, ) f( x) = x 2

Voluntary State Curriculum Algebra II

Support Vector Machines

Approximation Methods in Optimization

Parameter based 3D Optimization of the TU Berlin TurboLab Stator with ANSYS optislang

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Understanding Andrew Ng s Machine Learning Course Notes and codes (Matlab version)

Supervised Learning (contd) Linear Separation. Mausam (based on slides by UW-AI faculty)

Large-Scale Lasso and Elastic-Net Regularized Generalized Linear Models

Introduction to CS graphs and plots in Excel Jacek Wiślicki, Laurent Babout,

Compressible Flow in a Nozzle

Handout 4 - Interpolation Examples

Lecture 8. Divided Differences,Least-Squares Approximations. Ceng375 Numerical Computations at December 9, 2010

CPSC 340: Machine Learning and Data Mining. Robust Regression Fall 2015

Deep Neural Networks Optimization

Lecture 7: Mesh Quality & Advanced Topics. Introduction to ANSYS Meshing Release ANSYS, Inc. February 12, 2015

The viscous forces on the cylinder are proportional to the gradient of the velocity field at the

Points Lines Connected points X-Y Scatter. X-Y Matrix Star Plot Histogram Box Plot. Bar Group Bar Stacked H-Bar Grouped H-Bar Stacked

VW 1LQH :HHNV 7KH VWXGHQW LV H[SHFWHG WR

Generalized Additive Model

Express Introductory Training in ANSYS Fluent Workshop 06 Using Moving Reference Frames and Sliding Meshes

Support Vector Machines

Linear Regression & Gradient Descent

No more questions will be added

Using Multiple Rotating Reference Frames

Lab 9: FLUENT: Transient Natural Convection Between Concentric Cylinders

Multiple Regression White paper

Workbench Tutorial Flow Over an Airfoil, Page 1 ANSYS Workbench Tutorial Flow Over an Airfoil

Kernels + K-Means Introduction to Machine Learning. Matt Gormley Lecture 29 April 25, 2018

Solar Radiation Data Modeling with a Novel Surface Fitting Approach

Auto Injector Syringe. A Fluent Dynamic Mesh 1DOF Tutorial

EE795: Computer Vision and Intelligent Systems

CSC 411: Lecture 02: Linear Regression

and to the following students who assisted in the creation of the Fluid Dynamics tutorials:

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

An introduction to interpolation and splines

CS 450 Numerical Analysis. Chapter 7: Interpolation

ALGEBRA II A CURRICULUM OUTLINE

Four equations are necessary to evaluate these coefficients. Eqn

Lecture : Neural net: initialization, activations, normalizations and other practical details Anne Solberg March 10, 2017

Transcription:

Lecture 4 14. 5 Release Introduction to ANSYS DesignXplorer 1 2013 ANSYS, Inc. September 27, 2013

s are functions of different nature where the output parameters are described in terms of the input parameters s provide the approximated values of the output parameters, everywhere in the analyzed design space, without the need to perform a complete solution The response surface methods described here are suitable for problems using ~10-15 input parameters Standard Response Surface / Kriging Non-parametric Regression Neural Network 2 2013 ANSYS, Inc. September 27, 2013

Outline 1. Procedure 2. Types 3 2013 ANSYS, Inc. September 27, 2013

Procedure 1. Create response surface (3D) B A C 4 2013 ANSYS, Inc. September 27, 2013

Procedure 1. Create response surface 2D:1 output parameter vs. 1 input parameter 2D Slices: 1 output parameter vs. 2 input parameter (each curve represents a slice of the response) 5 2013 ANSYS, Inc. September 27, 2013

Procedure 2. Check goodness of fit by displaying design points on response chart 6 2013 ANSYS, Inc. September 27, 2013

Procedure 3. Check goodness of fit by reviewing goodness of fit metrics Coefficient of Determination (R 2 measure): Measures how well the response surface represents output parameter variability. Should be as close to 1.0 as possible. Adjusted Coefficient of Determination: Takes the sample size into consideration when computing the Coefficient of Determination. Usually this is more reliable than the usual coefficient of determination when the number of samples is small ( < 30). Maximum Relative Residual: Similar measure for response surface using alternate mathematical representation. Should be as close to 0.0 as possible. Equations in Appendix 7 2013 ANSYS, Inc. September 27, 2013

Procedure 3. Check goodness of fit by reviewing goodness of fit metrics Root mean square error: Square root of the average square of the residuals at the DOE points for regression methods. Relative Root Mean Square Error: Square root of the average square of the residuals scaled by the actual output values at the DOE points for regression methods. Relative Maximum Absolute Error: Absolute maximum residual value relative to the standard deviation of the actual outputs. Relative Average Absolute Error: The average of error relative to the standard deviation of the actual output data Useful when the number of samples is low ( < 30). Equations in Appendix 8 2013 ANSYS, Inc. September 27, 2013

Procedure 4. Check goodness of fit by reviewing Predicted versus Observed Chart 2 nd Order Polynomial Kriging 9 2013 ANSYS, Inc. September 27, 2013

Procedure 5. Check goodness of fit by creating verification points Compare the predicted and observed values of the output parameters at different locations of the design space Can be added when defining the response surface or by right-clicking on the response surface plot It is needed for Kriging and Sparse Grid response surfaces since the standard goodness of fit metrics over predict goodness of fit 10 2013 ANSYS, Inc. September 27, 2013

Procedure 6. Improve response surface Select a more appropriate response surface type (discussed later) Manually add Refinement Points: Points which are to be solved to improve the response surface quality in this area of the design space. 11 2013 ANSYS, Inc. September 27, 2013

7. Extract data Procedure Response Point: a snapshot of parameter values where output parameter values were calculated in ANSYS DesignXplorer from a. Spider plot - A visual representation of the output parameter values relative to the range of that parameter given a specified scenario (input parameters values) 12 2013 ANSYS, Inc. September 27, 2013

Procedure 7. Extract data Local Sensitivity Bar plot The change of the output based on the change of each input independently Local Sensitivity Pie chart The relative impact of the input parameters on the local sensitivity Outputmax Output Output avg min 13 2013 ANSYS, Inc. September 27, 2013

7. Extract data Procedure Local Sensitivity Curves Show sensitivity of one or two output parameter to the variations of one input parameter while all other input parameters are held fixed When plotting two output parameters, the circle corresponds to the lowest value of each parameter Manufacturable values are supported (green squares) 14 2013 ANSYS, Inc. September 27, 2013

7. Extract data Procedure Min-Max Search: The Min-Max Search examines the entire output parameter space from a to approximate the minimum and maximum values of each output parameter 15 2013 ANSYS, Inc. September 27, 2013

7. Extract data Procedure 16 2013 ANSYS, Inc. September 27, 2013

Review Design Point A scenario to be solved. Either selected manually or automatically by Parameter Correlation and Design of Experiments Verification Point A scenario to be solved that is used to determine the accuracy of a response surface Refinement Point A scenario to be solved that is used to improve a response surface Response Point The predicted behavior for a scenario based on the response surface 17 2013 ANSYS, Inc. September 27, 2013

Types There are five response surface types in DX 1. Standard (2 nd order polynomial) [default] 2. Kriging 3. Non-parametric Regression 4. Neural Network 5. Sparse Grid 2 nd order Kriging DOE samples Non parametric regression Neuronal network Neuronal network 18 2013 ANSYS, Inc. September 27, 2013

Standard Full 2 nd Order Polynomials This is the default response surface type and a good starting point Based on a modified quadratic formulation Output=f(inputs) where f is a second order polynomial Will provide satisfactory results when the variation of the output parameters is mild/smooth DOE samples f(x) 2 nd order 19 2013 ANSYS, Inc. September 27, 2013

Kriging A multidimensional interpolation combining a polynomial model similar to the one of the standard response surface, which provides a global model of the design space, plus local deviations determined so that the Kriging model interpolates the DOE points. Output=f(inputs) + Z(inputs) where f is a second order polynomial (which dictates the global behaviour of the model) and Z a perturbation term (which dictates the local behaviour of the model) Since Kriging fits the response surface through all design points the Goodness of fit metrics will always be good DOE samples Z(X) : localized deviations y(x) f(x) Kriging 20 2013 ANSYS, Inc. September 27, 2013

Kriging Will provide better results than the standard response surface when the variations of the output parameters is stronger and non-linear (e.g. EMAG) Do not use when results are noisy Kriging interpolates the Design Points, but oscillations appear on the response surface 21 2013 ANSYS, Inc. September 27, 2013

Kriging Refinement Allows DX to determine the accuracy of the response surface as well as the points that would be required to increase the accuracy Refinement Type Automatic: ANSYS DesignXplorer will add samples to the DOE (number of additional samples is user controlled) Manual: User specifies samples 22 2013 ANSYS, Inc. September 27, 2013

Kriging Refinement Initial DOE Samples With 2 refinement points generated through autorefinement 23 2013 ANSYS, Inc. September 27, 2013

Non-parametric Regression Belongs to a general class of Support Vector Method (SVM) type techniques The basic idea is that the tolerance epsilon creates a narrow envelope around the true output surface and all or most of the sample points must/should lie inside this envelope. f(x) + f(x): Response surface with a margin of tolerance f(x) - 24 2013 ANSYS, Inc. September 27, 2013

Non-parametric Regression Suited for nonlinear responses Use when results are noisy (discussed on next slide) for some problem types (like ones dominated by flat surfaces or lower order polynomials), some oscillations may be noticed between the DOE points Usually slow to compute Suggested to only use when goodness of fit metrics from the quadratic response surface model is unsatisfactory DOE samples Non parametric regression 25 2013 ANSYS, Inc. September 27, 2013

NPR vs Kriging when results are noisy NPR approximates the Design Points with a margin of tolerance Kriging interpolates the Design Points, but oscillations appear on the response surface 26 2013 ANSYS, Inc. September 27, 2013

Neural Network Mathematical technique based on the natural neural network in the human brain Each arrow is associated with a weight (this determines whether a hidden function is active) Hidden functions are threshold functions which turn off or on based on sum of inputs With each iteration, weights are adjusts to minimize error between response surface and design points Detailed Explanation in Appendix 27 2013 ANSYS, Inc. September 27, 2013

Successful with highly nonlinear responses Control over the algorithm is very limited Only use in rare case Neural Network DOE samples Neuronal Neuronal network network 28 2013 ANSYS, Inc. September 27, 2013

Sparse Grid An adaptive response surface (it refines itself automatically) Usually requires more runs than other response surfaces so use when solve is fast Requires the Sparse Grid Initialization DOE as a starting point Only refines in the directions necessary so fewer design points are needed for the same quality response surface 17 5 Refinement continues (DPs are added) until Max Relative Error or Maximum depth is reached for each response. 29 2013 ANSYS, Inc. September 27, 2013

Sparse Grid Maximum Depth The maximum number of hierarchical interpolation levels to compute in each direction 30 2013 ANSYS, Inc. September 27, 2013

Sparse Grid Adjust the Max Relative Error and the convergence can continue 31 2013 ANSYS, Inc. September 27, 2013

Summary Standard 2nd-Order Polynomial (default) Effective when the variation of the output is smooth with regard to the input parameters. Kriging Efficient in a large number of cases. Suited to highly nonlinear responses. Do NOT use when results are noisy; Kriging is an interpolation that matches the points exactly. Always use verification points to check Goodness of Fit. Non-Parametric Regression Suited to nonlinear responses. Use when results are noisy. Typically slow to compute. Neural Network Suited to highly nonlinear responses. Use when results are noisy. Control over the algorithm is very limited. Sparse Grid Suited for studies containing discontinuities. Use when solve is fast. Good default choice: Kriging with auto-refinement 32 2013 ANSYS, Inc. September 27, 2013

Problem Description This workshop looks deeper into the options available for DOEs, s and Optimization, as well as exposes you to creating parameters in FLUENT and CFD-Post. The problem to be analyzed is a static mixer where hot and cold fluid, entering at variable velocities, mix. The objective of this analysis is to find inlet velocities which minimize pressure loss from the cold inlet to the outlet and minimize the temperature spread at the outlet. Input Hot inlet velocity Cold inlet velocity Output Pressure loss Temperature spread Hot Inlet 400 K Cold Inlet 300 K Outlet 33 2013 ANSYS, Inc. September 27, 2013

Appendix 34 2013 ANSYS, Inc. September 27, 2013

Goodness of fit metrics Coefficient of Determination Adjusted Coefficient of Determination Maximum Relative Residual 35 2013 ANSYS, Inc. September 27, 2013

Goodness of fit metrics Root Mean Square Error Relative Maximum Absolute Error Relative Root Mean Square Error Relative Average Absolute Error 36 2013 ANSYS, Inc. September 27, 2013

Surface approximation method Non-Parametric regression (NPR) W: weighting vector X: input sample (DOE) b: bias K: Gaussian Kernel (=Radial Basis Function) A: Lagrange Multipliers N: number of DOE points A and b are the unknown parameters f ( X ) W, X b N i 1 ( A i A * i )* K( X, X ) b i Using the Support Vector Machine (SVM) technique Support vectors are the subset of X which is deemed to represent the output parameter: X, [1; ]/ 0 * i i N Ai or Ai Up until a threshold the error is considered 0, after the error it becomes calculated as error-epsilon 0 high nonlinear behavior of the outputs with respect to the inputs can be captured f(x): Response surface with a margin of tolerance f(x) + f(x) - 37 2013 ANSYS, Inc. September 27, 2013

Neural Network A network of weighted, additive values with nonlinear transfer functions y k compared with y DP. Weight functions adjusted to minimize error u w j ji x i u j 1 exp( u j) j tanh 2 1 exp( u ) j h j u j j http://www.dtreg.com/mlfn.htm 38 2013 ANSYS, Inc. September 27, 2013

Sparse Grid Iteration Procedure (animation) 39 2013 ANSYS, Inc. September 27, 2013