Reliable Adaptive Algorithms for Integration, Interpolation and Optimization

Size: px
Start display at page:

Download "Reliable Adaptive Algorithms for Integration, Interpolation and Optimization"

Transcription

1 Reliable Adaptive Algorithms for Integration, Interpolation and Optimization Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology mypages.iit.edu/~hickernell Joint work with Sou-Cheng Choi (NORC & IIT) Yuhan Ding (IIT) Lan Jiang (IIT) Lluís Antoni Jiménez Rugama (IIT) Art Owen (Stanford) Xin Tong (UIC) Yizhi Zhang (IIT) Xuan Zhou (J. P. Morgan) Supported by NSF-DMS-5392, Thank you for your kind invitation to visit! U Illinois Chicago, October 5, 25 /8

2 Why We Want Adaptive Algorithms % An easy function to integrate... numerically f sin(x.^2); tic, I = integral(f,,), toc I = Elapsed time is.53 seconds. f(x) x % A hard function to integrte... numerically f2 2*sin((2*x).^2); tic, I2 = integral(f2,,), toc I2 = Elapsed time is.6246 seconds. f2(x) x Adaptive algorithms exert enough computational effort to get the correct answer, but not much more effort than necessary. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

3 Why Adaptive Algorithms Fail f fluky integral data - f spiky chebfun data -.5 f cheb fminbnd data fminbnd location of minimum approximating f spiky with Chebfun minimizing f spiky with fminbnd ż f fluky (x) dx = integral(ffluky,,)= meanmc_clt(y,.,...) aims to estimate E(Y) with an absolute error of. with 99% confidence. The answer is wrong half the time. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

4 Why Adaptive Algorithms Fail f spiky chebfun data -.5 f cheb fminbnd data fminbnd location of minimum approximating f spiky with Chebfun minimizing f spiky with fminbnd Chebfun (Hale et al., 25) and fminbnd stop sampling when they think that they have enough information, but miss the spike Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

5 Why Adaptive Algorithms Fail.5 f fluky integral data ż f fluky (x) dx = integral(ffluky,,)= integral approximates its error by the difference of two different quadratures, which might coincidentally coincide Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

6 Why Adaptive Algorithms Fail meanmc_clt(y,.,...) aims to estimate E(Y) with an absolute error of. with 99% confidence. The answer is wrong half the time. Insufficient () samples are taken to estimate var(y) because Y has a heavy-tailed distribution (large Probability density function for Y -5 - kurtosis) y Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

7 Why Adaptive Algorithms Fail Adaptive algorithms fail due to one or both of the following: We do not use enough samples to start. Our error bounds are faulty. Fred J. Hickernell, Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

8 Why Adaptive Algorithms Fail Adaptive algorithms fail due to one or both of the following: We do not use enough samples to start. How many samples is enough? Our error bounds are faulty. What are the alternatives? Fred J. Hickernell, Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

9 Making the Trapezoidal Rule Adaptive The trapezoidal rule for approximating ż f (x) dx is T n (f ) = [f () + 2f (/n) + + 2f ( /n) + f ()]. 2n.5 The quadrature error is ż f (x) dx T n (f ) ď Var(f ) 8n 2. Without knowing Var(f ), what is n that makes this error ď ε a? f(x).5 T n (f) =.86 int. = x Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 4/8

10 Making the Trapezoidal Rule Adaptive The quadrature error is ż f (x) dx T n (f ) ď Var(f ) 8n 2 for n =? without knowing Var(f ). want ď ε a, (ERR) f(x).5.5 T n(f) =.86 int. =.84 We assume that f is not too spiky, i.e., Var(f ) ď τ f f () + f (), (CONE) x and note that f f () + f () ď nÿ f () f () f (i/n) f ((i )/n) n + Var(f ) 2n i= This implies that (ERR) is satisfied for f satisfying (CONE), and n satisfying τ nÿ f () f () 4n(2n τ) f (i/n) f ((i )/n) n ď ε a i= Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 5/8

11 integral_g in GAIL This rigorously justified, data-based error bound is employed in the adaptive algorithm integral_g, part of the Guaranteed Automatic Integration Library (GAIL) (Choi et al., 23 25)..5 f fluky integral data Clancy et al. (24) contains details and shows that the computational cost of integral_g is ) O (bτ Var(f )/ε a operations. This cost is asymptotically optimal among all possible algorithms. H. et al. (25+) remove the factor of τ ż f fluky (x) dx = integral(ffluky,,)= integral_g(ffluky,,)= Don t approximate error by the difference of two quadratures (Lyness, 983). Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 6/8

12 General Recipe for Justified Adaptive Algorithms Error bound ż Trapezoidal Rule f (x) dx T n (f ) ď Var(f ) 8n 2 Function Approximation, Optimization, etc. sol(f ) appxn (f ) c f ď convex sol and appx n n p Cone condition Var(f ) ď τ f f () + f () f ď τ f For f «A(f ) f f () + f () ď A(f ) f () + f () + Var(f ) 2n f ď A(f ) + c 2 f n q so f ď τ A(f ) + c 2τ f n τ A(f ) so f ď c 2 τ/n q Databased error bound τ A(f ) f () + f () 4n(2n τ) c τ A(f ) n p ( c 2 τ/n q ) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 7/8

13 funappx_g and funmin_g in GAIL -.5 GAIL includes rigorously justified, adaptive algorithms for function approximation and function minimization (Choi et al., 23 25). - f spiky f -.5 cheb f GAIL fminbnd location of minimum funmin g location of minimum The error bounds employed ensure that enough samples are taken. Spiky functions are handled well provided that τ is large enough. See Clancy et al. (24) and Tong (24) for details. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 8/8

14 Making Monte Carlo Adaptive Monte Carlo algorithms approximate µ = E(Y) = average: ^µ n = n nÿ Y i, i= Y, Y 2,... IID Y when the Y i can be generated, but the integral is too hard. The Central Limit Theorem (CLT) says that [ P µ ^µ n ď 2.58^σ ]? Ç 99%, lomon n but this is not rigorous. ε a Probability density function for Y ż y ϱ(y) dy by a sample y Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 9/8

15 Making Monte Carlo Adaptive µ = E(Y) «^µ n = n nÿ Y i, i= Y, Y 2,... IID Y To obtain a guaranteed error bound we need to assume a bound on the kurtosis of Y: kurt(y) := E[(Y µ) 4 ]/ var(y) 2 ď κ max. (CONE) Cantelli s inequality implies a bound on the true variance in terms of the sample variance: P[C 2^σ 2 ě var(y)] ě 99.5%, where ^σ 2 = n σ n σ ÿ i= (Y i ^µ nσ ) 2, where C ą and n σ depend on κ max. The Berry-Esseen Inequality (a finite sample version of the CLT) determines the sample size, n, needed for the sample mean: Φ (?nε a /(C^σ) ) loooooooooomoooooooooon + n (? loooooooooooomoooooooooooon nε a /(C^σ), κ max ) ď.25. CLT part Berry-Esseen extra part Compute the sample mean, ^µ n, of an independent sample of size n. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 /8

16 meanmc_g in GAIL Probability density function for Y y meanmc_clt(y,...) fails half the time meanmc_g(y,...) succeeds 99% of the time GAIL includes rigorously justified, algorithm meanmc_g for computing µ = E(Y) to the desired error tolerance (Choi et al., 23 25; H. et al., 24). Heavy-tailed random variables are handled well provided that κ max is large enough. One may use meanmc_g to evaluate ż ż g(x) dx = f (x) ϱ(x)dx = E[f (X)], R d R d where X ϱ for some probability density function ϱ. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 /8

17 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

18 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

19 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

20 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

21 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

22 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

23 Making Quasi-Monte Carlo Adaptive ż Quasi-Monte Carlo (QMC) algorithms approximate µ = f (x) dx by a [,] d sample average of correlated, evenly place points. The evenness improves the convergence to the answer (Dick et al., 24) Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 2/8

24 cubsobol_g and cublattice_g in GAIL Data-driven error bounds for quasi-monte Carlo are obtained via discrete Fourier transforms of the function values (H. and Jiménez Rugama, 25+; Jiménez Rugama and H., 25+): Fourier Walsh for Sobol sampling, and Fourier sine/cosine for lattice sampling. Substantial speedups can by using QMC sampling rather than IID sampling. % Option price class object % based on user defined input MCopt = optprice(inp); % meanmc_g to compute the price tic MCPrice = genoptprice(mcopt) toc % Results MCgenoptPrice =.2 Elapsed time is seconds. % Now use Sobol' sampling... with a PCA construction % After changing the... parameters in inp QMCopt = optprice(inp) tic QMCPrice = genoptprice(qmcopt) toc % Results QMCgenoptPrice =.2 Elapsed time is.8 seconds. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 3/8

25 Summary Adaptive algorithms are valuable because they determine the computational effort required based on the problem difficulty. Most adaptive algorithms are flawed. By moving from error analysis based on of input (functions, random variables, etc.) to, we can justify adaptive algorithms. Relative errors can be handled. More problems are waiting to be tackled: algorithms with higher order convergence, multi-level Monte Carlo,... Applied and computational mathematicians should learn to write code well, not only prove theorems about their algorithms. Fred J. Hickernell, Reliable Adaptive Algorithms U Illinois Chicago, /5/5 4/8

26 Raising the Bar for Numerical Algorithms Implemented Widely available in a common language Has a convenient user interface Tested for bugs Coded efficiently Stopping criterion is data-based Provides an answer within a user-specified error tolerance Runs in a reasonable amount of time on available hardware All inputs and parameters can be specified explicitly Can be written in pseudo-code Implementable Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 5/8

27 Raising the Bar for Numerical Algorithms Justified Justifiable The stopping criterion to reach a user-specified error tolerance is guaranteed to succeed This rate of convergence is asymptotically optimal with respect to all possible algorithms The rate of convergence as the computational effort tends to infinity is known The error can be bounded in terms of information about the inputs and parameters Has a theoretical basis Provides a reasonable answer for a wide range of problems Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 6/8

28 Thank you Fred J. Hickernell, Reliable Adaptive Algorithms U Illinois Chicago, /5/5 7/8

29 References Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou GAIL: Guaranteed Automatic Integration Library (versions. 2.). Clancy, N., Y. Ding, C. Hamilton, F. J. H., and Y. Zhang. 24. The cost of deterministic, adaptive, automatic algorithms: Cones, not balls, J. Complexity 3, Cools, R. and D. Nuyens (eds.) 25+. Monte Carlo and quasi-monte Carlo methods 24, Springer-Verlag, Berlin. Dick, J., F. Kuo, and I. H. Sloan. 24. High dimensional integration the Quasi-Monte Carlo way, Acta Numer., 53. Hale, N., L. N. Trefethen, and T. A. Driscoll. 25. Chebfun version 5.2. H., F. J., L. Jiang, Y. Liu, and A. B. Owen. 24. Guaranteed conservative fixed width confidence intervals via Monte Carlo sampling, Monte Carlo and quasi-monte Carlo methods 22, pp H., F. J. and Ll. A. Jiménez Rugama Reliable adaptive cubature using digital sequences, Monte Carlo and quasi-monte Carlo methods 24. to appear, arxiv:4.865 [math.na]. H., F. J., M. Razo, and S. Yun Reliable adaptive numerical integration. submitted for publication. Jiménez Rugama, Ll. A. and F. J. H Adaptive multidimensional integration based on rank- lattices, Monte Carlo and quasi-monte Carlo methods 24. to appear, arxiv: Lyness, J. N When not to use an automatic quadrature routine, SIAM Rev. 25, Tong, X. 24. A guaranteed, adaptive, automatic algorithm for univariate function minimization, Master s Thesis. Fred J. Hickernell, hickernell@iit.edu Reliable Adaptive Algorithms U Illinois Chicago, /5/5 8/8

Constructing Trustworthy Automatic Numerical Algorithms

Constructing Trustworthy Automatic Numerical Algorithms Constructing Trustworthy Automatic Numerical Algorithms Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology hickernell@iit.edu www.iit.edu/~hickernell Joint work with

More information

GAIL Guaranteed Automatic Integration Library in MATLAB: Documentation for Version 2.1

GAIL Guaranteed Automatic Integration Library in MATLAB: Documentation for Version 2.1 GAIL Guaranteed Automatic Integration Library in MATLAB: Documentation for Version 2.1 Sou-Cheng T. Choi Yuhan Ding Fred J. Hickernell Lan Jiang Lluís Antoni Jiménez Rugama Xin Tong Yizhi Zhang Xuan Zhou

More information

A Superconvergent Monte Carlo Method for Multiple Integrals on the Grid

A Superconvergent Monte Carlo Method for Multiple Integrals on the Grid A Superconvergent Monte Carlo Method for Multiple Integrals on the Grid Sofiya Ivanovska, Emanouil Atanassov, and Aneta Karaivanova Institute for Parallel Processing - Bulgarian Academy of Sciences, Acad.

More information

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION CS580: Computer Graphics KAIST School of Computing Chapter 3 MULTI-DIMENSIONAL MONTE CARLO INTEGRATION 2 1 Monte Carlo Integration This describes a simple technique for the numerical evaluation of integrals

More information

Computer Experiments. Designs

Computer Experiments. Designs Computer Experiments Designs Differences between physical and computer Recall experiments 1. The code is deterministic. There is no random error (measurement error). As a result, no replication is needed.

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

A Random Number Based Method for Monte Carlo Integration

A Random Number Based Method for Monte Carlo Integration A Random Number Based Method for Monte Carlo Integration J Wang and G Harrell Department Math and CS, Valdosta State University, Valdosta, Georgia, USA Abstract - A new method is proposed for Monte Carlo

More information

Laplace Transform of a Lognormal Random Variable

Laplace Transform of a Lognormal Random Variable Approximations of the Laplace Transform of a Lognormal Random Variable Joint work with Søren Asmussen & Jens Ledet Jensen The University of Queensland School of Mathematics and Physics August 1, 2011 Conference

More information

Integration. Volume Estimation

Integration. Volume Estimation Monte Carlo Integration Lab Objective: Many important integrals cannot be evaluated symbolically because the integrand has no antiderivative. Traditional numerical integration techniques like Newton-Cotes

More information

Applications of MPI I

Applications of MPI I Applications of MPI I N-Dimensional Integration Using Monte Carlo Techniques Gürsan ÇOBAN 4.06.202 Outline Numerical Integration Techniques Monte Carlo Techniques for Numerical Integration Some MPI Examples

More information

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16. Scientific Computing with Case Studies SIAM Press, 29 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 28 What is a Monte-Carlo method?

More information

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo Scientific Computing with Case Studies SIAM Press, 2009 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 2008 1 What is a Monte-Carlo

More information

A New Quasi-Monte Carlo Algorithm for Numerical Integration of Smooth Functions

A New Quasi-Monte Carlo Algorithm for Numerical Integration of Smooth Functions A New Quasi-Monte Carlo Algorithm for Numerical Integration of Smooth Functions Emanouil I. Atanassov, Ivan T. Dimov, and Mariya K. Durchova Central Laboratory for Parallel Processing, Bulgarian Academy

More information

Monte Carlo integration

Monte Carlo integration Monte Carlo integration Introduction Monte Carlo integration is a quadrature (cubature) where the nodes are chosen randomly. Typically no assumptions are made about the smoothness of the integrand, not

More information

Lecture-12: Closed Sets

Lecture-12: Closed Sets and Its Examples Properties of Lecture-12: Dr. Department of Mathematics Lovely Professional University Punjab, India October 18, 2014 Outline Introduction and Its Examples Properties of 1 Introduction

More information

Efficient Bidirectional Path Tracing by Randomized Quasi-Monte Carlo Integration

Efficient Bidirectional Path Tracing by Randomized Quasi-Monte Carlo Integration Efficient Bidirectional Path Tracing by Randomized Quasi-Monte Carlo Integration Thomas Kollig and Alexander Keller kollig@informatikuni-klde and keller@informatikuni-klde Computer Science Dept, University

More information

Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis

Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis Blake Boswell Booz Allen Hamilton ISPA / SCEA Conference Albuquerque, NM June 2011 1 Table Of Contents Introduction Monte Carlo Methods

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information

Conjectures concerning the geometry of 2-point Centroidal Voronoi Tessellations

Conjectures concerning the geometry of 2-point Centroidal Voronoi Tessellations Conjectures concerning the geometry of 2-point Centroidal Voronoi Tessellations Emma Twersky May 2017 Abstract This paper is an exploration into centroidal Voronoi tessellations, or CVTs. A centroidal

More information

Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation

Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation .--- Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Networ and Fuzzy Simulation Abstract - - - - Keywords: Many optimization problems contain fuzzy information. Possibility

More information

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods ISyE 6416: Computational Statistics Spring 2017 Lecture 13: Monte Carlo Methods Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Determine area

More information

PHYSICS 115/242 Homework 5, Solutions X = and the mean and variance of X are N times the mean and variance of 12/N y, so

PHYSICS 115/242 Homework 5, Solutions X = and the mean and variance of X are N times the mean and variance of 12/N y, so PHYSICS 5/242 Homework 5, Solutions. Central limit theorem. (a) Let y i = x i /2. The distribution of y i is equal to for /2 y /2 and zero otherwise. Hence We consider µ y y i = /2 /2 σ 2 y y 2 i y i 2

More information

Lecture 7: Monte Carlo Rendering. MC Advantages

Lecture 7: Monte Carlo Rendering. MC Advantages Lecture 7: Monte Carlo Rendering CS 6620, Spring 2009 Kavita Bala Computer Science Cornell University MC Advantages Convergence rate of O( ) Simple Sampling Point evaluation Can use black boxes General

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

Short-Cut MCMC: An Alternative to Adaptation

Short-Cut MCMC: An Alternative to Adaptation Short-Cut MCMC: An Alternative to Adaptation Radford M. Neal Dept. of Statistics and Dept. of Computer Science University of Toronto http://www.cs.utoronto.ca/ radford/ Third Workshop on Monte Carlo Methods,

More information

Bilinear Programming

Bilinear Programming Bilinear Programming Artyom G. Nahapetyan Center for Applied Optimization Industrial and Systems Engineering Department University of Florida Gainesville, Florida 32611-6595 Email address: artyom@ufl.edu

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 13 Random Numbers and Stochastic Simulation Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright

More information

Monte Carlo Methods and Statistical Computing: My Personal E

Monte Carlo Methods and Statistical Computing: My Personal E Monte Carlo Methods and Statistical Computing: My Personal Experience Department of Mathematics & Statistics Indian Institute of Technology Kanpur November 29, 2014 Outline Preface 1 Preface 2 3 4 5 6

More information

Halftoning and quasi-monte Carlo

Halftoning and quasi-monte Carlo Halftoning and quasi-monte Carlo Ken Hanson CCS-2, Methods for Advanced Scientific Simulations Los Alamos National Laboratory This presentation available at http://www.lanl.gov/home/kmh/ LA-UR-04-1854

More information

Nearest Neighbor Classification

Nearest Neighbor Classification Nearest Neighbor Classification Charles Elkan elkan@cs.ucsd.edu October 9, 2007 The nearest-neighbor method is perhaps the simplest of all algorithms for predicting the class of a test example. The training

More information

Use of Extreme Value Statistics in Modeling Biometric Systems

Use of Extreme Value Statistics in Modeling Biometric Systems Use of Extreme Value Statistics in Modeling Biometric Systems Similarity Scores Two types of matching: Genuine sample Imposter sample Matching scores Enrolled sample 0.95 0.32 Probability Density Decision

More information

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea Chapter 3 Bootstrap 3.1 Introduction The estimation of parameters in probability distributions is a basic problem in statistics that one tends to encounter already during the very first course on the subject.

More information

Hybrid Quasi-Monte Carlo Method for the Simulation of State Space Models

Hybrid Quasi-Monte Carlo Method for the Simulation of State Space Models The Tenth International Symposium on Operations Research and Its Applications (ISORA 211) Dunhuang, China, August 28 31, 211 Copyright 211 ORSC & APORC, pp. 83 88 Hybrid Quasi-Monte Carlo Method for the

More information

Probability Model for 2 RV s

Probability Model for 2 RV s Probability Model for 2 RV s The joint probability mass function of X and Y is P X,Y (x, y) = P [X = x, Y= y] Joint PMF is a rule that for any x and y, gives the probability that X = x and Y= y. 3 Example:

More information

Stochastic Simulation: Algorithms and Analysis

Stochastic Simulation: Algorithms and Analysis Soren Asmussen Peter W. Glynn Stochastic Simulation: Algorithms and Analysis et Springer Contents Preface Notation v xii I What This Book Is About 1 1 An Illustrative Example: The Single-Server Queue 1

More information

MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE

MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE REGULARIZATION METHODS FOR HIGH DIMENSIONAL LEARNING Francesca Odone and Lorenzo Rosasco odone@disi.unige.it - lrosasco@mit.edu June 6, 2011 ABOUT THIS

More information

A derivative-free trust-region algorithm for reliability-based optimization

A derivative-free trust-region algorithm for reliability-based optimization Struct Multidisc Optim DOI 10.1007/s00158-016-1587-y BRIEF NOTE A derivative-free trust-region algorithm for reliability-based optimization Tian Gao 1 Jinglai Li 2 Received: 3 June 2016 / Revised: 4 September

More information

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung POLYTECHNIC UNIVERSITY Department of Computer and Information Science VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung Abstract: Techniques for reducing the variance in Monte Carlo

More information

arxiv: v1 [math.na] 20 Jun 2014

arxiv: v1 [math.na] 20 Jun 2014 Iterative methods for the inclusion of the inverse matrix Marko D Petković University of Niš, Faculty of Science and Mathematics Višegradska 33, 18000 Niš, Serbia arxiv:14065343v1 [mathna 20 Jun 2014 Miodrag

More information

Interpolation by Spline Functions

Interpolation by Spline Functions Interpolation by Spline Functions Com S 477/577 Sep 0 007 High-degree polynomials tend to have large oscillations which are not the characteristics of the original data. To yield smooth interpolating curves

More information

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts by Emmanuel Agu Introduction The integral equations generally don t have analytic solutions, so we must turn to numerical

More information

MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE

MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE REGULARIZATION METHODS FOR HIGH DIMENSIONAL LEARNING Francesca Odone and Lorenzo Rosasco odone@disi.unige.it - lrosasco@mit.edu June 3, 2013 ABOUT THIS

More information

Perceptron as a graph

Perceptron as a graph Neural Networks Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University October 10 th, 2007 2005-2007 Carlos Guestrin 1 Perceptron as a graph 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-6 -4-2

More information

Monte Carlo for Spatial Models

Monte Carlo for Spatial Models Monte Carlo for Spatial Models Murali Haran Department of Statistics Penn State University Penn State Computational Science Lectures April 2007 Spatial Models Lots of scientific questions involve analyzing

More information

Sets. De Morgan s laws. Mappings. Definition. Definition

Sets. De Morgan s laws. Mappings. Definition. Definition Sets Let X and Y be two sets. Then the set A set is a collection of elements. Two sets are equal if they contain exactly the same elements. A is a subset of B (A B) if all the elements of A also belong

More information

Revisiting the Upper Bounding Process in a Safe Branch and Bound Algorithm

Revisiting the Upper Bounding Process in a Safe Branch and Bound Algorithm Revisiting the Upper Bounding Process in a Safe Branch and Bound Algorithm Alexandre Goldsztejn 1, Yahia Lebbah 2,3, Claude Michel 3, and Michel Rueher 3 1 CNRS / Université de Nantes 2, rue de la Houssinière,

More information

Numerical Integration

Numerical Integration Lecture 12: Numerical Integration (with a focus on Monte Carlo integration) Computer Graphics CMU 15-462/15-662, Fall 2015 Review: fundamental theorem of calculus Z b f(x)dx = F (b) F (a) a f(x) = d dx

More information

Divide and Conquer Kernel Ridge Regression

Divide and Conquer Kernel Ridge Regression Divide and Conquer Kernel Ridge Regression Yuchen Zhang John Duchi Martin Wainwright University of California, Berkeley COLT 2013 Yuchen Zhang (UC Berkeley) Divide and Conquer KRR COLT 2013 1 / 15 Problem

More information

) in the k-th subbox. The mass of the k-th subbox is M k δ(x k, y k, z k ) V k. Thus,

) in the k-th subbox. The mass of the k-th subbox is M k δ(x k, y k, z k ) V k. Thus, 1 Triple Integrals Mass problem. Find the mass M of a solid whose density (the mass per unit volume) is a continuous nonnegative function δ(x, y, z). 1. Divide the box enclosing into subboxes, and exclude

More information

Computation of the multivariate Oja median

Computation of the multivariate Oja median Metrika manuscript No. (will be inserted by the editor) Computation of the multivariate Oja median T. Ronkainen, H. Oja, P. Orponen University of Jyväskylä, Department of Mathematics and Statistics, Finland

More information

Markov Chain Monte Carlo (part 1)

Markov Chain Monte Carlo (part 1) Markov Chain Monte Carlo (part 1) Edps 590BAY Carolyn J. Anderson Department of Educational Psychology c Board of Trustees, University of Illinois Spring 2018 Depending on the book that you select for

More information

Application of Characteristic Function Method in Target Detection

Application of Characteristic Function Method in Target Detection Application of Characteristic Function Method in Target Detection Mohammad H Marhaban and Josef Kittler Centre for Vision, Speech and Signal Processing University of Surrey Surrey, GU2 7XH, UK eep5mm@ee.surrey.ac.uk

More information

DEDICATIONS. To my dear parents, and sisters. To my advisor Dr. Mahmoud Alrefaei. To my friends in and out the university.

DEDICATIONS. To my dear parents, and sisters. To my advisor Dr. Mahmoud Alrefaei. To my friends in and out the university. DEDICATIONS To my dear parents, and sisters. To my advisor Dr. Mahmoud Alrefaei. To my friends in and out the university. i ii TABLE OF CONTENTS Dedications Acknowledgements Table of Contents List of Tables

More information

A Low Level Introduction to High Dimensional Sparse Grids

A Low Level Introduction to High Dimensional Sparse Grids A Low Level Introduction to High Dimensional Sparse Grids http://people.sc.fsu.edu/ jburkardt/presentations/sandia 2007.pdf... John 1 Clayton Webster 2 1 Virginia Tech 2 Sandia National Laboratory. 21

More information

Computational Methods. Randomness and Monte Carlo Methods

Computational Methods. Randomness and Monte Carlo Methods Computational Methods Randomness and Monte Carlo Methods Manfred Huber 2010 1 Randomness and Monte Carlo Methods Introducing randomness in an algorithm can lead to improved efficiencies Random sampling

More information

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation November 2010 Nelson Shaw njd50@uclive.ac.nz Department of Computer Science and Software Engineering University of Canterbury,

More information

Johns Hopkins Math Tournament Proof Round: Point Set Topology

Johns Hopkins Math Tournament Proof Round: Point Set Topology Johns Hopkins Math Tournament 2019 Proof Round: Point Set Topology February 9, 2019 Problem Points Score 1 3 2 6 3 6 4 6 5 10 6 6 7 8 8 6 9 8 10 8 11 9 12 10 13 14 Total 100 Instructions The exam is worth

More information

Nonparametric Regression

Nonparametric Regression Nonparametric Regression John Fox Department of Sociology McMaster University 1280 Main Street West Hamilton, Ontario Canada L8S 4M4 jfox@mcmaster.ca February 2004 Abstract Nonparametric regression analysis

More information

First-passage percolation in random triangulations

First-passage percolation in random triangulations First-passage percolation in random triangulations Jean-François Le Gall (joint with Nicolas Curien) Université Paris-Sud Orsay and Institut universitaire de France Random Geometry and Physics, Paris 2016

More information

Optimized Analog Mappings for Distributed Source-Channel Coding

Optimized Analog Mappings for Distributed Source-Channel Coding 21 Data Compression Conference Optimized Analog Mappings for Distributed Source-Channel Coding Emrah Akyol, Kenneth Rose Dept. of Electrical & Computer Engineering University of California, Santa Barbara,

More information

M3P1/M4P1 (2005) Dr M Ruzhansky Metric and Topological Spaces Summary of the course: definitions, examples, statements.

M3P1/M4P1 (2005) Dr M Ruzhansky Metric and Topological Spaces Summary of the course: definitions, examples, statements. M3P1/M4P1 (2005) Dr M Ruzhansky Metric and Topological Spaces Summary of the course: definitions, examples, statements. Chapter 1: Metric spaces and convergence. (1.1) Recall the standard distance function

More information

Random and Parallel Algorithms a journey from Monte Carlo to Las Vegas

Random and Parallel Algorithms a journey from Monte Carlo to Las Vegas Random and Parallel Algorithms a journey from Monte Carlo to Las Vegas Google Maps couldn t help! Bogdán Zaválnij Institute of Mathematics and Informatics University of Pecs Ljubljana, 2014 Bogdán Zaválnij

More information

Deterministic Consistent Density Estimation for Light Transport Simulation

Deterministic Consistent Density Estimation for Light Transport Simulation Deterministic Consistent Density Estimation for Light Transport Simulation Alexander Keller and Nikolaus Binder Abstract Quasi-Monte Carlo methods often are more efficient than Monte Carlo methods, mainly,

More information

arxiv:math/ v3 [math.dg] 23 Jul 2007

arxiv:math/ v3 [math.dg] 23 Jul 2007 THE INTRINSIC GEOMETRY OF A JORDAN DOMAIN arxiv:math/0512622v3 [math.dg] 23 Jul 2007 RICHARD L. BISHOP Abstract. For a Jordan domain in the plane the length metric space of points connected to an interior

More information

Fast orthogonal transforms and generation of Brownian paths

Fast orthogonal transforms and generation of Brownian paths Fast orthogonal transforms and generation of Brownian paths G. Leobacher partially joint work with C. Irrgeher MCQMC 2012 G. Leobacher (JKU Linz, Austria) Orthogonal transforms/brownian paths MCQMC 2012

More information

3 Feature Selection & Feature Extraction

3 Feature Selection & Feature Extraction 3 Feature Selection & Feature Extraction Overview: 3.1 Introduction 3.2 Feature Extraction 3.3 Feature Selection 3.3.1 Max-Dependency, Max-Relevance, Min-Redundancy 3.3.2 Relevance Filter 3.3.3 Redundancy

More information

Monte Carlo Simulations

Monte Carlo Simulations Monte Carlo Simulations Lecture 2 December 8, 2014 Outline 1 Random Number Generation 2 Uniform Random Variables 3 Normal Random Variables 4 Correlated Random Variables Random Number Generation Monte Carlo

More information

Lecture 25 Nonlinear Programming. November 9, 2009

Lecture 25 Nonlinear Programming. November 9, 2009 Nonlinear Programming November 9, 2009 Outline Nonlinear Programming Another example of NLP problem What makes these problems complex Scalar Function Unconstrained Problem Local and global optima: definition,

More information

An Exact Algorithm for the Statistical Shortest Path Problem

An Exact Algorithm for the Statistical Shortest Path Problem An Exact Algorithm for the Statistical Shortest Path Problem Liang Deng and Martin D. F. Wong Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign Outline Motivation

More information

The Set-Open topology

The Set-Open topology Volume 37, 2011 Pages 205 217 http://topology.auburn.edu/tp/ The Set-Open topology by A. V. Osipov Electronically published on August 26, 2010 Topology Proceedings Web: http://topology.auburn.edu/tp/ Mail:

More information

Generating random samples from user-defined distributions

Generating random samples from user-defined distributions The Stata Journal (2011) 11, Number 2, pp. 299 304 Generating random samples from user-defined distributions Katarína Lukácsy Central European University Budapest, Hungary lukacsy katarina@phd.ceu.hu Abstract.

More information

Accuracy estimation for quasi-monte Carlo simulations

Accuracy estimation for quasi-monte Carlo simulations Mathematics and Computers in Simulation 54 (2000) 131 143 Accuracy estimation for quasi-monte Carlo simulations William C. Snyder SRI International, 333 Ravenswood Ave., Menlo Park, CA 94025, USA Accepted

More information

Math 3316, Fall 2016 Due Nov. 3, 2016

Math 3316, Fall 2016 Due Nov. 3, 2016 Math 3316, Fall 2016 Due Nov. 3, 2016 Project 3 Polynomial Interpolation The first two sections of this project will be checked in lab the week of Oct. 24-26 this completion grade will count for 10% of

More information

THREE LECTURES ON BASIC TOPOLOGY. 1. Basic notions.

THREE LECTURES ON BASIC TOPOLOGY. 1. Basic notions. THREE LECTURES ON BASIC TOPOLOGY PHILIP FOTH 1. Basic notions. Let X be a set. To make a topological space out of X, one must specify a collection T of subsets of X, which are said to be open subsets of

More information

An Analysis of a Variation of Hit-and-Run for Uniform Sampling from General Regions

An Analysis of a Variation of Hit-and-Run for Uniform Sampling from General Regions An Analysis of a Variation of Hit-and-Run for Uniform Sampling from General Regions SEKSAN KIATSUPAIBUL Chulalongkorn University ROBERT L. SMITH University of Michigan and ZELDA B. ZABINSKY University

More information

GPU Integral Computations in Stochastic Geometry

GPU Integral Computations in Stochastic Geometry Outline GPU Integral Computations in Stochastic Geometry Department of Computer Science Western Michigan University June 25, 2013 Outline Outline 1 Introduction 2 3 4 Outline Outline 1 Introduction 2 3

More information

One-class Problems and Outlier Detection. 陶卿 中国科学院自动化研究所

One-class Problems and Outlier Detection. 陶卿 中国科学院自动化研究所 One-class Problems and Outlier Detection 陶卿 Qing.tao@mail.ia.ac.cn 中国科学院自动化研究所 Application-driven Various kinds of detection problems: unexpected conditions in engineering; abnormalities in medical data,

More information

What the Dual-Dirac Model is and What it is Not

What the Dual-Dirac Model is and What it is Not What the Dual-Dirac Model is and What it is Not Ransom Stephens October, 006 Abstract: The dual-dirac model is a simple tool for estimating total jitter defined at a bit error ratio, TJ(BER), for serial

More information

Instance-based Learning

Instance-based Learning Instance-based Learning Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University February 19 th, 2007 2005-2007 Carlos Guestrin 1 Why not just use Linear Regression? 2005-2007 Carlos Guestrin

More information

Chapter 4: Non-Parametric Techniques

Chapter 4: Non-Parametric Techniques Chapter 4: Non-Parametric Techniques Introduction Density Estimation Parzen Windows Kn-Nearest Neighbor Density Estimation K-Nearest Neighbor (KNN) Decision Rule Supervised Learning How to fit a density

More information

Joint quantification of uncertainty on spatial and non-spatial reservoir parameters

Joint quantification of uncertainty on spatial and non-spatial reservoir parameters Joint quantification of uncertainty on spatial and non-spatial reservoir parameters Comparison between the Method and Distance Kernel Method Céline Scheidt and Jef Caers Stanford Center for Reservoir Forecasting,

More information

Today. Lecture 4: Last time. The EM algorithm. We examine clustering in a little more detail; we went over it a somewhat quickly last time

Today. Lecture 4: Last time. The EM algorithm. We examine clustering in a little more detail; we went over it a somewhat quickly last time Today Lecture 4: We examine clustering in a little more detail; we went over it a somewhat quickly last time The CAD data will return and give us an opportunity to work with curves (!) We then examine

More information

Computation of the multivariate Oja median

Computation of the multivariate Oja median Metrika manuscript No. (will be inserted by the editor) Computation of the multivariate Oja median T. Ronkainen, H. Oja, P. Orponen University of Jyväskylä, Department of Mathematics and Statistics, Finland

More information

Monte Carlo Integration

Monte Carlo Integration Lab 18 Monte Carlo Integration Lab Objective: Implement Monte Carlo integration to estimate integrals. Use Monte Carlo Integration to calculate the integral of the joint normal distribution. Some multivariable

More information

Math 494: Mathematical Statistics

Math 494: Mathematical Statistics Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 28 Problem Set : Probability Review Last updated: March 6, 28 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables, p x denotes the

More information

Monte Carlo Integration

Monte Carlo Integration Lecture 11: Monte Carlo Integration Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2016 Reminder: Quadrature-Based Numerical Integration f(x) Z b a f(x)dx x 0 = a x 1 x 2 x 3 x 4 = b E.g.

More information

An introduction to design of computer experiments

An introduction to design of computer experiments An introduction to design of computer experiments Derek Bingham Statistics and Actuarial Science Simon Fraser University Department of Statistics and Actuarial Science Outline Designs for estimating a

More information

Hyperparameter optimization. CS6787 Lecture 6 Fall 2017

Hyperparameter optimization. CS6787 Lecture 6 Fall 2017 Hyperparameter optimization CS6787 Lecture 6 Fall 2017 Review We ve covered many methods Stochastic gradient descent Step size/learning rate, how long to run Mini-batching Batch size Momentum Momentum

More information

STATISTICS (STAT) Statistics (STAT) 1

STATISTICS (STAT) Statistics (STAT) 1 Statistics (STAT) 1 STATISTICS (STAT) STAT 2013 Elementary Statistics (A) Prerequisites: MATH 1483 or MATH 1513, each with a grade of "C" or better; or an acceptable placement score (see placement.okstate.edu).

More information

MATH 54 - LECTURE 10

MATH 54 - LECTURE 10 MATH 54 - LECTURE 10 DAN CRYTSER The Universal Mapping Property First we note that each of the projection mappings π i : j X j X i is continuous when i X i is given the product topology (also if the product

More information

Using Arithmetic of Real Numbers to Explore Limits and Continuity

Using Arithmetic of Real Numbers to Explore Limits and Continuity Using Arithmetic of Real Numbers to Explore Limits and Continuity by Maria Terrell Cornell University Problem Let a =.898989... and b =.000000... (a) Find a + b. (b) Use your ideas about how to add a and

More information

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures.

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures. Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 1: Sections 5-1.1 to 5-1.4 For both discrete and continuous random variables we will discuss the following... Joint Distributions (for two or more r.v. s)

More information

A Non-Iterative Approach to Frequency Estimation of a Complex Exponential in Noise by Interpolation of Fourier Coefficients

A Non-Iterative Approach to Frequency Estimation of a Complex Exponential in Noise by Interpolation of Fourier Coefficients A on-iterative Approach to Frequency Estimation of a Complex Exponential in oise by Interpolation of Fourier Coefficients Shahab Faiz Minhas* School of Electrical and Electronics Engineering University

More information

Monte Carlo Integration

Monte Carlo Integration Chapter 2 Monte Carlo Integration This chapter gives an introduction to Monte Carlo integration. The main goals are to review some basic concepts of probability theory, to define the notation and terminology

More information

Random Permutations, Random Sudoku Matrices and Randomized Algorithms

Random Permutations, Random Sudoku Matrices and Randomized Algorithms Random Permutations, Random Sudoku Matrices and Randomized Algorithms arxiv:1312.0192v1 [math.co] 1 Dec 2013 Krasimir Yordzhev Faculty of Mathematics and Natural Sciences South-West University, Blagoevgrad,

More information

Lecture 4: Convexity

Lecture 4: Convexity 10-725: Convex Optimization Fall 2013 Lecture 4: Convexity Lecturer: Barnabás Póczos Scribes: Jessica Chemali, David Fouhey, Yuxiong Wang Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

A Local Algorithm for Structure-Preserving Graph Cut

A Local Algorithm for Structure-Preserving Graph Cut A Local Algorithm for Structure-Preserving Graph Cut Presenter: Dawei Zhou Dawei Zhou* (ASU) Si Zhang (ASU) M. Yigit Yildirim (ASU) Scott Alcorn (Early Warning) Hanghang Tong (ASU) Hasan Davulcu (ASU)

More information

Ranking and selection problem with 1,016,127 systems. requiring 95% probability of correct selection. solved in less than 40 minutes

Ranking and selection problem with 1,016,127 systems. requiring 95% probability of correct selection. solved in less than 40 minutes Ranking and selection problem with 1,016,127 systems requiring 95% probability of correct selection solved in less than 40 minutes with 600 parallel cores with near-linear scaling.. Wallclock time (sec)

More information

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate?

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate? Monte-Carlo Ray Tracing Antialiasing & integration So far, Antialiasing as signal processing Now, Antialiasing as integration Complementary yet not always the same in particular for jittered sampling Image

More information