Targeted Random Sampling for Reliability Assessment: A Demonstration of Concept

Size: px
Start display at page:

Download "Targeted Random Sampling for Reliability Assessment: A Demonstration of Concept"

Transcription

1 Illinois Institute of Technology; Chicago, IL Targeted Random Sampling for Reliability Assessment: A Demonstration of Concept Michael D. Shields Assistant Professor Dept. of Civil Engineering Johns Hopkins University V.S. Sundar Postdoctoral Fellow Dept. of Civil Engineering Johns Hopkins University May 2014

2 Motivation of Present Work Monte Carlo Simulation (MCS) is the most robust and accurate (and sometimes the only) means of reliability analysis MCS is often (rightfully so) criticized for its high computational cost The cost of MCS has been greatly reduced through the advent of sophisticated sampling and simulation methods We can still do better!

3 Outline of Presentation Overview of Reliability Methods Emphasis on Monte Carlo Simulation based methods Refined Stratified Sampling Concept / Theory Algorithms / Applications Targeted Random Sampling Concept / Theory Algorithm Examples Challenges & Future Directions

4 Reliability Analysis Reliability Methods Approximate Methods Exact Methods FORM SORM Response Surface Monte Carlo Numerical Int. - Low accuracy - Very efficient - Extensible to multiple failure modes - Moderate dimensions - Moderate accuracy - Low efficiency - Single failure mode only - Moderate dimensions - Moderate accuracy - Moderate efficiency - Multiple failure modes - Moderate to high dimensions - Very Robust - Accurate - Numerous methods available - Computationally expensive - Computationally intractable for many cases - Only applicable in low dimension with special limit states. Direct MC Importance Sampling Subset Simulation 1 Line Sampling 2 Targeted Sampling - Robust - Accurate - Very Expensive - High dimensions - Multiple failure modes - Moderate accuracy - Moderate efficiency - Widely used - Many variants - Accurate - Efficient - High dimensions - MCMC can cause problems - Accurate - Efficient - High dimensions - May still require hundreds of calculations - Very accurate & efficient - High dimensions - Multiple failure modes 1. Au, S.-K. and Beck, J.L. (2001). Estimation of small failure probabilities in high dimension by subset simulation Probabilistic Engineering Mechanics 2. Koutsourelakis, P.S., Pradlwarter, H.J., Schueller, G.I. (2004). Reliability of structures in high dimension, part I: algorithms and applications Probabilistic Engineering Mechanics

5 Stratified Sampling Random Sampling Stratified Sampling P[X 2 ] 0.5 P[X 2 ] P[X ] 1 Sample size extension is straightforward No control over where new samples are placed No need to compute sample weights P[X ] 1 Almost limitless control over how samples are added Carefully select, based on the available information, which strata to add samples to Is there a better way?

6 Stratified Sampling Random Sampling Stratified Sampling P[X 2 ] 0.5 P[X 2 ] P[X ] 1 Sample size extension is straightforward No control over where new samples are placed No need to compute sample weights P[X ] 1 Almost limitless control over how samples are added Carefully select, based on the available information, which strata to add samples to Is there a better way?

7 Stratified Sampling Random Sampling Stratified Sampling P[X 2 ] 0.5 P[X 2 ] P[X ] 1 Sample size extension is straightforward No control over where new samples are placed No need to compute sample weights P[X ] 1 Almost limitless control over how samples are added Carefully select, based on the available information, which strata to add samples to Is there a better way?

8 Refined Stratified Sampling Y" (1,1)" w 1 (x 1,y 1 )" (x 2,y 2 )" w 2 (x 1,y 1 )" 2. Randomly divide probability space into strata Ω i ; i=1,2 w 1 (x 1,y 1 )" (x 3,y 3 )" w 3 w 2 (x 2,y 2 )" 6. Divide largest stratum Ω k to obtain strata Ω i ; i=1,2,3,4 (0,0)" X" 1. Randomly"sample" w 1 (x 1,y 1 )" w 2 (x 2,y 2 )" 4. Randomly divide stratum Ω k to obtain strata Ω i ; i=1,2,3 w 1 (x 1,y 1 )" w 4 (x 4,y 4 )" (x 3,y 3 )" w 3 w 2 (x 2,y 2 )" 8. Randomly divide stratum Ω k to obtain strata Ω i ; i=1,2,3,4,5 New"stratum." Sample"carried"from"previous"step." New"sample." X"~"Probability"space"for"RV1" Y"~"Probability"space"for"RV2" 3. Randomly sample in empty stratum Ω i and assign weights: w i =P(Ω i ) w 1 (x 1,y 1 )" (x 3,y 3 )" w 3 w 2 (x 2,y 2 )" 5. Randomly sample in empty stratum Ω i and assign new weights: w i =P(Ω i ) 7. Randomly sample in empty stratum Ω i and assign new weights: w i =P(Ω i ) 9. Randomly sample in empty stratum Ω i and assign new weights: w i =P(Ω i ) 1. Shields, M.D., Teferra, K., Hapij, A. and Daddazio, R. (2014) Refined stratified sampling for efficient Monte Carlo based uncertainty quantification Reliability Engineering & System ty (forthcoming)

9 Why Refine Strata? Consider a N samples from stratum Ω 1 : Var! 1 T # " S $ = p 1 N σ M j=1 p j 2 2 n j σ j Ω 1 Consider stratum refinement with a single sample in each stratum: 2 Var! 2 T # " S $ = p 1 N 2 Ω 1 N σ 2 i + i=1 Other strata M j=1 p j 2 2 n j σ j Ω 1 Evaluate the difference in the variances: Var! 1 T # " S $ Var! " T S 2 Ω 1 Other strata # $ = p 2 ' 1 N ) σ ( N N i=1 (Assumes a balanced sub-stratification) σ i 2 *, > 0 +

10 Variance Savings by RSS: Example Consider samples drawn from x~n(0,1) with T S = E[x]: σ 1 2 = σ 2 2 = p 1 = 0.5 p 2 = 0.5 σ 1 2 = p 1 = 0.5 σ 2 2 = p 2 = 0.25 σ 3 2 = p 2 = 0.25 σ 3 2 = p 2 = 0.25 σ 2 2 = p 2 = 0.25 ( ) 2 1 Var! b " T S # $ = 2 2 σ ( 1 2 ) 2 2 σ 2 2 Var! b " T S # $ = Var! " T S ( ) 2 1 Var! u " T S # $ = 2 2 u # $ = σ ( 1 4 ) 2 σ ( 1 4 ) 2 2 σ 3 Var! u " T S # $ = 1 4 Var! u " T S # $ = ( ) 2 σ ( 1 4 ) 2 σ ( 1 4 ) 2 σ ( 1 4 ) 2 2 σ 4 >30% reduction in variance by adding one stratum >60% reduction in variance by fully stratifying

11 RSS Algorithms 1. Random Stratum Division 2. Variance Minimization 3. Sensitivity-Based Stratum Division 4. Enhanced Space-Filling 5. Targeted Random Sampling 6. Others (Stochastic Search, Stochastic Field Simulation, etc.)

12

13 Random (iid) sampling

14 Stratified Sampling/ Refined Stratified Sampling

15 Targeted Random Sampling The biggest problem with Monte Carlo methods is that they are wasteful Many simulations in regions that are, statistically, of little or no use This is especially problematic for reliability analysis where probability of failure is very low

16 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat

17 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Identify Strata Adjacent to Surface

18 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

19 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample Sandia National Laboratories

20 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

21 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

22 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

23 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

24 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Determine stratum pair to split & add sample

25 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Repeat until convergence criteria satisfied

26 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Repeat until convergence criteria satisfied

27 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Repeat until convergence criteria satisfied

28 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Repeat until convergence criteria satisfied

29 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Repeat until convergence criteria satisfied

30 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat

31 1. Define an initial stratification of the domain & sample 2. Identify point pairs on opposite sides of the failure surface 3. Identify the pair with the largest separation and divide the stratum at a point in between (alternatively, pair with largest associated probability) 4. Sample in the new stratum 5. Refine the target space 6. Repeat Notes: TRS constructs a piecewise approximation of the failure surface Need to identify at least one failure point per disjoint failure region Uses available information to inform stratification Repeat until convergence criteria satisfied

32 Limit State: Example 1: Initial Sample G(U) = U sin 5U 1 Probability of : p f ( ) + 4 U 2 ; U ~ N (0,1) e e 7 1.0e 12 4 U e 7 G(U)= e e e 7 1.0e U 1

33 Limit State: Probability of : Example 1: 200 Samples G(U) = U sin 5U 1 p f ( ) + 4 U 2 ; U ~ N (0,1) TRS MCS U 2 3 G(U)=0 P F U Added samples

34 Limit State: Example 1: 500 Samples G(U) = U sin 5U 1 Probability of : p f ( ) + 4 U 2 ; U ~ N (0,1) U G(U)= U 1 P F Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 MCS Added samples

35 Limit State: Example 1: Comparison G(U) = U sin 5U 1 Probability of : p f ( ) + 4 U 2 ; U ~ N (0,1) Method Mean P F COV MCS (10 8 samples) FORM 4.154e- 4 N/A 4 U G(U)=0 SORM Importance Sampling (IS) SS(500; 50; 10) (~1000 samples) N/A N/A e- 4 (100) 66% U 1 TRS (9+491) (500 samples) e- 4 (100) 4%

36 P F Limit State*: Probability of : Example 2: Comparison Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 MCS Added samples 5 G(X) = X i C; X i ~ Exp(λ i ); λ i =1 i; C = i=1 p f Method Mean P F COV MCS (10 8 samples) FORM SORM IS (1000 samples) SS(500; 50; 10) (~1000 samples) TRS ( ) (1000 samples) e- 6 3% e e e e- 6 (100) e- 6 (100) 1. Engelund, S. and Rackwitz, R. (1993). A benchmark study on importance sampling techniques in structural reliability. Structural ty. 12: % 20%

37 Challenges & Future Directions Defining an appropriate initial stratification in high dimension 3 20 ~ 3.5e+9 Use MCMC to explore the space and post-stratify Dynamic Problems & Stochastic Processes/Fields Other RSS designs Variance Minimization for UQ Sensitivity-based Sampling Space-filling sample designs Etc.

Math 494: Mathematical Statistics

Math 494: Mathematical Statistics Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/

More information

Fast Generation of Nested Space-filling Latin Hypercube Sample Designs. Keith Dalbey, PhD

Fast Generation of Nested Space-filling Latin Hypercube Sample Designs. Keith Dalbey, PhD Fast Generation of Nested Space-filling Latin Hypercube Sample Designs Keith Dalbey, PhD Sandia National Labs, Dept 1441 Optimization & Uncertainty Quantification George N. Karystinos, PhD Technical University

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis

Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis Blake Boswell Booz Allen Hamilton ISPA / SCEA Conference Albuquerque, NM June 2011 1 Table Of Contents Introduction Monte Carlo Methods

More information

Hyperparameters and Validation Sets. Sargur N. Srihari

Hyperparameters and Validation Sets. Sargur N. Srihari Hyperparameters and Validation Sets Sargur N. srihari@cedar.buffalo.edu 1 Topics in Machine Learning Basics 1. Learning Algorithms 2. Capacity, Overfitting and Underfitting 3. Hyperparameters and Validation

More information

A Fast CMS Technique for Computational Efficient System Re-analyses in Structural Dynamics

A Fast CMS Technique for Computational Efficient System Re-analyses in Structural Dynamics Paper 23 A Fast CMS Technique for Computational Efficient System Re-analyses in Structural Dynamics D.-C. Papadioti and C. Papadimitriou Department of Mechanical Engineering University of Thessaly, Volos,

More information

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao GAMES Webinar: Rendering Tutorial 2 Monte Carlo Methods Shuang Zhao Assistant Professor Computer Science Department University of California, Irvine GAMES Webinar Shuang Zhao 1 Outline 1. Monte Carlo integration

More information

Computational Methods. Randomness and Monte Carlo Methods

Computational Methods. Randomness and Monte Carlo Methods Computational Methods Randomness and Monte Carlo Methods Manfred Huber 2010 1 Randomness and Monte Carlo Methods Introducing randomness in an algorithm can lead to improved efficiencies Random sampling

More information

Quantitative Biology II!

Quantitative Biology II! Quantitative Biology II! Lecture 3: Markov Chain Monte Carlo! March 9, 2015! 2! Plan for Today!! Introduction to Sampling!! Introduction to MCMC!! Metropolis Algorithm!! Metropolis-Hastings Algorithm!!

More information

Modified Metropolis-Hastings algorithm with delayed rejection

Modified Metropolis-Hastings algorithm with delayed rejection Modified Metropolis-Hastings algorithm with delayed reection K.M. Zuev & L.S. Katafygiotis Department of Civil Engineering, Hong Kong University of Science and Technology, Hong Kong, China ABSTRACT: The

More information

A derivative-free trust-region algorithm for reliability-based optimization

A derivative-free trust-region algorithm for reliability-based optimization Struct Multidisc Optim DOI 10.1007/s00158-016-1587-y BRIEF NOTE A derivative-free trust-region algorithm for reliability-based optimization Tian Gao 1 Jinglai Li 2 Received: 3 June 2016 / Revised: 4 September

More information

Improved Convergence Rates in Implicit Monte Carlo Simulations Through Stratified Sampling

Improved Convergence Rates in Implicit Monte Carlo Simulations Through Stratified Sampling Improved Convergence Rates in Implicit Monte Carlo Simulations Through Stratified Sampling ANS Winter Conference 2013 Alex Long and Ryan McClarren Texas A&M University Alex Long (Texas A&M) ANS Winter

More information

Lab 5 Monte Carlo integration

Lab 5 Monte Carlo integration Lab 5 Monte Carlo integration Edvin Listo Zec 9065-976 edvinli@student.chalmers.se October 0, 014 Co-worker: Jessica Fredby Introduction In this computer assignment we will discuss a technique for solving

More information

Failure Surface Frontier for Reliability Assessment on Expensive Performance Function

Failure Surface Frontier for Reliability Assessment on Expensive Performance Function Failure Surface Frontier for Reliability Assessment on Expensive Performance Function Songqing Shan Dept. of Mech. & Manuf. Engineering The University of Manitoba Winnipeg, MB Canada, R3T 5V6 G. Gary Wang

More information

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION CS580: Computer Graphics KAIST School of Computing Chapter 3 MULTI-DIMENSIONAL MONTE CARLO INTEGRATION 2 1 Monte Carlo Integration This describes a simple technique for the numerical evaluation of integrals

More information

An Automated Testing Environment to support Operational Profiles of Software Intensive Systems

An Automated Testing Environment to support Operational Profiles of Software Intensive Systems An Automated Testing Environment to support Operational Profiles of Software Intensive Systems Abstract: Robert S. Oshana Raytheon Systems Company oshana@raytheon.com (972)344-783 Raytheon Systems Company

More information

INFOMAGR Advanced Graphics. Jacco Bikker - February April Welcome!

INFOMAGR Advanced Graphics. Jacco Bikker - February April Welcome! INFOMAGR Advanced Graphics Jacco Bikker - February April 2016 Welcome! I x, x = g(x, x ) ε x, x + S ρ x, x, x I x, x dx Today s Agenda: Introduction Stratification Next Event Estimation Importance Sampling

More information

Applied Survey Data Analysis Module 2: Variance Estimation March 30, 2013

Applied Survey Data Analysis Module 2: Variance Estimation March 30, 2013 Applied Statistics Lab Applied Survey Data Analysis Module 2: Variance Estimation March 30, 2013 Approaches to Complex Sample Variance Estimation In simple random samples many estimators are linear estimators

More information

How Learning Differs from Optimization. Sargur N. Srihari

How Learning Differs from Optimization. Sargur N. Srihari How Learning Differs from Optimization Sargur N. srihari@cedar.buffalo.edu 1 Topics in Optimization Optimization for Training Deep Models: Overview How learning differs from optimization Risk, empirical

More information

Topology and fmri Data

Topology and fmri Data Topology and fmri Data Adam Jaeger Statistical and Applied Mathematical Sciences Institute & Duke University May 5, 2016 fmri and Classical Methodology Most statistical analyses examine data with a variable

More information

How do we obtain reliable estimates of performance measures?

How do we obtain reliable estimates of performance measures? How do we obtain reliable estimates of performance measures? 1 Estimating Model Performance How do we estimate performance measures? Error on training data? Also called resubstitution error. Not a good

More information

Feature Selection. Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester / 262

Feature Selection. Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester / 262 Feature Selection Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester 2016 239 / 262 What is Feature Selection? Department Biosysteme Karsten Borgwardt Data Mining Course Basel

More information

Level-set MCMC Curve Sampling and Geometric Conditional Simulation

Level-set MCMC Curve Sampling and Geometric Conditional Simulation Level-set MCMC Curve Sampling and Geometric Conditional Simulation Ayres Fan John W. Fisher III Alan S. Willsky February 16, 2007 Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve

More information

Convexization in Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non

More information

Computing Optimal Strata Bounds Using Dynamic Programming

Computing Optimal Strata Bounds Using Dynamic Programming Computing Optimal Strata Bounds Using Dynamic Programming Eric Miller Summit Consulting, LLC 7/27/2012 1 / 19 Motivation Sampling can be costly. Sample size is often chosen so that point estimates achieve

More information

CS Introduction to Data Mining Instructor: Abdullah Mueen

CS Introduction to Data Mining Instructor: Abdullah Mueen CS 591.03 Introduction to Data Mining Instructor: Abdullah Mueen LECTURE 8: ADVANCED CLUSTERING (FUZZY AND CO -CLUSTERING) Review: Basic Cluster Analysis Methods (Chap. 10) Cluster Analysis: Basic Concepts

More information

A Low Level Introduction to High Dimensional Sparse Grids

A Low Level Introduction to High Dimensional Sparse Grids A Low Level Introduction to High Dimensional Sparse Grids http://people.sc.fsu.edu/ jburkardt/presentations/sandia 2007.pdf... John 1 Clayton Webster 2 1 Virginia Tech 2 Sandia National Laboratory. 21

More information

Deep Reinforcement Learning

Deep Reinforcement Learning Deep Reinforcement Learning 1 Outline 1. Overview of Reinforcement Learning 2. Policy Search 3. Policy Gradient and Gradient Estimators 4. Q-prop: Sample Efficient Policy Gradient and an Off-policy Critic

More information

Motivation: Monte Carlo Path Tracing. Sampling and Reconstruction of Visual Appearance. Monte Carlo Path Tracing. Monte Carlo Path Tracing

Motivation: Monte Carlo Path Tracing. Sampling and Reconstruction of Visual Appearance. Monte Carlo Path Tracing. Monte Carlo Path Tracing Sampling and Reconstruction of Visual Appearance CSE 274 [Winter 2018], Lecture 4 Ravi Ramamoorthi http://www.cs.ucsd.edu/~ravir Motivation: Key application area for sampling/reconstruction Core method

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Introduction Often, we have only a set of features x = x 1, x 2,, x n, but no associated response y. Therefore we are not interested in prediction nor classification,

More information

Machine Learning Reliability Techniques for Composite Materials in Structural Applications.

Machine Learning Reliability Techniques for Composite Materials in Structural Applications. Machine Learning Reliability Techniques for Composite Materials in Structural Applications. Roberto d Ippolito, Keiichi Ito, Silvia Poles, Arnaud Froidmont Noesis Solutions Optimus by Noesis Solutions

More information

Multi-Criterion Optimal Design of Building Simulation Model using Chaos Particle Swarm Optimization

Multi-Criterion Optimal Design of Building Simulation Model using Chaos Particle Swarm Optimization , pp.21-25 http://dx.doi.org/10.14257/astl.2014.69.06 Multi-Criterion Optimal Design of Building Simulation Model using Chaos Particle Swarm Optimization Young-Jin, Kim 1,*, Hyun-Su Kim 1 1 Division of

More information

Big Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1

Big Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1 Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that

More information

Stochastic Simulation: Algorithms and Analysis

Stochastic Simulation: Algorithms and Analysis Soren Asmussen Peter W. Glynn Stochastic Simulation: Algorithms and Analysis et Springer Contents Preface Notation v xii I What This Book Is About 1 1 An Illustrative Example: The Single-Server Queue 1

More information

Machine Learning. Cross Validation

Machine Learning. Cross Validation Machine Learning Cross Validation Cross Validation Cross validation is a model evaluation method that is better than residuals. The problem with residual evaluations is that they do not give an indication

More information

STAT10010 Introductory Statistics Lab 2

STAT10010 Introductory Statistics Lab 2 STAT10010 Introductory Statistics Lab 2 1. Aims of Lab 2 By the end of this lab you will be able to: i. Recognize the type of recorded data. ii. iii. iv. Construct summaries of recorded variables. Calculate

More information

The Application of Monte Carlo Method for Sensitivity Analysis of Compressor Components

The Application of Monte Carlo Method for Sensitivity Analysis of Compressor Components Purdue University Purdue e-pubs International Compressor Engineering Conference School of Mechanical Engineering 2012 The Application of Monte Carlo Method for Sensitivity Analysis of Compressor Components

More information

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate?

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate? Monte-Carlo Ray Tracing Antialiasing & integration So far, Antialiasing as signal processing Now, Antialiasing as integration Complementary yet not always the same in particular for jittered sampling Image

More information

Multifidelity Approaches for Design Under Uncertainty. Leo Wai-Tsun Ng

Multifidelity Approaches for Design Under Uncertainty. Leo Wai-Tsun Ng Multifidelity Approaches for Design Under Uncertainty by Leo Wai-Tsun Ng Bachelor of Applied Science, University of Toronto (2007) Master of Science, Massachusetts Institute of Technology (2009) Submitted

More information

University of Florida CISE department Gator Engineering. Data Preprocessing. Dr. Sanjay Ranka

University of Florida CISE department Gator Engineering. Data Preprocessing. Dr. Sanjay Ranka Data Preprocessing Dr. Sanjay Ranka Professor Computer and Information Science and Engineering University of Florida, Gainesville ranka@cise.ufl.edu Data Preprocessing What preprocessing step can or should

More information

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo Scientific Computing with Case Studies SIAM Press, 2009 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 2008 1 What is a Monte-Carlo

More information

Data Preprocessing. Data Preprocessing

Data Preprocessing. Data Preprocessing Data Preprocessing Dr. Sanjay Ranka Professor Computer and Information Science and Engineering University of Florida, Gainesville ranka@cise.ufl.edu Data Preprocessing What preprocessing step can or should

More information

Graph-Shifts Anatomic 3D Segmentation by Dynamic Hierarchical Minimization

Graph-Shifts Anatomic 3D Segmentation by Dynamic Hierarchical Minimization Graph-Shifts Anatomic 3D Segmentation by Dynamic Hierarchical Minimization Jason Corso Postdoctoral Fellow UCLA LONI/CCB jcorso@ucla.edu Motivation The work deals with the problem of automatically labeling

More information

GT "Calcul Ensembliste"

GT Calcul Ensembliste GT "Calcul Ensembliste" Beyond the bounded error framework for non linear state estimation Fahed Abdallah Université de Technologie de Compiègne 9 Décembre 2010 Fahed Abdallah GT "Calcul Ensembliste" 9

More information

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome!

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome! INFOGR Computer Graphics J. Bikker - April-July 2015 - Lecture 10: Ground Truth Welcome! Today s Agenda: Limitations of Whitted-style Ray Tracing Monte Carlo Path Tracing INFOGR Lecture 10 Ground Truth

More information

Balancing Multiple Criteria Incorporating Cost using Pareto Front Optimization for Split-Plot Designed Experiments

Balancing Multiple Criteria Incorporating Cost using Pareto Front Optimization for Split-Plot Designed Experiments Research Article (wileyonlinelibrary.com) DOI: 10.1002/qre.1476 Published online 10 December 2012 in Wiley Online Library Balancing Multiple Criteria Incorporating Cost using Pareto Front Optimization

More information

Motivation. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing

Motivation. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing Advanced Computer Graphics (Spring 2013) CS 283, Lecture 11: Monte Carlo Path Tracing Ravi Ramamoorthi http://inst.eecs.berkeley.edu/~cs283/sp13 Motivation General solution to rendering and global illumination

More information

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16. Scientific Computing with Case Studies SIAM Press, 29 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 28 What is a Monte-Carlo method?

More information

Online Stochastic Matching CMSC 858F: Algorithmic Game Theory Fall 2010

Online Stochastic Matching CMSC 858F: Algorithmic Game Theory Fall 2010 Online Stochastic Matching CMSC 858F: Algorithmic Game Theory Fall 2010 Barna Saha, Vahid Liaghat Abstract This summary is mostly based on the work of Saberi et al. [1] on online stochastic matching problem

More information

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Classification Vladimir Curic Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Outline An overview on classification Basics of classification How to choose appropriate

More information

The Markov Chain Monte Carlo Approach to Importance Sampling in Stochastic Programming. Berk Ustun

The Markov Chain Monte Carlo Approach to Importance Sampling in Stochastic Programming. Berk Ustun The Markov Chain Monte Carlo Approach to Importance Sampling in Stochastic Programming by Berk Ustun B.S., Operations Research, University of California, Berkeley (2009) B.A., Economics, University of

More information

A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection (Kohavi, 1995)

A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection (Kohavi, 1995) A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection (Kohavi, 1995) Department of Information, Operations and Management Sciences Stern School of Business, NYU padamopo@stern.nyu.edu

More information

Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg

Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg Phil Gregory Physics and Astronomy Univ. of British Columbia Introduction Martin Weinberg reported

More information

Expectation-Maximization Methods in Population Analysis. Robert J. Bauer, Ph.D. ICON plc.

Expectation-Maximization Methods in Population Analysis. Robert J. Bauer, Ph.D. ICON plc. Expectation-Maximization Methods in Population Analysis Robert J. Bauer, Ph.D. ICON plc. 1 Objective The objective of this tutorial is to briefly describe the statistical basis of Expectation-Maximization

More information

A Monte-Carlo Simulation Approach for Approximating Multi-state Two-Terminal Reliability

A Monte-Carlo Simulation Approach for Approximating Multi-state Two-Terminal Reliability A Monte-Carlo Simulation Approach for Approximating Multi-state Two-Terminal Reliability Jose E. Ramirez-Marquez and David W. Coit Department of Industrial and Systems Engineering, Rutgers University Abstract

More information

Model validation T , , Heli Hiisilä

Model validation T , , Heli Hiisilä Model validation T-61.6040, 03.10.2006, Heli Hiisilä Testing Neural Models: How to Use Re-Sampling Techniques? A. Lendasse & Fast bootstrap methodology for model selection, A. Lendasse, G. Simon, V. Wertz,

More information

Motivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi

Motivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi http://inst.eecs.berkeley.edu/~cs283 Acknowledgements and many slides courtesy: Thomas Funkhouser, Szymon

More information

Introduction to Stochastic Combinatorial Optimization

Introduction to Stochastic Combinatorial Optimization Introduction to Stochastic Combinatorial Optimization Stefanie Kosuch PostDok at TCSLab www.kosuch.eu/stefanie/ Guest Lecture at the CUGS PhD course Heuristic Algorithms for Combinatorial Optimization

More information

Rapid Stochastic Assessment of Post-hazard Connectivity and Flow Capacity of Lifeline Networks

Rapid Stochastic Assessment of Post-hazard Connectivity and Flow Capacity of Lifeline Networks Seismic Risk Assessment and Management of Transportation Networks Ma rch 18, 29 Rapid Stochastic Assessment of Post-hazard Connectivity and Flow Capacity of Lifeline Networks Junho Song Assistant Professor

More information

A General Purpose Software for Reliability-based Optimal Design M.A.Valdebenito E.Patelli G.I. Schuëller

A General Purpose Software for Reliability-based Optimal Design M.A.Valdebenito E.Patelli G.I. Schuëller A General Purpose Software for Reliability-based Optimal Design M.A.Valdebenito E.Patelli G.I. Schuëller University of Innsbruck Innsbruck, Austria, EU http://mechanik.uibk.ac.at Contents Motivations and

More information

Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources

Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources Lincoln Laboratory ASAP-2001 Workshop Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources Shawn Kraut and Jeffrey Krolik Duke University Department of Electrical and Computer

More information

Shows nodes and links, Node pair A-B, and a route between A and B.

Shows nodes and links, Node pair A-B, and a route between A and B. Solving Stochastic Optimization Problems on Computational Grids Steve Wright Argonne ational Laboratory University of Chicago University of Wisconsin-Madison Dundee, June 26, 2001. Outline Stochastic Programming

More information

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung POLYTECHNIC UNIVERSITY Department of Computer and Information Science VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung Abstract: Techniques for reducing the variance in Monte Carlo

More information

Raytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination

Raytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination Raytracing & Epsilon intersects light @ t = 25.2 intersects sphere1 @ t = -0.01 & Monte Carlo Ray Tracing intersects sphere1 @ t = 10.6 Solution: advance the ray start position epsilon distance along the

More information

Simulation of Conditional Value-at-Risk via Parallelism on Graphic Processing Units

Simulation of Conditional Value-at-Risk via Parallelism on Graphic Processing Units Simulation of Conditional Value-at-Risk via Parallelism on Graphic Processing Units Hai Lan Dept. of Management Science Shanghai Jiao Tong University Shanghai, 252, China July 22, 22 H. Lan (SJTU) Simulation

More information

Cycles in Random Graphs

Cycles in Random Graphs Cycles in Random Graphs Valery Van Kerrebroeck Enzo Marinari, Guilhem Semerjian [Phys. Rev. E 75, 066708 (2007)] [J. Phys. Conf. Series 95, 012014 (2008)] Outline Introduction Statistical Mechanics Approach

More information

Dual-Frame Weights (Landline and Cell) for the 2009 Minnesota Health Access Survey

Dual-Frame Weights (Landline and Cell) for the 2009 Minnesota Health Access Survey Dual-Frame Weights (Landline and Cell) for the 2009 Minnesota Health Access Survey Kanru Xia 1, Steven Pedlow 1, Michael Davern 1 1 NORC/University of Chicago, 55 E. Monroe Suite 2000, Chicago, IL 60603

More information

Random Sample Consensus (RANSAC) Algorithm, A Generic Implementation Release 1.0

Random Sample Consensus (RANSAC) Algorithm, A Generic Implementation Release 1.0 Random Sample Consensus (RANSAC) Algorithm, A Generic Implementation Release 1.0 Ziv Yaniv October 21, 2010 Imaging Science and Information Systems (ISIS) Center, Dept. of Radiology, Georgetown University

More information

Semantic Importance Sampling for Statistical Model Checking

Semantic Importance Sampling for Statistical Model Checking Semantic Importance Sampling for Statistical Model Checking Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Jeffery Hansen, Lutz Wrage, Sagar Chaki, Dionisio de Niz, Mark

More information

MCMC Methods for data modeling

MCMC Methods for data modeling MCMC Methods for data modeling Kenneth Scerri Department of Automatic Control and Systems Engineering Introduction 1. Symposium on Data Modelling 2. Outline: a. Definition and uses of MCMC b. MCMC algorithms

More information

Combine the PA Algorithm with a Proximal Classifier

Combine the PA Algorithm with a Proximal Classifier Combine the Passive and Aggressive Algorithm with a Proximal Classifier Yuh-Jye Lee Joint work with Y.-C. Tseng Dept. of Computer Science & Information Engineering TaiwanTech. Dept. of Statistics@NCKU

More information

A Review of Recent Features and Improvements Added to FERUM Software

A Review of Recent Features and Improvements Added to FERUM Software A Review of Recent Features and Improvements Added to FERUM Software J.-M. Bourinet, C. Mattrand, V. Dubourg IFMA & Université Blaise Pascal, Laboratoire de Mécanique et Ingénieries Campus des Cézeaux,

More information

REDCap Randomization Module

REDCap Randomization Module REDCap Randomization Module The Randomization Module within REDCap allows researchers to randomly assign participants to specific groups. Table of Contents 1 Table of Contents 2 Enabling the Randomization

More information

Multivariate Data Analysis and Machine Learning in High Energy Physics (V)

Multivariate Data Analysis and Machine Learning in High Energy Physics (V) Multivariate Data Analysis and Machine Learning in High Energy Physics (V) Helge Voss (MPI K, Heidelberg) Graduierten-Kolleg, Freiburg, 11.5-15.5, 2009 Outline last lecture Rule Fitting Support Vector

More information

Recent Advances in Monte Carlo Offline Rendering

Recent Advances in Monte Carlo Offline Rendering CS294-13: Special Topics Lecture #6 Advanced Computer Graphics University of California, Berkeley Monday, 21 September 2009 Recent Advances in Monte Carlo Offline Rendering Lecture #6: Monday, 21 September

More information

Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels

Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels Mehdi Molkaraie and Hans-Andrea Loeliger Dept. of Information Technology and Electrical Engineering ETH Zurich, Switzerland

More information

TOLERANT APPROACH WITH

TOLERANT APPROACH WITH A XFEM-BASED PROBABILISTIC DAMAGE TOLERANT APPROACH WITH MORFEO/CRACK FOR ABAQUS Frédéric Lani 1,3 Eric Wyart 2 Laurent D Alvise 3 1 Université Catholique de Louvain, Institute of Mechanics, Materials

More information

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods ISyE 6416: Computational Statistics Spring 2017 Lecture 13: Monte Carlo Methods Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Determine area

More information

An Introduction to Markov Chain Monte Carlo

An Introduction to Markov Chain Monte Carlo An Introduction to Markov Chain Monte Carlo Markov Chain Monte Carlo (MCMC) refers to a suite of processes for simulating a posterior distribution based on a random (ie. monte carlo) process. In other

More information

What is Learning? CS 343: Artificial Intelligence Machine Learning. Raymond J. Mooney. Problem Solving / Planning / Control.

What is Learning? CS 343: Artificial Intelligence Machine Learning. Raymond J. Mooney. Problem Solving / Planning / Control. What is Learning? CS 343: Artificial Intelligence Machine Learning Herbert Simon: Learning is any process by which a system improves performance from experience. What is the task? Classification Problem

More information

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant

More information

CMPSCI 250: Introduction to Computation. Lecture #7: Quantifiers and Languages 6 February 2012

CMPSCI 250: Introduction to Computation. Lecture #7: Quantifiers and Languages 6 February 2012 CMPSCI 250: Introduction to Computation Lecture #7: Quantifiers and Languages 6 February 2012 Quantifiers and Languages Quantifier Definitions Translating Quantifiers Types and the Universe of Discourse

More information

An Exact Algorithm for the Statistical Shortest Path Problem

An Exact Algorithm for the Statistical Shortest Path Problem An Exact Algorithm for the Statistical Shortest Path Problem Liang Deng and Martin D. F. Wong Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign Outline Motivation

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 13 Random Numbers and Stochastic Simulation Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright

More information

Lecture 7: Monte Carlo Rendering. MC Advantages

Lecture 7: Monte Carlo Rendering. MC Advantages Lecture 7: Monte Carlo Rendering CS 6620, Spring 2009 Kavita Bala Computer Science Cornell University MC Advantages Convergence rate of O( ) Simple Sampling Point evaluation Can use black boxes General

More information

MONTEREY, CALIFORNIA

MONTEREY, CALIFORNIA NPS-MAE-04-005 MONTEREY, CALIFORNIA DETERMINING THE NUMBER OF ITERATIONS FOR MONTE CARLO SIMULATIONS OF WEAPON EFFECTIVENESS by Morris R. Driels Young S. Shin April 2004 Approved for public release; distribution

More information

Halftoning and quasi-monte Carlo

Halftoning and quasi-monte Carlo Halftoning and quasi-monte Carlo Ken Hanson CCS-2, Methods for Advanced Scientific Simulations Los Alamos National Laboratory This presentation available at http://www.lanl.gov/home/kmh/ LA-UR-04-1854

More information

State of the art of Monte Carlo technics for reliable activated waste evaluations

State of the art of Monte Carlo technics for reliable activated waste evaluations State of the art of Monte Carlo technics for reliable activated waste evaluations Matthieu CULIOLI a*, Nicolas CHAPOUTIER a, Samuel BARBIER a, Sylvain JANSKI b a AREVA NP, 10-12 rue Juliette Récamier,

More information

STRATIFIED SAMPLING MEETS MACHINE LEARNING KEVIN LANG KONSTANTIN SHMAKOV EDO LIBERTY

STRATIFIED SAMPLING MEETS MACHINE LEARNING KEVIN LANG KONSTANTIN SHMAKOV EDO LIBERTY STRATIFIED SAMPLING MEETS MACHINE LEARNING KEVIN LANG KONSTANTIN SHMAKOV EDO LIBERTY 2 3 4 Apps with Flurry SDK 5 q( ) P i q(u i) Examples: Number of event of a certain type Number of unique user Number

More information

Adaptive Filtering using Steepest Descent and LMS Algorithm

Adaptive Filtering using Steepest Descent and LMS Algorithm IJSTE - International Journal of Science Technology & Engineering Volume 2 Issue 4 October 2015 ISSN (online): 2349-784X Adaptive Filtering using Steepest Descent and LMS Algorithm Akash Sawant Mukesh

More information

Extending the GLM. Outline. Mixed effects motivation Evaluating mixed effects methods Three methods. Conclusions. Overview

Extending the GLM. Outline. Mixed effects motivation Evaluating mixed effects methods Three methods. Conclusions. Overview Extending the GLM So far, we have considered the GLM for one run in one subject The same logic can be applied to multiple runs and multiple subjects GLM Stats For any given region, we can evaluate the

More information

SIMULATION AND MONTE CARLO

SIMULATION AND MONTE CARLO JHU course no. 550.790, Modeling, Simulation, and Monte Carlo SIMULATION AND MONTE CARLO Some General Principles James C. Spall Johns Hopkins University Applied Physics Laboratory September 2004 Overview

More information

Simulation and Optimization Methods for Reliability Analysis

Simulation and Optimization Methods for Reliability Analysis Simulation and Optimization Methods for Reliability Analysis M. Oberguggenberger, M. Prackwieser, M. Schwarz University of Innsbruck, Department of Engineering Science INTALES GmbH Engineering Solutions

More information

Monte Carlo Integration COS 323

Monte Carlo Integration COS 323 Monte Carlo Integration COS 323 Last time Interpolatory Quadrature Review formulation; error analysis Newton-Cotes Quadrature Midpoint, Trapezoid, Simpson s Rule Error analysis for trapezoid, midpoint

More information

Neuro-Dynamic Programming An Overview

Neuro-Dynamic Programming An Overview 1 Neuro-Dynamic Programming An Overview Dimitri Bertsekas Dept. of Electrical Engineering and Computer Science M.I.T. May 2006 2 BELLMAN AND THE DUAL CURSES Dynamic Programming (DP) is very broadly applicable,

More information

DS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University

DS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University DS 4400 Machine Learning and Data Mining I Alina Oprea Associate Professor, CCIS Northeastern University September 20 2018 Review Solution for multiple linear regression can be computed in closed form

More information

Divide and Conquer Kernel Ridge Regression

Divide and Conquer Kernel Ridge Regression Divide and Conquer Kernel Ridge Regression Yuchen Zhang John Duchi Martin Wainwright University of California, Berkeley COLT 2013 Yuchen Zhang (UC Berkeley) Divide and Conquer KRR COLT 2013 1 / 15 Problem

More information

Nested Sampling: Introduction and Implementation

Nested Sampling: Introduction and Implementation UNIVERSITY OF TEXAS AT SAN ANTONIO Nested Sampling: Introduction and Implementation Liang Jing May 2009 1 1 ABSTRACT Nested Sampling is a new technique to calculate the evidence, Z = P(D M) = p(d θ, M)p(θ

More information

CS 2750 Machine Learning. Lecture 19. Clustering. CS 2750 Machine Learning. Clustering. Groups together similar instances in the data sample

CS 2750 Machine Learning. Lecture 19. Clustering. CS 2750 Machine Learning. Clustering. Groups together similar instances in the data sample Lecture 9 Clustering Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square Clustering Groups together similar instances in the data sample Basic clustering problem: distribute data into k different groups

More information

Curve Sampling and Geometric Conditional Simulation

Curve Sampling and Geometric Conditional Simulation Curve Sampling and Geometric Conditional Simulation Ayres Fan Joint work with John Fisher, William Wells, Jonathan Kane, and Alan Willsky S S G Massachusetts Institute of Technology September 19, 2007

More information