Exam 2 is Tue Nov 21. Bring a pencil and a calculator. Discuss similarity to exam1. HW3 is due Tue Dec 5.

Size: px
Start display at page:

Download "Exam 2 is Tue Nov 21. Bring a pencil and a calculator. Discuss similarity to exam1. HW3 is due Tue Dec 5."

Transcription

1 Stat 100a: Introduction to Probability. Outline for the day 1. Bivariate and marginal density. 2. CLT. 3. CIs. 4. Sample size calculations. 5. Review for exam 2. Exam 2 is Tue Nov 21. Bring a pencil and a calculator. Discuss similarity to exam1. HW3 is due Tue Dec 5.

2 1. Bivariate and marginal density. Suppose X and Y are random variables. If X and Y are discrete, we can define the joint pmf f(x,y) = P(X = x and Y = y). Suppose X and Y are continuous for the rest of this page. Define the bivariate or joint pdf f(x,y) as a function with the properties that f(x,y) 0, and for any a,b,c,d, P(a X b and c Y d) = ab cd f(x,y) dx dy. The integral - f(x,y) dy = f(x), the pdf of X, and this function f(x) is sometimes called the marginal density of X. Similarly - f(x,y) dx is the marginal pdf of Y. E(X) = - x f(x) dx = - x [ - f(x,y) dy] dx = - - x f(x,y) dx dy. Just as P(A B) = P(AB)/P(B), f(x y) = f(x,y)/f(y). X and Y are independent iff. f(x,y) = f x (x)f y (y). Now E(XY) = - - xy f(x,y) dx dy. This can be useful to find cov(x,y) = E(XY) E(X)E(Y). E(X 2 Y+e Y ) = - - (x 2 y+e y ) f(x,y) dx dy

3 Bivariate and marginal density. Suppose the joint density of X and Y is f(x,y) = a exp(x+y), for X and Y in (0,1) x (0,1). What is a? What is the marginal density of Y? What type of distribution does X have conditional on Y? What is E(X Y)? What is the mean of X when Y = 2? Are X and Y independent? a exp(x+y) dxdy = 1 = a exp(x)exp(y)dxdy = a 01 exp(x) dx 01 exp(y) dy = a(e-1) 2, so a = (e-1) -2. The marginal density of Y is f(y) = 01 a exp(x+y) dx = a exp(y) 0 1 exp(x)dx = a exp(y)(e-1) = exp(y)/(e-1). Conditional on Y, the density of X is f(x y) = f(x,y)/f(y) = a exp(x+y)(e-1)/exp(y) = exp(x)/(e-1). So X Y is like an exponential(1) random variable restricted to (0,1). E(X Y) = 01 x exp(x)/(e-1) dx = 1/(e-1) [x exp(x) - exp(x)dx] = 1/(e-1) [x exp(x) exp(x)] 01 = 1/(e-1) [e e 0 + 1] = 1/(e-1). When Y = 2, E(X Y) = 1/(e-1). f(y) = exp(y)/(e-1) and similarly f(x) = exp(x)/(e-1), so f(x)f(y) = exp(x+y)/(e-1) 2 = f(x,y), so X and Y are independent.

4 2. Central Limit Theorem (CLT), ch 7.4. Sample mean = X i / n iid: independent and identically distributed. Suppose X 1, X 2, etc. are iid with expected value µ and sd s, LAW OF LARGE NUMBERS (LLN): ---> µ. CENTRAL LIMIT THEOREM (CLT): ( - µ) (s/ n) ---> Standard Normal. Useful for tracking results.

5 95% between and 1.96

6 Truth: -49 to 51, exp. value µ = 1.0 Truth: -49 or 51, each with prob. 1/2. exp. value = 1.0

7 Truth: uniform on -49 to 51. µ = 1.0 Estimated using +/ s/ n =.95 +/ in this example

8 Central Limit Theorem (CLT): if X 1, X 2, X n are iid with mean µ& SD s, then ( - µ) (s/ n) ---> Standard Normal. (mean 0, SD 1). In other words, has mean µ and a standard deviation of s n. Two interesting things about this: (i) As n -->, --> normal. Even if X i are far from normal. e.g. average number of pairs per hand, out of n hands. X i are 0-1 (Bernoulli). µ = p = P(pair) = 3/51 = 5.88%. s = (pq) = (5.88% x 94.12%) = %. (ii) We can use this to find a range where is likely to be. About 95% of the time, a std normal random variable is within to So 95% of the time, ( - µ) (s/ n) is within to So 95% of the time, ( So 95% of the time, - µ) is within (s/ n) to (s/ n). is within µ (s/ n) to µ (s/ n). That is, 95% of the time, is in the interval µ +/ (s/ n). = 5.88% +/- 1.96(23.525%/ n). For n = 1000, this is 5.88% +/ %. For n = 1,000,000 get 5.88% +/ %.

9 Another CLT Example Central Limit Theorem (CLT): if X 1, X 2, X n are iid with mean µ& SD s, then ( - µ) (s/ n) ---> Standard Normal. (mean 0, SD 1). In other words, is like a draw from a normal distribution with mean µ and standard deviation of s n. That is, 95% of the time, is in the interval µ +/ (s/ n). Q. Suppose you average $5 profit per hour, with a SD of $60 per hour. If you play 1600 hours, let Y be your average profit over those 1600 hours. Find a range where Y is 95% likely to fall. A. We want µ +/ (s/ n), where µ = $5, s = $60, and n=1600. So the answer is $5 +/ x $60 / (1600) = $5 +/- $2.94, or the range [$2.06, $7.94].

10 3. Confidence Intervals (CIs) for µ, ch 7.5. Central Limit Theorem (CLT): if X 1, X 2, X n are iid with mean µ& SD s, then ( - µ) (s/ n) ---> Standard Normal. (mean 0, SD 1). So, 95% of the time, is in the interval µ +/ (s/ n). Typically you know but not µ. Turning the blue statement above around a bit means that 95% of the time, µ is in the interval +/ (s/ n). This range +/ (s/ n) is called a 95% confidence interval (CI) for µ. [Usually you don t know s and have to estimate it using the sample std deviation, s, of your data, and ( - µ) (s/ n) has a t n-1 distribution if the X i are normal. For n>30, t n-1 is so similar to normal though.] 1.96 (s/ n) is called the margin of error.

11 The range +/ (s/ n) is a 95% confidence interval for µ (s/ n) (from fulltiltpoker.com:) Based on the data, can we conclude Dwan is a better player? Is his longterm avg. µ > 0? Over these 39,000 hands, Dwan profited $2 million. $51/hand. sd ~ $10, % CI for µ is $51 +/ ($10,000 / 39,000) = $51 +/- $99 = (-$48, $150). Results are inconclusive, even after 39,000 hands! 4. Sample size calculation. How many more hands are needed? If Dwan keeps winning $51/hand, then we want n so that the margin of error = $ (s/ n) = $51 means 1.96 ($10,000) / n = $51, so n = [(1.96)($10,000)/($51)] 2 ~ 148,000, so about 109,000 more hands.

12 5. Review problems for Exam 2. What is the probability you will have a full house on the turn? For example, K. There are 13 possibilities for the number on the triplet, and for each such choice, there are C(4,3) choices for the suits on the triplet, and for each such choice, there are 12 possibilities for the number on the pair, and for each such choice, there are C(4,2) possibilities for the suits of the pair, and for each such choice, there are 44 possibilities left for the last card to go with the full house. It is tempting to answer 13 * C(4,3) * 12 * C(4,2) * 44. But this would ignore So add in C(13,2)*C(4,3)*C(4,3). The answer is (13 * C(4,3) * 12 * C(4,2) * 44 + C(13,2)*C(4,3)*C(4,3))/C(52,6) ~ 1 in 123.

13 5. Review problems for Exam 2. What is the probability you will have a straight on the turn? Include the case where you have a straight and you also have a flush, or where you have a straightflush. For example, Q or Q. First, note there are 10 straights. A2345, 23456,..., 10JQKA. It is tempting to say 10 * 4 5 * choices for the numbers on the straight, 4*4*4*4*4 choices for the suits on the straight, and 52-4*5 = 32 possibilities for the other card. But there are two problems here. We have not counted And, we have doublecounted One way to do this is as follows. First, consider cases with no pairs, and count A2345 separately. For any other straight, there are 28 cards to go with it so it is not a straight already counted. For 45678, for instance, the last card must not be or 8, so there are 28 cards left. For A2345, there are 32 cards to go with it. So we have 9*4 5 * *32 combinations with no pairs, plus 10*5*C(4,2)*4 4 combinations with a pair like , and answer is (9*4 5 * * *5*C(4,2)*4 4 ) / C(52,6) ~ 1 in 55.4.

14 5. Review problems for Exam 2. Why 10*5*C(4,2)*4 4 combinations with a pair like ? There are 10 choices for the numbers on the straight, like For each such choice, there are 5 choices for the number to be paired up, like 5. For each such choice, there are C(4,2) possibilities for the suits on the pair. For each such choice, there are 4 possibilities for the suits on the other 4 cards.

15 5. Review problems for Exam 2. Suppose you have 10 players at the table. What is the expected number of players who have 2 face cards? P(2 face cards) = C(12,2)/C(52,2) = 4.98%. Let X1 = 1 if player 1 has 2 face cards, and X1 = 0 otherwise. X2 = 1 if player 2 has 2 face cards, and X2 = 0 otherwise. etc. X = Xi = total number of players with 2 face cards. E(X) = E(Xi) = 10 x 4.98% =

16 Let X = N(0, ) and e = N(0, ) and independent of X. Let Y = X + e. Find E(X), E(Y), E(Y X), var(x), var(y), cov(x,y), and r = cor(x,y). E(X) = 0. E(Y) = E( X + e) = E(X) + E(e) = 7. E(Y X) = E( X + e X) = X + E(e X) = X since e and X are ind. var(x) = var(y) = var( X + e) = var(0.2x + e) = var(x) + var(e) + 2*0.2 cov(x,e) = (.64) = cov(x,y) = cov(x, X + e) = 0.2 var(x) + cov(x, e) = 0.2(0.64) + 0 = r = cov(x,y)/(sd(x) sd(y)) = / (0.8 x.0356) =

17 Suppose (X,Y) are bivariate normal with E(X) = 10, var(x) = 9, E(Y) = 30, var(y) = 4, r = 0.3, What is the distribution of Y given X = 7? Given X = 7, Y is normal. Write Y = b 1 + b 2 X + e where e is normal with mean 0, ind. of X. Recall b 2 = r s y /s x = 0.3 x 2/3 = 0.2. So Y = b X + e. To get b 1, note 30 = E(Y) = b E(X) + E(e) = b (10) + 0. So 30 = b b 1 = 28. So Y = X + e, where e is normal with mean 0 and ind. of X. What is var(e)? 4 = var(y) = var( X + e) = var(x) + var(e) + 2(0.2) cov(x,e) = (9) + var(e) + 0. So var(e) = (9) = 3.64 and sd(e) = 3.64 = So Y = X + e, where e is N(0, ) and ind. of X. Given X = 7, Y = (7) + e = e, so Y X=7 ~ N(29.4, ).

Probability Model for 2 RV s

Probability Model for 2 RV s Probability Model for 2 RV s The joint probability mass function of X and Y is P X,Y (x, y) = P [X = x, Y= y] Joint PMF is a rule that for any x and y, gives the probability that X = x and Y= y. 3 Example:

More information

Topic 5 - Joint distributions and the CLT

Topic 5 - Joint distributions and the CLT Topic 5 - Joint distributions and the CLT Joint distributions Calculation of probabilities, mean and variance Expectations of functions based on joint distributions Central Limit Theorem Sampling distributions

More information

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures.

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures. Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 1: Sections 5-1.1 to 5-1.4 For both discrete and continuous random variables we will discuss the following... Joint Distributions (for two or more r.v. s)

More information

Joint probability distributions

Joint probability distributions Joint probability distributions Let X and Y be two discrete rv s defined on the sample space S. The joint probability mass function p(x, y) is p(x, y) = P(X = x, Y = y). Note p(x, y) and x y p(x, y) =

More information

Multivariate probability distributions

Multivariate probability distributions Multivariate probability distributions September, 07 STAT 0 Class Slide Outline of Topics Background Discrete bivariate distribution 3 Continuous bivariate distribution STAT 0 Class Slide Multivariate

More information

Today s outline: pp

Today s outline: pp Chapter 3 sections We will SKIP a number of sections Random variables and discrete distributions Continuous distributions The cumulative distribution function Bivariate distributions Marginal distributions

More information

STATISTICAL LABORATORY, April 30th, 2010 BIVARIATE PROBABILITY DISTRIBUTIONS

STATISTICAL LABORATORY, April 30th, 2010 BIVARIATE PROBABILITY DISTRIBUTIONS STATISTICAL LABORATORY, April 3th, 21 BIVARIATE PROBABILITY DISTRIBUTIONS Mario Romanazzi 1 MULTINOMIAL DISTRIBUTION Ex1 Three players play 1 independent rounds of a game, and each player has probability

More information

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures.

This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures. Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 1: Sections 5-1.1 to 5-1.4 For both discrete and continuous random variables we will discuss the following... Joint Distributions for two or more r.v. s)

More information

Lecture 8: Jointly distributed random variables

Lecture 8: Jointly distributed random variables Lecture : Jointly distributed random variables Random Vectors and Joint Probability Distributions Definition: Random Vector. An n-dimensional random vector, denoted as Z = (Z, Z,, Z n ), is a function

More information

Will Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions

Will Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions Will Monroe July 1, 017 with materials by Mehran Sahami and Chris Piech Joint Distributions Review: Normal random variable An normal (= Gaussian) random variable is a good approximation to many other distributions.

More information

CS 112: Computer System Modeling Fundamentals. Prof. Jenn Wortman Vaughan April 21, 2011 Lecture 8

CS 112: Computer System Modeling Fundamentals. Prof. Jenn Wortman Vaughan April 21, 2011 Lecture 8 CS 112: Computer System Modeling Fundamentals Prof. Jenn Wortman Vaughan April 21, 2011 Lecture 8 Quiz #2 Reminders & Announcements Homework 2 is due in class on Tuesday be sure to check the posted homework

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2017 Abhay Parekh and Jean Walrand September 21, 2017.

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2017 Abhay Parekh and Jean Walrand September 21, 2017. EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2017 Abhay Parekh and Jean Walrand September 21, 2017 Midterm Exam Last Name First Name SID Rules. You have 80 minutes

More information

Intro to Probability Instructor: Alexandre Bouchard

Intro to Probability Instructor: Alexandre Bouchard www.stat.ubc.ca/~bouchard/courses/stat302-sp2017-18/ Intro to Probability Instructor: Aleandre Bouchard Announcements New webwork will be release by end of day today, due one week later. Plan for today

More information

Statistics I 2011/2012 Notes about the third Computer Class: Simulation of samples and goodness of fit; Central Limit Theorem; Confidence intervals.

Statistics I 2011/2012 Notes about the third Computer Class: Simulation of samples and goodness of fit; Central Limit Theorem; Confidence intervals. Statistics I 2011/2012 Notes about the third Computer Class: Simulation of samples and goodness of fit; Central Limit Theorem; Confidence intervals. In this Computer Class we are going to use Statgraphics

More information

Chapter 6: Simulation Using Spread-Sheets (Excel)

Chapter 6: Simulation Using Spread-Sheets (Excel) Chapter 6: Simulation Using Spread-Sheets (Excel) Refer to Reading Assignments 1 Simulation Using Spread-Sheets (Excel) OBJECTIVES To be able to Generate random numbers within a spreadsheet environment.

More information

MATH : EXAM 3 INFO/LOGISTICS/ADVICE

MATH : EXAM 3 INFO/LOGISTICS/ADVICE MATH 3342-004: EXAM 3 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (04/22) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:

More information

Behavior of the sample mean. varx i = σ 2

Behavior of the sample mean. varx i = σ 2 Behavior of the sample mean We observe n independent and identically distributed (iid) draws from a random variable X. Denote the observed values by X 1, X 2,..., X n. Assume the X i come from a population

More information

Probability Models.S4 Simulating Random Variables

Probability Models.S4 Simulating Random Variables Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Probability Models.S4 Simulating Random Variables In the fashion of the last several sections, we will often create probability

More information

Page 129 Exercise 5: Suppose that the joint p.d.f. of two random variables X and Y is as follows: { c(x. 0 otherwise. ( 1 = c. = c

Page 129 Exercise 5: Suppose that the joint p.d.f. of two random variables X and Y is as follows: { c(x. 0 otherwise. ( 1 = c. = c Stat Solutions for Homework Set Page 9 Exercise : Suppose that the joint p.d.f. of two random variables X and Y is as follows: { cx fx, y + y for y x, < x < otherwise. Determine a the value of the constant

More information

STAT 135 Lab 1 Solutions

STAT 135 Lab 1 Solutions STAT 135 Lab 1 Solutions January 26, 2015 Introduction To complete this lab, you will need to have access to R and RStudio. If you have not already done so, you can download R from http://cran.cnr.berkeley.edu/,

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 28 Problem Set : Probability Review Last updated: March 6, 28 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables, p x denotes the

More information

Testing Continuous Distributions. Artur Czumaj. DIMAP (Centre for Discrete Maths and it Applications) & Department of Computer Science

Testing Continuous Distributions. Artur Czumaj. DIMAP (Centre for Discrete Maths and it Applications) & Department of Computer Science Testing Continuous Distributions Artur Czumaj DIMAP (Centre for Discrete Maths and it Applications) & Department of Computer Science University of Warwick Joint work with A. Adamaszek & C. Sohler Testing

More information

10 Sum-product on factor tree graphs, MAP elimination

10 Sum-product on factor tree graphs, MAP elimination assachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 10 Sum-product on factor tree graphs, AP elimination algorithm The

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Statistics 406 Exam November 17, 2005

Statistics 406 Exam November 17, 2005 Statistics 406 Exam November 17, 2005 1. For each of the following, what do you expect the value of A to be after executing the program? Briefly state your reasoning for each part. (a) X

More information

Discrete Mathematics Course Review 3

Discrete Mathematics Course Review 3 21-228 Discrete Mathematics Course Review 3 This document contains a list of the important definitions and theorems that have been covered thus far in the course. It is not a complete listing of what has

More information

STAT 725 Notes Monte Carlo Integration

STAT 725 Notes Monte Carlo Integration STAT 725 Notes Monte Carlo Integration Two major classes of numerical problems arise in statistical inference: optimization and integration. We have already spent some time discussing different optimization

More information

Missing Data Analysis for the Employee Dataset

Missing Data Analysis for the Employee Dataset Missing Data Analysis for the Employee Dataset 67% of the observations have missing values! Modeling Setup Random Variables: Y i =(Y i1,...,y ip ) 0 =(Y i,obs, Y i,miss ) 0 R i =(R i1,...,r ip ) 0 ( 1

More information

Missing Data Analysis for the Employee Dataset

Missing Data Analysis for the Employee Dataset Missing Data Analysis for the Employee Dataset 67% of the observations have missing values! Modeling Setup For our analysis goals we would like to do: Y X N (X, 2 I) and then interpret the coefficients

More information

Direction Fields; Euler s Method

Direction Fields; Euler s Method Direction Fields; Euler s Method It frequently happens that we cannot solve first order systems dy (, ) dx = f xy or corresponding initial value problems in terms of formulas. Remarkably, however, this

More information

What you will learn today

What you will learn today What you will learn today Tangent Planes and Linear Approximation and the Gradient Vector Vector Functions 1/21 Recall in one-variable calculus, as we zoom in toward a point on a curve, the graph becomes

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

Lecture 14. December 19, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Lecture 14. December 19, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University. geometric Lecture 14 Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University December 19, 2007 geometric 1 2 3 4 geometric 5 6 7 geometric 1 Review about logs

More information

Chapter 5: Joint Probability Distributions and Random

Chapter 5: Joint Probability Distributions and Random Chapter 5: Joint Probability Distributions and Random Samples Curtis Miller 2018-06-13 Introduction We may naturally inquire about collections of random variables that are related to each other in some

More information

Section 6.2: Generating Discrete Random Variates

Section 6.2: Generating Discrete Random Variates Section 6.2: Generating Discrete Random Variates Discrete-Event Simulation: A First Course c 2006 Pearson Ed., Inc. 0-13-142917-5 Discrete-Event Simulation: A First Course Section 6.2: Generating Discrete

More information

Lecture 5: Probability Distributions. Random Variables

Lecture 5: Probability Distributions. Random Variables Lecture 5: Probablty Dstrbutons Random Varables Probablty Dstrbutons Dscrete Random Varables Contnuous Random Varables and ther Dstrbutons Dscrete Jont Dstrbutons Contnuous Jont Dstrbutons Independent

More information

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION CS580: Computer Graphics KAIST School of Computing Chapter 3 MULTI-DIMENSIONAL MONTE CARLO INTEGRATION 2 1 Monte Carlo Integration This describes a simple technique for the numerical evaluation of integrals

More information

Exam 1 Review. MATH Intuitive Calculus Fall Name:. Show your reasoning. Use standard notation correctly.

Exam 1 Review. MATH Intuitive Calculus Fall Name:. Show your reasoning. Use standard notation correctly. MATH 11012 Intuitive Calculus Fall 2012 Name:. Exam 1 Review Show your reasoning. Use standard notation correctly. 1. Consider the function f depicted below. y 1 1 x (a) Find each of the following (or

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

Section 2.3: Simple Linear Regression: Predictions and Inference

Section 2.3: Simple Linear Regression: Predictions and Inference Section 2.3: Simple Linear Regression: Predictions and Inference Jared S. Murray The University of Texas at Austin McCombs School of Business Suggested reading: OpenIntro Statistics, Chapter 7.4 1 Simple

More information

Chapters 5-6: Statistical Inference Methods

Chapters 5-6: Statistical Inference Methods Chapters 5-6: Statistical Inference Methods Chapter 5: Estimation (of population parameters) Ex. Based on GSS data, we re 95% confident that the population mean of the variable LONELY (no. of days in past

More information

Math 115 Second Midterm March 25, 2010

Math 115 Second Midterm March 25, 2010 Math 115 Second Midterm March 25, 2010 Name: Instructor: Section: 1. Do not open this exam until you are told to do so. 2. This exam has 9 pages including this cover. There are 8 problems. Note that the

More information

Info 2950, Lecture 6

Info 2950, Lecture 6 Info 2950, Lecture 6 14 Feb 2017 More Programming and Statistics Boot Camps? Prob Set 2: due Fri night 24 Feb Brief Summary Expectation value: E[X] = P s2s p(s)x(s) Variance: V [X] = P s2s p(s)(x(s) E[X])2

More information

Properties of a Function s Graph

Properties of a Function s Graph Section 3.2 Properties of a Function s Graph Objective 1: Determining the Intercepts of a Function An intercept of a function is a point on the graph of a function where the graph either crosses or touches

More information

LAB #2: SAMPLING, SAMPLING DISTRIBUTIONS, AND THE CLT

LAB #2: SAMPLING, SAMPLING DISTRIBUTIONS, AND THE CLT NAVAL POSTGRADUATE SCHOOL LAB #2: SAMPLING, SAMPLING DISTRIBUTIONS, AND THE CLT Statistics (OA3102) Lab #2: Sampling, Sampling Distributions, and the Central Limit Theorem Goal: Use R to demonstrate sampling

More information

CHAPTER 4. COMPUTABILITY AND DECIDABILITY

CHAPTER 4. COMPUTABILITY AND DECIDABILITY CHAPTER 4. COMPUTABILITY AND DECIDABILITY 1. Introduction By definition, an n-ary function F on N assigns to every n-tuple k 1,...,k n of elements of N a unique l N which is the value of F at k 1,...,k

More information

18.02 Multivariable Calculus Fall 2007

18.02 Multivariable Calculus Fall 2007 MIT OpenCourseWare http://ocw.mit.edu 18.02 Multivariable Calculus Fall 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.02 Problem Set 4 Due Thursday

More information

COMPUTATIONAL STATISTICS UNSUPERVISED LEARNING

COMPUTATIONAL STATISTICS UNSUPERVISED LEARNING COMPUTATIONAL STATISTICS UNSUPERVISED LEARNING Luca Bortolussi Department of Mathematics and Geosciences University of Trieste Office 238, third floor, H2bis luca@dmi.units.it Trieste, Winter Semester

More information

Discrete Mathematics and Probability Theory Spring 2017 Rao Midterm 2

Discrete Mathematics and Probability Theory Spring 2017 Rao Midterm 2 CS 70 Discrete Mathematics and Probability Theory Spring 2017 Rao Midterm 2 PRINT Your Name:, (last) SIGN Your Name: (first) PRINT Your Student ID: WRITE THE NAME OF your exam room: Name of the person

More information

Generative and discriminative classification techniques

Generative and discriminative classification techniques Generative and discriminative classification techniques Machine Learning and Category Representation 2014-2015 Jakob Verbeek, November 28, 2014 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.14.15

More information

Lecture 09: Continuous RV. Lisa Yan July 16, 2018

Lecture 09: Continuous RV. Lisa Yan July 16, 2018 Lecture 09: Continuous RV Lisa Yan July 16, 2018 Announcements As of Sunday July 17 2pm, PS3 has been updated (typo in problem 10) Midterm: Practice midterm up in next 24hrs PS4 out on Wednesday SCPD email

More information

Monte Carlo Methods and Statistical Computing: My Personal E

Monte Carlo Methods and Statistical Computing: My Personal E Monte Carlo Methods and Statistical Computing: My Personal Experience Department of Mathematics & Statistics Indian Institute of Technology Kanpur November 29, 2014 Outline Preface 1 Preface 2 3 4 5 6

More information

Chapter III.C Analysis of The Second Derivative Draft Version 4/19/14 Martin Flashman 2005

Chapter III.C Analysis of The Second Derivative Draft Version 4/19/14 Martin Flashman 2005 Chapter III.C Analysis of The Second Derivative Draft Version 4/19/14 Martin Flashman 2005 In this section we will develop applications of the second derivative that explain more graphical features of

More information

Math 494: Mathematical Statistics

Math 494: Mathematical Statistics Math 494: Mathematical Statistics Instructor: Jimin Ding jmding@wustl.edu Department of Mathematics Washington University in St. Louis Class materials are available on course website (www.math.wustl.edu/

More information

Clustering web search results

Clustering web search results Clustering K-means Machine Learning CSE546 Emily Fox University of Washington November 4, 2013 1 Clustering images Set of Images [Goldberger et al.] 2 1 Clustering web search results 3 Some Data 4 2 K-means

More information

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods ISyE 6416: Computational Statistics Spring 2017 Lecture 13: Monte Carlo Methods Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Determine area

More information

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C. D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either

More information

Optimization and Simulation

Optimization and Simulation Optimization and Simulation Statistical analysis and bootstrapping Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale

More information

EE123 Digital Signal Processing

EE123 Digital Signal Processing Multi-Dimensional Signals EE23 Digital Signal Processing Lecture Our world is more complex than D Images: f(x,y) Videos: f(x,y,t) Dynamic 3F scenes: f(x,y,z,t) Medical Imaging 3D Video Computer Graphics

More information

Learner Expectations UNIT 1: GRAPICAL AND NUMERIC REPRESENTATIONS OF DATA. Sept. Fathom Lab: Distributions and Best Methods of Display

Learner Expectations UNIT 1: GRAPICAL AND NUMERIC REPRESENTATIONS OF DATA. Sept. Fathom Lab: Distributions and Best Methods of Display CURRICULUM MAP TEMPLATE Priority Standards = Approximately 70% Supporting Standards = Approximately 20% Additional Standards = Approximately 10% HONORS PROBABILITY AND STATISTICS Essential Questions &

More information

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea Chapter 3 Bootstrap 3.1 Introduction The estimation of parameters in probability distributions is a basic problem in statistics that one tends to encounter already during the very first course on the subject.

More information

Recursive Estimation

Recursive Estimation Recursive Estimatio Raffaello D Adrea Sprig 2 Problem Set: Probability Review Last updated: February 28, 2 Notes: Notatio: Uless otherwise oted, x, y, ad z deote radom variables, f x (x) (or the short

More information

Sampling and Monte-Carlo Integration

Sampling and Monte-Carlo Integration Sampling and Monte-Carlo Integration Sampling and Monte-Carlo Integration Last Time Pixels are samples Sampling theorem Convolution & multiplication Aliasing: spectrum replication Ideal filter And its

More information

Comprehensive Practice Handout MATH 1325 entire semester

Comprehensive Practice Handout MATH 1325 entire semester 1 Comprehensive Practice Handout MATH 1325 entire semester Test 1 material Use the graph of f(x) below to answer the following 6 questions. 7 1. Find the value of lim x + f(x) 2. Find the value of lim

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS246: Mining Massive Datasets Jure Leskovec, Stanford University http://cs246.stanford.edu Web advertising We discussed how to match advertisers to queries in real-time But we did not discuss how to estimate

More information

Experiment 4 Boolean Functions Implementation

Experiment 4 Boolean Functions Implementation Experiment 4 Boolean Functions Implementation Introduction: Generally you will find that the basic logic functions AND, OR, NAND, NOR, and NOT are not sufficient to implement complex digital logic functions.

More information

Math 52 - Fall Final Exam PART 1

Math 52 - Fall Final Exam PART 1 Math 52 - Fall 2013 - Final Exam PART 1 Name: Student ID: Signature: Instructions: Print your name and student ID number and write your signature to indicate that you accept the Honor Code. This exam consists

More information

Announcements. Topics: To Do:

Announcements. Topics: To Do: Announcements Topics: In the Functions of Several Variables module: - Section 3: Limits and Continuity - Section 4: Partial Derivatives - Section 5: Tangent Plane, Linearization, and Differentiability

More information

B. Examples Set up the integral(s) needed to find the area of the region bounded by

B. Examples Set up the integral(s) needed to find the area of the region bounded by Math 176 Calculus Sec. 6.1: Area Between Curves I. Area between the Curve and the x Axis A. Let f(x) 0 be continuous on [a,b]. The area of the region between the graph of f and the x-axis is A = f ( x)

More information

1 extrema notebook. November 25, 2012

1 extrema notebook. November 25, 2012 Do now as a warm up: Suppose this graph is a function f, defined on [a,b]. What would you say about the value of f at each of these x values: a, x 1, x 2, x 3, x 4, x 5, x 6, and b? What would you say

More information

Math 32B Discussion Session Week 2 Notes January 17 and 24, 2017

Math 32B Discussion Session Week 2 Notes January 17 and 24, 2017 Math 3B Discussion Session Week Notes January 7 and 4, 7 This week we ll finish discussing the double integral for non-rectangular regions (see the last few pages of the week notes) and then we ll touch

More information

= f (a, b) + (hf x + kf y ) (a,b) +

= f (a, b) + (hf x + kf y ) (a,b) + Chapter 14 Multiple Integrals 1 Double Integrals, Iterated Integrals, Cross-sections 2 Double Integrals over more general regions, Definition, Evaluation of Double Integrals, Properties of Double Integrals

More information

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors (Section 5.4) What? Consequences of homoskedasticity Implication for computing standard errors What do these two terms

More information

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts by Emmanuel Agu Introduction The integral equations generally don t have analytic solutions, so we must turn to numerical

More information

Biostatistics & SAS programming. Kevin Zhang

Biostatistics & SAS programming. Kevin Zhang Biostatistics & SAS programming Kevin Zhang February 27, 2017 Random variables and distributions 1 Data analysis Simulation study Apply existing methodologies to your collected samples, with the hope to

More information

SD 372 Pattern Recognition

SD 372 Pattern Recognition SD 372 Pattern Recognition Lab 2: Model Estimation and Discriminant Functions 1 Purpose This lab examines the areas of statistical model estimation and classifier aggregation. Model estimation will be

More information

( ) = 1 PART-A. k+3k+3k+2k+k = 1

( ) = 1 PART-A. k+3k+3k+2k+k = 1 V.S.B. ENGINEERING COLLEGE, KARUR DEPARTMENT OF MATHEMATICS Academic year : 7-8 MA 6453 - PROBABILITY AND QUEUEING THEORY UNIT-I RANDOM VARIABLES PART-A. If X is a discrete random variable with probability

More information

Bayesian Methods. David Rosenberg. April 11, New York University. David Rosenberg (New York University) DS-GA 1003 April 11, / 19

Bayesian Methods. David Rosenberg. April 11, New York University. David Rosenberg (New York University) DS-GA 1003 April 11, / 19 Bayesian Methods David Rosenberg New York University April 11, 2017 David Rosenberg (New York University) DS-GA 1003 April 11, 2017 1 / 19 Classical Statistics Classical Statistics David Rosenberg (New

More information

CS 237 Fall 2018, Homework 08 Solution

CS 237 Fall 2018, Homework 08 Solution CS 237 Fall 2018, Homework 08 Solution Due date: Thursday November 8th at 11:59 pm (10% off if up to 24 hours late) via Gradescope General Instructions Please complete this notebook by filling in solutions

More information

Solved Examples. Parabola with vertex as origin and symmetrical about x-axis. We will find the area above the x-axis and double the area.

Solved Examples. Parabola with vertex as origin and symmetrical about x-axis. We will find the area above the x-axis and double the area. Solved Examples Example 1: Find the area common to the curves x 2 + y 2 = 4x and y 2 = x. x 2 + y 2 = 4x (i) (x 2) 2 + y 2 = 4 This is a circle with centre at (2, 0) and radius 2. y = (4x-x 2 ) y 2 = x

More information

Info 2950, Lecture 22

Info 2950, Lecture 22 Info 2950, Lecture 22 25 Apr 2017 Prob Set 6: due Mon night 24 Apr Prob Set 7: to be issued tonight, due mid next week Prob Set 8: due 11 May (end of classes) Cov(X,Y) = E[(X-E[X])(Y-E[Y])] Cov(X,X) =

More information

Integration. Example Find x 3 dx.

Integration. Example Find x 3 dx. Integration A function F is called an antiderivative of the function f if F (x)=f(x). The set of all antiderivatives of f is called the indefinite integral of f with respect to x and is denoted by f(x)dx.

More information

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal Email: yuppal@ysu.edu Chapter 8: Interval Estimation Population Mean: Known Population Mean: Unknown Margin of Error and the Interval

More information

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing Lecture 4 Digital Image Enhancement 1. Principle of image enhancement 2. Spatial domain transformation Basic intensity it tranfomation ti Histogram processing Principle Objective of Enhancement Image enhancement

More information

Midterm 2. Friday, August 2, 5:10pm 7:10pm CS 70: Discrete Mathematics and Probability Theory, Summer 2013

Midterm 2. Friday, August 2, 5:10pm 7:10pm CS 70: Discrete Mathematics and Probability Theory, Summer 2013 Midterm 2 Friday, August 2, 5:10pm 7:10pm CS 70: Discrete Mathematics and Probability Theory, Summer 2013 Your name Your student ID # Your section # This exam has 5 questions and a total of 100 points.

More information

Understanding Disconnection and Stabilization of Chord

Understanding Disconnection and Stabilization of Chord Understanding Disconnection and Stabilization of Chord Zhongmei Yao Joint work with Dmitri Loguinov Internet Research Lab Department of Computer Science Texas A&M University, College Station, TX 77843

More information

Equation of tangent plane: for implicitly defined surfaces section 12.9

Equation of tangent plane: for implicitly defined surfaces section 12.9 Equation of tangent plane: for implicitly defined surfaces section 12.9 Some surfaces are defined implicitly, such as the sphere x 2 + y 2 + z 2 = 1. In general an implicitly defined surface has the equation

More information

Generating random samples from user-defined distributions

Generating random samples from user-defined distributions The Stata Journal (2011) 11, Number 2, pp. 299 304 Generating random samples from user-defined distributions Katarína Lukácsy Central European University Budapest, Hungary lukacsy katarina@phd.ceu.hu Abstract.

More information

Testing Random- Number Generators

Testing Random- Number Generators Testing Random- Number Generators Raj Jain Washington University Saint Louis, MO 63131 Jain@cse.wustl.edu These slides are available on-line at: http://www.cse.wustl.edu/~jain/cse574-06/ 27-1 Overview

More information

LSN 4 Boolean Algebra & Logic Simplification. ECT 224 Digital Computer Fundamentals. Department of Engineering Technology

LSN 4 Boolean Algebra & Logic Simplification. ECT 224 Digital Computer Fundamentals. Department of Engineering Technology LSN 4 Boolean Algebra & Logic Simplification Department of Engineering Technology LSN 4 Key Terms Variable: a symbol used to represent a logic quantity Compliment: the inverse of a variable Literal: a

More information

Section 1: Section 2: Section 3: Section 4:

Section 1: Section 2: Section 3: Section 4: Announcements Topics: In the Functions of Several Variables module: - Section 1: Introduction to Functions of Several Variables (Basic Definitions and Notation) - Section 2: Graphs, Level Curves + Contour

More information

Unit 5: Estimating with Confidence

Unit 5: Estimating with Confidence Unit 5: Estimating with Confidence Section 8.3 The Practice of Statistics, 4 th edition For AP* STARNES, YATES, MOORE Unit 5 Estimating with Confidence 8.1 8.2 8.3 Confidence Intervals: The Basics Estimating

More information

Chapter 3. Gate-Level Minimization. Outlines

Chapter 3. Gate-Level Minimization. Outlines Chapter 3 Gate-Level Minimization Introduction The Map Method Four-Variable Map Five-Variable Map Outlines Product of Sums Simplification Don t-care Conditions NAND and NOR Implementation Other Two-Level

More information

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points]

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points] CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, 2015. 11:59pm, PDF to Canvas [100 points] Instructions. Please write up your responses to the following problems clearly and concisely.

More information

CHAPTER 2: Describing Location in a Distribution

CHAPTER 2: Describing Location in a Distribution CHAPTER 2: Describing Location in a Distribution 2.1 Goals: 1. Compute and use z-scores given the mean and sd 2. Compute and use the p th percentile of an observation 3. Intro to density curves 4. More

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

Graphics calculator instructions

Graphics calculator instructions Graphics calculator instructions Contents: A Basic calculations B Basic functions C Secondary function and alpha keys D Memory E Lists F Statistical graphs G Working with functions H Two variable analysis

More information

Sections 1.3 Computation of Limits

Sections 1.3 Computation of Limits 1 Sections 1.3 Computation of Limits We will shortly introduce the it laws. Limit laws allows us to evaluate the it of more complicated functions using the it of simpler ones. Theorem Suppose that c is

More information

Lecture 31 Sections 9.4. Tue, Mar 17, 2009

Lecture 31 Sections 9.4. Tue, Mar 17, 2009 s for s for Lecture 31 Sections 9.4 Hampden-Sydney College Tue, Mar 17, 2009 Outline s for 1 2 3 4 5 6 7 s for Exercise 9.17, page 582. It is believed that 20% of all university faculty would be willing

More information

To complete the computer assignments, you ll use the EViews software installed on the lab PCs in WMC 2502 and WMC 2506.

To complete the computer assignments, you ll use the EViews software installed on the lab PCs in WMC 2502 and WMC 2506. An Introduction to EViews The purpose of the computer assignments in BUEC 333 is to give you some experience using econometric software to analyse real-world data. Along the way, you ll become acquainted

More information