Euclidean Space. Definition 1 (Euclidean Space) A Euclidean space is a finite-dimensional vector space over the reals R, with an inner product,.

Similar documents
x = 12 x = 12 1x = 16

Hw 4 Due Feb 22. D(fg) x y z (

Vector Calculus: Understanding the Cross Product

60 2 Convex sets. {x a T x b} {x ã T x b}

Permutation Matrices. Permutation Matrices. Permutation Matrices. Permutation Matrices. Isomorphisms of Graphs. 19 Nov 2015

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces

THE DOLD-KAN CORRESPONDENCE

MAT 3271: Selected Solutions to the Assignment 6

2 Second Derivatives. As we have seen, a function f (x, y) of two variables has four different partial derivatives: f xx. f yx. f x y.

CALCULATING TRANSFORMATIONS OF KINEMATIC CHAINS USING HOMOGENEOUS COORDINATES

CHAPTER 5 SYSTEMS OF EQUATIONS. x y

Geometric transformations assign a point to a point, so it is a point valued function of points. Geometric transformation may destroy the equation

CS-9645 Introduction to Computer Vision Techniques Winter 2019

Revision Problems for Examination 2 in Algebra 1

Section 13.5: Equations of Lines and Planes. 1 Objectives. 2 Assignments. 3 Lecture Notes

A Detailed Look into Forward and Inverse Kinematics

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

GEOMETRIC TRANSFORMATIONS AND VIEWING


Chapter 18. Geometric Operations

Specifying Complex Scenes

Introduction to Geometric Algebra

A GENTLE INTRODUCTION TO THE BASIC CONCEPTS OF SHAPE SPACE AND SHAPE STATISTICS

To do this the end effector of the robot must be correctly positioned relative to the work piece.

Visual Recognition: Image Formation

Menu. Algebraic Simplification - Boolean Algebra EEL3701 EEL3701. MSOP, MPOS, Simplification

Linear Algebra Part I - Linear Spaces

3D Mathematics. Co-ordinate systems, 3D primitives and affine transformations

Spaces with algebraic structure

Orthogonal complements

THE ISOMORPHISM PROBLEM FOR SOME CLASSES OF MULTIPLICATIVE SYSTEMS

Generalized alternative and Malcev algebras

REGULAR GRAPHS OF GIVEN GIRTH. Contents

CS6015 / LARP ACK : Linear Algebra and Its Applications - Gilbert Strang

Definition 3.1 The partial derivatives of a function f(x, y) defined on an open domain containing (x, y) are denoted by f

Multiple View Geometry in Computer Vision

MATRIX REVIEW PROBLEMS: Our matrix test will be on Friday May 23rd. Here are some problems to help you review.

As a consequence of the operation, there are new incidences between edges and triangles that did not exist in K; see Figure II.9.

Three-Dimensional Coordinate Systems

An introduction to simplicial sets

CS231A Course Notes 4: Stereo Systems and Structure from Motion

2. Convex sets. x 1. x 2. affine set: contains the line through any two distinct points in the set

Convex Optimization. Convex Sets. ENSAE: Optimisation 1/24

Geometric structures on manifolds

Conic Duality. yyye

Chapter 3. Quadric hypersurfaces. 3.1 Quadric hypersurfaces Denition.

True/False. MATH 1C: SAMPLE EXAM 1 c Jeffrey A. Anderson ANSWER KEY

Lecture 0: Reivew of some basic material

Today. Today. Introduction. Matrices. Matrices. Computergrafik. Transformations & matrices Introduction Matrices

Math background. 2D Geometric Transformations. Implicit representations. Explicit representations. Read: CS 4620 Lecture 6

Convex Sets. CSCI5254: Convex Optimization & Its Applications. subspaces, affine sets, and convex sets. operations that preserve convexity

CS 775: Advanced Computer Graphics. Lecture 3 : Kinematics

Interpolation is a basic tool used extensively in tasks such as zooming, shrinking, rotating, and geometric corrections.

An array is a collection of data that holds fixed number of values of same type. It is also known as a set. An array is a data type.

Variable, Complement, and Literal are terms used in Boolean Algebra.

Generalized trace ratio optimization and applications

2. Convex sets. affine and convex sets. some important examples. operations that preserve convexity. generalized inequalities

Chapter 15: Functions of Several Variables

Part 0. The Background: Projective Geometry, Transformations and Estimation

The Singular Value Decomposition: Let A be any m n matrix. orthogonal matrices U, V and a diagonal matrix Σ such that A = UΣV T.

Structure from Motion. Prof. Marco Marcon

Partial Derivatives. Partial Derivatives. Partial Derivatives. Partial Derivatives. Partial Derivatives. Partial Derivatives

1) Give a set-theoretic description of the given points as a subset W of R 3. a) The points on the plane x + y 2z = 0.

AH Matrices.notebook November 28, 2016

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Isometries. 1 Identifying Isometries

Inverse Kinematics of 6 DOF Serial Manipulator. Robotics. Inverse Kinematics of 6 DOF Serial Manipulator

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

Unsupervised Learning

CMSC Honors Discrete Mathematics

A few multilinear algebraic definitions I Inner product A, B = i,j,k a ijk b ijk Frobenius norm Contracted tensor products A F = A, A C = A, B 2,3 = A

CSE328 Fundamentals of Computer Graphics

Coordinate Frames and Transforms

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

Math 462: Review questions

Chapter 18 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. Cargal.

Circuit analysis summary

Convex Optimization. 2. Convex Sets. Prof. Ying Cui. Department of Electrical Engineering Shanghai Jiao Tong University. SJTU Ying Cui 1 / 33

Worksheet 2.2: Partial Derivatives

Finite Math - J-term Homework. Section Inverse of a Square Matrix

CIS 580, Machine Perception, Spring 2014: Assignment 4 Due: Wednesday, April 10th, 10:30am (use turnin)

Geometric structures on manifolds

Math 355: Linear Algebra: Midterm 1 Colin Carroll June 25, 2011

Adjacent: Two distinct vertices u, v are adjacent if there is an edge with ends u, v. In this case we let uv denote such an edge.

Inverse and Implicit functions

Graphics and Interaction Transformation geometry and homogeneous coordinates

I. An introduction to Boolean inverse semigroups

DRAFT: Mathematical Background for Three-Dimensional Computer Graphics. Jonathan R. Senning Gordon College

COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates

LECTURE 13, THURSDAY APRIL 1, 2004

CPSC 340: Machine Learning and Data Mining. Kernel Trick Fall 2017

3D Geometry and Camera Calibration

Epipolar Geometry and the Essential Matrix

Agenda. Rotations. Camera models. Camera calibration. Homographies

Rectangular Coordinates in Space

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

Rectification. Dr. Gerhard Roth

How to declare an array in C?

Exponential Maps for Computer Vision

A Subfactor Planar Algebra Everyone Will Understand

Transcription:

Definition 1 () A Euclidean space is a finite-dimensional vector space over the reals R, with an inner product,. 1

Inner Product Definition 2 (Inner Product) An inner product, on a real vector space X is a symmetric, bilinear, positive-definite function, : X X R (x,x) x,x. (Positive-definite means x, x > 0 unless x = 0.) 2

Orthogonal Definition 3 (Orthogonal) Two vectors x and x are orthogonal if their inner product is zero, x,x = 0. Geometrically, orthogonal means perpendicular. 3

Orthonormal Basis Definition 4 (Orthonormal Basis) In a Euclidean space, an orthonormal basis is a basis x i such that xi,x j = 1 if i = j 0 if i j. Any two basis vectors are orthogonal. A Euclidean space has more than one orthonormal basis. 4

If x = i x = i x i x i x i x i, then x,x = i x i x i. 5

R For the real numbers R, the inner product is just ordinary multiplication. Definition 5 The Euclidean space R of real numbers is defined by the inner product x,x := x x. 6

R n The Euclidean space R n := R R (n times), in which the elements are vectors with n real components. By assumption, the n vectors 1 0 0., 0 1 0.,. form an orthonormal basis. The inner product of two vectors is then the sum of the component by component products. 7

Isomorphic In abstract algebra, isomorphic means the same. If two objects of a given type (group, ring, vector space, Euclidean space, algebra, etc.) are isomorphic, then they are the same, when considered as objects of that type. An isomorphism is a one-to-one and onto mapping from one space to the other that preserves all properties defining the space. Any n-dimensional Euclidean space is isomorphic to R n. Although two spaces may be isomorphic as Euclidean spaces, perhaps the same two spaces are not isomorphic when viewed as another space. 8

Coordinate-Free Versus Basis It is useful to think of a vector in a Euclidean space as coordinate-free. Given a basis, any vector can be expressed uniquely as a linear combination of the basis elements. For example, if x = i x i x i for some basis x i, one can refer to the x i as the coordinates of x in terms of this basis. Many linear algebra textbooks develop all the results in terms of a basis. 9

In economic theory and econometrics, typically vectors are not seen as coordinate-free. A particular basis is singled out, and one works with coordinates. Commonly there is a natural basis, but unfortunately the natural basis is perhaps not orthonormal. Despite this tradition, the coordinate-free point-of-view is superior. Not using coordinates reduces the use of subscripts and makes expressions simpler, and theorems are easier to state and to prove. 10

Linear Transformation Definition 6 (Linear Transformation) A linear transformation from a Euclidean space X to a Euclidean space Y is a function A : X Y x y = Ax such that A(x 1 + x 2 ) = Ax 1 + Ax 2. 11

Adjoint The following proposition is a standard theorem of linear algebra. Proposition 7 (Adjoint) Given a linear transformation A : X Y, then there exists a unique linear transformation (the adjoint) A : Y X that preserves the inner product: y,ax = A y,x (1) for all x and y. 12

The adjoint is very important in applications and has not been appreciated by economists. The adjoint is independent of any choice of bases, and in many applications one can determine it directly, expressed in a coordinate-free way. The adjoint then becomes a powerful tool, and one can easily obtain valuable results via the adjoint, almost as if by magic. Typically one does not calculate the adjoint directly. Instead one conjectures an expression for the adjoint, and then verifies that the adjoint condition (1) holds. 13

Matrix Representation A matrix representation for a linear transformation A : X Y is a matrix A i j that shows how basis elements x j X map to a linear combination of basis elements y i Y: x j Ax j = i A i j y i. 14

Adjoint as Transpose If the bases for X and Y are each orthonormal, then the matrix representation of the adjoint is the transpose of the matrix representation: A y i = j A i j x j. 15

To prove this relationship, verify the adjoint condition (1), for arbitrary basis elements: A y i,x j = A ik x k,x j k = A i j (since the basis x j is orthonormal) = A k j y k (since the basis y i is orthonormal) y i, k = y i,ax j, as desired. 16

On the other hand, if the bases are not orthonormal, then the transpose of the matrix representation is not the matrix representation of the adjoint. 17

Since we want to see vectors as coordinate-free, however, the matrix representation is of secondary importance. Apart from simple cases, it may be difficult to write down the matrix representation explicitly. At the same time, one can describe the adjoint easily, without reference to any basis. 18

Riesz Representation A fundamental theorem states that any linear function X R can be expressed as x y,x for a unique y. 19

For some y X, the adjoint of the linear function y : R X z x = zy is y : X R x z = y,x. 20

Verify that the adjoint condition (1) holds: x,yz = x,y z = y,x z = y,x,z = y x,z. Thus y x = y,x. Either notation is equivalent, but normally we employ the inner product notation on the right-hand side. 21

Matrix Representation Suppose y = y i x i, i for a basis x i. Let us use the natural orthonormal basis 1 for R. The matrix representation of the linear transformation y is 1 i y i x i, so the vector with components y i defines the matrix representation. For the adjoint y, however, the matrix representation is not the transpose of this vector, unless the basis x i is orthonormal. 22

The matrix representation of the adjoint is x j y,x j 1 = i y i x i,x j 1 = i xi,x j yi 1. For a nonorthonormal basis, the matrix representation of the adjoint is not x j y j 1. 23

Fundamental Theorem of Linear Algebra The fundamental theorem of linear algebra states that the null space N(A) and the range R ( A ) are orthogonal, and any x X can be written uniquely as an element of N(A) plus an element of R ( A ). The same relationship holds for the range R(A) and the null space N ( A ). 24

Moore-Penrose Generalized Inverse Using the fundamental theorem of linear algebra, we define the Moore-Penrose generalized inverse. Consider a linear transformation A : X Y x y = Ax. The generalized inverse A + is a linear transformation mapping Y X. 25

Using the fundamental theorem of linear algebra, one can prove the following. Proposition 8 (Inverse) The restriction of A to R ( A ) A : R ( A ) R(A) x y = Ax. is one-to-one and onto, so it has an inverse. 26

Define the generalized inverse A + : Y X via this inverse mapping. For y R(A), define A + y as the inverse of A. For y N ( A ), define A + y = 0. This definition rests on the coordinate-free approach to linear algebra. The relationship between A and A + is symmetric. A linear transformation is the generalized inverse of its generalized inverse: (A + ) + = A. And AA + A = A. The singular-value decomposition obtains further results. If A is onto, then AA is invertible, and A + = A (AA ) 1. 27

Linear Equation For the linear equation Ax = y, there is a solution x if and only if y R(A). If there is a solution, then the unique solution in R ( A ) is A + y. This vector plus any element of N(A) is also a solution. Hence the complete solution set is { A + y } + N(A). 28

Order The concept of a Euclidean space does not involve any concept of order, of one vector being greater than another. However commonly one defines the additional structure of a partial ordering via the representation of a vector in a natural basis. 29

Definition 9 (Partial Ordering) Given a basis representation x = j x j x j, then x 0 if every x j 0 x 0 if every x j 0 and some x i > 0 x 0 if every x j > 0. 30

Definition of the Inner Product To embed model structure into the inner product simplifies the analysis. In a Euclidean space of random variables, one might define the inner product of two random variables as the covariance. Orthogonality then means no correlation. A different definition of the inner product derives from a partial ordering: one defines a trace inner product consistent with the ordering. 31