Overview. By end of the week:

Similar documents
Overview. By end of the week:

CSE 167. Discussion 03 ft. Glynn 10/16/2017

CS 432 Interactive Computer Graphics

CPSC 436D Video Game Programming

Computergraphics Exercise 15/ Shading & Texturing

CGT520 Transformations

Lab 2-3D Transformations and Vertex Shaders

Shadows. Prof. George Wolberg Dept. of Computer Science City College of New York

Computer graphics 2: Graduate seminar in computational aesthetics

Rendering Objects. Need to transform all geometry then

Virtual Cameras and The Transformation Pipeline

One or more objects A viewer with a projection surface Projectors that go from the object(s) to the projection surface

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

Classical and Computer Viewing. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012

Tutorial 04. Harshavardhan Kode. September 14, 2015

Discussion 3. PPM loading Texture rendering in OpenGL

The Graphics Pipeline and OpenGL I: Transformations!

The Graphics Pipeline and OpenGL IV: Stereo Rendering, Depth of Field Rendering, Multi-pass Rendering!

Sign up for crits! Announcments

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

INF3320 Computer Graphics and Discrete Geometry

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Announcement. Project 1 has been posted online and in dropbox. Due: 11:59:59 pm, Friday, October 14

CS452/552; EE465/505. Texture Mapping in WebGL

Lighting and Texturing

Computer Viewing. Prof. George Wolberg Dept. of Computer Science City College of New York

CS452/552; EE465/505. Review & Examples

The Graphics Pipeline and OpenGL I: Transformations!

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

Game Architecture. 2/19/16: Rasterization

OpenGL Transformations

Rasterization Overview

Computer Graphics Coursework 1

CS 184: Assignment 2 Scene Viewer

Viewing and Projection Transformations

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

CS452/552; EE465/505. Image Processing Frame Buffer Objects

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CS179: GPU Programming

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

Texture Mapping 1/34

Announcements. Submitting Programs Upload source and executable(s) (Windows or Mac) to digital dropbox on Blackboard

Blis: Better Language for Image Stuff Project Proposal Programming Languages and Translators, Spring 2017

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Lecture 4. Viewing, Projection and Viewport Transformations

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

Computer Viewing. CITS3003 Graphics & Animation. E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley

Computação Gráfica. Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre. Chap. 4 Windows and Viewports

3D Viewing. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Viewing with Computers (OpenGL)

2D graphics with WebGL

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

SUMMARY. CS380: Introduction to Computer Graphics Projection Chapter 10. Min H. Kim KAIST School of Computing 18/04/12. Smooth Interpolation

OPENGL RENDERING PIPELINE

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

BCA611 Video Oyunları için 3B Grafik

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

Programming shaders & GPUs Christian Miller CS Fall 2011

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

CS4621/5621 Fall Computer Graphics Practicum Intro to OpenGL/GLSL

Homework 3: Programmable Shaders

VGP353 Week 2. Agenda: Assignment #1 due Introduce shadow maps. Differences / similarities with shadow textures Added benefits Potential problems

COMS 4160: Problems on Transformations and OpenGL

Methodology for Lecture

Shader Programs. Lecture 30 Subsections 2.8.2, Robb T. Koether. Hampden-Sydney College. Wed, Nov 16, 2011

CS559 Computer Graphics Fall 2015

COMP371 COMPUTER GRAPHICS

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

Shadow Rendering. CS7GV3 Real-time Rendering

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

Game Graphics & Real-time Rendering

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

CS 130 Final. Fall 2015

We will use WebGL 1.0. WebGL 2.0 is now being supported by most browsers but requires a better GPU so may not run on older computers or on most cell

Introduction to Computer Graphics with WebGL

Transforms 3: Projection Christian Miller CS Fall 2011

Building Models. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CS 432 Interactive Computer Graphics

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Texture Mapping 1/34

Models and Architectures

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Wire-Frame 3D Graphics Accelerator IP Core. C Library Specification

INF3320 Computer Graphics and Discrete Geometry

Shader Programming 1. Examples. Vertex displacement mapping. Daniel Wesslén 1. Post-processing, animated procedural textures

Transcription:

Overview By end of the week: - Know the basics of git - Make sure we can all compile and run a C++/ OpenGL program - Understand the OpenGL rendering pipeline - Understand how matrices are used for geometric transformations - Understand how the projection from 3D to 2D is encoded in a matrix - Load and use an image texture

OpenGL Vertex Transformation Moving a point in 3D space to a 2D screen

Clip Coordinates The view frustum is defined from the point of view of the camera.

Clip Coordinates Defining the view frustum using a perspective transformation.

Projection Matrix The Projection Matrix defines how much of the world is seen by the camera. It encodes the following information: The near plane and the far plane: The range of depth in the world that the camera can see. The field of view angle that the camera sees in the y direction. The aspect ratio of the screen which the world will be projected on.

Projection Matrix The near plane and far plane define the distance along the z axis from the camera origin. The near plane needs to be a distance > 0 and the far plane needs to be < infinity. Common values are.1 and 100, but it depends on how you decide to position things in the world. The field of view, or fovy, defines the angle in the y direction The aspect ratio (width/height) of the screen bounds thus defines the clipping in the x axis. These values are used to define the view frustum in terms of 6 values, the left, right, top, bottom, near, and far bounds of the world. The projection matrix transforms the view frustum into a unit cube.

Projection Matrix The actual Projection Matrix looks likes this: ( 2n / (r l), 0, -(r + l) / (r - l), 0 ) ( 0 2n / ( t - b), (t + b) / (t - b), 0 ) ( 0 0, -(f + n) / (f - n), - (2fn) / ( f - n) ) ( 0 0, 1, 0 ) Where n and f are the near and far planes, t and b are defined by the fovy And l and r are further defined by the aspect ratio

Useful GLM methods glm::mat4 proj = glm::perspective(60.0, width/height, 0.1, 100.0); //creates a symmetrical perspective projection matrix //arg 1,2,3,4 = fovy, aspect ratio, near plane, far plane glm::vec3 camera_pos = vec3(0,0,-2); glm::vec3 camera_look_at = vec3(0,0,0); glm::vec3 camera_up = vec3(0,1,0); glm::mat4 view = glm::lookat(camera_pos, camera_look_at, camera_up ); //pos = position of camera in world space //look_at = position camera is looking at; defines view vector emenating out from the camera //up = the orientation of the camera around the view vector

Example: Transforming a vertex To transform our 3D point from object coordinates into 2D window coordinates we do the following operations: Given a vertex v o in object coordinates (x o,y o,z o,w o ), where w o is always 1. Put the object point into eye coordinates by multiplying it by the MODELVIEW matrix M (which concatenates the transformation from object coordinates à world coordinates à eye coordinates) v e = Mv o Put the vertex into clip coordinates by multiplying it by the PROJECTION matrix P v c = Pv e Put the vertex into normalized device coordinates by dividing by the w c value of v c. v d = (x c / w c, y c / w c, z c / w c ) Put the vertex into screen space by scaling x c and y c by the width and height of the screen. v p = (width/2 + (x d * width/2), height/2 + (y d * height/2))

Textures Loading textures by hand is kind of a pain. OpenGL environments generally provide helper methods. We re using Cocoa/iOS methods (for Apple) or FreeImage (for Windows and Linux) which handles most of this. A texture is just an array of data, can be used for images, depth maps, luminance maps, etc 1. enable textures and generate texture ids 2. bind a specific texture id 3. load image from disk 4. put it into a texture object usually 2D, RGBA format 5. set texture attributes (eg, linear filtering, clamping)

Textures Textures are copied directly onto the video card, so drawing them is hardware-accelerated First, we call our helper method to load, say, a JPEG into a buffer of bytes, say a variable called imgpixeldata glenable(gl_texture_2d); glgentextures(1, texid); //bind 1 textures to IDs glbindtexture(gl_texture_2d, texid); gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glteximage2d(texid, 0, GL_RGBA, imgwidth, imgheight, 0, GL_RGBA, GL_UNSIGNED_BYTE, imgpixeldata); glbindtexture(gl_texture_2d, 0); //unbind texture

Textures - opengl program.bind(); { //pass in uniform data... one of which will be a pointer to a texture gluniform1i(program.uniform( u_tex_id"), 0); glactivetexture(gl_texture0) //the number here must match the ID above! glbindtexture(gl_texture_2d, texid) { //bind the texture //now pass in vertex data... glbindvertexarray( vao ); { gldrawelements(gl_triangles, 12, GL_UNSIGNED_INT, 0); } glbindvertexarray( 0 ); } glbindtexture(gl_texture_2d, 0); //unbind texture } program.unbind();

Textures vertex shader uniform mat4 proj; uniform mat4 view; uniform mat4 model; in vec4 vertexposition; in vec3 vertextexcoord; out vec2 texcoord; void main() { texcoord = vertextexcoord.xy; gl_position = proj * view * model * vertexposition; }

Texture fragment shader uniform sampler2d u_tex_id; in vec2 texcoord; out vec4 outputfrag; void main(){ outputfrag = texture(u_tex_id, texcoord); }

Homework package #1 I will send the first homework out tonight or tomorrow. Will be due on Monday 9/15 in the evening (11:59pm). 1. A small sized programming project that makes use of the basic OpenGL / GLSL we ve learned this week (and will cover next week) 2. Some smaller programming examples 3. A series of (hopefully) simple problem solving questions that you could do by hand I ll announce details via Piazza...