Game Graphics & Real-time Rendering

Similar documents
Game Graphics & Real-time Rendering

Computer graphics 2: Graduate seminar in computational aesthetics

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

The Graphics Pipeline

Rasterization Overview

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Introduction to Computer Graphics with WebGL

Models and Architectures

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

Shadows. Prof. George Wolberg Dept. of Computer Science City College of New York

OPENGL RENDERING PIPELINE

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

CS 354R: Computer Game Technology

Getting fancy with texture mapping (Part 2) CS559 Spring Apr 2017

Lecture 2. Shaders, GLSL and GPGPU

The Graphics Pipeline and OpenGL I: Transformations!

COMP371 COMPUTER GRAPHICS

Pipeline Operations. CS 4620 Lecture 14

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CS451Real-time Rendering Pipeline

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

The Rasterization Pipeline

WebGL and GLSL Basics. CS559 Fall 2015 Lecture 10 October 6, 2015

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Spring 2011 Prof. Hyesoon Kim

Rendering Objects. Need to transform all geometry then

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Computer graphics 2: Graduate seminar in computational aesthetics

Hands-On Workshop: 3D Automotive Graphics on Connected Radios Using Rayleigh and OpenGL ES 2.0

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

Ciril Bohak. - INTRODUCTION TO WEBGL

The Rasterization Pipeline

Shaders. Slide credit to Prof. Zwicker

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

WebGL and GLSL Basics. CS559 Fall 2016 Lecture 14 October

CS 432 Interactive Computer Graphics

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

Programming shaders & GPUs Christian Miller CS Fall 2011

Case 1:17-cv SLR Document 1-3 Filed 01/23/17 Page 1 of 33 PageID #: 60 EXHIBIT C

Parallelizing Graphics Pipeline Execution (+ Basics of Characterizing a Rendering Workload)

Direct Rendering of Trimmed NURBS Surfaces

CS4621/5621 Fall Computer Graphics Practicum Intro to OpenGL/GLSL

EE 4702 GPU Programming

Computergrafik. Matthias Zwicker. Herbst 2010

Scanline Rendering 2 1/42

GPU Memory Model. Adapted from:

Coding OpenGL ES 3.0 for Better Graphics Quality

Graphics Hardware. Instructor Stephen J. Guy

Computer Graphics Coursework 1

Spring 2009 Prof. Hyesoon Kim

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

CS 464 Review. Review of Computer Graphics for Final Exam

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

Metal for Ray Tracing Acceleration

X. GPU Programming. Jacobs University Visualization and Computer Graphics Lab : Advanced Graphics - Chapter X 1

The Graphics Pipeline

Adaptive Point Cloud Rendering

Blis: Better Language for Image Stuff Project Proposal Programming Languages and Translators, Spring 2017

OpenGL shaders and programming models that provide object persistence

Overview. By end of the week:

CS4620/5620: Lecture 14 Pipeline

2D graphics with WebGL

The Graphics Pipeline

Introduction. What s New in This Edition

TSBK03 Screen-Space Ambient Occlusion

CS770/870 Fall 2015 Advanced GLSL

Enhancing Traditional Rasterization Graphics with Ray Tracing. October 2015

Programmable GPUs. Real Time Graphics 11/13/2013. Nalu 2004 (NVIDIA Corporation) GeForce 6. Virtua Fighter 1995 (SEGA Corporation) NV1

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

3D Graphics and OpenGl. First Steps

Dave Shreiner, ARM March 2009

About Phoenix FD PLUGIN FOR 3DS MAX AND MAYA. SIMULATING AND RENDERING BOTH LIQUIDS AND FIRE/SMOKE. USED IN MOVIES, GAMES AND COMMERCIALS.

Parallelizing Graphics Pipeline Execution (+ Basics of Characterizing a Rendering Workload)

OpenGL on Android. Lecture 7. Android and Low-level Optimizations Summer School. 27 July 2015

3D GRAPHICS. design. animate. render

CS 179: GPU Programming

Homework 3: Programmable Shaders

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Shader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express

CSE4030 Introduction to Computer Graphics

The Graphics Pipeline and OpenGL I: Transformations!

Chapter 1 Introduction

The Application Stage. The Game Loop, Resource Management and Renderer Design

BCA611 Video Oyunları için 3B Grafik

WebGL (Web Graphics Library) is the new standard for 3D graphics on the Web, designed for rendering 2D graphics and interactive 3D graphics.

The Graphics Pipeline and OpenGL IV: Stereo Rendering, Depth of Field Rendering, Multi-pass Rendering!

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students)

2.11 Particle Systems

Computer Science 175. Introduction to Computer Graphics lib175 time: m/w 2:30-4:00 pm place:md g125 section times: tba

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

Shaders CSCI 4239/5239 Advanced Computer Graphics Spring 2014

CIS 581 Interactive Computer Graphics

OpenGL Programmable Shaders

6.837 Introduction to Computer Graphics Quiz 2 Thursday November 20, :40-4pm One hand-written sheet of notes allowed

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

I expect to interact in class with the students, so I expect students to be engaged. (no laptops, smartphones,...) (fig)

CS179: GPU Programming

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Transcription:

Game Graphics & Real-time Rendering CMPM 163, W2018 Prof. Angus Forbes (instructor) angus@ucsc.edu Lucas Ferreira (TA) lferreira@ucsc.edu creativecoding.soe.ucsc.edu/courses/cmpm163 github.com/creativecodinglab

Last week - Gave an overview of the rendering pipeline - Looked at two Three.js programs - Introduced writing custom shaders in Three.js using the RawShaderMaterial object - Looked at GLSL syntax https://www.khronos.org/files/webgl/webglreference-card-1_0.pdf

This week - Continue to explore writing custom shaders in Three.js using the RawShaderMaterial object - Go over GLSL data types and syntax - Discuss how to pass data from the CPU to the GPU - Discuss how to pass data between the vertex shader and the fragment shader - Introduce GLSL textures - Intoduce offscreen buffers (Frame Buffer Objects) - Introduce first homework packet - https://creativecoding.soe.ucsc.edu/courses/cmpm163/co de/week2_codeexamples.zip

3D Scene In most graphics framework, a scene consists of: - Cameras: A camera characterizes the extent of the 3D space that will be projected onto the 2D image plane - Lights: Different positions, orientations, colors, and types of light are used to illuminate the scene - Geometry: Objects made out of triangles populate the scene, and are lit by the different light source - Materials: The objects in the scene can be characterized by different material properties - Textures: The objects can also have different textures, or images, placed on them

Rendering pipeline The rendering pipeline describes the series of operations that transform your programmatically defined 3D scene into an actual 2D image. This is done using shaders. - Vertex shader: The vertex shader is responsible for turning 3D geometry into normalized device coordinates (2D values between -1 and +1, plus a depth value) - Fragment/pixel shader: The fragment shader is responsible for coloring in each of those pixels, and then outputting it on the 2D screen

Perspective camera

uniform: data that is static over the life of the binding It will be the same for all geometry passed in the shader. Three.js defines the camera data for you automatically if you use their Camera objects and link it to the scene. Uniform variable are available to both the vertex and the fragment shader. attribute: data that is defined per vertex for each geometry that you pass in (i.e., position, normal, texture coordinates). Three.js defines most of this for you automatically. Attribute data is only available in the vertex shader. varying: this is how you link data from the vertex shader to the fragment shader.

uniforms: To define a uniform in Three.js, you use the following syntax, var uniforms = { x_and_y: { type: 2f", value: arraywithtwovals}, light1_pos: { type: "v3", value: light1_pos }, light1_diffuse: { type: "v3", value: light1_diffuse }, light1_specular: { type: "v3", value: light1_specular }, timerval: { type: f", value: time }, somenumber: { type: i", value: myinteger}, specialmatrix: { type: "Matrix4fv", value: fancymatrix}, }; It serves as a kind of a contract. You are telling the shader what to expect will be passed in from the CPU. Thus, your shader needs to have uniform variables defined so that it is ready to receive this data. https://github.com/mrdoob/three.js/wiki/uniforms-types

varying: Varying variables provide an interface between the vertex and the fragment shader. Vertex shaders compute values per vertex and fragment shaders compute values per fragment. If you define a varying variable in a vertex shader, its value will be interpolated over the primitive being rendered, and you can access the interpolated value in the fragment shader. Download code: https://creativecoding.soe.ucsc.edu/courses/cmpm163/code/ week2_codeexamples.zip

The code example, w2_varying.html, shows how to use the varying keyword to pass color attribute data from the vertex shader to the fragment shader. It also introduces Three.js BufferGeometry, which gives you more control over how you set up the geometry. Specifically, it lets you define the attribute data yourself. We will create a triangle. We are required to define the position of all the points in the triangle. We also attach two colors to each point on the triangle.

//instantiate a new BufferGeometry var geometry = new THREE.BufferGeometry(); //define the data for the geometry var vertices = new Float32Array( [ Math.cos(toRad(90.0)) * 2, Math.sin(toRad(90.0)) * 2, 0.0, Math.cos(toRad(210.0)) * 2, Math.sin(toRad(210.0)) * 2, 0.0, Math.cos(toRad(330.0)) * 2, Math.sin(toRad(330.0)) * 2, 0.0 ] ); var colors1 = new Float32Array( [1.0,0.0,0.0, 0.0,1.0,0.0, 0.0,0.0,1.0] ); var colors2 = new Float32Array( [0.0,1.0,0.0, 0.0,0.0,1.0, 1.0,0.0,0.0] ); //link the data as attributes available to the vertex shader geometry.addattribute( 'position', new THREE.BufferAttribute( vertices, 3 ) ); geometry.addattribute( 'color1', new THREE.BufferAttribute( colors1, 3 ) ); geometry.addattribute( 'color2', new THREE.BufferAttribute( colors2, 3 ) );

Questions? - Look at other code examples for week2 (especially the texture example) - Lab sessions take place tomorrow and Thursday (led by Lucas) - Lab will cover how to load in more complex objects - Make sure you understand the material we ve covered so far

Game Graphics & Real-time Rendering CMPM 163, W2018 Prof. Angus Forbes (instructor) angus@ucsc.edu Lucas Ferreira (TA) lferreira@ucsc.edu creativecoding.soe.ucsc.edu/courses/cmpm163 github.com/creativecodinglab