Topic 10: Scene Management, Particle Systems and Normal Mapping. CITS4242: Game Design and Multimedia

Similar documents
Spatial Data Structures

Scene Management. Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development

Spatial Data Structures and Speed-Up Techniques. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

Spatial Data Structures

Spatial Data Structures

Spatial Data Structures

Spatial Data Structures

Global Rendering. Ingela Nyström 1. Effects needed for realism. The Rendering Equation. Local vs global rendering. Light-material interaction

Ray Tracing III. Wen-Chieh (Steve) Lin National Chiao-Tung University

Actions and Graphs in Blender - Week 8

Ray Tracing: Intersection

TSBK03 Screen-Space Ambient Occlusion

Particle systems, collision detection, and ray tracing. Computer Graphics CSE 167 Lecture 17

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students)

Computer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo

CPSC / Sonny Chan - University of Calgary. Collision Detection II

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

Lecture 25 of 41. Spatial Sorting: Binary Space Partitioning Quadtrees & Octrees

Speeding up your game

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

MODELING AND HIERARCHY

Scale Rate by Object Size: Only available when the current Emitter Type is Surface, Curve, or Volume. If you turn on this attribute, the

Announcements. Written Assignment2 is out, due March 8 Graded Programming Assignment2 next Tuesday

Universiteit Leiden Computer Science

CS535 Fall Department of Computer Science Purdue University

Topics and things to know about them:

Graphics and Interaction Rendering pipeline & object modelling

Chapter 17: The Truth about Normals

Sculpting 3D Models. Glossary

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

Dynamic Ambient Occlusion and Indirect Lighting. Michael Bunnell NVIDIA Corporation

CPSC GLOBAL ILLUMINATION

Computer Graphics. Bing-Yu Chen National Taiwan University

Spatial Data Structures and Speed-Up Techniques. Ulf Assarsson Department of Computer Science and Engineering Chalmers University of Technology

Flames in Particle Flow

CS451Real-time Rendering Pipeline

Computer Graphics Ray Casting. Matthias Teschner

COMP 175: Computer Graphics April 11, 2018

Chapter 4. Chapter 4. Computer Graphics 2006/2007 Chapter 4. Introduction to 3D 1

Rendering & Project Management. Dillon Courts Sandy Natarajan Spencer Balogh Do Young Park

Ray Tracing Acceleration Data Structures

Lecture 15: Shading-I. CITS3003 Graphics & Animation

improving raytracing speed

Deferred Rendering Due: Wednesday November 15 at 10pm

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

Chapter 19- Object Physics

Spatial Data Structures and Acceleration Algorithms

Problem Max. Points Total 80

Real-Time Non- Photorealistic Rendering

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Overview. Pipeline implementation I. Overview. Required Tasks. Preliminaries Clipping. Hidden Surface removal

Animation Basics. Learning Objectives

Chapter 14 Particle Systems & Interactions

MODELING EYES ESTIMATED TIME REQUIRED

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

CSE 167: Introduction to Computer Graphics Lecture #19: Wrapping Up. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Intersection Acceleration

CSE 167: Introduction to Computer Graphics Lecture #18: More Effects. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016

Visible-Surface Detection Methods. Chapter? Intro. to Computer Graphics Spring 2008, Y. G. Shin

Computer Graphics II

Computer Graphics (CS 543) Lecture 13b Ray Tracing (Part 1) Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

CS 4620 Midterm, March 21, 2017

9. Three Dimensional Object Representations

Clipping. CSC 7443: Scientific Information Visualization

Introduction to Visualization and Computer Graphics

CS 465 Program 5: Ray II

Collision Detection based on Spatial Partitioning

Terrain rendering (part 1) Due: Monday, March 10, 10pm

CS 563 Advanced Topics in Computer Graphics Culling and Acceleration Techniques Part 1 by Mark Vessella

Rendering diffuse objects using particle systems inside voxelized surface geometry. Thorsten Juckel Steffi Beckhaus

Chapter 13 - Modifiers

Midterm Exam CS 184: Foundations of Computer Graphics page 1 of 11

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Unit 68: 3D Environments

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 1 The Blender Interface and Basic Shapes

MANAGING MODS Imported mods are located in:..\[steamlibrary]\steamapps\common\purefarming \ PureFarming_Data\StreamingAssets\IMPORTER\mod

Terrain Rendering (Part 1) Due: Thursday November 30 at 10pm

Full Screen Layout. Main Menu Property-specific Options. Object Tools ( t ) Outliner. Object Properties ( n ) Properties Buttons

Subdivision Of Triangular Terrain Mesh Breckon, Chenney, Hobbs, Hoppe, Watts

3D Studio Max Lesson 1.1: A Basic Overview of 3DSMax's Main Tool Bar

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

TDA361/DIT220 Computer Graphics, January 15 th 2016

Chapter 4- Blender Render Engines

CS 4620 Program 4: Ray II

Collision Detection. Pu Jiantao.

Chapter 12- NURBS & Meta Shape Basics

LEVEL 1 ANIMATION ACADEMY2010

3.6: First Person Computer Games

1 INTRoduCTIon to BLENDER 1. 3 PREPARATIon 19

CSE 167: Introduction to Computer Graphics Lecture 11: Scene Graph 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

Computer Graphics. Lecture 9 Environment mapping, Mirroring

Blue colour text questions Black colour text sample answers Red colour text further explanation or references for the sample answers

Ray Tracing. Foley & Van Dam, Chapters 15 and 16

An Introduction to Maya. Maya. Used in industrial design, CAD, computer games and motion picture effects. The ambition is what get

Simulation in Computer Graphics Space Subdivision. Matthias Teschner

Reference Image. Source:

Ray Tracing Foley & Van Dam, Chapters 15 and 16

4.6 Normal Mapping Simplifying Meshes for Normal Mapped Objects. CHAPTER 4 Surfacing

Transcription:

CITS4242: Game Design and Multimedia Topic 10: Scene Management, Particle Systems and Normal Mapping

Scene Management Scene management means keeping track of all objects in a scene. - In particular, keeping track of positions of all objects while a game is running. - Generally scenes are defined using scene graphs with a tree hierarchy. - The position and rotation of each object is defined with respect to a parent object - The map is the parent for top-level objects. - This is convenient: we can easily move and rotate compound objects built from smaller object. - We can also easily move and rotate parts of them with respect to the parent. - In NeoAxis scene graphs can be created easily by attaching map objects to each other - with the map editor or in code.

Scene Management - Queries A scene graph can be directly implemented as a tree data-structure. However, we often need an efficient way to answer geometric questions such as: What are all objects within the region visible from the current camera? (To avoid sending all objects to OpenGL/DirectX, which is slow.) What objects are within a particular distance of a character? More generally: what objects are in a particular volume? To address this, something more than a scene graph is required it needs to be converted to a form suitable for answering such queries. A simple way of dividing up space for efficiency is using portals: basically dividing a large scene in to many smaller ones, each containing all objects visible from any point in the scene. This works particularly well with indoor scenes.

Scene Management Spatial Techniques More sophisticated techniques divide space up into a hierarchy. Binary Space Partition (BSP) trees divide the space each time with a plane that divides the space roughly in half. (Usually only in 2D.) Quadtrees are similar, except that the planes must be aligned with the axes. Octrees extend this idea to 3D, and are suitable for complex scenes involving all three dimensions. These all allow finding a particular small part of the map in roughly time log(n), where n is the number of regions. This generally works with bounding volumes, or similar, and often a game engine will use a single data structure for efficient rendering, physics, as well as game logic. E.g., Game logic: for determining whether an enemy can see the player, or what object a character is standing on. Neoaxis provides a choice of scene managers via the map system. Swapping to a different one may fix some kinds of efficiency issues. From code the Map.GetObjects method allows queries to find map objects within a number of common kinds of 3D shapes (sphere, box, frustum, ray,...)

Particle Systems Particle systems allow modeling objects using sets of moving particles, generally with 2D images attached. - This provides a way to model many common physical phenomena that can t easily be modeled as meshes. - This includes explosions, smoke, fireworks, rain. - The basic idea is to have emitters of particles and various ways of controlling how they move, etc. - Particles appearance can be controlled using materials. - In NeoAxis, there are both point and volume emitters. - There are also many ways of affecting particles after they are emitted. - The resource editor allows you build particle systems. - Look at the examples to see what is possible they can add life to static scenes.

Normal mapping Normal mapping is a relatively straightforward technique that can show great detail on objects with few polygons. - The basic idea is simple: use a texture map to store the directions of the normals. - The red, green and blue channels are used as x, y and z coordinates. - 0-255 represents -1 to 1 for red and green, but 0 to 1 for blue (z) in tangent space normal mapping. - During rendering, the shader replaces the actual normal with the one in derived from the texture when calculating lighting. - The shading looks very detailed, but the outlines of shapes will still show corners. (The eye tends to be fooled by the shading though.)

Normal mapping Normal mapping is a relatively straightforward technique that can show great detail on objects with few polygons. The basic idea is simple: use a texture map to store the directions of the normals. The red, green and blue channels are used as x, y and z coordinates. 0-255 represents -1 to 1 for red and green, but 0 to 1 for blue (z) in tangent space normal mapping. During rendering, the shader replaces the actual normal with the one in derived from the texture when calculating lighting. The shading looks very detailed, but the outlines of shapes will still show corners. (The eye tends to be fooled by the shading though.) Neoaxis provides normal mapping via the high materials. You can create appropriate normal maps with Blender (see the help list posts from last year one colour channel needs inverting).

Parallax mapping Normal mapping is a very effective technique, but it doesn't quite offer the same realism as a high-polygon mesh. One important limitation of normal mapping is that it doesn't move parts of the mesh relative to each other based on which is closer to the viewer. Consider a stone wall with gaps between the stones: From an angle you shouldn't see as much of the parts in the gaps. But, with normal mapping the amount you see doesn't change based on the angle. Parallax mapping aims to fix this by moving the positions within a polygon when it is rendered based on the distance to the viewer. Doing this perfectly is hard in real-time rendering, but there are good approximations. Parallax mapping needs more than just the normals it needs the amount that points are moved. NeoAxis will perform parallax mapping given a normal map and a displacement map.