CITS4242: Game Design and Multimedia Topic 10: Scene Management, Particle Systems and Normal Mapping
Scene Management Scene management means keeping track of all objects in a scene. - In particular, keeping track of positions of all objects while a game is running. - Generally scenes are defined using scene graphs with a tree hierarchy. - The position and rotation of each object is defined with respect to a parent object - The map is the parent for top-level objects. - This is convenient: we can easily move and rotate compound objects built from smaller object. - We can also easily move and rotate parts of them with respect to the parent. - In NeoAxis scene graphs can be created easily by attaching map objects to each other - with the map editor or in code.
Scene Management - Queries A scene graph can be directly implemented as a tree data-structure. However, we often need an efficient way to answer geometric questions such as: What are all objects within the region visible from the current camera? (To avoid sending all objects to OpenGL/DirectX, which is slow.) What objects are within a particular distance of a character? More generally: what objects are in a particular volume? To address this, something more than a scene graph is required it needs to be converted to a form suitable for answering such queries. A simple way of dividing up space for efficiency is using portals: basically dividing a large scene in to many smaller ones, each containing all objects visible from any point in the scene. This works particularly well with indoor scenes.
Scene Management Spatial Techniques More sophisticated techniques divide space up into a hierarchy. Binary Space Partition (BSP) trees divide the space each time with a plane that divides the space roughly in half. (Usually only in 2D.) Quadtrees are similar, except that the planes must be aligned with the axes. Octrees extend this idea to 3D, and are suitable for complex scenes involving all three dimensions. These all allow finding a particular small part of the map in roughly time log(n), where n is the number of regions. This generally works with bounding volumes, or similar, and often a game engine will use a single data structure for efficient rendering, physics, as well as game logic. E.g., Game logic: for determining whether an enemy can see the player, or what object a character is standing on. Neoaxis provides a choice of scene managers via the map system. Swapping to a different one may fix some kinds of efficiency issues. From code the Map.GetObjects method allows queries to find map objects within a number of common kinds of 3D shapes (sphere, box, frustum, ray,...)
Particle Systems Particle systems allow modeling objects using sets of moving particles, generally with 2D images attached. - This provides a way to model many common physical phenomena that can t easily be modeled as meshes. - This includes explosions, smoke, fireworks, rain. - The basic idea is to have emitters of particles and various ways of controlling how they move, etc. - Particles appearance can be controlled using materials. - In NeoAxis, there are both point and volume emitters. - There are also many ways of affecting particles after they are emitted. - The resource editor allows you build particle systems. - Look at the examples to see what is possible they can add life to static scenes.
Normal mapping Normal mapping is a relatively straightforward technique that can show great detail on objects with few polygons. - The basic idea is simple: use a texture map to store the directions of the normals. - The red, green and blue channels are used as x, y and z coordinates. - 0-255 represents -1 to 1 for red and green, but 0 to 1 for blue (z) in tangent space normal mapping. - During rendering, the shader replaces the actual normal with the one in derived from the texture when calculating lighting. - The shading looks very detailed, but the outlines of shapes will still show corners. (The eye tends to be fooled by the shading though.)
Normal mapping Normal mapping is a relatively straightforward technique that can show great detail on objects with few polygons. The basic idea is simple: use a texture map to store the directions of the normals. The red, green and blue channels are used as x, y and z coordinates. 0-255 represents -1 to 1 for red and green, but 0 to 1 for blue (z) in tangent space normal mapping. During rendering, the shader replaces the actual normal with the one in derived from the texture when calculating lighting. The shading looks very detailed, but the outlines of shapes will still show corners. (The eye tends to be fooled by the shading though.) Neoaxis provides normal mapping via the high materials. You can create appropriate normal maps with Blender (see the help list posts from last year one colour channel needs inverting).
Parallax mapping Normal mapping is a very effective technique, but it doesn't quite offer the same realism as a high-polygon mesh. One important limitation of normal mapping is that it doesn't move parts of the mesh relative to each other based on which is closer to the viewer. Consider a stone wall with gaps between the stones: From an angle you shouldn't see as much of the parts in the gaps. But, with normal mapping the amount you see doesn't change based on the angle. Parallax mapping aims to fix this by moving the positions within a polygon when it is rendered based on the distance to the viewer. Doing this perfectly is hard in real-time rendering, but there are good approximations. Parallax mapping needs more than just the normals it needs the amount that points are moved. NeoAxis will perform parallax mapping given a normal map and a displacement map.