Generation and Control of Game Virtual Environment

Similar documents
Scene Management. Video Game Technologies 11498: MSc in Computer Science and Engineering 11156: MSc in Game Design and Development

Spatial Data Structures and Speed-Up Techniques. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

Fast continuous collision detection among deformable Models using graphics processors CS-525 Presentation Presented by Harish

Modeling the Virtual World

A Real-time Rendering Method Based on Precomputed Hierarchical Levels of Detail in Huge Dataset

Speeding up your game

Topic 10: Scene Management, Particle Systems and Normal Mapping. CITS4242: Game Design and Multimedia

Rendering Grass with Instancing in DirectX* 10

Animation Basics. Learning Objectives

Unit 68: 3D Environments

Algorithms for Atmospheric Special Effects in Graphics and their Implementation

Overview. Collision detection. Collision detection. Brute force collision detection. Brute force collision detection. Motivation

CPSC / Sonny Chan - University of Calgary. Collision Detection II

Virtual Interaction System Based on Optical Capture

Overview. Collision Detection. A Simple Collision Detection Algorithm. Collision Detection in a Dynamic Environment. Query Types.

Mathematical Approaches for Collision Detection in Fundamental Game Objects

Graphics and Interaction Rendering pipeline & object modelling

CS 563 Advanced Topics in Computer Graphics Culling and Acceleration Techniques Part 1 by Mark Vessella

Construction of Complex City Landscape with the Support of CAD Model

Subdivision Of Triangular Terrain Mesh Breckon, Chenney, Hobbs, Hoppe, Watts

Using Bounding Volume Hierarchies Efficient Collision Detection for Several Hundreds of Objects

SYNTHETIC VISION AND EMOTION CALCULATION IN INTELLIGENT VIRTUAL HUMAN MODELING

CS451Real-time Rendering Pipeline

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

Simulation in Computer Graphics Space Subdivision. Matthias Teschner

Digitization of 3D Objects for Virtual Museum

High level NavMesh Building Components

Overview. A real-time shadow approach for an Augmented Reality application using shadow volumes. Augmented Reality.

Research on the key technologies and realization of virtual campus

LOD and Occlusion Christian Miller CS Fall 2011

An Image Based 3D Reconstruction System for Large Indoor Scenes

Collision Detection based on Spatial Partitioning

Advanced Computer Graphics

The Application Stage. The Game Loop, Resource Management and Renderer Design

Shadow Techniques. Sim Dietrich NVIDIA Corporation

Design of a dynamic simulation system for VR applications

Z-Buffer hold pixel's distance from camera. Z buffer

Wide Area Ontology Integration Scheme for Reasoning Agents in Surveillance Networks

Real Time Rendering. CS 563 Advanced Topics in Computer Graphics. Songxiang Gu Jan, 31, 2005

Why modern versions of OpenGL should be used Some useful API commands and extensions

Real-Time Rendering of a Scene With Many Pedestrians

Optimizing and Profiling Unity Games for Mobile Platforms. Angelo Theodorou Senior Software Engineer, MPG Gamelab 2014, 25 th -27 th June

IFC-based clash detection for the open-source BIMserver

Performance Analysis of Cluster based Interactive 3D Visualisation

Spatial Data Structures

Table of Contents. Questions or problems?

Spatial Data Structures

A New Bandwidth Reduction Method for Distributed Rendering Systems

CS 465 Program 5: Ray II

Rapid Modeling of Digital City Based on Sketchup

ACTIVITYDETECTION 2.5

A Web Page Segmentation Method by using Headlines to Web Contents as Separators and its Evaluations

SURFACE COLLISION DETECTION WITH THE OVERLAPPING BOUNDING BOX BETWEEN VIRTUAL PROTOTYPE MODELS

Principles of Computer Game Design and Implementation. Revision Lecture

Spatial Data Structures

INTERACTIVE 3D ANIMATION SYSTEM BASED ON TOUCH INTERFACE AND EFFICIENT CREATION TOOLS. Anonymous ICME submission

Announcements. Written Assignment2 is out, due March 8 Graded Programming Assignment2 next Tuesday

Chapter Answers. Appendix A. Chapter 1. This appendix provides answers to all of the book s chapter review questions.

MAXIS-mizing Darkspore*: A Case Study of Graphic Analysis and Optimizations in Maxis Deferred Renderer

A Bandwidth Effective Rendering Scheme for 3D Texture-based Volume Visualization on GPU

Ray Tracing III. Wen-Chieh (Steve) Lin National Chiao-Tung University

Shadows in the graphics pipeline

Accelerating Ray Tracing

Spatial Data Structures and Speed-Up Techniques. Ulf Assarsson Department of Computer Science and Engineering Chalmers University of Technology

Change list. CityGRID 2017 Release Setup. CityGRID Manager, CityGRID Administrator. Recent Developments. Recent Developments

l Without collision detection (CD), it is practically impossible to construct e.g., games, movie production tools (e.g., Avatar)

INTERACTIVE ENVIRONMENT FOR INTUITIVE UNDERSTANDING OF 4D DATA. M. Murata and S. Hashimoto Humanoid Robotics Institute, Waseda University, Japan

Getting Started with ShowcaseChapter1:

Terrain rendering (part 1) Due: Monday, March 10, 10pm

Keywords: 3D-GIS, R-Tree, Progressive Data Transfer.

A Pattern Matching Technique for Detecting Similar 3D Terrain Segments

Using Siemens NX 11 Software. Assembly example - Gears

Graph-based High Level Motion Segmentation using Normalized Cuts

Research Article Polygon Morphing and Its Application in Orebody Modeling

Game Programming with. presented by Nathan Baur

SimLab plugin for 3DS Max. SimLab Plugin for 3DS Max

GeoShow3D. Product Features. Product Information Number: Persona de contacto:

Benchmark 1.a Investigate and Understand Designated Lab Techniques The student will investigate and understand designated lab techniques.

Course Title: Computer Graphics Course no: CSC209

Hardware Displacement Mapping

Human Body Shape Deformation from. Front and Side Images

This allows you to choose convex or mesh colliders for you assets. Convex Collider true = Convex Collider. Convex Collider False = Mesh Collider.

Efficient Path Finding Method Based Evaluation Function in Large Scene Online Games and Its Application

Triangular Mesh Segmentation Based On Surface Normal

Generating Tool Paths for Free-Form Pocket Machining Using z-buffer-based Voronoi Diagrams

Terrain Rendering (Part 1) Due: Thursday November 30 at 10pm

A Rapid Development Method of Virtual Assembly Experiments Based on 3D Game Engine Wenfeng Hu 1, a, Xin Zhang 2,b

Broadcasting Scheme for Location Management in Mobile Networks

Textures. Texture coordinates. Introduce one more component to geometry

3-Dimensional Object Modeling with Mesh Simplification Based Resolution Adjustment

VEGETATION STUDIO FEATURES

Semi-Automated and Interactive Construction of 3D Urban Terrains

space is used effectively. Outdoor space components such as fences and trees affect the solar radiation that the building receives. These components a

Sculpting 3D Models. Glossary

JOURNAL OF OBJECT TECHNOLOGY

Graphs, Search, Pathfinding (behavior involving where to go) Steering, Flocking, Formations (behavior involving how to go)

Image-Based Deformation of Objects in Real Scenes

Hybrid Rendering for Collaborative, Immersive Virtual Environments

MIKE: a Multimodal Cinematographic Editor for Virtual Worlds

Function Based 2D Flow Animation

Transcription:

International Journal of Automation and Computing 04(1), January 2007, 25-29 DOI: 10.1007/s11633-007-0025-4 Generation and Control of Game Virtual Environment Myeong Won Lee 1 Jae Moon Lee 2 1 Department of Internet Information Engineering, The University of Suwon, Gyeonggi-do 445-743, Korea 2 Department of Computers, The University of Suwon, Gyeonggi-do 445-743, Korea Abstract: In this paper, we present a framework for the generation and control of an Internet-based 3-dimensional game virtual environment that allows a character to navigate through the environment. Our framework includes 3-dimensional terrain mesh data processing, a map editor, scene processing, collision processing, and walkthrough control. We also define an environment-specific semantic information editor, which can be applied using specific location obtained from the real world. Users can insert text information related to the characters real position in the real world during navigation in the game virtual environment. Keywords: Virtual environment, virtual reality, 3D game, 3D navigation, 3D scene management. 1 Introduction A series of processes is needed to develop an Internetbased game using 3D virtual environments. The client application program which provides the end-user interface includes game logic and engine modules amongst its processes [1]. The game logic module has game progress procedures and is not related to how objects such as characters, buildings, and terrains are displayed, but to what objects are displayed. In other words, it controls overall program progress and is in dependent of the computer operating system or application programming interface (API). In order to provide this independence, a game engine module is required as an interface to the operating system and the API [2,3]. The game engine module is in charge of a variety of tasks necessary for the behavior and representation of objects that appear in the game logic application [4,5]. The game engine module is composed of several sub modules which provide a rendering library [6,7], terrain management, object and resource management, application framework, and networking function that are necessary for a game program. Our research has focused on the game engine module, and on providing a method of controlling game progress in real time. We present an overall framework for generating and controlling 3D virtual environments commonly necessary for developing game applications. We also introduce a method of controlling objects by a console command interface and describe the kinds of commands for controlling them. In addition, we describe the method for generating environment semantic information [8,9]. 2 Mesh data management Normally, we use a 3D modeler when generating a character or a virtual environment in development of game software. We may import 3D geometric data from the 3D modeler since it often provides an export function that outputs textual mesh information. We used Autodesk 3ds Max to obtain the geometric data for characters and virtual environments used in our game. It provides ASCII scene ex- Manuscript received August 3, 2006; revised December 15, 2006. *Corresponding author. E-mail address: mwlee@suwon.ac.kr porter (ASE) which is capable of obtaining such mesh information in text files (see Fig. 1). Although it is easy to handle and modify the data, there is a disadvantage in that ASE includes unnecessary information that is not used directly by the application. For example, for a 3D model of a virtual environment representing an ancient palace, using ASE, we obtained 12 megabytes of mesh data. Such a large mesh file may cause problems when implementing the game. It may take a long time for the file to be loaded and transformed in the reorganization for graphics libraries that utilize it when developing the game. Therefore, we have devised a binary format named MSH whose file size is smaller than using ASE, and which also provides considerable performance improvement in the mesh transformation (see Fig. 1). Usually, the parsing process takes a significant amount of time when loading mesh information, and this can be avoided if we use MSH files. Our mesh management module provides the function of transforming an ASE file to an MSH file so that client applications can use it. 3 Scene management Generally, there are several different kinds of scenes in a game, such as logo, main menu, intro, and game progressing scenes. These scenes need a certain logic to manage separately per scene since they are independent of each other. Scenes may be changed from one to another, and each needs processes to run when being started and closed. A scene management system processes a life cycle of scenes from start to finish (see Fig. 2). The system provides two classes, SceneManager and Scene. The Scene class is derived, and then five kinds of overloaded functions - Initialize, Dispose, Begin, End, and DoFrame - are defined and registered in the SceneManager. The SceneManager calls the scenes using functions according to the life cycle. The SceneManger identifies registered scenes by integer numbers, and uses these numbers when activating or deleting scenes. Two states exist per scene. One is registered but inactive, and the other is active among registered scenes. Several scenes can be registered, but only one scene can exist in the SceneManager. The active scene is changed into an inactive scene if we try to activate an-

26 International Journal of Automation and Computing 04(1), January 2007 Fig. 1 Mesh data transformation editor is required for characters to have motion according to the game scenario and to process collisions that may occur during navigation in the virtual environment [11]. The editor makes it easy for a map designer to author virtual environments. The map designer can author the following two kinds of spaces. One is an indoor space that can be generated by traditional graphics tools and then read into the game program. We refer to this as a game engine. The indoor space is stored in the game engine with optimized data structures. A static object can be placed into the internal space using the tool. In addition, in order to render the internal space more effectively, binary space partitioning (BSP) and potentially visible set (PVS) are included with the game engine. The other kind of space the map designer can author is an outdoor space. It can be generated by setting up the heights and textures for terrain tiles, random terrain generation, or raw bitmap files. We can place a static object in the outdoor space as in the indoor space using the map editor. The map editor uses ROAM technology to represent and manage a large outdoor space. 6 Collision processing Fig. 2 Life cycle of a scene other scene while the active scene is working. 4 3D terrain mesh generation We consider buildings, natural features of the earth, and terrains [10] as components for organizing a virtual environment. We generate the buildings and the natural features of the earth by using an external graphics tool and manage them with the mesh management module. The terrains, however, are not generated using such a tool. Instead, they are rendered by a special method called realtime optimally adapting meshes (ROAM) [7]. ROAM is an algorithm for implementing level of detail (LOD) when representing terrains. It provides the function of controlling the number of polygons dynamically. ROAM performs tessellation which divides each mesh into triangles for each frame. Sometimes, a mesh breaks if the level of tessellation for each area is different when an area is tessellated in the mesh. ROAM addresses this problem by also tessellating a neighbor triangle whenever a triangle is tessellated. 5 Map editor In game development, it is difficult to generate a series of variable scenes continuously for providing game contents after constructing virtual environments including characters and backgrounds using commercial graphics tools. A map A collision processing method must be provided for a game to resolve problems that occur when objects collide during navigation. We used octree representation for detecting collisions, since detecting collisions polygon by polygon is ineffective. Using this method, spaces where some part of the target object does not exist are exempted from collision detection. Then, axis aligned bounding box (AABB), which is the bounding box for a mesh, is applied, and is paralleled with the XYZ axes of the coordinate system in the virtual environment [12]. The collision detection is not accomplished at each polygon but at the bounding box surrounding an object. In addition, we applied a polygon to the bounding box collision after detecting the bounding box collision. We used a separated axis when detecting collisions between a polygon and a bounding box. A separated axis is one that separates two primitives composing objects, and their projected lines do not overlap, as shown in Fig. 3. For example, the projected primitives on an axis do not overlap if they are apart from each other. Therefore, we can determine that two primitives have not collided if such an axis exists. And, on the other hand, two primitives have collided if such an axis does not exist. In our system, we will not allow a character to proceed anymore in the virtual environment when it has been determined that a collision occurred. 7 Environment semantic information editor Generally, many games treat virtual environments as independent of characters, and are focused on modeling and rendering representation. They do not consider information within the environment, or how the environment relates to the characters. However, we have defined relative semantic information with real position in the virtual environment (see Fig. 4). Characters are able to recognize the semantic information where they pass through.

M. W. Lee and J. M. Lee/ Generation and Control of Game Virtual Environment 27 Fig. 3 Separated axis Fig. 5 Environment semantic information editor Fig. 4 Environment semantic information Fig. 6 Classes for our game virtual environment framework The virtual environments in our research can contain semantic information specific to an object s geographical location. In order to synchronize geographical location and semantic information, a function is necessary to input semantic information into the location that a character walks through. So, we added an editor to input location-specific information where some explanatory description is required. In the editor, the virtual environment is displayed in a 2- dimensional scene, where objects are represented by x and z coordinate values. Semantic information can be inserted using a dialog box. In Fig. 4, the game client system requests environment semantic information from the server. Then, the server retrieves the information from the database and the editor stores what the user has chosen to insert. Fig. 5 shows the environment semantic information editor. We included the editor since an information editor is even more of a necessity when we consider mobile game programs with information service in a ubiquitous environment. 8 Game control interface We used the method of designing a facade pattern when our game system executes a job and modifies an attribute of the virtual environment. The facade pattern is the same concept described in object-oriented languages. An inter- face is required to access the facade class for providing control functions during game execution. To accomplish this, we developed a console library, and then provided character user interface (CUI) for users to access the game system. In other words, the console is a user interface that can implement various control functions of the game system during execution. There are advantages that users can implement the functions themselves during execution of a game in a functional point of view, and that it can execute functions collectively in a systematic point of view. Various setup functions can be batched in separate files, and then we can modify setup controls flexibly. We have defined and implemented several console commands as shown in Table 1. Setup functions include delay time for keystroke input, console flag to show, rendering mode selection for polygon filling, gravity assignment, and semantic information retrieval that can be used in the console window. Read and write functions are also used to retrieve and store game progress during game play. Several other functions are also provided. 9 Implementation results We implemented the overall system by using the Visual C++ programming language, Microsoft DirectX 9, and 3ds

28 International Journal of Automation and Computing 04(1), January 2007 Table 1 Console commands Command Parameter Description set key delay Set the delay time of keystroke. set console show Set the flag for a console interface. set rs fill Specify rendering mode fill option. set s gravity Specify the gravity amount. set info Read the map information. set play cam record Apply view according to stored camera information. set record cam Record current camera information. set play cam speed Specify the speed of camera movement. write cam Store camera information. read cam Read the stored camera information. exec file path Execute the commands stored in the batch file stored at the path. exit none or cam records Close the console or delete the camera information. Fig. 7 Game samples based on our game VR framework Max. Fig. 6 briefly describes several kinds of classes composing our game system. The system starts from Ruin Application program. It manages the loop of the game s procedures. Ruin System manages all the functions that provide the game to the user. Ruin PhysicsSystem manages the interaction between characters and surrounding environments. It includes the collision detection and response algorithms. Ruin Frame is a class for generating and managing windows using Win32 API. Ruin Renderer manages functions and devices related to Direct3D for rendering 3D characters and environments. Ruin Console manages the console interface for controlling game setup such as key delay, console show, set gravity, info, record camera, and so on, described in Section 8. Ruin InputMap enables input of text information and provides the functions of the environment semantic information editor. Ruin SceneManager manages various scenes according to the users selection or the game progress. Ruin Scene is an abstract class to define each scene. It is derived and then used for implementing the contents of a game application. Fig. 7 shows sample screen shots from the system. 10 Conclusions In this paper, we presented a framework for generating and controlling 3D virtual environments in the development of an Internet game. The framework includes the generation of virtual environments, control of character movement, a real-time virtual environment information generator, and a console interface for controlling game progress. In the framework organization, we defined a systematic game programming procedure, including the environment semantic information editor. We focused on 3D characters and environments that affect the visual representation of a game. Compared with conventional game engines, our system has the following advantages: First, the framework provides a total solution for generating and controlling 3D objects in a game. Second, we have solved the collision problems be-

M. W. Lee and J. M. Lee/ Generation and Control of Game Virtual Environment 29 tween characters or environments during game progress by combining and improving conventional collision detection algorithms. Third, the framework has the capability of controlling a game progress using the convenient console command interface that we have developed. Game progress can be stored partially or totally and also retrieved whenever it is needed for replay. Several convenient setup functions are also provided. Lastly, our framework includes the environment semantic information editor that can insert valuable information into specific locations, enhancing the usability of the virtual environment. Future work will include a client system to be applied to mobile environments so that such environment semantic information can be collected in a ubiquitous virtual environment. References [1] C. Faisstnauer, W. Purgathofer, M. Gervautz, J. D. Gascuel. Construction of an Open Geometry Server for Client-server Virtual Environments. In Proceedings of IEEE Conference on Virtual Reality 2001, Yakohama, Japan, pp. 105 114, 2001. [2] L. Bishop, B. Eberty, T. Whitted, M. Finch, M. Shantz. Designing a PC Game Engine. IEEE Computer Graphics and Applications, vol. 18, no. 1, pp. 46 53, 1998. [3] R. Darken, P. McDowell, E. Johnson. The Delta3D Open Source Game Engine. IEEE Computer Graphics and Applications, vol. 25, no. 3, pp. 10 12, 2005. [4] K. Kanev, S. Kimura. Integrating Dynamic Full-body Motion Devices in Interactive 3D Entertainment. IEEE Computer Graphics and Applications, vol. 22, no. 4, pp. 76 86, July 2002. [5] T. K. Capin, H. Noser, D. Thalmann, I. S. Pandzic, N. M. Thalmann. Virtual Human Representation and Communication in VLNET. IEEE Computer Graphics and Applications, vol. 17, no. 2, pp. 42 53, 1997. [6] F. D. Luna. Introduction to 3D Game Programming with DirectX 9.0, Paperback, USA, 2003. [7] T. Moller, E. Haines, T. A. Moller. Real-time Rendering, 2nd ed., A K Peters Ltd., USA, 2002. [8] M. E. Latoschik, P. Biermann, I. Wachsmuth. High-Level Semantics Representation for Intelligent Simulative Environments, In Proceedings of IEEE Conference on Virtual Reality 2005, Bonn, Germany, pp. 283 284, 2005. [9] D. A. Bowman, C. North, J. Chen, N. F. Polys, P. S. Pyla, U. Yilmaz. Information-rich Virtual Environments: Theory, Tools, and Research Agenda. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology 2003, Osaka, Japan, pp. 81 90, October 01 03, 2003. [10] G. Snook. Real-Time 3D Terrain Engines Using C++ and DirectX 9, Paperback, USA, 2003. [11] R. P. Darken, J. L. Sibert. A Toolset for Navigation in Virtual Environments. In Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology, Georgia, USA, pp. 157 165, 1993. [12] A. M. Tomas. Fast 3D Triangle-box Overlap Testing. Journal of Graphics Tools, vol. 6, no. 1, pp. 29 33, 2001. communication. Myeong Won Lee received her B. Sc. from Seoul National University, Korea, in 1981, and the M. Sc. degree from Seoul National University, in 1984 and the Ph. D. degree from the University of Tokyo, Japan, in 1990. She is currently an associate professor at the Dept. of Internet Information Engineering, the University of Suwon. Her research interests include computer graphics applications, Web-based virtual reality, computer animation, and multimedia Jae Moon Lee received his B. Sc. from the University of Suwon, Korea, in 2005, and was a researcher of VR&M Laboratory at the University of Suwon. He is currently a researcher at the ESTsoft Corp. His research interests include computer graphics applications and Web-based virtual reality.