Augmented Feedback for Childhood Apraxia of Speech Design Specification Document

Size: px
Start display at page:

Download "Augmented Feedback for Childhood Apraxia of Speech Design Specification Document"

Transcription

1 Augmented Feedback for Childhood Apraxia of Speech Design Specification Document Document Version: 1.0 Status: Under revision Department of Computer Science & Engineering York University October 2013

2 Version History Version Name Date Description 1.0 D. Raymond 10/27/13 First release

3 Table of Contents 1. INTRODUCTION Scope and Purpose Project Design Constraints DEVELOPMENT ENVIRONMENT Windows Unity3D Assets Scripts OpenCv cvblob OpenCVSharp DESIGN Visuals Main Screen Configure Screen Play Screen IMPLEMENTATION Overview Details BlobScript SphereTrack Hit FUTURE WORK APPENDIX: SOURCE CODE Main Screen Configure Screen Play Screen... 19

4 1. INTRODUCTION 1.1 Scope and Purpose The purpose of this design specification document is to outline the procedures and implementation practices used when developing the game titled Rock Climber. Rock Climber is the implemented by-product of the document titled Augmented Feedback for Childhood Apraxia of Speech Requirements Analysis Document. Though the model digresses from its initial intended use as presented in the aforementioned document, it serves as a foundation upon which our final product is intended to be developed. Rock Climber is an interactive game designed for children with speech disorders, namely Apraxia of Speech. Though still a prototype, the designed game is outlined to provide jaw exercises for children through a series of incentive-based actions. The input of the game will consist of frame by frame images rendered from the webcam, which is then processed by tracking algorithms. These frames are then mapped to the vertical movements of a game object, emulating the movement of the mandibular along the sagittal plane. This document will outline the user interface design and implementation details of the game. It will list the platform specified controls, features and libraries used to implement the game. Furthermore, it also serves as a reference for future development of the prototype. 1.2 Project Design Constraints Rock Climber provides for an initial prototype upon which our intended model is to be built. Our final envisioned product as discussed in the Requirements Analysis Document is to be developed on a portable device, preferably an ipad or similar tablet. Using a tablet will provide for a more user friendly platform for its patients the children. Because Rock Climber is developed on a desktop, the user will be restricted to the location(s) in which he or she is playing the game. Rocker Climber is presented on a desktop computer, therefore the only form of input comes from the machine s built in webcam. The webcam only has a limited frame per second rate, so details such as preciseness and accuracy of tracking must be overlooked when collecting feedback data. Our current methods of development have led to procedures that do not have the accuracy of the game as a focal point. All in all, Rock Climber is presented as a basic working model for future development.

5 2. DEVELOPMENT ENVIRONMENT 2.1 Windows Rock Climber is targeted for the Windows desktop platform. Users of the game must therefore be present in front of a monitor on which output can be seen. A webcam, either built-in or external, is additionally needed to provide input in real time. These input images will be processed to provide for the main controller of the game. As noted earlier, since the webcam is being used, accuracy is not a main feature that has been taken into consideration when developing this prototype. 2.2 Unity3D Unity is a game engine designed to provide an easy interface for 3D interactive game development. It acts as a cross-platform game engine to develop games for operating systems such as Windows, ios and Android. Unity was chosen as the game engine for this project, due to its easy interface for creating 3D games. Not only that, but its popularity has grown, giving its massive community of developers easy access to tutorials and forums. Unity has a drag-and-drop style interface that allows easy development of games. With a powerful 3D rendering and physics engine, Unity has the necessary features that are needed for our prototype. Before discussing the implementation, it is very important to understand the structure of objects and terminology that Unity uses in development. These concepts will be essential to understanding the overall picture and linkage of all code and classes. We will not give an elaborate overview of every feature used in Unity. Rather, we will give a general introduction to all conceptual features needed in our game, sufficient enough to understand the development of it. More curious readers can seek additional references on Unity s main web page Assets Every object in Unity can be thought of under the general umbrella of objects, namely Assets. An asset, with regards to Unity, can be anything both physically and conceptually. It can represent a tree, a character, a camera and even the lighting features. In the development of our game, every object interacted with (and not) is an asset. This includes the rock face, the terrain, the buttons, the tracking object, a movement and the position of the camera (from which the user is viewing from). Assets represent objects in which a developer or user can interact with. As a user, one can move a character an asset. As a developer, one can assign certain actions to characters both assets.

6 2.1.2 Scripts Scripts are the code that provides functionality to our objects. They represent, in our context, C# or JavaScript classes that when instantiated create some dynamic change on screen. We note however that scripts, in Unity s definition, is not defined in the traditional sense. When a developer thinks of the word script, some key words come to mind JavaScript, Perl or Python. In Unity s definition, any piece of code whether traditional script or defined class, is a script. 2.3 OpenCv Unity was used to create a static version of our game, Rock Climber. To implement the tracking algorithms that were proposed, a library known as OpenCV was used. Open Source Computer Vision Library, abbreviated as OpenCV, is a library of programming functions implemented in C++ aimed towards the field of computer vision. It acts as a library that supports built in calls to real time image processing and manipulation techniques. The majority of the code used to process the frames rendered from the webcam, will be calls to the OpenCV library. One important factor that may cause confusion is the tracking rate within Unity. The OpenCV library functions manipulate images on a frame by frame basis. Therefore, we would need some way to incorporate that within Unity. When implementing code, two proprietary functions must be always introduced: void Start() and void Update(). The Start() method acts as a point of initialization for all code within the module. The Update() function is called every frame. This is where the majority of our OpenCV library functions will be called, since manipulation of every frame, in essence, is a moving picture cvblob The input of our game will come from the rendered frames of the webcam. Additionally, a specific tracking object will need to be used to act as a controller on screen. This tracking object needs to be chosen as an object with a colour not prevalent in the background of the input frame. In other words, the tracking object needs to be chosen so that it stands out on screen. Ideally, the tracking object will be placed on the users chin and will be used to control the object in our game. A separate open source library extending OpenCV s features is used to detect this tracking object. This library is known as cvblob. CvBlob provides for basic blob tracking through image processing requiring the definition of contours and binary labeling OpenCVSharp The main difficulty regarding our project was integrating the OpenCV library with Unity. Because Unity is developed in C#, using the IDE MonoDevelop, a wrapper library was needed to

7 call OpenCV functions. The wrapper library used in our case was OpenCVSharp. OpenCVSharp provides for the majority of OpenCV functions for.net using C# syntax and semantics. An example of the correspondence between OpenCV library function call (1) and OpenCVSharp library call (2) is as follows. IplImage *frame = cvcreateimage(); //1 IplImage frame = Cv.CreateImage(); //2 Note the non-usage of pointers in 2 due to the fact that we are using.net

8 3. DESIGN 3.1 Visuals Below is a description of the games features, discussing its static visual appearance. The implementation details will be left to the remainder of the document Main Screen The opening screen in Rock Climber is the Main Screen. The background of the Main Screen is set to look like a rock face, to stay consistent with the intention of our game. This rock face has three buttons - Play, Configure and Exit. The Play button will lead us to the game screen, referred to as Play Screen and the configure button will lead us to the configuration screen, known as the Configure Screen. The exit button, upon click, will exit the game. Everything else on this screen is static it was created to have a 3D effect with the only intention of a being visually appealing Configure Screen Figure1: The Main Screen The Configure Screen is where the user configures the game to track a specific object. When the user enters this screen they will be prompted by three separate windows. Two of them are visual augmentations of the webcam being captured in real time. The third screen will consist of six sliders.

9 The sliders are used to set the minimum and maximum hue, saturation, and value (HSV) values that denote the threshold that will be used to identify the color of the tracking object. By adjusting these values, through trial and error, one will have a clearly processed color that is alone being tracked on the third window, which is an augmentation of a live webcam capture. The HSV values selected in the slider are filtered from the live webcam capture and converted to a binary image, the second window. The pixels represented by the colour white are the ones that fall within our defined HSV threshold, and the pixels with the colour black are everything outside of this threshold. This binary image frame is then processed using the cvblob library to display the tracking object on the live webcam capture. The tracking object will be outlined in pink on the third screen to indicate that it is being located. An example user configuration would be as follows. 1. The user will hold up an object of color, not prevalent in skin tones (something that stands out) 2. Using the sliders, the user will begin adjusting the HSV values until the binary image on the second window shows the object in a color of white (and everything else as black) 3. When the user has done this to the smallest error, they can verify that the image is being tracked on the third window (it should be outlined in pink) Play Screen Figure 2: The Configure Screen The Play Screen is the game. The tracking object will be mapped to an tracker asset on screen. The user will be prompted to adjust the location of the tracker, and hence the tracking object, to a specified location. When this initial configuration has been done, the user is instructed to pronounce the word on screen. The word is displayed directly under the initial location of the tracker. The objective is to pronounce the word with minimal jaw sliding. The result will be the tracker colliding with the word, and the word exploding (as a prompt showing that the motion was correct). Jaw sliding will lead to the object not correctly colliding with the word, and therefore the user will be prompted to pronounce the word again.

10 Figure 3: The Play Screen

11 4. IMPLEMENTATION 4.1 Overview The implementation details discussed in this section will only relate to the Play Screen, since this is where the majority of the dynamic and interactive code comes into play. As mentioned in the previous section, within the Play Screen will be an object that is tracked according the movement received via webcam input. Below is a table created of the assets and their corresponding scripts being used in the Play Screen: Asset Script Description Tracking BlobScript Used to render frame input and locate tracking object. Tracker SphereTrack, Hit Used to map tracking object to co-ordinates on screen (thereby moving tracker asset). Used to detect collisions on Word asset. Word Used as a target for collision. The Tracking asset is positioned in a location that is theoretically positioned where user is viewing the screen. Tracking is not a physical asset, but rather a conceptual one as discussed earlier. It represents the object that is being controlled by the input and allows for an attachment of a script. The script being attached to Tracking is BlobScript. BlobScript detects the tracking object that is being used to control the game and stores its x and y coordinates respective to the screen. The Tracker asset is the object that is being moved according to the controller. Visually, it is represented by a red sphere. Attached to the Tracker are two scripts SphereTrack and Hit. SphereTrack retrieves the Tracking asset s x and y coordinates and synchronously moves Tracker to the appropriate position on screen. The Hit script is used to instantiate an explosion when the Tracker and another object (the Word asset) collide. The Word asset is randomized and represents a target object that is aimed for. When the Tracker asset collides with the Word asset (thereby calling the script Hit), the Word asset disappears and an explosion appears on-screen.

12 4.2 Details BlobScript The BlobScript script is the most important of our created classes. It has the function of tracking a specified "object" on screen and retrieving its coordinates relative to our screen size. In its core, BlobScript retrieves the majority of its functions from two external libraries - OpenCV and cvblob (both represented by the wrapper class OpenCVSharp). Both libraries allow for image processing techniques and functions within their respective fields. OpenCV provides for the majority of color manipulation and image displaying functions, while cvblob provides for the "blob tracking". We will not explain the entirety of the code, but we will explain the main calls that enable the tracking of the said object. Cv.CvtColor(frame, hsvframe, OpenCvSharp.ColorConversion.BgrToHsv); Cv.InRangeS(hsvframe, new OpenCvSharp.CvScalar(bMin, gmin, rmin), new OpenCvSharp.CvScalar(bMax, gmax, rmax), threshy); Cv.Smooth(threshy, threshy, OpenCvSharp.SmoothType.Median); uint result = CvBlobLib.Label(threshy, labelimg, blobs); CvBlobLib.RenderBlobs(labelImg, blobs, frame, frame); CvBlobLib.FilterByArea(blobs, 60, 500); The first three method calls are from the OpenCV library. They provide standard image processing techniques that are then combined with functions from the cvblob library. Firstly, the input must be retrieved from the webcam on a frame by frame basis. Once the frame is retrieved, the first operation that occurs is converting the BGR or normal color space (of the frame) to an HSV color space. This operation is performed using the function CvtColor() and its purpose is later explained. The new HSV color frame is then filtered according to the HSV threshold values that are specified. This is done with a call to the method InRangeS() and the threshold minimum and maximum values are given as input. The resulting frame is then smoothed to reduce noise with a call to Smooth().The result of these linear operations is a frame that is filtered according to a minimum and maximum HSV value that we provide as input. Theoretically, if a tracking object is used with a color not prevalent in its background, a blob will be displayed in the frame. The next three calls are from the cvblob library. The Label() method is used to retrieve the said blob from the frame. We note that there can be more than one (depending on the threshold) however for our project we limited ourselves to only one the tracking object. The RenderBlobs() method renders the blobs in this frame. Finally, the FilterByArea() method limits the size of the blob according to specified dimensions.

13 4.2.2 SphereTrack The SphereTrack script has a general function of mapping the tracking object to the Tracker asset shown on screen. It maps this process using an instantiated script of type BlobScript (attached to the Tracker asset). This function is called within the Start() function of the script, to denote an instantiation: public GameObject blobobject; public BlobScript blobscript; void Start () { blobscript = blobobject.getcomponent<blobscript>(); After instantiating the BlobScript, it calls the tracked coordinates synchronously and stores the coordinates in a vector object of type Vector: Vector3 pos = new Vector3(); Finally, these coordinates are mapped and scaled to an appropriate location on the screen, thus creating a boundary for where our Tracker asset can move. This happens by storing the coordinates in the Vector object created earlier: float halfw = (float)(screen.width/2f); pos.x = blobscript.xcoordinate - (halfw * 0.55f); temp1 = pos.x; float halfh = (float)(screen.height/2f); pos.y = (halfh * 1.5f) - blobscript.ycoordinate; temp2 = pos.y; transform.position = pos; Hit The script titled Hit has only one function. When the asset tagged Target "collides" with the asset upon which this script is attached, some code is executed. In our case, the Word asset is tagged as Target. if (collision.gameobject.tag=="target") { Instantiate(prefab, transform.position, Quaternion.identity); Destroy(collision.gameObject);

14 Inside the if statement, we have only 2 function calls. The first function instantiates an "explosion" (already initialized) and targets its position to the exact location on which the collision occurred. The second function, Destroy, eliminates the collided object onscreen, thereby creating an explosive effect.

15 5. FUTURE WORK Although Rock Climber does basic functions, it does not encompass the rich functionality that we had defined in the Requirements Analysis Document. The first step towards improvement would be to develop the game on a tablet platform, specifically the ipad. As discussed, a tablet will be more effective with regards to mobility when in use by children. The libraries that we have used to provide tracking are also available for ios which allows for a smooth transition and minimal changes in implementation. Aside from the change in platform for which the application is to be developed on, several changes must be made to the actual algorithms that are used to track and provide control. In this prototype, we use an external object (the tracking object) that is to be placed on the patients jaw in order to provide control. Although this method results in controlling the game, we need a more user-friendly way of providing input. One method to achieve this result, is to specifically track facial feature points, specifically the jaw region. By tracking only the jaw, the configuration steps needed will be reduced and resulting will be an increased adaptability in game play. The Active Shape Model (ASM) and Active Appearance Model (AAM) provide for these enhanced techniques. Both AAM and ASM are computer vision algorithms that provide statistical models of mean shape and variations, specifically targeted to human facial features. In this sense, by tracking points specific to the jaw region, there will no longer be a need for a tracking object.

16 APPENDIX: SOURCE CODE Main Screen MenuObject.cs: Used to detect on-click events using UnityEngine; using System.Collections; public class MenuObject : MonoBehaviour { public bool isquit = false; public bool isconfigure = false; public bool isback = false; void OnMouseEnter() { renderer.material.color = Color.red; void OnMouseExit() { renderer.material.color = Color.white; void OnMouseDown() { if (isquit) Application.Quit(); else if (isconfigure) Application.LoadLevel(2); else if (isback) Application.LoadLevel(0); else Application.LoadLevel(1); Configure Screen opencvtest.cs: Opens 3 windows, 2 of which are augmented webcam captures and the other being a sliders window using UnityEngine; using System.Collections; using OpenCvSharp; using OpenCvSharp.MachineLearning;

17 using OpenCvSharp.Blob; using System.Runtime.InteropServices; public class opencvtest : MonoBehaviour { int H_MIN = 0; int H_MAX = 256; int S_MIN = 0; int S_MAX = 256; int V_MIN = 0; int V_MAX = 256; const int CAPTURE_WIDTH = 640; const int CAPTURE_HEIGHT = 400; CvCapture capture; CvBlobs blobs = new CvBlobs(); private IplImage frame; private IplImage hsvframe; private IplImage labelimg; private IplImage threshy; private IplImage binary; void track_function_h_min(int x) { H_MIN = x; BlobScript.hMin = x; void track_function_h_max(int x) { H_MAX = x; BlobScript.hMax = x; void track_function_s_min(int x) { S_MIN = x; BlobScript.sMin = x; void track_function_s_max(int x) { S_MAX = x; BlobScript.sMax = x; void track_function_v_min(int x) { V_MIN = x; BlobScript.vMin = x;

18 void track_function_v_max(int x) { V_MAX = x; BlobScript.vMax = x; void createtrackbars() { Cv.NamedWindow("Trackbars"); Cv.CreateTrackbar("H_MIN", "Trackbars", H_MIN, H_MAX, track_function_h_min); Cv.CreateTrackbar("H_MAX", "Trackbars", H_MAX, H_MAX, track_function_h_max); Cv.CreateTrackbar("S_MIN", "Trackbars", S_MIN, S_MAX, track_function_s_min); Cv.CreateTrackbar("S_MAX", "Trackbars", S_MAX, S_MAX, track_function_s_max); Cv.CreateTrackbar("V_MIN", "Trackbars", V_MIN, V_MAX, track_function_v_min); Cv.CreateTrackbar("V_MAX", "Trackbars", V_MAX, V_MAX, track_function_v_max); void Start() { Cv.NamedWindow( "Calibrate"); capture = Cv.CreateCameraCapture(0); frame = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,3); //Original Image hsvframe = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,3);//Image in HSV color space labelimg = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),CvBlobLib.DepthLabel,1);//Image Variable for blobs threshy = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,1); //Threshold image of yellow color binary = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,1); createtrackbars(); void Update() { IplImage fram = Cv.QueryFrame( capture ); if( fram == null ) Destroy(); Cv.Resize(fram, frame, OpenCvSharp.Interpolation.Linear); Cv.Flip(fram, frame, OpenCvSharp.FlipMode.Y); Cv.CvtColor(frame, hsvframe, OpenCvSharp.ColorConversion.BgrToHsv); Cv.InRangeS(hsvframe, new OpenCvSharp.CvScalar(H_MIN, S_MIN, V_MIN), new OpenCvSharp.CvScalar(H_MAX, S_MAX, V_MAX), threshy); Cv.Copy(threshy, binary);

19 Cv.Smooth(threshy, threshy, OpenCvSharp.SmoothType.Median); uint result = CvBlobLib.Label(threshy, labelimg, blobs); CvBlobLib.RenderBlobs(labelImg, blobs, frame, frame); CvBlobLib.FilterByArea(blobs, 60, 500); Cv.ShowImage("Binary", binary); Cv.ShowImage( "Calibrate", frame ); char c = (char)cv.waitkey(33); if( c == 27 ) Destroy(); void Destroy() { Cv.ReleaseCapture(capture); Cv.DestroyAllWindows(); Play Screen BlobScript.cs: Used to detect blobs on screen using UnityEngine; using System.Collections; using OpenCvSharp; using OpenCvSharp.MachineLearning; using OpenCvSharp.Blob; using System.Runtime.InteropServices; public class BlobScript : MonoBehaviour { //cvscalar BGR Min values static public int hmin = 140; static public int smin = 25; static public int vmin = 140; //cvscalar BGR Max values static public int hmax = 186; static public int smax = 256; static public int vmax = 256; public int xcoordinate = 0; public int ycoordinate = 0; const int test = 22;

20 const int CAPTURE_WIDTH = 640; const int CAPTURE_HEIGHT = 400; CvCapture capture; CvBlobs blobs = new CvBlobs(); private IplImage frame; private IplImage hsvframe; private IplImage labelimg; private IplImage threshy; int screenx; int screeny; void Start() { capture = Cv.CreateCameraCapture(0); Cv.NamedWindow("Live"); frame = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,3); //Original Image hsvframe = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,3);//Image in HSV color space labelimg = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),CvBlobLib.DepthLabel,1);//Image Variable for blobs threshy = Cv.CreateImage(Cv.Size(CAPTURE_WIDTH,CAPTURE_HEIGHT),OpenCvSharp.BitDepth.U8,1); //Threshold image of yellow color screenx = 1600; screeny = 900; void Update() { IplImage fram = Cv.QueryFrame( capture ); if( fram == null ) Destroy(); Cv.Resize(fram, frame, OpenCvSharp.Interpolation.Linear); Cv.Flip(fram, frame, OpenCvSharp.FlipMode.Y); Cv.CvtColor(frame, hsvframe, OpenCvSharp.ColorConversion.BgrToHsv); Cv.InRangeS(hsvframe, new OpenCvSharp.CvScalar(hMin, smin, vmin), new OpenCvSharp.CvScalar(hMax, smax, vmax), threshy); Cv.Smooth(threshy, threshy, OpenCvSharp.SmoothType.Median); uint result = CvBlobLib.Label(threshy, labelimg, blobs);

21 CvBlobLib.RenderBlobs(labelImg, blobs, frame, frame); CvBlobLib.FilterByArea(blobs, 60, 500); foreach (CvBlob b in blobs.values) { double moment10 = b.m10; double moment01 = b.m01; double area = b.area; int x1; int y1; x1 = (int)(moment10/area); y1 = (int)(moment01/area); xcoordinate = (int)(x1*screenx/capture_width); ycoordinate = (int)(y1*screeny/capture_height); Cv.ShowImage( "Live", frame ); char c = (char)cv.waitkey(33); if( c == 27 ) Destroy(); void Destroy() { Cv.ReleaseCapture(capture); Cv.DestroyAllWindows(); Hit.js: Used to detect collisions var prefab:gameobject; function OnCollisionEnter(collision: Collision) { //If Tracker collides with Target... if (collision.gameobject.tag=="target"){ //Instantiate "Explode" prefab and destroy Target Instantiate(prefab, transform.position, Quaternion.identity); Destroy(collision.gameObject);

22 SphereTrack.cs: Used to map tracking object to screen coordinates using UnityEngine; using System.Collections; public class SphereTrack : MonoBehaviour { public GameObject blobobject; public BlobScript blobscript; // Use this for initialization void Start () { blobscript = blobobject.getcomponent<blobscript>(); // Update is called once per frame void Update () { Vector3 pos = new Vector3(); float temp1, temp2; float halfw = (float)(screen.width/2f); pos.x = blobscript.xcoordinate - (halfw * 0.55f); temp1 = pos.x; float halfh = (float)(screen.height/2f); pos.y = (halfh * 1.5f) - blobscript.ycoordinate; temp2 = pos.y; transform.position = pos;

Workshop BOND UNIVERSITY. Bachelor of Interactive Multimedia and Design. Asteroids

Workshop BOND UNIVERSITY. Bachelor of Interactive Multimedia and Design. Asteroids Workshop BOND UNIVERSITY Bachelor of Interactive Multimedia and Design Asteroids FACULTY OF SOCIETY AND DESIGN Building an Asteroid Dodging Game Penny de Byl Faculty of Society and Design Bond University

More information

Google SketchUp/Unity Tutorial Basics

Google SketchUp/Unity Tutorial Basics Software used: Google SketchUp Unity Visual Studio Google SketchUp/Unity Tutorial Basics 1) In Google SketchUp, select and delete the man to create a blank scene. 2) Select the Lines tool and draw a square

More information

Terrain. Unity s Terrain editor islands topographical landscapes Mountains And more

Terrain. Unity s Terrain editor islands topographical landscapes Mountains And more Terrain Unity s Terrain editor islands topographical landscapes Mountains And more 12. Create a new Scene terrain and save it 13. GameObject > 3D Object > Terrain Textures Textures should be in the following

More information

Adding a Trigger to a Unity Animation Method #2

Adding a Trigger to a Unity Animation Method #2 Adding a Trigger to a Unity Animation Method #2 Unity Version: 5.0 Adding the GameObjects In this example we will create two animation states for a single object in Unity with the Animation panel. Our

More information

Terrain. Unity s Terrain editor islands topographical landscapes Mountains And more

Terrain. Unity s Terrain editor islands topographical landscapes Mountains And more Terrain Unity s Terrain editor islands topographical landscapes Mountains And more 12. Create a new Scene terrain and save it 13. GameObject > 3D Object > Terrain Textures Textures should be in the following

More information

Game Design From Concepts To Implementation

Game Design From Concepts To Implementation Game Design From Concepts To Implementation Giacomo Cappellini - g.cappellini@mixelweb.it Why Unity - Scheme Unity Editor + Scripting API (C#)! Unity API (C/C++)! Unity Core! Drivers / O.S. API! O.S.!

More information

IAT 445 Lab 10. Special Topics in Unity. Lanz Singbeil

IAT 445 Lab 10. Special Topics in Unity. Lanz Singbeil IAT 445 Lab 10 Special Topics in Unity Special Topics in Unity We ll be briefly going over the following concepts. They are covered in more detail in your Watkins textbook: Setting up Fog Effects and a

More information

Introduction This TP requires Windows and UNITY 5.

Introduction This TP requires Windows and UNITY 5. TP - Desktop VR: Head tracking and asymmetric frustum with OpenCVSharp and Unity This tutorial has been printed from http://henriquedebarba.com/index.php/0/0/0//, use that website if possible as copy-pasting

More information

UFO. Prof Alexiei Dingli

UFO. Prof Alexiei Dingli UFO Prof Alexiei Dingli Setting the background Import all the Assets Drag and Drop the background Center it from the Inspector Change size of Main Camera to 1.6 Position the Ship Place the Barn Add a

More information

Pacman. you want to see how the maze was created, open the file named unity_pacman_create_maze.

Pacman. you want to see how the maze was created, open the file named unity_pacman_create_maze. Pacman Note: I have started this exercise for you so you do not have to make all of the box colliders. If you want to see how the maze was created, open the file named unity_pacman_create_maze. Adding

More information

if(input.getkey(keycode.rightarrow)) { this.transform.rotate(vector3.forward * 1);

if(input.getkey(keycode.rightarrow)) { this.transform.rotate(vector3.forward * 1); 1 Super Rubber Ball Step 1. Download and open the SuperRubberBall project from the website. Open the main scene. In it you will find a game track and a sphere as shown in Figure 1.1. The sphere has a Rigidbody

More information

Game Design Unity Workshop

Game Design Unity Workshop Game Design Unity Workshop Activity 1 Unity Overview Unity is a game engine with the ability to create 3d and 2d environments. Unity s prime focus is to allow for the quick creation of a game from freelance

More information

Quick Setup Guide. Date: October 27, Document version: v 1.0.1

Quick Setup Guide. Date: October 27, Document version: v 1.0.1 Quick Setup Guide Date: October 27, 2016 Document version: v 1.0.1 Table of Contents 1. Overview... 3 2. Features... 3 3. ColorTracker library... 3 4. Integration with Unity3D... 3 Creating a simple color

More information

PROTOTYPE 1: APPLE PICKER FOR UNITY 5.X

PROTOTYPE 1: APPLE PICKER FOR UNITY 5.X CHAPTER 28 PROTOTYPE 1: APPLE PICKER FOR UNITY 5.X In the pages below, I've replaced the sections of Chapter 28 that used GUIText with new pages that take advantage of the UGUI (Unity Graphical User Interface)

More information

Tutorial Physics: Unity Car

Tutorial Physics: Unity Car Tutorial Physics: Unity Car This activity will show you how to create a free-driving car game using Unity from scratch. You will learn how to import models using FBX file and set texture. You will learn

More information

Tokens, Expressions and Control Structures

Tokens, Expressions and Control Structures 3 Tokens, Expressions and Control Structures Tokens Keywords Identifiers Data types User-defined types Derived types Symbolic constants Declaration of variables Initialization Reference variables Type

More information

OpenCV. Rishabh Maheshwari Electronics Club IIT Kanpur

OpenCV. Rishabh Maheshwari Electronics Club IIT Kanpur OpenCV Rishabh Maheshwari Electronics Club IIT Kanpur Installing OpenCV Download and Install OpenCV 2.1:- http://sourceforge.net/projects/opencvlibrary/fi les/opencv-win/2.1/ Download and install Dev C++

More information

Introduction to Unity. What is Unity? Games Made with Unity /666 Computer Game Programming Fall 2013 Evan Shimizu

Introduction to Unity. What is Unity? Games Made with Unity /666 Computer Game Programming Fall 2013 Evan Shimizu Introduction to Unity 15-466/666 Computer Game Programming Fall 2013 Evan Shimizu What is Unity? Game Engine and Editor With nice extra features: physics engine, animation engine, custom shaders, etc.

More information

Game Design Unity Workshop

Game Design Unity Workshop Game Design Unity Workshop Activity 2 Goals: - Creation of small world - Creation of character - Scripting of player movement and camera following Load up unity Build Object: Mini World and basic Chase

More information

AN INTRODUCTION TO SCRATCH (2) PROGRAMMING

AN INTRODUCTION TO SCRATCH (2) PROGRAMMING AN INTRODUCTION TO SCRATCH (2) PROGRAMMING Document Version 2 (04/10/2014) INTRODUCTION SCRATCH is a visual programming environment and language. It was launched by the MIT Media Lab in 2007 in an effort

More information

Mouse Simulation Using Two Coloured Tapes

Mouse Simulation Using Two Coloured Tapes Mouse Simulation Using Two Coloured Tapes Kamran Niyazi 1, Vikram Kumar 2, Swapnil Mahe 3 and Swapnil Vyawahare 4 Department of Computer Engineering, AISSMS COE, University of Pune, India kamran.niyazi@gmail.com

More information

Crush Around Augmented Reality Game Computer Vision and Image Processing for Mobile Platforms

Crush Around Augmented Reality Game Computer Vision and Image Processing for Mobile Platforms Crush Around Augmented Reality Game Computer Vision and Image Processing for Mobile Platforms Tomer Cagan cagan.tomer@post.idc.ac.il Ziv Levy levy.ziv@post.idc.ac.il School of Computer Science. The Interdisciplinary

More information

Unity3D. Unity3D is a powerful cross-platform 3D engine and a user friendly development environment.

Unity3D. Unity3D is a powerful cross-platform 3D engine and a user friendly development environment. Unity3D Unity3D is a powerful cross-platform 3D engine and a user friendly development environment. If you didn t like OpenGL, hopefully you ll like this. Remember the Rotating Earth? Look how it s done

More information

MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens)

MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens) MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens) 1. INTRODUCTION TO Mixed Reality (AR & VR) What is Virtual Reality (VR) What is Augmented reality(ar) What is Mixed Reality Modern VR/AR experiences

More information

Mechanic Animations. Mecanim is Unity's animation state machine system.

Mechanic Animations. Mecanim is Unity's animation state machine system. Mechanic Animations Mecanim is Unity's animation state machine system. It essentially allows you to create 'states' that play animations and define transition logic. Create new project Animation demo.

More information

Pong in Unity a basic Intro

Pong in Unity a basic Intro This tutorial recreates the classic game Pong, for those unfamiliar with the game, shame on you what have you been doing, living under a rock?! Go google it. Go on. For those that now know the game, this

More information

ANIMATOR TIMELINE EDITOR FOR UNITY

ANIMATOR TIMELINE EDITOR FOR UNITY ANIMATOR Thanks for purchasing! This document contains a how-to guide and general information to help you get the most out of this product. Look here first for answers and to get started. What s New? v1.53

More information

Unity introduction & Leap Motion Controller

Unity introduction & Leap Motion Controller Unity introduction & Leap Motion Controller Renato Mainetti Jacopo Essenziale renato.mainetti@unimi.it jacopo.essenziale@unimi.it Lab 04 Unity 3D Game Engine 2 Official Unity 3D Tutorials https://unity3d.com/learn/tutorials/

More information

Bonus Chapter 10: Working with External Resource Files and Devices

Bonus Chapter 10: Working with External Resource Files and Devices 1 Bonus Chapter 10: Working with External Resource Files and Devices In this chapter, we will cover the following topics: Loading external resource files using Unity default resources Loading external

More information

Introduction to Unreal Engine Blueprints for Beginners. By Chaven R Yenketswamy

Introduction to Unreal Engine Blueprints for Beginners. By Chaven R Yenketswamy Introduction to Unreal Engine Blueprints for Beginners By Chaven R Yenketswamy Introduction My first two tutorials covered creating and painting 3D objects for inclusion in your Unreal Project. In this

More information

Multimedia Retrieval Exercise Course 2 Basic of Image Processing by OpenCV

Multimedia Retrieval Exercise Course 2 Basic of Image Processing by OpenCV Multimedia Retrieval Exercise Course 2 Basic of Image Processing by OpenCV Kimiaki Shirahama, D.E. Research Group for Pattern Recognition Institute for Vision and Graphics University of Siegen, Germany

More information

Better UI Makes ugui Better!

Better UI Makes ugui Better! Better UI Makes ugui Better! 2016 Thera Bytes UG Developed by Salomon Zwecker TABLE OF CONTENTS Better UI... 1 Better UI Elements... 4 1 Workflow: Make Better... 4 2 UI and Layout Elements Overview...

More information

Form Properties Window

Form Properties Window C# Tutorial Create a Save The Eggs Item Drop Game in Visual Studio Start Visual Studio, Start a new project. Under the C# language, choose Windows Form Application. Name the project savetheeggs and click

More information

Unity and MySQL. Versions: Unity 3.0.0f5; MySQL Author: Jonathan Wood. Alias: Zumwalt. Date: 10/8/2010. Updated: 10/12/2010 to include JS code

Unity and MySQL. Versions: Unity 3.0.0f5; MySQL Author: Jonathan Wood. Alias: Zumwalt. Date: 10/8/2010. Updated: 10/12/2010 to include JS code Versions: Unity 3.0.0f5; MySQL 5.2.28 Author: Jonathan Wood Alias: Zumwalt Date: 10/8/2010 Updated: 10/12/2010 to include JS code Document Version 1.0.1 Unity and MySQL Table of Contents Unity and MySQL...

More information

IMAGE PROCESSING AND OPENCV. Sakshi Sinha Harshad Sawhney

IMAGE PROCESSING AND OPENCV. Sakshi Sinha Harshad Sawhney l IMAGE PROCESSING AND OPENCV Sakshi Sinha Harshad Sawhney WHAT IS IMAGE PROCESSING? IMAGE PROCESSING = IMAGE + PROCESSING WHAT IS IMAGE? IMAGE = Made up of PIXELS. Each Pixels is like an array of Numbers.

More information

Damaging, Attacking and Interaction

Damaging, Attacking and Interaction Damaging, Attacking and Interaction In this tutorial we ll go through some ways to add damage, health and interaction to our scene, as always this isn t the only way, but it s the way I will show you.

More information

+ Typed Message [Vlissides, 1998]

+ Typed Message [Vlissides, 1998] Background literature Introduction to Game Programming Autumn 2016 04. Game Programming Patterns and Techniques Juha Vihavainen University of Helsinki E. Gamma et al. (1994), Design Patterns: Elements

More information

Technical Reference Document Team Vision 2020 version 1.0

Technical Reference Document Team Vision 2020 version 1.0 Technical Reference Document Team Vision 2020 version 1.0 Index I. Tools & Definitions 1. CCV 2. FLOSC 3. Unity 4. Flash 5. Smartfox 6. UDPflashlc-bridge II. The Touchscreen 1. Technology III. Software

More information

is.centraldispatch Documentation

is.centraldispatch Documentation SPINACH is.centraldispatch Documentation July 27, 2016 Last Edit : July 27, 2016 Page I! of XII! IS.CENTRALDISPATCH DOCUMENTATION Getting Start Write Your First Multi-Threaded Script Using SPINACH.iSCentralDispatch

More information

Spell Casting Motion Pack 5/5/2017

Spell Casting Motion Pack 5/5/2017 The Spell Casting Motion pack requires the following: Motion Controller v2.49 or higher Mixamo s free Pro Magic Pack (using Y Bot) Importing and running without these assets will generate errors! Overview

More information

Transforms Transform

Transforms Transform Transforms The Transform is used to store a GameObject s position, rotation, scale and parenting state and is thus very important. A GameObject will always have a Transform component attached - it is not

More information

OxAM Achievements Manager

OxAM Achievements Manager 1 v. 1.2 (15.11.26) OxAM Achievements Manager User manual Table of Contents About...2 Demo...2 Version changes...2 Known bugs...3 Basic usage...3 Advanced usage...3 Custom message box style...3 Custom

More information

Unity Scripting 4. CS 491 / DES 400 Crea.ve Coding. Computer Science

Unity Scripting 4. CS 491 / DES 400 Crea.ve Coding. Computer Science Unity Scripting 4 Unity Components overview Particle components Interaction Key and Button input Parenting CAVE2 Interaction Wand / Wanda VR Input Devices Project Organization Prefabs Instantiate Unity

More information

Collision Avoidance with Unity3d

Collision Avoidance with Unity3d Collision Avoidance with Unity3d Jassiem Ifill September 12, 2013 Abstract The primary goal of the research presented in this paper is to achieve natural crowd simulation and collision avoidance within

More information

Merging Physical and Virtual:

Merging Physical and Virtual: Merging Physical and Virtual: A Workshop about connecting Unity with Arduino v1.0 R. Yagiz Mungan yagiz@purdue.edu Purdue University - AD41700 Variable Topics in ETB: Computer Games Fall 2013 September

More information

Setting up A Basic Scene in Unity

Setting up A Basic Scene in Unity Setting up A Basic Scene in Unity So begins the first of this series of tutorials aimed at helping you gain the basic understanding of skills needed in Unity to develop a 3D game. As this is a programming

More information

C# for UNITY3d: Sem 1 a.k.a. The Gospel of Mark Part 1

C# for UNITY3d: Sem 1 a.k.a. The Gospel of Mark Part 1 C# for UNITY3d: Sem 1 a.k.a. The Gospel of Mark Part 1 Special Thanks to Mark Hoey, whose lectures this booklet is based on Move and Rotate an Object (using Transform.Translate & Transform.Rotate)...1

More information

Section 28: 2D Gaming: Continuing with Unity 2D

Section 28: 2D Gaming: Continuing with Unity 2D Section 28: 2D Gaming: Continuing with Unity 2D 1. Open > Assets > Scenes > Game 2. Configuring the Layer Collision Matrix 1. Edit > Project Settings > Tags and Layers 2. Create two new layers: 1. User

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Lecture 05 Randomness and Probability Edirlei Soares de Lima Game AI Model Pathfinding Steering behaviours Finite state machines Automated planning Behaviour

More information

Pick up a book! 2. Is a reader on screen right now?! 3. Embedding Images! 3. As a Text Mesh! 4. Code Interfaces! 6. Creating a Skin! 7. Sources!

Pick up a book! 2. Is a reader on screen right now?! 3. Embedding Images! 3. As a Text Mesh! 4. Code Interfaces! 6. Creating a Skin! 7. Sources! Noble Reader Guide Noble Reader version 1.1 Hi, Toby here from Noble Muffins. This here is a paginating text kit. You give it a text; it ll lay it out on a skin. You can also use it as a fancy text mesh

More information

NOTTORUS. Getting Started V1.00

NOTTORUS. Getting Started V1.00 NOTTORUS Getting Started V1.00 2016 1. Introduction Nottorus Script Editor is a visual plugin for generating and debugging C# Unity scripts. This plugin allows designers, artists or programmers without

More information

Unity Tutorial. Fall /15-666

Unity Tutorial. Fall /15-666 Unity Tutorial Fall 2014 15-466/15-666 Game World model, video, audio, interaction Often like Model-View-Controller Art, mechanics, levels, items, etc. Game Engine World model, video, audio, interaction

More information

Tutorial: Using the UUCS Crowd Simulation Plug-in for Unity

Tutorial: Using the UUCS Crowd Simulation Plug-in for Unity Tutorial: Using the UUCS Crowd Simulation Plug-in for Unity Introduction Version 1.1 - November 15, 2017 Authors: Dionysi Alexandridis, Simon Dirks, Wouter van Toll In this assignment, you will use the

More information

Interactive & Cross-platform development studio

Interactive & Cross-platform development studio Interactive & Cross-platform development studio Unity 2D Filters You're trying to create fancy effects for your Sprite 2D but don't know where to start? You heard about shaders but that's some dark magic

More information

VISIT FOR THE LATEST UPDATES, FORUMS & MORE ASSETS.

VISIT  FOR THE LATEST UPDATES, FORUMS & MORE ASSETS. Gargoyle VISIT WWW.SFBAYSTUDIOS.COM FOR THE LATEST UPDATES, FORUMS & MORE ASSETS. 1. INTRODUCTION 2. QUICK SET UP 3. PROCEDURAL VALUES 4. SCRIPTING 5. ANIMATIONS 6. LEVEL OF DETAIL 7. CHANGE LOG 1. Introduction

More information

Massive Documentation

Massive Documentation Massive Documentation Release 0.2 Inhumane Software June 15, 2016 Contents 1 Introduction 3 1.1 Features.................................................. 3 1.2 Getting Started..............................................

More information

COS Lecture 10 Autonomous Robot Navigation

COS Lecture 10 Autonomous Robot Navigation COS 495 - Lecture 10 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 2011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization

More information

Massive Documentation

Massive Documentation Massive Documentation Release 0.2 Inhumane Software August 07, 2014 Contents 1 Introduction 3 1.1 Features.................................................. 3 1.2 Getting Started..............................................

More information

User Manual. Contact the team: Contact support:

User Manual.     Contact the team: Contact support: User Manual http://dreamteck.io https://www.facebook.com/dreamteckstudio Contact the team: team@dreamteck.io Contact support: support@dreamteck.io Discord Server: https://discord.gg/bkydq8v 1 Contents

More information

There we are; that's got the 3D screen and mouse sorted out.

There we are; that's got the 3D screen and mouse sorted out. Introduction to 3D To all intents and purposes, the world we live in is three dimensional. Therefore, if we want to construct a realistic computer model of it, the model should be three dimensional as

More information

Workshop BOND UNIVERSITY Bachelor of Interactive Multimedia and Design Beginner Game Dev Character Control Building a character animation controller.

Workshop BOND UNIVERSITY Bachelor of Interactive Multimedia and Design Beginner Game Dev Character Control Building a character animation controller. Workshop BOND UNIVERSITY Bachelor of Interactive Multimedia and Design Beginner Game Dev Character Control Building a character animation controller. FACULTY OF SOCIETY AND DESIGN Building a character

More information

2 rd class Department of Programming. OOP with Java Programming

2 rd class Department of Programming. OOP with Java Programming 1. Structured Programming and Object-Oriented Programming During the 1970s and into the 80s, the primary software engineering methodology was structured programming. The structured programming approach

More information

Intel RealSense SDK Gesture Sequences Implemented in Unity* 3D

Intel RealSense SDK Gesture Sequences Implemented in Unity* 3D Intel RealSense SDK Gesture Sequences Implemented in Unity* 3D By Lynn Thompson When configuring gestures to control assets in a scene, it s important to minimize the complexity of the gestures and the

More information

MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens)

MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens) MIXED REALITY (AR & VR) WITH UNITY 3D (Microsoft HoloLens) 1. INTRODUCTION TO Mixed Reality (AR & VR) What is Virtual Reality (VR) What is Augmented reality(ar) What is Mixed Reality Modern VR/AR experiences

More information

Optical Verification of Mouse Event Accuracy

Optical Verification of Mouse Event Accuracy Optical Verification of Mouse Event Accuracy Denis Barberena Email: denisb@stanford.edu Mohammad Imam Email: noahi@stanford.edu Ilyas Patanam Email: ilyasp@stanford.edu Abstract Optical verification of

More information

CE318/CE818: High-level Games Development

CE318/CE818: High-level Games Development CE318/CE818: High-level Games Development Lecture 1: Introduction. C# & Unity3D Basics Diego Perez Liebana dperez@essex.ac.uk Office 3A.527 2017/18 Outline 1 Course Overview 2 Introduction to C# 3 Scripting

More information

UNITY WORKSHOP. Unity Editor. Programming(Unity Script)

UNITY WORKSHOP. Unity Editor. Programming(Unity Script) July, 2018 Hayashi UNITY WORKSHOP Unity Editor Project: Name your project. A folder is created with the same name of the project. Everything is in the folder. Four windows (Scene, Project, Hierarchy, Inspector),

More information

LECTURE 4. Announcements

LECTURE 4. Announcements LECTURE 4 Announcements Retries Email your grader email your grader email your grader email your grader email your grader email your grader email your grader email your grader email your grader email your

More information

CMSC 425 Programming Assignment 1, Part 2 Implementation Notes

CMSC 425 Programming Assignment 1, Part 2 Implementation Notes CMSC 425 Programming Assignment 1, Part 2 Implementation Notes Disclaimer We provide these notes to help in the design of your project. There is no requirement that you use our settings, and in fact your

More information

Intermediate. 6.1 What Will Be Covered in This Chapter. 6.2 Review

Intermediate. 6.1 What Will Be Covered in This Chapter. 6.2 Review 6 Intermediate Chapters 7 and 8 are big and dense. In many ways, they re exciting as well. Getting into a programming language means discovering many cool tricks. The basics of any programming language

More information

Karlen Communications Accessible Word Document Design: Images and Alt Text. Karen McCall, M.Ed.

Karlen Communications Accessible Word Document Design: Images and Alt Text. Karen McCall, M.Ed. Karlen Communications Accessible Word Document Design: Images and Alt Text Karen McCall, M.Ed. Table of Contents Introduction... 3 Creating Pictures with Print Screen... 4 Creating Pictures with Snipping

More information

Please find attached a copy of my final report entitled EECE 496 Hand Gesture Recognition Software Project.

Please find attached a copy of my final report entitled EECE 496 Hand Gesture Recognition Software Project. Letter of Transmittal 4431 Union Street Burnaby, British Columbia V5C 2X7 March 19, 2011 Randall Kerr English Coordinator Department of Applied Science University of British Columbia Vancouver, British

More information

This allows you to choose convex or mesh colliders for you assets. Convex Collider true = Convex Collider. Convex Collider False = Mesh Collider.

This allows you to choose convex or mesh colliders for you assets. Convex Collider true = Convex Collider. Convex Collider False = Mesh Collider. AGF Asset Packager v. 0.4 (c) Axis Game Factory LLC Last Updated: 6/04/2014, By Matt McDonald. Compiled with: Unity 4.3.4. Download This tool may not work with Unity 4.5.0f6 ADDED: Convex Collider Toggle:

More information

Game Design Unity Workshop

Game Design Unity Workshop Game Design Unity Workshop Activity 4 Goals: - Creation of small world - Creation of character - Scripting of player movement and camera following Load up unity Build Object: Collisions in Unity Aim: Build

More information

CSci 1113, Fall 2018 Lab Exercise 11 (Week 13): Graphics. Warm-up

CSci 1113, Fall 2018 Lab Exercise 11 (Week 13): Graphics. Warm-up CSci 1113, Fall 2018 Lab Exercise 11 (Week 13): Graphics It's time to put all of your C++ knowledge to use to implement a substantial program. In this lab exercise you will construct a graphical game that

More information

Sviluppo Di App Mobile Per Augmented Reality

Sviluppo Di App Mobile Per Augmented Reality Sviluppo Di App Mobile Per Augmented Reality Manuela Chessa University of Genova Dept. of Informatics, Bioengineering, Robotics, and Systems Engineering (manuela.chessa@unige.it) A brief history TODAY

More information

Chapter 19- Object Physics

Chapter 19- Object Physics Chapter 19- Object Physics Flowing water, fabric, things falling, and even a bouncing ball can be difficult to animate realistically using techniques we have already discussed. This is where Blender's

More information

Unity Animation. Objectives. Animation Overflow. Animation Clips and Animation View. Computer Graphics Section 2 ( )

Unity Animation. Objectives. Animation Overflow. Animation Clips and Animation View. Computer Graphics Section 2 ( ) Unity Animation Objectives How to animate and work with imported animations. Animation Overflow Unity s animation system is based on the concept of Animation Clips, which contain information about how

More information

Open GL Framework For A Computer Graphics Course

Open GL Framework For A Computer Graphics Course Open GL Framework For A Computer Graphics Course Programmer: Daniel Odle Sponsor / Advisor: Dr. Morse University of Evansville 4-26-03 Table of Contents Introduction 3 Statement of Problem 3 Design Approach

More information

Demo Scene Quick Start

Demo Scene Quick Start Demo Scene Quick Start 1. Import the Easy Input package into a new project. 2. Open the Sample scene. Assets\ootii\EasyInput\Demos\Scenes\Sample 3. Select the Input Source GameObject and press Reset Input

More information

Easy Decal Version Easy Decal. Operation Manual. &u - Assets

Easy Decal Version Easy Decal. Operation Manual. &u - Assets Easy Decal Operation Manual 1 All information provided in this document is subject to change without notice and does not represent a commitment on the part of &U ASSETS. The software described by this

More information

Unity Game Development

Unity Game Development Unity Game Development 1. Introduction to Unity Getting to Know the Unity Editor The Project Dialog The Unity Interface The Project View The Hierarchy View The Inspector View The Scene View The Game View

More information

Chart And Graph. Features. Features. Quick Start Folders of interest Bar Chart Pie Chart Graph Chart Legend

Chart And Graph. Features. Features. Quick Start Folders of interest Bar Chart Pie Chart Graph Chart Legend Chart And Graph Features Quick Start Folders of interest Bar Chart Pie Chart Graph Chart Legend Overview Bar Chart Canvas World Space Category settings Pie Chart canvas World Space Pie Category Graph Chart

More information

Basic Waypoints Movement v1.0

Basic Waypoints Movement v1.0 Basic Waypoints Movement v1.0 1. Create New Unity project (or use some existing project) 2. Import RAIN{indie} AI package from Asset store or download from: http://rivaltheory.com/rainindie 3. 4. Your

More information

Progress Report of Final Year Project

Progress Report of Final Year Project Progress Report of Final Year Project Project Title: Design and implement a face-tracking engine for video William O Grady 08339937 Electronic and Computer Engineering, College of Engineering and Informatics,

More information

Fundamentals. 5.1 What Will Be Covered in This Chapter. 5.2 Review

Fundamentals. 5.1 What Will Be Covered in This Chapter. 5.2 Review 5 Fundamentals Now that we ve covered most of the basic concepts and terms that are commonly taught to new programmers, it s time to learn some of the fundamental concepts that are required to write functioning

More information

Introduction to C# Applications

Introduction to C# Applications 1 2 3 Introduction to C# Applications OBJECTIVES To write simple C# applications To write statements that input and output data to the screen. To declare and use data of various types. To write decision-making

More information

Variables. Data Types.

Variables. Data Types. Variables. Data Types. The usefulness of the "Hello World" programs shown in the previous section is quite questionable. We had to write several lines of code, compile them, and then execute the resulting

More information

TUTORIAL: MoveYourRobot with Unity3D You created your own robot with servo- motors and you are wondering how to control it.

TUTORIAL: MoveYourRobot with Unity3D You created your own robot with servo- motors and you are wondering how to control it. TUTORIAL: MoveYourRobot with Unity3D You created your own robot with servo- motors and you are wondering how to control it. This package provide environment and scripts to be easily able to control your

More information

IDL Tutorial. Working with Images. Copyright 2008 ITT Visual Information Solutions All Rights Reserved

IDL Tutorial. Working with Images. Copyright 2008 ITT Visual Information Solutions All Rights Reserved IDL Tutorial Working with Images Copyright 2008 ITT Visual Information Solutions All Rights Reserved http://www.ittvis.com/ IDL is a registered trademark of ITT Visual Information Solutions for the computer

More information

Previously, on Lesson Night... From Intermediate Programming, Part 1

Previously, on Lesson Night... From Intermediate Programming, Part 1 Previously, on Lesson Night... From Intermediate Programming, Part 1 Struct A way to define a new variable type. Structs contains a list of member variables and functions, referenced by their name. public

More information

Chapter 1 Getting Started

Chapter 1 Getting Started Chapter 1 Getting Started The C# class Just like all object oriented programming languages, C# supports the concept of a class. A class is a little like a data structure in that it aggregates different

More information

Autonomous Rubik's Cube Solver Using Image Processing

Autonomous Rubik's Cube Solver Using Image Processing Autonomous Rubik's Cube Solver Using Image Processing Harshad Sawhney Sakshi Sinha Anurag Lohia Prashant Jalan Priyanka Harlalka Abstract Rubik's cube is a 3-D mechanical puzzle in which a pivot mechanism

More information

Aircraft Smooth Motion Controls with Intel Perceptual Computing SDK. Cédric Andreolli - Intel

Aircraft Smooth Motion Controls with Intel Perceptual Computing SDK. Cédric Andreolli - Intel Aircraft Smooth Motion Controls with Intel Perceptual Computing SDK Cédric Andreolli - Intel 1 Contents 1 Introduction... 3 2 Playing with the aircraft orientation... 4 2.1 The forces in our game... 4

More information

(Refer Slide Time: 00:02:02)

(Refer Slide Time: 00:02:02) Computer Graphics Prof. Sukhendu Das Dept. of Computer Science and Engineering Indian Institute of Technology, Madras Lecture - 20 Clipping: Lines and Polygons Hello and welcome everybody to the lecture

More information

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation Perfect Timing Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation Problem & Solution College students do their best to plan out their daily tasks, but

More information

Programming Lecture 3

Programming Lecture 3 Programming Lecture 3 Expressions (Chapter 3) Primitive types Aside: Context Free Grammars Constants, variables Identifiers Variable declarations Arithmetic expressions Operator precedence Assignment statements

More information

Max scene used to generate the image from the second pdf in this tutorial.

Max scene used to generate the image from the second pdf in this tutorial. Tutorial covers creating vector drawings from a 3ds max scene and methods for compositing these drawings back into a rendering. Rendering set up is based of the lighting set up from the mental ray/skylight/mr

More information

BASEBALL TRAJECTORY EXTRACTION FROM

BASEBALL TRAJECTORY EXTRACTION FROM CS670 Final Project CS4670 BASEBALL TRAJECTORY EXTRACTION FROM A SINGLE-VIEW VIDEO SEQUENCE Team members: Ali Goheer (mag97) Irene Liew (isl23) Introduction In this project we created a mobile application

More information

AS AUTOMAATIO- JA SYSTEEMITEKNIIKAN PROJEKTITYÖT CEILBOT FINAL REPORT

AS AUTOMAATIO- JA SYSTEEMITEKNIIKAN PROJEKTITYÖT CEILBOT FINAL REPORT AS-0.3200 AUTOMAATIO- JA SYSTEEMITEKNIIKAN PROJEKTITYÖT CEILBOT FINAL REPORT Jaakko Hirvelä GENERAL The goal of the Ceilbot-project is to design a fully autonomous service robot moving in a roof instead

More information