Abstract. MSc. Electronics: Computer Graphics. Sept Abstract.

Size: px
Start display at page:

Download "Abstract. MSc. Electronics: Computer Graphics. Sept Abstract."

Transcription

1 Abstract. MSc. Electronics: Computer Graphics. Sept Abstract. This report gives the reader a detailed account of a project undertaken for the award of an MSc in Electronics. The project is concerned with computer graphics and more specifically image processing. The report begins by considering digital images and how they are generated and represented within a computer. Image processing is essentially the manipulation of the individual picture elements (pixels) in an image. The nature of this manipulation determines the processing effect. Image processing is generally used in the fields of image and three dimensional object recognition by examining images and determining the components that make it up. These techniques are mostly used in medical imaging and robot/computer visualisation. This project, however, explores the use of standard image processing methods to produce enhanced images for computer art and computer animation. Some background information on 3-D computer graphics and 3-D computer animation is provided to show the significance of the project and to aid in understanding the applications of image processing with respect to computer animation. The existing application that the image processing code was developed for was Soft F/X Pro, which is a 3-D modelling and animation package designed to run in the Microsoft Windows NT environment. This report includes details on how a windows program is constructed and how it differs from a standard console based program. It also details the importance of dynamic linking to accommodate the highly modular structure of Windows. A Windows NT dynamic link library of code was produced for each of three image processing modules which create the animated atmospheric effects of falling snow and rain as well as implementing a realistic simulation of firework rockets and explosions. The development of the windows software is detailed herein and example output from each of the effect generators is included. i

2 Contents. MSc. Electronics: Computer Graphics. Sept Project Aims & Objectives Introduction Specifications Project Planning. Contents D Computer graphics & Digital image processing Introduction Digital image formation & representation Static computer graphics and computer animation Elementary digital image processing techniques & theory. 3. The Development Environment Introduction Soft F/X The Modeller The Animator The Renderer Windows Structured programming using C The Windows API functions and common controls Dynamic Link Libraries. 4. Image Processing Algorithms & How they are coded into a Soft F/X DLL Introduction The C functions involved The C structures involved The Filter example. 5. Snow F/X Introduction Algorithm design Results Debugging. 6. Rain F/X Introduction Algorithm design Results. iii

3 Contents. MSc. Electronics: Computer Graphics. Sept Pyro F/X. Contents. (contd.) 7.1. Introduction Algorithm design Results Further discussion of the user colour choosing. 8. Image processor Tutorials Introduction Snow F/X tutorial Rain F/X tutorial Pyro F/X tutorial. 9. Future work. 10. Conclusions. Appendix A. Appendix B. Appendix C. Appendix D. Appendix E. A.1. Snow F/X DLL code listing. A.2. Snow F/X header file. A.3. Snow F/X resource script. B.1. Rain F/X DLL code listing. B.2. Rain F/X header file. B.3. Rain F/X resource script. C.1. Pyro F/X DLL code listing. C.2. Pyro F/X header file. C.3. Pyro F/X resource script. D.1. Template F/X DLL code listing. D.2. Template F/X header file. D.3. Template F/X resource script. E.1. References. iv

4 Chapter1. MSc. Electronics: Computer Graphics. Sept Chapter 1. Project Aims & Objectives Introduction. This chapter deals with the nature of the project and how it was managed. It first details the specifications and provides an introduction to the topics that are involved. Secondly, a plan for the conduct of the work and a time scale for sections to be completed is included. This plan also provided the structure for this report as a separate chapter is dedicated to each stage of the project Specifications. The main objective of this project can be defined as follows:- To develop an add on or 'plug-in' image processing module for the Soft F/X graphics package that creates the illusion of atmospheric effects such as rain and snow on the images produced by this 3-D computer visualisation, modelling and animation software. This is a long sentence and requires some explanation. Soft F/X is a software package that is designed to run in the Windows environment and like most windows applications is written in modules of code. This method of modular programming allows for the integration of new modules even after the application has been built and released. This is a good way to provide a certain amount of 'ugradeablilty' and allowing optional extras. The modules developed for this project would be termed as extra code as they are not essential for the operation of the application but they afford the user a new way to enhance the images and animations they produce. This is analogous to central locking for a car. It is not required, but it is a useful device for the driver. Further explanation of Soft F/X and image processing can be found in chapters 3 and 4. Achieving the main objective involved the following:- As Soft F/X is a Microsoft Windows based application, some study of the internal workings of the Windows operating systems (Windows 3.x/NT/95) is involved so that the appropriate programs can be written to run on these systems. Some study of the way Soft F/X works is also needed so that information passing and sharing can be achieved between the main application and the plug-in module. Image processing algorithms must be designed and coded-up to provide the atmospheric effects. Page 1

5 Chapter1. MSc. Electronics: Computer Graphics. Sept Project planning. The table below shows the way in which the main project has been broken up into sub tasks and the estimated completion date associated with it. Project Sub task Jan Feb Mar Apr May June July Aug Sept Initial research on Windows X X programming and image processing. Organising the PC hardware, and the X Soft F/X and Microsoft Visual C++ software. Gaining full familiarity with the X X Soft F/X software and the Visual C/C++ compiler. Advanced research on (i) Windows NT programming (ii) Soft F/X external module developers templates. X X X X X X X X (iii)image processing algorithms. Windows NT test X X X X applications programming. Windows NT test X X X DLL programming. Image processing X X algorithm designs. Algorithm X X X implementations as Windows NT DLL s. Complete project and submit final report. X X Fig 1.3.A. Project plan and time scale. NB X signifies when a task is to be undertaken. Page 2

6 Chapter 2. MSc. Electronics: Computer Graphics. Sept Chapter 2. processing. 3-D Computer graphics & Digital image 2.1. Introduction. The Chinese proverb, "One picture is worth a thousand words," [PIT93] describes the amount of information provided by a single image. Most media (e.g. newspapers, television, cinema) use pictures whether still or moving as information carriers. It is the ease of understanding that comes with a picture which has prompted the development of graphical operating systems for computers. Most people with a limited knowledge of computers can still operate a personal computer loaded with a picture or icon based operating system. Whereas, a console based operating system is complex and dull for the user. This move to high levels of graphics on computers extended the limits of the machine from simply a number cruncher, to a tool which nearly everyone can utilise. The next section of this chapter explains how an image is represented within a computer and how this digital representation of a picture can be formed or captured from the real world. It then goes on to discuss the uses and background of 3-D computer graphics and animation Digital image formation & representation. One definition of an image is, an optical representation of an object illuminated by a radiating source. [PIT93] This definition holds true for an image which is generated using, for example, a camera, but a truly computer generated image has to be drawn using a simulation of light (radiating source). These two methods of digital image formation are considered below. Figure 2.2.A. below shows a model of a digital image formation system which would be characteristic of a digital still or video camera. This system also applies to images which are digitised (using a scanner) from printed images produced from standard cameras. f Optical b Sensor. i Digitiser. g System. Fig. 2.2.A. Model of a digital image formation system. The optical sub system usually consists of an arrangement of optical lenses which take the signal f (the reflected light from the object being viewed) and produce the slightly different signal b which may be blurred or unfocused. The optical sensor generates an electrical signal i from the light signal b by means of a photoelctronic sensor. Several types of sensor exist, such as standard Vidicon tubes, charge injection devices (CIDs) and charge coupled devices (CCDs). The mathematical relationship between the two signals i Page 3

7 Chapter 2. MSc. Electronics: Computer Graphics. Sept and b is highly non-linear. Figure 2.2.B. shows a characteristic curve of current - illumination for a typical Vidicon tube. At the saturation level, any increase in light intensity produces no more current from the sensing device. Also note that there is always some current present in the detector even when there is no illumination. This is known as dark current and is a characteristic of the sensing device. log (i) Saturation Dark current log (b) Fig. 2.2.B. Current - illumination curve of a photoelectric sensor. The output of the sensor is still a two-dimensional analogue signal. It must be sampled and digitised before it can be processed by a computer. This sampling and digitisation is carried out by an analogue to digital (A/D) converter, or the last element in the system above, the digitiser. For colour images sampling is performed on each of the red, green and blue channels to produce three values R, G and B. The value range for each channel depends on the quantisation of the digitised image. This is usually 1 byte (8 bits) for each of the red, green and blue and describes digitally the illumination at any point in the picture. How many points in the picture should be given R, G and B values is determined by the sampling rate. The sampling rate is chosen for a particular digital resolution that is required. The higher the resolution, the closer to the original image the digital representation will be. However, the higher the resolution, the more RGB values must be obtained and stored. Each time an RGB value is obtained a pixel is created. A pixel is a picture element and it can be seen that a two-dimensional array of pixels can be used to represent an image digitally. g(1,1) g(1,2)... g(1,m) g(2,1) g(2,2)... g(2,m) g = : : : : (2.2.1.) g(n,1) g(n,2)... g(n,m) The size of the matrix (array) is the resolution of the image and is denoted as N x M. Each matrix element is a representation of a pixel in the image. i.e. the smallest area of the picture which can have a discrete intensity or colour value. For 8 bit monochrome images each pixel is represented by a single Page 4

8 Chapter 2. MSc. Electronics: Computer Graphics. Sept integer in the range [0,..,255] but for 24 bit images or so called true colour images 3 bytes are needed to store each pixel so each element is a reference of an RGB pixel value. It should be noted here that most digital images are not permanently stored in this format, rather that they are compressed in some form before disk storage. This compression can take different forms for example Huffman encoding or run length encoding [PIT93]. Several file formats with different compression methods exist such as TIFF, GIF, PCX and TGA, but as the only image format that will be manipulated in the image processing of this project is the uncompressed data as described above, the description of these formats is better left to books on the subject. [LIN91], [RIM90] (a) Fig. 2.2.C. (a) part of an image. (b) zoomed in part of (a) to show pixels. [CAC97] (b) Page 5

9 Chapter 2. MSc. Electronics: Computer Graphics. Sept Figure 2.2.C. above shows how a digitised photograph is made up of pixels. A comparison between computer images and paint by numbers can be made as each pixel is represented by a certain number in the computer s memory and each number corresponds to a particular colour. Another way to generate a digital image is to draw it within the computer. This can be done in a drawing package such as Microsoft paintbrush or, to get more photorealistic images, in a 3-D computer visualisation package such as 3D Studio Max. A short discussion of 3-D graphics follows in the next section Static computer graphics and computer animation. Three dimensional computer graphics now have a wide range of applications, from the entertainment and advertising industries, to more practical applications such as Computer Aided Design or Simulation, which may be used by engineers or scientists. Static computer graphics are effectively used in all theses fields but 3-D computer animation, which gives a deeper realism to 3-D visualisation, is mainly used to produce television advertisements and film special effects. It is a much underused tool as far as scientific modelling and simulation is concerned. The attraction of computers for animation tools mostly comes from the speed of visualisation. Computers today are capable of producing wireframe (see figure 2.3.A.) animation sequences interactively which provides the animator with immediate results which can be edited simply and quickly. (a) (b) (c) (d) Fig. 2.3.A. Four wireframe frames in a cut-away piston animation. Page 6

10 Chapter 2. MSc. Electronics: Computer Graphics. Sept Also, employing a keyframing system, a reduction in the workload and time spent creating a finished sequence, can be achieved. This is where the animator specifies several key frames (such as the five frames in figure 2.3.A.) and a motion specification, whilst the computer fills in the blanks or interpolates between the key frames to produce the undescribed frames. Once the animator is happy with the wireframe preview of the sequence each frame can be rendered to produce a sequence of images like the one in figure 2.3.B. Fig. 2.3.B. The fully rendered image of the cut-away piston. The motion specification is the largest barrier to scientists wishing to simulate real world phenomenon, as it is difficult to express real or highly accurate motions in mathematical terms that the computer can interpret. This is probably the main reason for the low usage of computer animation in scientific or engineering fields. Computer graphics can be generated so that they become photo-realistic or have a super-realism that identifies the images as computer generated. This idea of super-realism has been utilised mostly in advertising to add to the appeal of a product. When producing either photo-realistic or super-real animations there are a number of techniques which can be employed to improve the look of the generated images. These include texture mapping, ray tracing of lights to produce reflections and shadows or to reproduce the effect of refraction, and post render image processing. It is this image processing that this project is concerned with so the next section introduces digital image processing, the fundamentals and reasons for its use Elementary digital image processing techniques & theory. As detailed before an image is made up of a two-dimensional array of pixels and so mathematically an image can be described as:- i[n][m] (2.4.1.) where NxM is the size of the image or its resolution. So it can be seen that any pixel (x,y) within the image can be referenced by: i[x][y] (2.4.2.) Page 7

11 Chapter 2. MSc. Electronics: Computer Graphics. Sept It must always be remembered that each pixel is represented by a red, green and a blue component, all of which can be changed. This is the crucial point in digital image processing. It is essentially the manipulation of the value of each pixel (and hence the RGB components of it) in the image. Note:- In the following equations and expressions i and j are integer variables ranging [0,...N] and [0,...M] respectively. Elementary digital image processing involves basic arithmetic operations, e.g. image addition, subtraction, c[i][j] = a[i][j] ± b[i][j] (2.4.3.) and multiplication by a constant. b[i][j] = c.a[i][j] (2.4.4.) Point non-linear operations of the form b[i][j] = h(a[i][j]), where h(x) is a transform function can be used in a number of applications, e.g. gamma correction and image sharpening. A simple pointwise operation is clipping where the transform function is reduced to a look-up table as shown below: cmax if a[i][j] > cmax b[i][j] = a[i][j] if cmin <= a[i][j] <= cmax (2.4.5.) cmin if a[i][j] < cmin Elementary binary operations (AND, OR, XOR) can be easily performed: c[i][j] = a[i][j]&b[i][j] AND (2.4.6.) c[i][j] = a[i][j] b[i][j] OR (2.4.7.) c[i][j] = a[i][j] ^ b[i][j] XOR (2.4.8.) An image negative could also be formed by finding the 1 s complement of each pixel: b[i][j] = ~a[i][j] (2.4.9.) Note that in all the above processing techniques each pixel reference is a reference to an RGB value and that each component can be acted on independently of the other two. For example in the multiplication by a constant (2.4.4.) any or all of the RGB components could be scaled and for clipping (2.4.5.) any or all of the colours could be clipped. The digital image formation system described earlier is not without fault. Each subsystem introduces a deformation or degradation to the digital image (e.g. geometrical distortion, noise, nonlinear transformations). The Page 8

12 Chapter 2. MSc. Electronics: Computer Graphics. Sept mathematical modelling of formation system is very important so that image processing algorithms can be developed to combat the degradations. This information is needed for digital image restoration and digital image enhancement. This involves contrast enhancement, digital image shrpening and noise reduction. As noted before images take up a lot of memory or disk space for storage so image processing is used to code and compress images by taking advantage of the redundancy existing in the image. This may be that there are 40 red pixels in a row (as in a large area of red on an image) and whereas this would normally take 120 (40 * 3) bytes to store, it could be stored in just four bytes by using three to store the colour and one to store the number of pixels with that colour. These compression techniques are becoming very important for use in image databases, digital image transmission, facsimile, and high definition television as real time decompression techniques are now possible. Other uses for digital image processing are image and object recognition. The most simple form of this type of image processing is boundary detection using line and edge detection techniques. These techniques are used in computer visualsation which involves creating object models from object pictures. Over the last three decades image processing has been growing tremendously and more recently it has been used to enhance digital images by adding information (objects or effects) that was not in the original image or by altering the image to create an artistic effect. These enhancements include effects such as adding a lens flare simulation, colour filtering and stretching components in an image. Image processing algorithms with a time dependance can be used to create animated effects such as moving particles in a smoke simulation. This projects involves the development of time dependant image processing effects. Page 9

13 Chapter 3. MSc. Electronics: Computer Graphics. Sept Chapter 3. The Development Environment Introduction. This chapter discusses the hardware and software that is needed to complete the project. The hardware is a Pentium based PC running Windows NT 4.0. The reason for using this machine and this operating system is due to the 3-D software chosen for the project. The reasons for using this software are detailed below in section 3.2. Finally section 3.3. gives an introduction to windows programming and highlights some of the important issues that are involved with the project Soft F/X. Soft F/X is one of many 3-D modelling and animation packages available for the PC. It includes separate modules for object modelling, animation and rendering. Figure 3.2.A. show a screen shot of the animator with its top, side and front views of the scene and also a rendered perspective view at the top right. More information on the animator is given below. Fig. 3.2.A. Soft F/X running under Windows NT 4.0 Page10

14 Chapter 3. MSc. Electronics: Computer Graphics. Sept Soft F/X was chosen for this project for a number of reasons:- It has an easy to understand and quick to learn interface. It gives a good opportunity to learn how to program for the Microsoft Windows environment. It is much cheaper than other 3-D software of its type (see figure 3.2.B) Package. Platform. Price. 3D Studio Rel 4 DOS 2,250 Amapi Studio 3 Dual (NT / PowerMac) 349 Cinema 4D Dual (NT / PowerMac) 599 Extreme 3D2 Dual (NT / PowerMac) 399 Soft F/X Pro. Windows NT/ (educational) Fig. 3.2.B. Cost comparisons between popular 3D graphics packages. [CAI97] The Modeller. The modeller looks much like the animator shown in figure 3.2.A. with the standard quad view windows and it is used to 'construct' three dimensional models of real or even imaginary objects. This is done by defining a number of 3-D co-ordinates (vertices) which are linked together (with edges) to form surfaces. It takes three points to make up one triangular face or facet and all models are made up of an arrangement of these facets. The vertices and edges are made using one of the many construction tools like the 3-D pen or by using one of the build primitive object tools The Animator. The animator can be used to set up a scene which may include a ground, a sky and any number of models which have been created with the modeller. A keyframe system like the one described in chapter 2 is used to give any model a motion specification which may be translation, rotation or even scaling. Other time based attributes can be assigned to each model in the animation scene such as modulation or deforming effects. A simple animation would be one in which a number of models are placed in a scene and only the camera is given any keyframed 'instructions'. This type of animation is known as a fly-by as usually all the models are intended to be stationary and the camera flies by or around the models to show what they look like from different angles. The animator is the component of Soft F/X that is used the most during this project as it is this module that all the image processing code will be developed for. The modeller is only used to produce a few models to make the final rendered scenes a little more interesting The Renderer. Page11

15 Chapter 3. MSc. Electronics: Computer Graphics. Sept The renderer can be called from either the modeller or the animator and is used to draw the model or scene respectively. The renderer creates smooth and realistic appearances to models or animation frames by adding texture mapping, shadows and by shading curved surfaces. It is after a frame has been rendered from the animator that the image processing code is run. More details about the way external image processors are incorporated into Soft F/X is given in the next chapter Windows. Windows is a graphical, multitasking operating system. Today there are many versions of windows which plot the development and improvements of the desktop metaphor. Windows 1 was launched towards the end of 1985 and the latest personal computer version (Windows 95) was released at the end of In the corporate sector where large business software is needed, operating system development has been dominated by companies such as IBM but now with the release of windows NT 4.0 (the OS used in this project) Microsoft has become competitive. NT stands for new technology and was developed to take advantage of a 32-bit system architecture. Another advantage with windows NT is its support for non-intel compatible machines. Early versions of NT could run on the Intel, Alpha, MIPS and PowerPC platforms. The next versions of windows due for release are windows 98 and NT 5.0. Windows 98 is being developed to include more easy access to the internet by including the internet explorer 4 as part of the operating system rather than as a separate application. Windows NT 5.0 will support 64-bit data, in line with Intel s future Merced range of processors and naturally Alpha based PC s with their 64-bit architecture will automatically be able to use this new feature [CAI97]. Microsoft developed the windows operating systems for two reasons. The first was to produce an easy to learn and intuitive interface so that users and developers could quickly use and develop standardised applications without having to refer to a manual every couple of minutes. The second reason was to get around the idea of modal operation [MSN97], [PET92]. DOS is a mainly modal operating system in that a user must cancel what they are doing before they can do something else. That is, only one program can be running at any one time. The way Microsoft tackled this problem was to implement a message or event driven system. With this type of system user actions result in an event being picked up by the system. The system passes the event to an appropriate application for processing. Here it is the user that is in control and not the application. Instead of the user code calling OS functions, Windows calls a window procedure in the user code which must react to an event detected by the operating system. This event is messaged to the windows program and the code deals with it accordingly. An example of a message is WM_PAINT which tells the program that its window must be redisplayed, for instance when the window is resized or revealed as another window is minimised. Here the program must control how the window is redisplayed. Other messages include ones that notify of button presses, scrolling activity or program termination [MUR92], [PET92]. Page12

16 Chapter 3. MSc. Electronics: Computer Graphics. Sept Structured programming using C. For this project all programs are written in C and are compiled (built) using the Microsoft Visual C++ developer. Figure 3.3.A. shows what this development environment looks like. Fig. 3.3.A. The MS VC++ developer screen. This screen shot above shows the main part of the developer screen is essentially a glorified text editor. The different blue and green coloured text indicate C keywords and comments respectively. This is where the programs are written and edited. The programs are saved as ASCII text files which can be compiled using any windows ready C compiler. MSVC++ has its own compiler which can be used from the menu bar of the application or run from the command line in a DOS box. For this project the command line option was chosen as a customised make file is provided with the Soft F/X developers kit [SFX97]. There is also a resource editor in MSVC++ which allows the graphical design of dialog boxes, icons and cursors. This is a much faster way to develop good looking interfaces for programs as it would normally have to be described using resource control code. All windows applications written in C have two main functions. The first, WinMain, is the only fixed name function there is in an application s code (excluding the windows API functions which are described in the next section). This function is analogous to main in normal C programming. It defines a function entry point for the windows startup code. An example of a WinMain function is given below along with an explanation of the code and why the WinMain function is so important. Page13

17 Chapter 3. MSc. Electronics: Computer Graphics. Sept int PASCAL WinMain(hInstance,hPrevInstance,lpszCmdLine,nCmdShow){ /*variable definitions*/ HWND hwnd; MSG msg; char *pszclassname = Example ; /*if first instance of code then define window class and register it*/ if (!hprevinstance){ /*first instance?*/ WNDCLASS wc; wc.style =CS_HREDRAW CS_VREDRAW; wc.lpfnwndproc =WndProc; wc.cbclsextra =0; wc.cbwndextra =0; wc.hinstance =hinstance; wc.hicon =NULL; wc.hcursor =LoadCursor(NULL,IDC_ARROW); wc.hbrbackground =COLOR_WINDOW +1; wc.lpszmenuname =NULL; wc.lpszclassname =pszclassname; RegisterClass(&wc); /*register this window class with windows*/ hwnd = CreateWindow(pszClassName, pszclassname, CW_USERDEFAULT, NULL, CW_USERDEFAULT, NULL, NULL, NULL, hinstance; NULL); ShowWindow(hWnd,nCmdShow); UpdateWindow(hWnd); /*Message loop*/ While(GetMessage (&msg,null,0,0)){ TranslateMessage(&msg); DispatchMessage(&msg); return msg.wparam; /*show window*/ Fig 3.3.B. Example of the WinMain function. [MOR93] As can be seen from figure 3.3.B. that there are two parts to the WinMain function:- Page14

18 Chapter 3. MSc. Electronics: Computer Graphics. Sept The set up and registration of the window that the application will run in. The message loop. The set up of the window is used to define it s style or look and the initial position on the screen that it will be displayed. In the example above the initial position is determined at runtime by the operating system but the programmer could explicitly specify screen co-ordinates for its display. The message loop is an infinite loop that continually polls the message queue for events that the application wishes to respond to. This loop will terminate only at the termination of the application as when a WM_DESTROY message is received. The second main function in any windows program is the WndProc or window procedure function. This function form the basis of the application by processing the messages dispatched by WinMain. It runs particular code depending on the message received or it calls a separate function to accomplish this. This is known as message mapping. An example of the WndProc function is shown in figure 3.3.C. LONG FAR PASCAL WndProc(hWnd,nMessage,wParam,lParam){ PAINTSTRUCT ps; char *psztext= Hello World ; RECT rc; switch (nmessage){ case WM_DESTROY: PostQuitMessage(NULL); break; case WM_PANT: GetClientRect(hWnd,&rc); BeginPaint(hWnd,&ps); DrawText(ps.hdc,pszText,-1,&rc,DT_SINGLELINE DT_CENTRE DT_VCENTRE); EndPaint(hWnd,&ps); break; default: return DefWindowProc(hWnd,nMessage,wParam,lParam); return; Fig. 3.3.C. Example of the WndProc function. [MOR93] The function above explicitly processes two windows messages: WM_DESTROY in which the application is terminated. WM_PAINT in which the client area of the window is repainted to display the text Hello World in the centre of the window. Page15

19 Chapter 3. MSc. Electronics: Computer Graphics. Sept This window procedure will no doubt receive other system messages that it does not wish to process explicitly so these messages are processed by the windows function DefWindowProc which processes the messages with the windows default results [HEL92] The Windows API functions and common controls. The application programmers interface (API) is a set of functions defined in the windows header file (windows.h) to allow the application developer to interact with the operating system. The functions form the core of all windows programs as they provide access to common windows controls and all the functionality that labels a program as a windows one. For example the CreateWindow, ShowWindow and UpdateWindow functions used in the WinMain function in the previous section are all API functions. The PostQuitMessage function to terminate a windows application is also an API function. All these functions are used to create, control and destroy programs that are specifically designed to run in the windows environment. Take the case where the programmer wishes to include an edit box in his/her client area or dialog box. The control will have its own identifier by which it can be referenced and the following code could be used to retrieve information from it. if(getdlgitemtext(hwnd,dlg_edit1,str,12)!= 0) sscanf(str,"%d",&variable); [SFX97] If variable is defined as an integer then this piece of code uses the API function GetDlgItemText to get the information from the edit box DLG_EDIT1 and store it as a string of text in the str variable. The sscanf function then scans the string str and converts it to an integer value which is then stored in the integer variable. As this report is not intended to be a reference for all API functions or indeed a programmers guide to windows further discussion of the API is restricted to the functions and structures which were useful in this project. Further interest in the API should conducted with a text on the subject such as one of those listed in appendix E [SCH95], [SCH96]. The following list of controls were used in the development of the image processing modules for this project. Ltext (label) edit box push button radio button bitmap Page16

20 Chapter 3. MSc. Electronics: Computer Graphics. Sept Consider each control in turn. LTEXT. This control is one of the simplest offered by windows. The following resource code will define a label or text sting of up to 255 characters long. It has its own identifier and the co-ordinates of its placement are also defined. LTEXT "Text string to be displayed",idc_static,154,70,50,10 The control can be given further attributes which are defined after the positioning information. For example the next definition will produce a label with a sunken border. LTEXT "<none>",idc_sname,197,136,47,11,ss_sunken This control, like any other, can generate windows messages such as WM_LBUTTONCLK which is generated if the control is clicked on by the left mouse button. It is up to the developer to decide if this message should be processed. i.e. should anything happen if this piece of text is clicked. EDITBOX. This is the control discussed earlier and is one of the most frequently used controls. It is mainly used for data acquisition or to receive user input. Below is an example of the resource code used to set up this type of control. EDITTEXT DLG_FRAME,212,36,31,12,ES_AUTOHSCROLL The ES_AUTOHSCROLL attribute allows the control to automatically scroll horizontally whenever the user types something that would not normally fit in the edit box. Some of the API functions usually associated with an edit box are: GetDlgItem gets the handle for the control GetDlgItemText gets the text from the control and stores it in a character string. GetDlgItemInt returns the integer value of the edit box contents. The reverse of these functions can be carried out using SetDlgItem, SetDlgItemText and SetDlgItemInt. Note that these functions are not restricted for edit box use and can be used with other controls. The reason for this is that one of the parameters passed to these functions is the unique identifier for the control. PUSHBUTTON. This is also one of the most widely used controls as every dialog box should have at least one of them. This is the type of control that is used to create the OK and CANCEL buttons which are common to most windows applications. Page17

21 Chapter 3. MSc. Electronics: Computer Graphics. Sept PUSHBUTTON "&Ok",IDOK,300,65,40,14 These buttons are used to provide the user with the ability to control when things happen. For example the print button or open file button found on the toolbar of most word processors are of this type. RADIOBUTTON. Here is an example of a control which was not one of the original windows standard controls, although it is widely used today. This is the reason why it has to be defined as a CONTROL in the resource script rather than having a keyword of its own. CONTROL "Fixed",DLG_RADFIXED,"Button",BS_AUTORADIOBUTTON WS_GROUP,153,114,29,11 CONTROL "Named",DLG_RADNAMED,"Button",BS_AUTORADIOBUTTON,153,135, 40,12 These two controls are a pair of radiobuttons which have been grouped together. This is so when one button is clicked to be switched on the other is automatically switched off. It can be seen from the definition of these controls that they are derived from the standard button type and can act just as they do. The difference is that the radio button can have an on or off state whereas the push button always returns to the off state once it has been released. BITMAP. This control is used to define a picture elemet in a window. An example of its definition is shown below. CONTROL 109,IDC_STATIC,"Static",SS_BITMAP SS_CENTERIMAGE SS_SUNKEN WS_BORDER, 3,46,138,107,WS_EX_STATICEDGE Again this control is derived from one of the standard controls, in this case STATIC, meaning that it is does not require any special processing other than what is needed for the control from which it is derived [MOR93], [HEL92] Dynamic Link Libraries. A dynamic link library is a set of functions that can be used by an application like any other library of code. The main difference between DLL and LIB Page18

22 Chapter 3. MSc. Electronics: Computer Graphics. Sept (static library) code the fact that DLL code can be loaded and discarded by an application at run time. This is in difference because a static library is linked to the application at compile time and becomes part of its executable code. Dynamic link libraries also provide the following advantages. Sharing of code. A DLL usually contains code that may be useful to a number of applications. Now consider the situation where two windows programs are running at the same time and part of both these programs need to run code that displays an animation on screen. Without DLL s each program would have to have its own separate copy of this animation code loaded into memory. Having this code in a single DLL, however, means that both programs can use the same copy of the code. Easy upgrade of code. Using the example above about the animation code, imagine if three months after a product is released a bug is found in its code for displaying animations. If it were not for dynamic link libraries the whole application would have to be rewritten and recompiled and shipped to the numerous existing customers. Using the DLL will mean that only the DLL code must be changed and recompiled. An example where DLL s are effectively used for easy upgrades is the Visual BASIC runtime libraries. Although each new versions of the libraries are not produced for the purpose of fixing bugs, the fact that more and more new functionality is being provided for VB developers means that new DLL s must be released to support the growing functionality. Functionality when its needed. Take the example of a spelling checker for a word processor. This is a function provided by many word processors but it is code that is not used all the time. If this code was loaded at the time the word processor is started and not used for an hour or even not used at all then it is wasting memory space. Providing these functions in a DLL means that the code can be loaded at run-time. i.e. when it is needed. Once the all the words have been checked and the code is no longer needed the DLL code is discarded to free up those much needed windows resources [HEL92], [MOR93]. Implementing code in a DLL is done in a very similar manner to that of writing a standard windows application except that the WinMain function is not used and instead a DLLMain function is used as the entry point. This function takes care of the loading and discarding of the code and usually checks to see if there is enough system memory for it to be located in. An example of the DLLMain function can be seen in figure 3.3.D. below. BOOL WINAPI DllMain(HANDLE hdll, DWORD dwreason, LPVOID lpreserved){ switch (dwreason) { case DLL_PROCESS_ATTACH: MessageBox(NULL,"Attaching",NULL,MB_OK); hdllinstance = hdll; /* handle to DLL file */ break; Page19

23 Chapter 3. MSc. Electronics: Computer Graphics. Sept case DLL_PROCESS_DETACH: MessageBox(NULL,"Detaching",NULL,MB_OK); break; return (int)true; Fig. 3.3.D. Example of the DLLMain function. [MOR93] The function above displays two message boxes which are standard windows dialog box and is started using the API function MessageBox. The messages indicate when the DLL is loaded (attached) and when it is discarded (detached). The rest of the dynamic link library can be made up of useful functions such as those to ensure all words are spelt correctly in a document. It is this type of dynamically linked code that is used for the implementation of Soft F/X external image processors, external shaders and modellers. This makes sense because obviously not all users will want to include the atmospheric effects developed for this project in all their animations so why should it be loaded every time and not only at the times when it is needed? Page20

24 Chapter 4. MSc. Electronics: Computer Graphics. Sept Chapter 4. External Image Processors & How they are coded into a Soft F/X DLL Introduction. Building external image processing modules for Soft F/X is an example of writing extra code for an application which has already been compiled to an executable program. The design of this application, however, has taken in consideration the fact that additional modules of code can be written and compiled separately for seamless incorporation into the main program. This is accomplished via a dynamic link library as described in chapter 3 of this report. Users of Soft F/X can access the additional code by using one of the menu commands which has been programmed to look for a named DLL file and call one of the standard functions within it. It is these functions that have to be customised to provide the desired image processing effect. The next section details the C functions which must be included in the DLL code to allow user interaction and execution of all post processing effects. The remainder of the chapter deals with the essential Soft F/X structures that must be manipulated and referenced. Also a short example of some code to produce a colour filtering effect is included to explain how the image manipulation is carried out The C functions involved. Each external image processor designed and coded for Soft F/X has two distinct parts:- The setup which displays a dialog box prompting for user parameters. The code that performs the effect. The two parts are implemented with three functions. The prototypes for these functions are provided in figure 4.2.A. char * _SetExternalParameters( char *Op, /* string for the parameters */ HWND hwnd, /* parent window */ long ruler, /* ruler scale value to facilitate scaling */ char *name, /* name of DLL file with the effect */ EVI_MEMORY_MANAGER *lpevi /* pointer to structure with memory functions */); BOOL CALLBACK DlgProc(HWND hwnd,uint msg,wparam wparam,lparam lparam); long _RenderImageProcess(char *PrmList, XIMAGE *lpximage); Fig. 4.2.A. The three functions used by all external image processors. [SFX97] Page21

25 Chapter 4. MSc. Electronics: Computer Graphics. Sept The first function, _SetExternalParameters, is used to call the dialog callback procedure (the second function) and handles the storage of the external user parameters in the animation file of the current animation in Soft F/X. The second function, DlgProc, is used to display a dialog box so that the user can choose values for variables that the image processor will use. It is this function that has all the code that corresponds to windows programming and uses the API functions defined in windows.h to allow programming of standard windows controls such as edit boxes and radio buttons. An example of a dialog box to allow user interaction with an image processor can be seen in figure 4.2.B. This Glow effect is one of the image processors that has been shipped with the full Soft F/X product but it is still implemented as a dynamic link library. The third function, long _RenderImageProcess, is somewhat independent of the other two in that there is no direct interaction between them. That is, this function is called from the renderer after a frame has been drawn but before it has been stored, whilst the other two are called from the animator whenever the image processor is first selected or editing of the user parameters is required. Fig. 4.2.B. The user interface for the Glow image processor. The connection, however, is the parameters stored in the animation file by the _SetExternalParameters function. These are retrieved by the third function and acted upon to execute the actual processing. It can be seen that this function has only two function parameters, the first is the identifier for the user parameters in the animation file, and the second is an identifier for a structure of parameters which describe different animation attributes. This structure and some of the others that the image processors use are described in the Page22

26 Chapter 4. MSc. Electronics: Computer Graphics. Sept next section. Figure 4.2.C. shows the connection between the different pieces of code and data that are used in all image processing for Soft F/X. Fig 4.2.C. Code and data connections in image processing. There is a fourth function DllMain which is called to allow the attachment and detachment of the DLL code as described in section , but for the purpose of coding image processors this function does not need to be altered from the template description. A copy of the template code for building image processors is included as appendix D The C structures involved. As we now know a pixel is made up of a red, a green and a blue component each of which need 8 bits or 1 byte to store their values of The C defined type for a variable with this range is an unsigned char. There is a fourth standard component to the pixel known as the alpha channel, which refers to an optional transparency value. The alpha channel has not yet been made available to Soft F/X and has not been used throughout this project. To make it easy to reference an individual component of a pixel, a user defined structure Screenbuffer is used. This means that to alter the red component, for example, the programmer does not have to do complex mental arithmetic on a 32-bit integer. Figure 4.3.A. shows the definition of a Screenbuffer structure. typedef struct SCREENBUFFER { unsigned char A, /*Alpha channel (not available yet)*/ R, /*Red value range */ G, /*Green value range */ B; /*Blue value range */ fullscreenbuffer; Fig. 4.3.A. Pixel structure for Soft F/X. [SFX97] It can be seen that an image could be represented by an M x N array of these structures. Page23

27 Chapter 4. MSc. Electronics: Computer Graphics. Sept The most important C structure to aid in external image processing is the XIMAGE structure (defined in XIMAGE.h). This structure is shown below for easy reference: typedef struct tagximage { double ViewTransform[4][4]; /* viewing transformation matrix */ double Xscale,Yscale; /* scaling values for camera field of view */ long Frame; /* frame being rendered */ long Xmax,Ymax; /* dimensions of the image */ long Nlights; /* number of lights present */ light *Lights; /* pointer to array of light structures */ long Ntargets; /* not used yet */ vector *Targets; /* not used yet */ fullscreenbuffer *Screen; /* pointer to screen buffer */ double *Zbuffer; /* pointer to Z depth buffer NULL if absent */ fullscreenbuffer *Blur; /* pointer to motion blur buffer */ unsigned char *ObjectBuffer; /* pointer to objects buffer (if present) */ long *AnimatorIdList; /* pointer to list of ID's in object buffer */ long Pad[16]; /* for future expansion */ long Morph; /* 0 = No morph 1 = Morph */ double MorphRatio; /* ratio of morphing 0.0 -> 1.00 */ long first_frame; /* first frame in channel or 1 */ long this_frame; /* frame in anim being rendered */ long last_frame; /* last frame in channel or Nframes */ char *aparameters; /* parameter list for current effect */ char *mparameters; /* morph parameters (morphing from) */ long version; /* version LITE / PRO /DEMO */ XIMAGE; Fig. 4.3.B. The image structure for external image processing in Soft F/X. [SFX97] This structure gives the image process developer all the information about the current frame of an animation that needs to be processed. It can be seen that the image size (Xmax, Ymax) and a pointer to an array of previously described screenbuffer structures is all that is needed to reference any pixel in the image. A piece of code such as the one described in Fig. 4.3.C. could be adapted to perform any manipulation on the value of each pixel. for (j=0;j<lpximage -> Ymax;j++) for (i=0;i<lpximage -> Xmax;i++) { S->R =???; S-> G =???; S-> B =???; S++; /*set red to new value*/ /*set green to new value*/ /*set blue to new value*/ /*point to next pixel*/ Fig 4.3.C. How to reference each pixel in a image and modify the RGB values of it. [SFX97] Page24

28 Chapter 4. MSc. Electronics: Computer Graphics. Sept Note in Fig 4.3.C. :- S is a pointer to an array of SCREENBUFFER structures and is defined and equated by the code: fullscreenbuffer S*; S = lpximage->screen; This array is of dimensions Xmax * Ymax. /*pointer definition*/ /*equated to the info from the lpximage structure */ 4.4. The Filter example. The following piece of code is an example of the image process function. long_renderimageprocess(char *PrmList, XIMAGE *lpximage){ int i,j; double rs,gs,bs; char dummy[255]; fullscreenbuffer *S; /* read the parameters from the parameter list */ sscanf(prmlist,"%s %f %f %f",dummy,&rs,&gs,&bs); /* set local pointer to start of screen buffer */ S=lpXimage->Screen; for(j=0;j<lpximage->ymax;j++) /* do rows in screen buffer */ for(i=0;i<lpximage->xmax;i++) /* do all pixles in row j */ { /* scale the R,G,B values for pixel i,j */ S->R = min(255,(unsigned char)((double)s->r * rs)); S->G = min(255,(unsigned char)((double)s->g * gs)); S->B = min(255,(unsigned char)((double)s->b * bs)); S++; /* point to next pixels in screen buffer */ return 1; /* all done OK */ Fig. 4.4.A. long_renderimageprocess for constant RGB scaling. [SFX97] This particular example reads three floating point (real) numbers from the animation information and scales the red, green and blue colours in every pixel in the rendered images according to the user parameters rs, gs and bs. One thing to note about the code in figure 4.4.A. is that once scaled by a constant each of the RGB values may have a new value outside the range [0,...,255] so it must be clipped at a maximum unsigned integer value of 255. Hence the use of the min Macro. Page25

29 Chapter 5. Snow F/X Introduction. This chapter explains the design and operation of an image processor which produces the illusion of falling snow in an animated scene. The algorithm that was designed for this purpose is explained fully and an example of the output is included. As this is the first image processor that was developed for Soft F/X during this project, it presented the most challenges and a section discussing code debugging, details some of the techniques that can be used to aid in windows and image processing code development Algorithm Design, The basic idea behind the coding of the snow effect is to randomly determine starting positions for a number of snow particles (flakes) and progressively move them all in a generally downwards direction. Using this basic idea and considering some user parameters so that each snow scene can be customised to suit the developer, the following algorithmic structure was developed. 1. Retrieve user parameters. 2. Determine the number of snow particles according to a user defined heaviness value. 3. Allocate random starting positions for a third of the particles. 4. Move each particle from original position according to two user defined values of speed of fall and direction of fall and the frame in the animation. (Consider flakes that fall off the bottom of the images and the generation of new flakes at the top of it). 5. Draw the flakes with either one pixel for small flakes or an number of pixeis for larger flakes. 6. Repeat 3, 4 & 5 for a second third of the particles but use slightly different speed and direction values. 7. Repeat 3, 4 & 5 for the final third of the snow flakes with again slightly differing values for speed and direction. The point to this repetition is to produce a 3-D look of parallax. To explain this algorithm each section is discussed below and reference is made to the code listing In appendix A. 1. Retrieve user parameters. This is the first part to any image processing that uses one or more user defined values to control the operation of the effect. Firstly, a dialog box was designed with the resource editor in Microsoft s Visual C++ developer. This is shown in figure 5.2.A. below. As detailed in the previous chapter the management of this dialog box is achieved via the DigProc caliback function. Once the user parameters are stored in the animation this way they need to be retrieved. The parameters Short section missing

30 3. Allocate random starting position for a third of the snow particles. A for loop is set up to run the following code several times according to the np value calculated earlier. for (inp=o;inp<np/3;inp++){ i=((ipximage->xmax-l)*(rando))/rand - MAX+1; j=((ipximage->ymaxl)*(rando))/rand-max+1; The rand function returns a random number in the range [0,...,RAND_MAXI (RAND-MAX is a constant value which has been defined in math.h) so to find a random number between 0 and 1 the rand result must be divided by the RAND-MAX value. 4. Move each particle. Because any frame in an animation can be rendered independently of all the others there is no way in tracking the position of each flake from frame to frame. However, if some initial reference point could be determined time and time again then the position of any flake at any time could be determined using this reference point displaced by a frame dependant amount. As it happens if the seed for the random number generator, rand, is kept the same from frame to frame the same random numbers will be selected by the computer each time. The default seed for the random number generator is 1 but it can be changed using the srand function. This means because the rand function is only pseudo random an identical set of random numbers can be generated each frame. So the new position for the flake does not depend on where it was last frame but rather on the i and j position determined in part 3. Hence the new positions can be determined by considering the speed and direct values as below. j=j+((ipximage->this - frame)*speed); i=i+((ipximage->this_frame)*direct); This formula works up to the point when either i or j goes beyond the limits of the image. So consideration must be given to the particles that fall off the bottom of the image or are blown off the edge of the image. The next piece of code takes care of this problem and rather conveniently

31 takes care of the problem of producing new snow flakes at the top of the images. It does this by wrapping the flakes that fall off the bottom back up to the top of the image. while 0>=IpXimage->Ymax){ j=j-ipximage->ymax; while (i>=ipximage->xmax){ i=i-lpximage->xmax; while (i<o){ i=i+ipximage->xmax; 5. Draw the flakes. To draw the flakes on the image the colour value of the pixei(s) the flake covers must be changed to show the white of the snow. The following line of code allow the programmer to access any pixel (i,j) in the image and change the red, green or blue value of it. (S+O*(IpXimage->Xmax))+i)->&?=***., where S is a Screenbuffer (described in chapter 4) pointer and points to the array of pixeis in the image the? symbol denotes either R for red, G for green or B for blue and the *** denotes a new value for the colour. [0, ]. The code below shows how the middle of the snow flake was drawn in a greyish colour and the surrounding pixeis were drawn white. Note the neighbouring pixeis are only drawn if the centre pixel is not at an image edge, hence the use of the if condition. The small bit of grey helps to give the flake a better shape in a similar manner to anti-aiiasing lines or edges in an image. (S+O*(IpXimage->Xmax))+i)->R=200; (S+O*(IpXimage->Xmax))+i)->G=200; (S+O*(IpXimage->Xmax))+i)->B=200; if (i>o && i<ipximage->xmax && j>o && j<lpximage->ymax-l){ (S+O*(IpXimage->Xmax))+i+l)->R=255; (S+O*(IpXimage->Xmax))+i+l)->G=255; (S+O*(IpXimage->Xmax))+i+l)->B=255; (S+O*(IpXimage->Xmax))+i-1)->R=255; (S+O*(IpXimage->Xmax))+i-1)->G=255; (S+O*(IpXimage->Xmax))+i-1)->B=255; (S+(0+1)*(IpXimage->Xmax))+i)->R=255; (S+(0+1)*(IpXimage->Xmax))+i)->G=255; (S+(0+1)*(IpXimage->Xmax))+i)->B=255;

32 6 & 7. Repeating. (S+(0-1)*(IpXimage->Xmax))+i)->R=255; (S+(0-1)*(IpXimage->Xmax))+i)->G=255; (S+(0-1)*(IpXimage->Xmax))+i)->B=255; (Scanner failed to resolve a few paragraphs): use of letting the developer see if the program actually got to this point before crashing. A simple message box that displays the message "No problem so far" can be coded as follows. MessageBox(NULL,"No problem so far","message titie",mb_ok); Note: This function is another AP] function. A development of this idea is to display some useful information in the message box, such as the value of some variable(s). The following code can be used to display the integer value of the variable test, where the str variable is a character string. sprintf(str,"%d",test); MessageBox(NULL,test,"Message titie",mb_ok); Here the text in the message box is the value of the variable test. This method can be used to monitor any variables in the code at any point in its execution simply by inserting the message box code where it is needed. Message Beep. This is much the same as the message box except that no useful information can be displayed on the screen. It is a simple way of notifying the developer of actions taken by the program. For example if the program has an if expression to find out which course is taken by including a message beep in one part of the conditional code and not the other. If the system beeps then it is known which way through the code was taken.

33 Another use for message beep is to monitor the number of iterations in a variable controlled for loop. This however becomes impractical for large numbers of iterations as it involves counting the number of times the system beeps. #Ifdef 0... #endif. These are C preprocessor directives that can be used to conditionally include or exclude large blocks of code. It is very useful for tracking down the very line of code that is presenting the problem. Any code between these directives is not compiled and so does not feature in the final code that is run. So if the code runs without crashing the bug cannot be between these directives. If, however, the program does crash then the bug is between them and the range of code between the conditional inclusions can be reduced to further narrow down the search for the offending line of code and so on. One of the most common windows program bugs is the invalid pointer or access violation error. This occurs when a program tries to access a memory location that it has no right to. For example, a program that has a pointer to an array of ten integers can change the value of any of its elements by referencing it with the pointer plus an offset. If the pointer is given an offset of eleven the program will try to access an element beyond the range of the array an it will generate an access violation error or a general protection fault (GPF). This is a risk that the programmer takes when using C pointers because a pointer can be manipulated like any other variable and so can be given a value that makes it point to something it shouldn t. This type of error cannot be debugged using the message box because to display the value of the pointer would be meaningless. So whatever is controlling the pointer variable must be checked. There are several debugging tools that can be used such as the integrated debugging environment within MS Visual C++ developer or Boriand s Turbo debugger for Windows (TDR, but it is suggested that these tools are used as a last resort. Remember that debugging requires visualisation of what a program should do, looking at what it actually does and analysing the two to try and make a match. This is best done by the developer as he/she is the only one that can actually visualise what the program should actually do [HEL921, [MUR921.

34 Chapter 6. MSc. Electronics: Computer Graphics. Sept Chapter 6. Rain F/X Introduction. Producing a rain effect is similar to that of a snow effect as they both share a number of the same controlling parameters and the tracking of each rain droplet from frame to frame can be done in the same way. The next section shows the algorithm followed to code the rain effect. The similarities can be seen from this algorithm but the differences and extra considerations are also highlighted Algorithm Design. Considering the fact that rain drops are semi-transparent, in that objects behind the rain are not obscured and that the rain must also be seen to exist and fall, the following algorithm was developed. 1. Retrieve user parameters. 2. Determine the number of rain drops according to a user defined heaviness value. 3. Determine the orientation of the rain drop from the user defined direction. 4. Allocate random starting positions for a half of the drops. 5. Move each drop from original position according to two user defined values of speed of fall and direction of fall and the frame in the animation. (Consider drops that fall off the bottom of the screen and the generation of new drops at the top of it). 6. Determine the colour of each pixel in the rain drop from the colour of the pixel that the rain drop is replacing. (This is to produce the transparency). 7. Repeat 3, 4, 5 & 6 for a the second half of the particles but use slightly different speed and direction values. To explain this algorithm each section is discussed below and reference is made to the code listing In appendix B. 1. Retrieve user parameters. Again, a dialog box was designed with the resource editor in Microsoft s Visual C++ developer. This is shown in figure 6.2.A. below. As can be seen this dialog box is similar to the Snow F/X interface and it accepts the same parameters. The code is also the same to read the user parameters from the character string into the appropriate temporary variables, using the sscanf function. 2. Determine the number of rain drops. Again the method for determining the amount of rain covering the image is the same for the snow effect. Page33

35 Chapter 6. MSc. Electronics: Computer Graphics. Sept np=(long)(((double)prob/500.0)*((double)(lpximage->xmax)* (double)(lpximage->ymax))); Now we have the variable np as the for loop counter when finding the position of the rain drops and drawing them into the frame buffer. Fig. 6.2.A. The user interface for the Rain F/X image processor. 3. Determine the orientation of the drop from the user defined direction. As the rain drops are to be represented by a short line, the direction the rain is falling in will define which pixels need to be drawn. See figure 6.2.B. for the differing orientations of the rain drops. direct < 0 direct == 0 direct > 0 (a) (b) (c) Fig. 6.2.B. The different orientations of the rain drops. Figure 6.2.B. also shows the fact that depending on what colour the rain drop is falling past its appearance will differ. There is more about this transparency characteristic in part six of this algorithm when the drops are drawn. 4. Allocate random starting positions for a half of the drops. The position of one pixel in the rain drop (the one second from the top of the drop) is determined in the same way as the centre pixel of the snow flakes in the previous image processor.. i=((lpximage->xmax-1)*(rand()))/rand_max+1; j=((lpximage->ymax-1)*(rand()))/rand_max+1; 5. Move each drop from original position. Page34

36 Chapter 6. MSc. Electronics: Computer Graphics. Sept Here again a linear formula is used to determine the position of the rain drops at any frame in the animation. This allows rendering of any frame (with rain) independently... j=j+((lpximage->this_frame)*speed); i=i+((lpximage->this_frame)*direct); while (j>=lpximage->ymax){ j=j-lpximage->ymax; while (i>=lpximage->xmax){ i=i-lpximage->xmax; while (i<0){ i=i+lpximage->xmax;.. Also again the above code remedies the problem of rain falling off the bottom of the image by wrapping it round to the top. 6. Determine the colour of each pixel in the rain drop and draw it. Whereas snow is generally white, rain is semi-transparent so consideration must be given to the original colour of the pixel that the rain is to overwrite. This means that the pixel must be read, the new colours computed and the pixel rewritten. Luckily with C all these functions can accomplished in a single line of code for each colour. The C assignment operator = allows the read and rewrite of a variable in a single statement, so the psuedo code for determining the colour of the pixels is:- pixel colour = ((pixel colour) * (factor)) + rain colour; Here it can be seen that the original colour is decreased slightly by a factor and a little grey (rain colour) is added to produced the final colour of the pixels. This allows the original colour to show through the colour of the rain. Refer to figure 6.2.B. to see the results of using this colour choosing code. Also as this is a zoomed in part of a picture refer to figure 6.3.A. for a full picture incorporating rain. The full C code for drawing a rain pixel is:-. (S+(j*(lpXimage->Xmax))+i)->R= min(((s+(j*(lpximage->xmax))+i)->r*factor)+rcol,255); (S+(j*(lpXimage->Xmax))+i)->G= min(((s+(j*(lpximage->xmax))+i)->g*factor)+rcol,255); (S+(j*(lpXimage->Xmax))+i)->B= min(((s+(j*(lpximage->xmax))+i)->b*factor)+rcol,255);.. Note that the min macro must be used to cater for the case where the red, green or blue value, once altered, might become greater than 255. As much of this colour altering code is repeated throughout the rain image processor code the FACTOR and RCOL values have been defined at the top of the code with the #define preprocessor directive. This allows fast experimenting Page35

37 Chapter 6. MSc. Electronics: Computer Graphics. Sept with different values for the factor and rain colour without having to change the code in every single pixel manipulation line of code. 7. Repetition. The rain generation is repeated for the other half of the droplets with the following differences. The speed and direction values are changed to reflect the parallax illusion. The rain drops are drawn with only three pixels instead of four Results. Figure 6.3.A. shows a single frame from an animation which has been defined to include the Rain F/X image processor. It can clearly be seen that there are drops of different colours falling past the red, green and blue parts of the scene. Unfortunately as with the snow it would take several frames to show the full effect which can not be appreciated fully on paper. Fig. 6.3.A. A single frame from a rainy scene animation. Page36

38 Chapter 7. MSc. Electronics: Computer Graphics. Sept Chapter 7. Pyro F/X Introduction. This image processor presents more of a challenge than the previous two as there is much more windows programming involved and it considers the full 3- D scene that is rendered by Soft F/X, rather than just drawing on top of the image. The motion of the rockets and the sparks in the explosions is based on Newton s laws of motions. This three dimensional projectile trajectory is computed at each frame and the position of the rocket or spark is transformed into two dimensions depending on the camera parameters (optics). This piece of code also considers the Z- buffer which is a matrix of real values of the same size as the image matrix. i.e. there is one Z- buffer entry for each pixel. The values stored in the elements of this buffer represent the distance away from the viewpoint (camera) the object that the pixel covers is. The involvement of the Z- buffer allows the testing of each new pixel to be drawn against the pixel that it is to replace to see which is closest to the camera and hence should be drawn. For example if the pixel at (i,j) has a Z value of a and the computed distance away from the camera of a firework spark is b, then if a is greater than b the spark should be drawn, otherwise it is obscured by whatever is in the scene and should not be drawn. This and all the other attributes of the pyrotechnics effect are explained below Algorithm Design. The following algorithm was designed to implement the fireworks effect. Following this algorithm is a detailed description of each step and how it was coded in C. 1. Retrieve user parameters. 2. Determine the RGB components of the colour selected by the user. 3. Find 3-D co-ordinates of the starting and explosion positions. 4. Determine the initial upward velocity of the rocket, the time that the current frame represents and the time of explosion of the rocket. 5. If time is between launch time and explosion time find the 3-D coordinates of the rocket at current time. 5.a. Scale the x, y and z co-ordinates to transformable coordinates. 5.b. Transform co-ordinates so that they are referred to an origin at the camera. 5.c. Draw the rocket at the corresponding image location. 6. If time is after the explosion time start the explosion redefine the time reference to start at time of explosion. 6.a. For each spark find random angles to describe initial direction of spark. Hence find initial velocity of spark. 6.b. Find the 3-D position of spark and all its trails, scale and transform them. Consider ground detection. 6.c. Draw the spark and trails at the corresponding image location. Page37

39 Chapter 7. MSc. Electronics: Computer Graphics. Sept If frame is one of the five directly proceeding the start of the explosion then add a little fading white to all the pixels in the image to produce a flash effect. 1. Retrieve user parameters. As with all image processors a dialog box is used to allow the user to customise the effect. The interface for this effect can be seen in figure 7.2.A. below. As can be seen there are many more parameters to take account of in this effect compared to either of the previous two. There are many of the same dialog controls present, but there are a few more which must be explained. This is covered at the end of this chapter. Fig. 7.2.A. The user interface for the pyro F/X image processor. Again the sscanf function is used to scan the user parameters from the parameter string in the animation file. A list of all these parameters is given here with a short explanation of the purpose of each. seed launchframe expmax no_sparks used to seed the random number generator so that each firework does not explode in the exact same manner. Indicates the frame at which the rocket is to launch The maximum height that a spark should rise from the explosion position. (Convenient way to describe the size or force of the explosion) The number of sparks in the explosion. Page38

40 Chapter 7. MSc. Electronics: Computer Graphics. Sept colour The 32-bit value of the firework s colour. Xstart Ystart Zstart Xexp Yexp Zexp ground id1 id2 source target type ruler The x co-ordinate of the launch position. The y co-ordinate of the launch position. The z co-ordinate of the launch position. The x co-ordinate of the explosion position. The y co-ordinate of the explosion position. The z co-ordinate of the explosion position. The height (z-value) of the ground in the animation scene identifier for a dummy light at the named launch position. identifier for a dummy light at the named explosion position. True or false value indicating if the launch position is named or fixed. True or false value indicating if the explosion position is named or fixed. The type of firework (normal, twinkle or fountain) Used for scaling user co-ordinates. 2. Determine RGB components of the colour. A section at the end of this chapter details how the colour of the firework is chosen from the dialog box but for now, we will assume that we are passes a 32-bit value which describes the red green and blue components of a colour. The following piece of code decodes the colour value into its R, G and B parts. redval=getrvalue(colour); greenval=getgvalue(colour); blueval=getbvalue(colour); redval, greenval and blueval are three char type variables and are used later to draw the colours in the explosion of the firework. The three functions used in this piece of code are all API functions provided by the windows header file. 3. Find 3-D co-ordinates of launch and explosion positions. This is done in one of two ways depending on whether the user has selected named or fixed launch and explosion points. If fixed positions are used then the x, y and z co-ordinates of the launch and explosion points are already loaded into the Xstart, Ystart, Zstart, Xexp, Yexp, Zexp variables. If however named dummy lights are used then the vector positions of the lights must be obtained from the lpximage structure with the following code. /* Is source selected rather than defined? if so find 3-D position */ Page39

41 Chapter 7. MSc. Electronics: Computer Graphics. Sept if(source==false){ for(lightcount=0;lightcount<lpximage->nlights;lightcount++){ if(lightptr->animatorid==id1){ Xstart=(lightptr->pin[0])/(double)ruler; Ystart=(lightptr->pin[1])/(double)ruler; Zstart=(lightptr->pin[2])/(double)ruler; lightptr++; /* Is target selected rather than defined? if so find 3-D position */ if(target==false){ for(lightcount=0;lightcount<lpximage->nlights;lightcount++){ if(lightptr->animatorid==id2){ Xexp=(lightptr->pin[0])/(double)ruler; Yexp=(lightptr->pin[1])/(double)ruler; Zexp=(lightptr->pin[2])/(double)ruler; lightptr++; The lightptr variable is a pointer to an array of light structures and is equated by: lightptr=lpximage->lights; This code tests each light included in the animation against the identification (id1 or id2 for launch and explosion positions respectively) tag retrieved from the dialog box at the setup time. 4. Initial velocity, current time and time of explosion. The following formulae assume that the gravitational constant g is equal to 10. This will be true as long as the user has scaled the components in his/her animation to allow one user unit to equal one metre. As it stands there is no automatic way to ensure this. The determination of the position of the rocket at any time depends upon its initial velocity. To compute an initial vertical velocity for the rocket we must know how high do we want it to explode. This vertical distance is the value of Zexp-Zstart (the explosion height - the launch height). If we use the standard notation of Newton s laws of motion this displacement is denoted as s. Also using this notation the value 10 for g becomes -a. If v is the velocity of the rocket at any time and u is the initial velocity then the expression: v 2 = u 2 +2as (7.2.1.) can be used to describe the velocity of the rocket at any time. Since we want to know the initial velocity we can insert values corresponding to the apex of the trajectory (i.e. at the position of the explosion) into equation At this point the vertical velocity v will be zero, the vertical displacement s will be Zexp-Zstart and a will be -10. So we have: Page40

42 Chapter 7. MSc. Electronics: Computer Graphics. Sept = u 2 +2(-10(Zexp-Zstart)) (7.2.2.) rearranging this expression we can calculate a value for u : u 2 = 20(Zexp-Zstart) (7.2.3.) So: u = 20(Zexp-Zstart) (7.2.4.) This equation can be rewritten as: u = 10( ((Zexp-Zstart)/5) [QUA72] Translating directly into C code this becomes: velocity=(10.0*(sqrt((zexp-zstart)/5.0))); This is only the vertical component of the rocket s initial velocity but as the displacement of the rocket is linear in both the X and Y directions the initial velocity in these directions is not needed once the explosion time has been determined. See part 5 for the determination of the X and Y displacements. We now need a time reference. The frame of the animation is an obvious starting point, however this alone does not prove to be an acceptable scale to use. Experimentation showed that if the frame was divided by four this provided a good time reference. So that time is set to zero at the launch time of the firework the launchframe parameter must be taken away from the current frame to derive the correct time value. This is accomplished In C as follows. time=(((double)(lpximage->this_frame))-((double)(launchframe)))/4.0; Now we must find the time at which the firework explodes. We know that at the time of explosion v is zero and from Newton s equations we have, v = u + at (7.2.5.) So rearranging and substituting: t = u/10 (7.2.6.) As we have already calculated u (velocity) we can use it again. For the rest of the processing we want the explosion time to be the exact frame that the firework explodes so we must consider the launch frame and the divide by four factor for the time reference. With this we have, t_of_explode=((long)((velocity/10.0)*(4.0))+(double)(launchframe)); Now that we know the explosion time the rest of the processor can be divided into two parts. The first is plotting the trajectory of the rocket and the second Page41

43 Chapter 7. MSc. Electronics: Computer Graphics. Sept is the actual explosion. A simple test on the frame against the explosion time will decide to execute part 5 or part 6 of the algorithm. 5. Rocket position at current time. This part of the algorithm is executed if the current frame is between the launch frame and the time of explosion. The X and Y positions are computed by a simple linear interpolation between the launch frame and the explosion frame, whereas the Z position is calculated with another one of Newton s laws: s = ut + at 2 The three line of code to determine the X, Y and Z positions of the rocket are: Xuser=Xstart+(time*((Xexp-Xstart)/((double)(t_of_explodelaunchframe)/4.0))); Yuser=Ystart+(time*((Yexp-Ystart)/((double)(t_of_explodelaunchframe)/4.0))); Zuser=Zstart+(((velocity)*(time))-((5.0)*(time)*(time))); 5.a. Scale co-ordinates. Once the X, Y and Z co-ordinates are found they must be scaled with the ruler value from user co-ordinates to world co-ordinates so that they can be transformed to be referenced with respect to the position of the camera. Xpos=Xuser*(double)ruler; Ypos=Yuser*(double)ruler; Zpos=Zuser*(double)ruler; 5.b. Transform co-ordinates. Now the co-ordinates are transformed with a function provided by Soft F/X so that they can be translated into 2-D co-ordinates for the image. Once the x and y image co-ordinates are found the position of the rocket can be drawn in the frame buffer. TransformIntoView(lpXimage->ViewTransform, Xpos,Ypos,Zpos, &Xview,&Yview,&Zview); 5.c. Draw rocket in frame buffer. To draw the position of the rocket in the frame buffer we must determine if it can actually be seen or not. That is if it is in front of the camera and within its field of view. This is accomplished with the following code: /*Draw in buffer the position of the rocket*/ if (Yview>=1.000){/*all points in front of viewer */ /*scale to appear in buffer (image size is Xmax by Ymax) */ xx=(double)lpximage->xmax/2.0 + (lpximage->xscale*xview/yview); yy=(double)lpximage->ymax/2.0 - (lpximage->yscale*zview/yview); if ((long)xx>=0 && (long)xx<lpximage->xmax &&(long)yy>=0 && Page42

44 Chapter 7. MSc. Electronics: Computer Graphics. Sept (long)yy<lpximage->ymax){ i=(long)xx; j=(long)yy; /*check the Z-buffer to see if anything is in the way*/ if((float)yview<*(z+(j*(lpximage->xmax))+i)){ (S+(j*(lpXimage->Xmax))+i)->R=255; (S+(j*(lpXimage->Xmax))+i)->G=255; (S+(j*(lpXimage->Xmax))+i)->B=255; *(Z+(j*(lpXimage->Xmax))+i)=Yview; /*update the Z-buffer with the new depth value*/ As can be seen here the Z-buffer is tested to see if anything is in front of the rocket s position and the rocket is only drawn if it is unobscured. If it is drawn the Z-buffer must be updated so that if any other fireworks or indeed any other effect using the Z-buffer, needs to test this pixel afterwards. 6. Start explosion. This part of the algorithm is executed after the explosion time and becomes unnoticeable once the colour of the firework has faded to the background colours of the image. A new time reference is needed so that Newton s equations can be used again. Time needs to be reset to zero at the time of the explosion. time=(((double)(lpximage->this_frame))-((double)(t_of_explode)))/4.0; Now for each spark in the explosion a 3-D position must be calculated. 6.a. Find random angles for spark initial direction. Before the position of each spark can be found a random direction and the initial force (velocity) of the explosion must be defined. The direction is defined by two angles θ and φ. These angles are determined randomly using the same rand function that was used in both the rain and snow effects. theta=(((double)(rand())*(2.0*pi))/(double)(rand_max+1)); phi=(((double)(rand())*(2.0*pi))/(double)(rand_max+1)); The explosion velocity is found in much the same way as the rocket s initial velocity, except the expvel value is used as a scale. This value was found from the user specified size parameter. expvel=(10.0*(sqrt(expmax/5.0))) ; 6.b. Find spark and trail positions. The X, Y and Z co-ordinates of each spark and each trail is computed using Newton s equation: s = ut + at 2 Page43

45 Chapter 7. MSc. Electronics: Computer Graphics. Sept which has been developed for use in three dimensions using the sine and cosine of the two angles. for(trails=0;trails<10;trails++){ Xuser=Xexp+((expvel)*(cos(phi))*(cos(theta))*(time+( (double)(trails)/30.0))); Yuser=Yexp+((expvel)*(cos(phi))*(sin(theta))*(time+( (double)(trails)/30.0))); Zuser=Zexp+(((expvel)*(sin(phi))*(time+((double) (trails)/30.0)))-((1.8)*(time+((double) (trails)/30.0))*(time+((double) (trails)/30.0)))); if (Zuser<ground){ /*ground detection*/ Zuser=ground;.. It can be seen here that consideration of the ground level has been included to ensure that sparks that fall to the ground, do not fall through it but rather land on it and flow across it. 6.c. Draw spark and trails in frame buffer. Drawing the explosion for the fireworks is implemented in much the same way as drawing the snow, rain and the rocket position, but the clever part is the colour choosing code. This code determines the colour of the sparks depending on the user specified code and the length of time since the start of the explosion. This means that the colour of the sparks can slowly be reduced to that of the colour in the background to accomplish a fading effect. As can be imagined there is not a linear cooling effect in all the sparks within an exploding firework, so here the colour change has a random factor which allows some sparks to cool faster than others. The code to draw just one of the sparks is shown below. (S+(j*(lpXimage->Xmax))+i)->R=max((S+(j*(lpXimage->Xmax))+i)- >R,redval-(factor*time)); (S+(j*(lpXimage->Xmax))+i)->G=max((S+(j*(lpXimage->Xmax))+i)- >G,greenval-(factor*time)); (S+(j*(lpXimage->Xmax))+i)->B=max((S+(j*(lpXimage->Xmax))+i)- >B,blueval-(factor*time)); *(Z+(j*(lpXimage->Xmax))+i)=Yview; The factor variable in the above code is pre calculated with the following line of code: factor=((20*rand())/rand_max+1)+15; which randomly chooses an integer value between 15 and 35. Page44

46 Chapter 7. MSc. Electronics: Computer Graphics. Sept Hence, the colour is either the colour of the original pixel (background colour) or a factor of the colour chosen by the user. This programs a fade to background which is what essentially happens in a real firework explosion. 7. Implement flash. At the explosion time a little white is added to each of the pixels in the animation to create a flash effect of the explosion. The little white that is added fades away in nearly the same way the colour of the sparks do, but in a more linear fashion. /*code that produces the initial flash of the firework*/ if(lpximage->this_frame>t_of_explode+1 && lpximage- >this_frame<t_of_explode+6){ for(j=0;j<lpximage->ymax-1;j++) /* do rows in screen buffer */ for(i=0;i<lpximage->xmax;i++){ /* do all pixels in row j */ /* add a little white for pixel i,j to give the illusion of a flash*/ S->R = min(255,(unsigned char)((double)s->r +(5.0/time))); S->G = min(255,(unsigned char)((double)s->g +(5.0/time))); S->B = min(255,(unsigned char)((double)s->b +(5.0/time))); S++; /* point to next pixels in screen buffer */ This piece of code is much like the filter example given in chapter 4 as it processes each pixel in the image. The min macro is used again as not to create an overflow in any of the R, G or B channels, but this causes the unwanted effect of changing to the wrong the colour of the pixel if one or two of the colours have to be clipped. This side effect will only be present for at most one frame and as there are 30 frames in every second of an animation this becomes unnoticeable Results. The picture in figure 7.4.A., although quite dark (because its supposed to be at night), shows the explosions of four fireworks which were added to an animation using the fireworks effect once for each firework. The image also shows where the green firework is falling behind the bridge and so is obscured in some parts. Page45

47 Chapter 7. MSc. Electronics: Computer Graphics. Sept Fig. 7.4.A. Example output from the Pyro F/X image processors Further discussion of the user colour choosing code. Instead of just allowing the user to type red, green and blue values for the colour of the firework into edit boxes the following pieces of code allow the user to click on the colour box and call the windows standard colour choosing dialog box. The following case statement is part of the dialog message processing of the message WM_COMMAND and corresponds to the actions taken when the user clicks the colour box (identified by DLG_COLOUR). case DLG_COLOUR: cs.lstructsize=sizeof(choosecolor); cs.hwndowner=hwnd; cs.lpcustcolors =(LPDWORD) acrcustclr; cs.flags=cc_fullopen; ChooseColor(&cs); colour=cs.rgbresult; FillColourRect(hwnd,colour); return (TRUE); The first part of the code fills in some important values in a colour structure cs, which is one of the standard structures provided by the windows API [PET92]. It then calls the ChooseColor function which displays the colour dialog box and allows the user to choose any 24-bit true colour. The colour the user chooses is then stored in the rgbresult field of the cs structure. The colour is retrieved from this structure to be used later in the processing code as explained above. The next step is to call the FillColourRect function, which is not an API function but written especially for this project. This function is detailed below. FillColourRect(HWND hwnd, DWORD colour){ Page46

We display some text in the middle of a window, and see how the text remains there whenever the window is re-sized or moved.

We display some text in the middle of a window, and see how the text remains there whenever the window is re-sized or moved. 1 Programming Windows Terry Marris January 2013 2 Hello Windows We display some text in the middle of a window, and see how the text remains there whenever the window is re-sized or moved. 2.1 Hello Windows

More information

Window programming. Programming

Window programming. Programming Window programming 1 Objectives Understand the mechanism of window programming Understand the concept and usage of of callback functions Create a simple application 2 Overview Windows system Hello world!

More information

Game Programming I. Introduction to Windows Programming. Sample Program hello.cpp. 5 th Week,

Game Programming I. Introduction to Windows Programming. Sample Program hello.cpp. 5 th Week, Game Programming I Introduction to Windows Programming 5 th Week, 2007 Sample Program hello.cpp Microsoft Visual Studio.Net File New Project Visual C++ Win32 Win32 Project Application Settings Empty project

More information

/*********************************************************************

/********************************************************************* Appendix A Program Process.c This application will send X, Y, Z, and W end points to the Mx4 card using the C/C++ DLL, MX495.DLL. The functions mainly used are monitor_var, change_var, and var. The algorithm

More information

Introduction to Computer Graphics

Introduction to Computer Graphics Introduction to 1.1 What is computer graphics? it would be difficult to overstate the importance of computer and communication technologies in our lives. Activities as wide-ranging as film making, publishing,

More information

Screen Designer. The Power of Ultimate Design. 43-TV GLO Issue 2 01/01 UK

Screen Designer. The Power of Ultimate Design. 43-TV GLO Issue 2 01/01 UK Screen Designer The Power of Ultimate Design 43-TV-25-13 GLO Issue 2 01/01 UK 43-TV-25-13 GLO Issue 2 01/01 UK Table of Contents Table of Contents Honeywell Screen Designer - The Power of Ultimate Design

More information

Overview of Computer Graphics

Overview of Computer Graphics Application of Computer Graphics UNIT- 1 Overview of Computer Graphics Computer-Aided Design for engineering and architectural systems etc. Objects maybe displayed in a wireframe outline form. Multi-window

More information

0. Introduction: What is Computer Graphics? 1. Basics of scan conversion (line drawing) 2. Representing 2D curves

0. Introduction: What is Computer Graphics? 1. Basics of scan conversion (line drawing) 2. Representing 2D curves CSC 418/2504: Computer Graphics Course web site (includes course information sheet): http://www.dgp.toronto.edu/~elf Instructor: Eugene Fiume Office: BA 5266 Phone: 416 978 5472 (not a reliable way) Email:

More information

Autodesk Fusion 360: Render. Overview

Autodesk Fusion 360: Render. Overview Overview Rendering is the process of generating an image by combining geometry, camera, texture, lighting and shading (also called materials) information using a computer program. Before an image can be

More information

Computer Graphics. Chapter 1 (Related to Introduction to Computer Graphics Using Java 2D and 3D)

Computer Graphics. Chapter 1 (Related to Introduction to Computer Graphics Using Java 2D and 3D) Computer Graphics Chapter 1 (Related to Introduction to Computer Graphics Using Java 2D and 3D) Introduction Applications of Computer Graphics: 1) Display of Information 2) Design 3) Simulation 4) User

More information

TxWin 5.xx Programming and User Guide

TxWin 5.xx Programming and User Guide TxWin 5.xx Programming and User Guide Jan van Wijk Brief programming and user guide for the open-source TxWin text UI library Presentation contents Interfacing, include files, LIBs The message event model

More information

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22) Digital Image Processing Prof. P. K. Biswas Department of Electronics and Electrical Communications Engineering Indian Institute of Technology, Kharagpur Module Number 01 Lecture Number 02 Application

More information

Windows and Messages. Creating the Window

Windows and Messages. Creating the Window Windows and Messages In the first two chapters, the sample programs used the MessageBox function to deliver text output to the user. The MessageBox function creates a "window." In Windows, the word "window"

More information

Chapter 15 Programming Paradigm

Chapter 15 Programming Paradigm Chapter 15 Programming Paradigm A Windows program, like any other interactive program, is for the most part inputdriven. However, the input of a Windows program is conveniently predigested by the operating

More information

Computer Graphics Lecture 2

Computer Graphics Lecture 2 1 / 16 Computer Graphics Lecture 2 Dr. Marc Eduard Frîncu West University of Timisoara Feb 28th 2012 2 / 16 Outline 1 Graphics System Graphics Devices Frame Buffer 2 Rendering pipeline 3 Logical Devices

More information

In Depth: Writer. The word processor is arguably the most popular element within any office suite. That. Formatting Text CHAPTER 23

In Depth: Writer. The word processor is arguably the most popular element within any office suite. That. Formatting Text CHAPTER 23 CHAPTER 23 In Depth: Writer The word processor is arguably the most popular element within any office suite. That said, you ll be happy to know that OpenOffice.org s Writer component doesn t skimp on features.

More information

Lecture Week 4. Images

Lecture Week 4. Images Lecture Week 4 Images Images can be used: As a backdrop behind text to create a pictorial framework for the text. As a background for the content. As an icon to represent options that can be selected.

More information

FACULTY AND STAFF COMPUTER FOOTHILL-DE ANZA. Office Graphics

FACULTY AND STAFF COMPUTER FOOTHILL-DE ANZA. Office Graphics FACULTY AND STAFF COMPUTER TRAINING @ FOOTHILL-DE ANZA Office 2001 Graphics Microsoft Clip Art Introduction Office 2001 wants to be the application that does everything, including Windows! When it comes

More information

Digital Image Fundamentals. Prof. George Wolberg Dept. of Computer Science City College of New York

Digital Image Fundamentals. Prof. George Wolberg Dept. of Computer Science City College of New York Digital Image Fundamentals Prof. George Wolberg Dept. of Computer Science City College of New York Objectives In this lecture we discuss: - Image acquisition - Sampling and quantization - Spatial and graylevel

More information

LSN 4 GUI Programming Using The WIN32 API

LSN 4 GUI Programming Using The WIN32 API LSN 4 GUI Programming Using The WIN32 API ECT362 Operating Systems Department of Engineering Technology LSN 4 Why program GUIs? This application will help introduce you to using the Win32 API Gain familiarity

More information

Introduction. Chapter Computer Graphics

Introduction. Chapter Computer Graphics Chapter 1 Introduction 1.1. Computer Graphics Computer graphics has grown at an astounding rate over the last three decades. In the 1970s, frame-buffers capable of displaying digital images were rare and

More information

ASIC-200 Version 5.0. integrated industrial control software. HMI Guide

ASIC-200 Version 5.0. integrated industrial control software. HMI Guide ASIC-200 Version 5.0 integrated industrial control software HMI Guide Revision Description Date C Name change, correct where applicable with document 4/07 HMI Guide: 139168(C) Published by: Pro-face 750

More information

Quick Start Guide - Contents. Opening Word Locating Big Lottery Fund Templates The Word 2013 Screen... 3

Quick Start Guide - Contents. Opening Word Locating Big Lottery Fund Templates The Word 2013 Screen... 3 Quick Start Guide - Contents Opening Word... 1 Locating Big Lottery Fund Templates... 2 The Word 2013 Screen... 3 Things You Might Be Looking For... 4 What s New On The Ribbon... 5 The Quick Access Toolbar...

More information

Graphics Hardware and Display Devices

Graphics Hardware and Display Devices Graphics Hardware and Display Devices CSE328 Lectures Graphics/Visualization Hardware Many graphics/visualization algorithms can be implemented efficiently and inexpensively in hardware Facilitates interactive

More information

Anima-LP. Version 2.1alpha. User's Manual. August 10, 1992

Anima-LP. Version 2.1alpha. User's Manual. August 10, 1992 Anima-LP Version 2.1alpha User's Manual August 10, 1992 Christopher V. Jones Faculty of Business Administration Simon Fraser University Burnaby, BC V5A 1S6 CANADA chris_jones@sfu.ca 1992 Christopher V.

More information

Animation Basics. Learning Objectives

Animation Basics. Learning Objectives Animation Basics Learning Objectives After completing this chapter, you will be able to: Work with the time slider Understand animation playback controls Understand animation and time controls Morph compound

More information

Introduction to 3D Concepts

Introduction to 3D Concepts PART I Introduction to 3D Concepts Chapter 1 Scene... 3 Chapter 2 Rendering: OpenGL (OGL) and Adobe Ray Tracer (ART)...19 1 CHAPTER 1 Scene s0010 1.1. The 3D Scene p0010 A typical 3D scene has several

More information

Some Resources. What won t I learn? What will I learn? Topics

Some Resources. What won t I learn? What will I learn? Topics CSC 706 Computer Graphics Course basics: Instructor Dr. Natacha Gueorguieva MW, 8:20 pm-10:00 pm Materials will be available at www.cs.csi.cuny.edu/~natacha 1 midterm, 2 projects, 1 presentation, homeworks,

More information

There are six main steps in creating web pages in FrontPage98:

There are six main steps in creating web pages in FrontPage98: This guide will show you how to create a basic web page using FrontPage98 software. These instructions are written for IBM (Windows) computers only. However, FrontPage is available for Macintosh users

More information

hinstance = ((LPCREATESTRUCT)lParam)->hInstance obtains the program's instance handle and stores it in the static variable, hinstance.

hinstance = ((LPCREATESTRUCT)lParam)->hInstance obtains the program's instance handle and stores it in the static variable, hinstance. 1 Programming Windows Terry Marris Jan 2013 6 Menus Three programs are presented in this chapter, each one building on the preceding program. In the first, the beginnings of a primitive text editor are

More information

Getting Started with DADiSP

Getting Started with DADiSP Section 1: Welcome to DADiSP Getting Started with DADiSP This guide is designed to introduce you to the DADiSP environment. It gives you the opportunity to build and manipulate your own sample Worksheets

More information

Getting Acquainted with Paint Shop Pro X Meet Paint Shop Pro for the First Time p. 3 What Computer Hardware Do I Need? p. 4 Should I Upgrade or

Getting Acquainted with Paint Shop Pro X Meet Paint Shop Pro for the First Time p. 3 What Computer Hardware Do I Need? p. 4 Should I Upgrade or Acknowledgments p. xvii Introduction p. xix Getting Acquainted with Paint Shop Pro X Meet Paint Shop Pro for the First Time p. 3 What Computer Hardware Do I Need? p. 4 Should I Upgrade or Replace My Hardware?

More information

Adobe Premiere Pro CC 2018

Adobe Premiere Pro CC 2018 Course Outline Adobe Premiere Pro CC 2018 1 TOURING ADOBE PREMIERE PRO CC Performing nonlinear editing in Premiere Pro Expanding the workflow Touring the Premiere Pro interface Keyboard shortcuts 2 SETTING

More information

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into 2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into the viewport of the current application window. A pixel

More information

CGDD 4113 Final Review. Chapter 7: Maya Shading and Texturing

CGDD 4113 Final Review. Chapter 7: Maya Shading and Texturing CGDD 4113 Final Review Chapter 7: Maya Shading and Texturing Maya topics covered in this chapter include the following: Shader Types Shader Attributes Texturing the Axe Life, Love, Textures and Surfaces

More information

Creative Web Designer Course

Creative Web Designer Course Creative Web Designer Course Photoshop 1. Getting to Know the Work Area Starting to work in Adobe Photoshop Using the tools Setting tool properties Undoing actions in Photoshop More about panels and panel

More information

There are many kinds of surface shaders, from those that affect basic surface color, to ones that apply bitmap textures and displacement.

There are many kinds of surface shaders, from those that affect basic surface color, to ones that apply bitmap textures and displacement. mental ray Overview Mental ray is a powerful renderer which is based on a scene description language. You can use it as a standalone renderer, or even better, integrated with 3D applications. In 3D applications,

More information

There we are; that's got the 3D screen and mouse sorted out.

There we are; that's got the 3D screen and mouse sorted out. Introduction to 3D To all intents and purposes, the world we live in is three dimensional. Therefore, if we want to construct a realistic computer model of it, the model should be three dimensional as

More information

Windows Programming. 1 st Week, 2011

Windows Programming. 1 st Week, 2011 Windows Programming 1 st Week, 2011 시작하기 Visual Studio 2008 새프로젝트 파일 새로만들기 프로젝트 Visual C++ 프로젝트 Win32 프로젝트 빈프로젝트 응용프로그램설정 Prac01 솔루션 새항목추가 C++ 파일 main.cpp main0.cpp cpp 다운로드 솔루션빌드 오류 Unicode vs. Multi-Byte

More information

Publisher 2007 Creating Flyers and Brochures

Publisher 2007 Creating Flyers and Brochures MS Publisher 2007 User Guide Publisher 2007 Creating Flyers and Brochures THE NATURE OF DESKTOP PUBLISHING - INTRODUCTION Publisher is a desktop publishing program. You can create publications that

More information

Publisher 2007 Creating Flyers and Brochures

Publisher 2007 Creating Flyers and Brochures MS Publisher 2007 User Guide Publisher 2007 Creating Flyers and Brochures THE NATURE OF DESKTOP PUBLISHING - INTRODUCTION Publisher is a desktop publishing program. You can create publications that use

More information

Computer Graphics and Visualization. What is computer graphics?

Computer Graphics and Visualization. What is computer graphics? CSCI 120 Computer Graphics and Visualization Shiaofen Fang Department of Computer and Information Science Indiana University Purdue University Indianapolis What is computer graphics? Computer graphics

More information

The Wireframe Update Buttons. The Frontface and Backface Buttons. The Project Designer 265

The Wireframe Update Buttons. The Frontface and Backface Buttons. The Project Designer 265 The Wireframe Update Buttons The speed at which objects can be manipulated in the Project Designer viewport depends in part on the complexity of the object being moved. An object that is made up of many

More information

Adobe Flash Course Syllabus

Adobe Flash Course Syllabus Adobe Flash Course Syllabus A Quick Flash Demo Introducing the Flash Interface Adding Elements to the Stage Duplicating Library Items Introducing Keyframes, the Transform Tool & Tweening Creating Animations

More information

Visual Analyzer V2.1 User s Guide

Visual Analyzer V2.1 User s Guide Visual Analyzer V2.1 User s Guide Visual Analyzer V2.1 User s Guide Page 2 Preface Purpose of This Manual This manual explains how to use the Visual Analyzer. The Visual Analyzer operates under the following

More information

Render methods, Compositing, Post-process and NPR in NX Render

Render methods, Compositing, Post-process and NPR in NX Render Render methods, Compositing, Post-process and NPR in NX Render Overview What makes a good rendered image Render methods in NX Render Foregrounds and backgrounds Post-processing effects Compositing models

More information

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer. Computer Graphics

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer. Computer Graphics r About the Tutorial To display a picture of any size on a computer screen is a difficult process. Computer graphics are used to simplify this process. Various algorithms and techniques are used to generate

More information

Fundamentals. Training Kit. Presentation Products, Inc. 632 W 28th St, 7th fl New York, NY f presentationproducts.

Fundamentals. Training Kit. Presentation Products, Inc. 632 W 28th St, 7th fl New York, NY f presentationproducts. Fundamentals Training Kit Presentation Products, Inc. 632 W 28th St, 7th fl New York, NY 10001 212.736.6350 f 212.736.6353 presentationproducts.com Table of Contents Getting Started How Does the SMART

More information

Multimedia Technology CHAPTER 4. Video and Animation

Multimedia Technology CHAPTER 4. Video and Animation CHAPTER 4 Video and Animation - Both video and animation give us a sense of motion. They exploit some properties of human eye s ability of viewing pictures. - Motion video is the element of multimedia

More information

How Computer Mice Work

How Computer Mice Work How Computer Mice Work Inside this Article 1. Introduction to How Computer Mice Work 2. Evolution of the Computer Mouse 3. Inside a Mouse 4. Connecting Computer Mice 5. Optical Mice 6. Optical Mouse Accuracy

More information

RAM JAYAM VIDYAA MANDIR MATRIC HR SEC SCHOOL. Cumbum. Theni district. 1 D.EASWARAN M.Sc B.Ed Computer Instructor RJMS_SCHOOL.

RAM JAYAM VIDYAA MANDIR MATRIC HR SEC SCHOOL. Cumbum. Theni district. 1 D.EASWARAN M.Sc B.Ed Computer Instructor RJMS_SCHOOL. RAM JAYAM VIDYAA MANDIR MATRIC HR SEC SCHOOL. Cumbum. Theni district. D.Easwaran M.sc B.ed Computer Instructor RJMS School. Cumbum THENI DISTRICT. CELL : 8760324704 1 Higher Secondary First Year Study

More information

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4 TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4 23 November 2001 Two-camera stations are located at the ends of a base, which are 191.46m long, measured horizontally. Photographs

More information

Computer Graphics Introduction. Taku Komura

Computer Graphics Introduction. Taku Komura Computer Graphics Introduction Taku Komura What s this course all about? We will cover Graphics programming and algorithms Graphics data structures Applied geometry, modeling and rendering Not covering

More information

Breeze User Guide. Breeze Multiple Stream Video Processors. Light and Standard Editions Version 5.3.2

Breeze User Guide. Breeze Multiple Stream Video Processors. Light and Standard Editions Version 5.3.2 Breeze User Guide Breeze Multiple Stream Video Processors Light and Standard Editions Version 5.3.2 Copyright 1989-2007 Discovery Scientific, LLC All rights reserved www.discoverybiz.net January 01, 2007

More information

Education and Training CUFMEM14A. Exercise 2. Create, Manipulate and Incorporate 2D Graphics

Education and Training CUFMEM14A. Exercise 2. Create, Manipulate and Incorporate 2D Graphics Education and Training CUFMEM14A Exercise 2 Create, Manipulate and Incorporate 2D Graphics Menu Exercise 2 Exercise 2a: Scarecrow Exercise - Painting and Drawing Tools... 3 Exercise 2b: Scarecrow Exercise

More information

Some Quick Terms Before we move ahead, we need to touch on a few key terms used throughout the book.

Some Quick Terms Before we move ahead, we need to touch on a few key terms used throughout the book. Getting Started Welcome to the official Apple Pro training course for Motion, Apple Computer s revolutionary real-time-design motion graphics application. This book provides a comprehensive guide to designing

More information

Advanced Maya Texturing and Lighting

Advanced Maya Texturing and Lighting Advanced Maya Texturing and Lighting Lanier, Lee ISBN-13: 9780470292730 Table of Contents Introduction. Chapter 1 Understanding Lighting, Color, and Composition. Understanding the Art of Lighting. Using

More information

TOON BOOM HARMONY Paint Preferences Guide

TOON BOOM HARMONY Paint Preferences Guide TOON BOOM HARMONY 12.2.1 Paint Preferences Guide 2 Legal Notices Toon Boom Animation Inc. 4200 Saint-Laurent, Suite 1020 Montreal, Quebec, Canada H2W 2R2 Tel: +1 514 278 8666 Fax: +1 514 278 2666 toonboom.com

More information

INFOGR Computer Graphics

INFOGR Computer Graphics INFOGR Computer Graphics Jacco Bikker & Debabrata Panja - April-July 2018 Lecture 4: Graphics Fundamentals Welcome! Today s Agenda: Rasters Colors Ray Tracing Assignment P2 INFOGR Lecture 4 Graphics Fundamentals

More information

TSBK03 Screen-Space Ambient Occlusion

TSBK03 Screen-Space Ambient Occlusion TSBK03 Screen-Space Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History 2 2.1 Crysis method..................................... 2 3 Chosen method 2 3.1 Algorithm

More information

Beginning PowerPoint XP for Windows

Beginning PowerPoint XP for Windows Beginning PowerPoint XP for Windows Tutorial Description This course introduces you to the PowerPoint program basics for creating a simple on-screen presentation. Intended Audience Individuals interested

More information

Microsoft Excel 2007

Microsoft Excel 2007 Learning computers is Show ezy Microsoft Excel 2007 301 Excel screen, toolbars, views, sheets, and uses for Excel 2005-8 Steve Slisar 2005-8 COPYRIGHT: The copyright for this publication is owned by Steve

More information

In this section you will learn some simple data entry, editing, formatting techniques and some simple formulae. Contents

In this section you will learn some simple data entry, editing, formatting techniques and some simple formulae. Contents In this section you will learn some simple data entry, editing, formatting techniques and some simple formulae. Contents Section Topic Sub-topic Pages Section 2 Spreadsheets Layout and Design S2: 2 3 Formulae

More information

LEVEL 1 ANIMATION ACADEMY2010

LEVEL 1 ANIMATION ACADEMY2010 1 Textures add more realism to an environment and characters. There are many 2D painting programs that can be used to create textures, such as Adobe Photoshop and Corel Painter. Many artists use photographs

More information

Renderize Live Overview

Renderize Live Overview Renderize Live Overview The Renderize Live interface is designed to offer a comfortable, intuitive environment in which an operator can create projects. A project is a savable work session that contains

More information

animation, and what interface elements the Flash editor contains to help you create and control your animation.

animation, and what interface elements the Flash editor contains to help you create and control your animation. e r ch02.fm Page 43 Wednesday, November 15, 2000 8:52 AM c h a p t 2 Animating the Page IN THIS CHAPTER Timelines and Frames Movement Tweening Shape Tweening Fading Recap Advanced Projects You have totally

More information

Adobe Premiere Pro CC 2015 Tutorial

Adobe Premiere Pro CC 2015 Tutorial Adobe Premiere Pro CC 2015 Tutorial Film/Lit--Yee GETTING STARTED Adobe Premiere Pro CC is a video layout software that can be used to create videos as well as manipulate video and audio files. Whether

More information

PowerPoint Instructions

PowerPoint Instructions PowerPoint Instructions Exercise 1: Type and Format Text and Fix a List 1. Open the PowerPoint Practice file. To add a company name to slide 1, click the slide 1 thumbnail if it's not selected. On the

More information

A/D Converter. Sampling. Figure 1.1: Block Diagram of a DSP System

A/D Converter. Sampling. Figure 1.1: Block Diagram of a DSP System CHAPTER 1 INTRODUCTION Digital signal processing (DSP) technology has expanded at a rapid rate to include such diverse applications as CDs, DVDs, MP3 players, ipods, digital cameras, digital light processing

More information

Advanced Maya e Texturing. and Lighting. Second Edition WILEY PUBLISHING, INC.

Advanced Maya e Texturing. and Lighting. Second Edition WILEY PUBLISHING, INC. Advanced Maya e Texturing and Lighting Second Edition Lee Lanier WILEY PUBLISHING, INC. Contents Introduction xvi Chapter 1 Understanding Lighting, Color, and Composition 1 Understanding the Art of Lighting

More information

CECOS University Department of Electrical Engineering. Wave Propagation and Antennas LAB # 1

CECOS University Department of Electrical Engineering. Wave Propagation and Antennas LAB # 1 CECOS University Department of Electrical Engineering Wave Propagation and Antennas LAB # 1 Introduction to HFSS 3D Modeling, Properties, Commands & Attributes Lab Instructor: Amjad Iqbal 1. What is HFSS?

More information

Chapter 3. Texture mapping. Learning Goals: Assignment Lab 3: Implement a single program, which fulfills the requirements:

Chapter 3. Texture mapping. Learning Goals: Assignment Lab 3: Implement a single program, which fulfills the requirements: Chapter 3 Texture mapping Learning Goals: 1. To understand texture mapping mechanisms in VRT 2. To import external textures and to create new textures 3. To manipulate and interact with textures 4. To

More information

Shadow Casting in World Builder. A step to step tutorial on how to reach decent results on the creation of shadows

Shadow Casting in World Builder. A step to step tutorial on how to reach decent results on the creation of shadows Shadow Casting in World Builder A step to step tutorial on how to reach decent results on the creation of shadows Tutorial on shadow casting in World Builder 3.* Introduction Creating decent shadows in

More information

A Step-by-step guide to creating a Professional PowerPoint Presentation

A Step-by-step guide to creating a Professional PowerPoint Presentation Quick introduction to Microsoft PowerPoint A Step-by-step guide to creating a Professional PowerPoint Presentation Created by Cruse Control creative services Tel +44 (0) 1923 842 295 training@crusecontrol.com

More information

The Mathcad Workspace 7

The Mathcad Workspace 7 For information on system requirements and how to install Mathcad on your computer, refer to Chapter 1, Welcome to Mathcad. When you start Mathcad, you ll see a window like that shown in Figure 2-1. By

More information

ClipArt and Image Files

ClipArt and Image Files ClipArt and Image Files Chapter 4 Adding pictures and graphics to our document not only breaks the monotony of text it can help convey the message quickly. Objectives In this section you will learn how

More information

This guide will show you how to create a basic multi-media PowerPoint presentation containing text, graphics, charts, and audio/video elements.

This guide will show you how to create a basic multi-media PowerPoint presentation containing text, graphics, charts, and audio/video elements. This guide will show you how to create a basic multi-media PowerPoint presentation containing text, graphics, charts, and audio/video elements. Before starting the steps outlined in this guide, it is recommended

More information

UNIT-2 IMAGE REPRESENTATION IMAGE REPRESENTATION IMAGE SENSORS IMAGE SENSORS- FLEX CIRCUIT ASSEMBLY

UNIT-2 IMAGE REPRESENTATION IMAGE REPRESENTATION IMAGE SENSORS IMAGE SENSORS- FLEX CIRCUIT ASSEMBLY 18-08-2016 UNIT-2 In the following slides we will consider what is involved in capturing a digital image of a real-world scene Image sensing and representation Image Acquisition Sampling and quantisation

More information

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker CMSC427 Advanced shading getting global illumination by local methods Credit: slides Prof. Zwicker Topics Shadows Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection

More information

To start, open or build a simple solid model. The bracket from a previous exercise will be used for demonstration purposes.

To start, open or build a simple solid model. The bracket from a previous exercise will be used for demonstration purposes. Render, Lights, and Shadows The Render programs are techniques using surface shading, surface tones, and surface materials that are then presented in a scene with options for lights and shadows. Modifications

More information

Animation is the illusion of motion created by the consecutive display of images of static elements. In film and video

Animation is the illusion of motion created by the consecutive display of images of static elements. In film and video Class: Name: Class Number: Date: Computer Animation Basis A. What is Animation? Animation is the illusion of motion created by the consecutive display of images of static elements. In film and video production,

More information

Animation & Rendering

Animation & Rendering 7M836 Animation & Rendering Introduction, color, raster graphics, modeling, transformations Arjan Kok, Kees Huizing, Huub van de Wetering h.v.d.wetering@tue.nl 1 Purpose Understand 3D computer graphics

More information

Adding Art to Office Documents

Adding Art to Office Documents Adding Art to Office Documents Introduction What You ll Do Although well-illustrated documents can t make up for a lack Locate and Insert an Online Picture of content, you can capture your audiences attention

More information

Graphics for VEs. Ruth Aylett

Graphics for VEs. Ruth Aylett Graphics for VEs Ruth Aylett Overview VE Software Graphics for VEs The graphics pipeline Projections Lighting Shading VR software Two main types of software used: off-line authoring or modelling packages

More information

The Auslan System Sign Editor User Manual

The Auslan System Sign Editor User Manual The Auslan System Sign Editor User Manual Preface: This manual explains how to construct, edit, or design their own sign language signs. The software referred to in this manual, the Auslan Sign Editor,

More information

Microsoft Office. Microsoft Office

Microsoft Office. Microsoft Office is an office suite of interrelated desktop applications, servers and services for the Microsoft Windows. It is a horizontal market software that is used in a wide range of industries. was introduced by

More information

Excel 2013 Intermediate

Excel 2013 Intermediate Instructor s Excel 2013 Tutorial 2 - Charts Excel 2013 Intermediate 103-124 Unit 2 - Charts Quick Links Chart Concepts Page EX197 EX199 EX200 Selecting Source Data Pages EX198 EX234 EX237 Creating a Chart

More information

This work is about a new method for generating diffusion curve style images. Although this topic is dealing with non-photorealistic rendering, as you

This work is about a new method for generating diffusion curve style images. Although this topic is dealing with non-photorealistic rendering, as you This work is about a new method for generating diffusion curve style images. Although this topic is dealing with non-photorealistic rendering, as you will see our underlying solution is based on two-dimensional

More information

An Overview of the BLITZ System

An Overview of the BLITZ System An Overview of the BLITZ System Harry H. Porter III Department of Computer Science Portland State University Introduction The BLITZ System is a collection of software designed to support a university-level

More information

Logger Pro 3. Quick Reference

Logger Pro 3. Quick Reference Logger Pro 3 Quick Reference Getting Started Logger Pro Requirements To use Logger Pro, you must have the following equipment: Windows 98, 2000, ME, NT, or XP on a Pentium processor or equivalent, 133

More information

1.6 Graphics Packages

1.6 Graphics Packages 1.6 Graphics Packages Graphics Graphics refers to any computer device or program that makes a computer capable of displaying and manipulating pictures. The term also refers to the images themselves. A

More information

Technical Overview of the Picasso User Interface Management System

Technical Overview of the Picasso User Interface Management System Technical Overview of the Picasso User Interface Management System Picasso is developed by the OECD Halden Reactor Project at the Institute for Energy Technology, Halden, Norway Picasso is presented on

More information

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS) IMAGING Film photography Digital photography Images are stored by capturing the binary data using some electronic devices (SENSORS) Sensors: Charge Coupled Device (CCD) Photo multiplier tube (PMT) The

More information

Course 3D_OpenGL: 3D-Graphics with C++ and OpenGL Chapter 1: Moving Triangles

Course 3D_OpenGL: 3D-Graphics with C++ and OpenGL Chapter 1: Moving Triangles 1 Course 3D_OpenGL: 3D-Graphics with C++ and OpenGL Chapter 1: Moving Triangles Project triangle1 Animation Three Triangles Hundred Triangles Copyright by V Miszalok, last update: 2011-03-20 This project

More information

WORD XP/2002 USER GUIDE. Task- Formatting a Document in Word 2002

WORD XP/2002 USER GUIDE. Task- Formatting a Document in Word 2002 University of Arizona Information Commons Training Page 1 of 21 WORD XP/2002 USER GUIDE Task- Formatting a Document in Word 2002 OBJECTIVES: At the end of this course students will have a basic understanding

More information

EZware Quick Start Guide

EZware Quick Start Guide EZware Quick Start Guide Your Industrial Control Solutions Source www.maplesystems.com For use as the following: Evaluation Tool for Prospective Users Introductory Guide for New Customers Maple Systems,

More information

Turn your movie file into the homework folder on the server called Lights, Camera, Action.

Turn your movie file into the homework folder on the server called Lights, Camera, Action. CS32 W11 Homework 3: Due MONDAY, APRIL 18 Now let s put the ball in a world of your making and have some fun. Create a simple AND WE MEAN SIMPLE environment for one of your ball bounces. You will assign

More information

3 Data Storage 3.1. Foundations of Computer Science Cengage Learning

3 Data Storage 3.1. Foundations of Computer Science Cengage Learning 3 Data Storage 3.1 Foundations of Computer Science Cengage Learning Objectives After studying this chapter, the student should be able to: List five different data types used in a computer. Describe how

More information

CS410 Visual Programming Solved Online Quiz No. 01, 02, 03 and 04. For Final Term Exam Preparation by Virtualians Social Network

CS410 Visual Programming Solved Online Quiz No. 01, 02, 03 and 04. For Final Term Exam Preparation by Virtualians Social Network CS410 Visual Programming Solved Online Quiz No. 01, 02, 03 and 04 For Final Term Exam Preparation by Virtualians Social Network 1. Ptr -> age is equivalent to *ptr.age ptr.age (ptr).age (*ptr).age 2. is

More information

Solo 4.6 Release Notes

Solo 4.6 Release Notes June9, 2017 (Updated to include Solo 4.6.4 changes) Solo 4.6 Release Notes This release contains a number of new features, as well as enhancements to the user interface and overall performance. Together

More information