ANIMS-Phase 2 IntuiLab Intactile DESIGN Eurocontrol CARE INO II programme Introduction The ANIMS project is carried out in collaboration between the Eurocontrol agency, researchers in user interfaces from IntuiLab and visual and sound designers from Intactile Design. It aims at assessing how the use of animation and sound in the user interface of air traffic control tools can improve their efficiency and safety of operation. Conducted during year 2004, the first phase of the project provided a state of the art of the use of animation and sound in user interfaces across all application domains. Based on that knowledge, the work with ATC experts allowed the selection of five ATC scenarios that were likely candidates to take advantage of the most documented benefits of sound and animation. The scenarios are: distinguishing STCA and ACAS-RA alerts; accessing to more information in line 0 of a flight label; presenting STCA type/urgency/importance; identification of the calling flight; animations and sound in a sequence manager. Finally, the design and prototyping phase produced demonstrators for all these scenarios. The work performed in the first year of the project provided ground for a second phase to be conducted during years 2005 and 2006. The two selected research axes for ANIMS Phase 2 are: - assessing more formally the benefit of animation and sounds in user interfaces - proposing a specification method and development tools to integrate sound and animations in user interfaces. During this year 2005, the work on the first axis began with the proposal of experiments to assess the benefit of animation and sounds. A work on animation design through animated themes enabled to propose new types of animation that would be tested during the experimentations. From sound design point of view, recordings were made in ATC rooms providing sound ambiances that will enable to design and evaluate new sounds relevant in an ATC context. Concerning the second research axis, specification method for animation and sound was proposed. We now detail activities and results of this first year of Phase 2. Assessing human factors benefits of animation Expected human factors benefits from using animation had been identified in the first phase of the ANIMS project. Animation is believed to improve situation awareness, remote awareness,
mutual working, user confidence/trust in the system, user efficiency and ease of use, user acceptability and satisfaction and to reduce cognitive load. In the second phase, a state of the art review of these benefits enabled to provide a more precise definition of each term. A parallel study was conducted focussing on the techniques used to assess human factors benefits in general. Both researches lead to the organisation of a workshop at the Eurocontrol Experimental Centre on the theme How to assess animation benefits?. HMI and Human Factors experts from the field of ATC/ATM were asked to give their points of view on which benefits should be assessed in the ANIMS project and which type of experimentation could be conducted, and to provide operational scenarios to support the experimentation. During the workshop, eight experimentations were imagined for assessing situation awareness, remote user awareness or user performances. Each experimentation was based on ATC/ATM scenarios. Out of those, two experimentations have been chosen according criteria such as feasibility, demonstrability, priority (given to situation awareness), The first experimentation consists in testing if animation enables to better detect and interpret a modification in a list perceived by the peripheral view. Modifications of the list, that can be a list of mobiles on the airport ground, may be done in different conditions (with different types of animations, with or without sound). Measurements and analysis will be performed on detection time and number of good detections in order to confirm or invalidate the main hypothesis (the type of modification is better detected with continuous animation, the type of mobile impacted by modification is better identified with animated themes (cf. next section)). Figure 1: Experimentation 1 Second experiment consists in testing if animation enables to better understand a conflict situation. It is based on work done on STCA alerts illustrated in scenario 3 of ANIMS Phase 1. Two conditions will be tested: the route of the flight is only displayed; future positions of the flight are animated on the route with triangles and atmospheric perspective. Measures and analysis will be performed on the number of true answers to the asked questions (conflict or not? Qualification of the conflict) and on the relative errors within conditions 1 and 2 in order to confirm or invalidate the main hypothesis (relative position of flights is better foreseen with animation, animation is more efficient when the two flights are evolving). Figure 2: Experimentation 2 These two experimentations will enable to assess the impact of these animations on the situation awareness of an air traffic controller in lists management and in radar control. Animated themes or how to give a personality to graphical objects?
The work on animated themes was carried out with the goal of giving animated personality to objects so as to better structure the perception of multiple events on complex displays. The aim was to provide a set of animations enabled to improve the understanding of the nature of the object (what/who is this object), to improve object affordance (what can I do on/with this object), to improve the understanding of object interaction with other elements in the interface and/or to group those with similar personalities. The results of the first phase of the ANIMS project led us think that this represents a major opportunity for improving efficiency and situation awareness. First of all, a set of isolated animations have been designed considering the life cycle of an object in the interface, whatever the context of use. These animations are on different themes: entrance of graphical objects in the scene, expression of the state of the object, automatic movements of the object, interaction with other objects, evolution of an object in the scene. As a design guideline to facilitate the interpretation of the animation and the associated meaning in the object life cycle, we chose to design animations using natural mappings (physical world with notions of gravity, attraction, atmospheric perspectives ; human beings attitudes such as politeness, refusal, suggestions.; objects such as tickets, pistons, cars ). Three demonstrations constructed in ATC/ATM scenarios have then been designed to illustrate the use and the benefits of such animations in the ATC/ATM operational context. Isolated animations have been picked up in the set established previously. The first demonstration illustrates how animations can support clearances ask/answer process between pilots and controllers by presenting different state of the process. The second demonstration illustrates the use of animated themes on a flight management tool to discriminate the status of a new flight entering the interface in the current displayed situation. Finally, the third demonstration presents the use of animated themes on a list presenting mobiles evaluating on ground: vehicles, buses and aircrafts. Animation are used to show modification of the sorting of the list. This demonstration corresponds to one condition to be tested during Experimentation 1 presented in previous section. Figure 3: Polite entrance Figure 4: Demonstration 3 Inversion of an aircraft and a vehicle The proposed set of animations can be used by HMI designers to code graphical object intrinsic data giving it a personality that helps to understand the object itself, its role and/or its
status in the interface. The demonstrations illustrate a real gain in the use of such objects. It encourages continuing this study by completing the set of animations, searching new applications of the animated themes or providing guidelines to use them. Ambiance sound library The aim of this work was to make recordings in control rooms to get sound samples to recreate a realistic air traffic control room atmosphere, and ultimately to extract a panel of sounds typical of air traffic control. Two French control room were selected for the recording: Roissy approach and Bordeaux En-Route control rooms. Both rooms were equipped with five cardioids microphones; seven hours were recorded in each control room. Figure 5 : Roissy Approach control room Figure 6 : Bordeaux En-Route control room Each recorded sound track was then reworked by the sound engineer. First, the sampling phase requires listening to all the tracks recorded, track by track, in order to determine which extracts are most representative of the objectives being pursued. Then, filtering and equalising consists in purifying the recording signal by attenuating or accentuating some frequencies, correcting amplified sounds or resonance effects. The mixing phase enables to obtain a consistent and controlled mix of the different sound sources that have been recorded in order to reproduce a realistic, spatialised atmosphere. Samples were anonymized to warrant controller privacy. The final mix consists in reducing the superimposed sound tracks into a single stereo track. This work resulted in the production of eleven ambiance sound scenarios. There are high/low and increasing/decreasing sound activity scenarios. The scenarios can therefore be chained to recreate the varying sound atmosphere of one entire day. In addition to the original goal of recreate control rooms ambiances to design and evaluate the effectiveness of new sounds in ATC user interfaces, other uses are envisaged: on-site learning (creation of realistic simulation soundtracks for various ATC contexts), simulation (test of new concepts in which sound activity may represent a disturbance or a particular parameter to be taken into account), and demonstration (recreate realistic atmosphere of a control room). Specification method for animation and sound The aim was to propose methods and techniques to share animations and sound specifications between designers and developers. A first proposed method was elaborated for animation
only. This method was then presented and used by a motion designer. The method was finally enriched with the integration of sounds in specification of animation. The steps proposed to specify animations and sounds are the followings. Global description. Textual description of the animation, the main steps and how sounds are linked. Inventory of varying objects. List of the objects involved in the animation and the sounds to be played. Sequencing. Decomposition of the animation into sequences and the sequences to primitive transformations (translation, colour variation ). For sounds, the transformation is only playback transformation (turn on the volume at start and turn off the volume at end). Inventory of key-points. List of the key-points of the animation and of sound playback. These key-points can be the beginning of a transformation or the end of a sound. Storyboarding. Illustration of the visual state of the animation at each key-point. Each drawing corresponding to a key-point that starts or ends one or more sounds is completed with specific sound symbols. Figure 7: Storyboarding animations and sound at eligible key-points Conditions at eligible key-points. List of conditions that trigger the start or the end of the sequences or primitive transformations identified in sequencing step. There can be temporal conditions (3 seconds after this sequence), user events (space key pressed), or system events (when a plane enters the sector). Trajectories. Description of the variation of the values of graphical attributes in function of a time or progression index, for each primitive transformation in the animation. It can be a path for a translation or a range of opacity for a colour. Paces. Definition of transformation functions that can be applied to the time during a transformation. Some paces are predefined, such ac slow-in, slow-out, slow-in-slow-out and linear. Repetitions. List of sequences and/or primitive transformations of an animation that are repeated during an animation and the way they are repeated (backward, forward). Proof of concept of the specification method has been performed. Motion designers applied it on some examples of the animated themes (Figure 4) which were then coded by developers; the resulting animations were those expected by the designers. To integrate properly the designed animations into user interfaces, we now have to consider interactive software architectures and development tools; this is one aim of year 2006. Conclusion and perspectives Considering the two research axes of ANIMS Phase 2, which were to assess more formally the benefit of animation and sounds in user interfaces and to propose a specification method and development tools to integrate sound and animations in user interfaces, a first step was taken. Year 2005 provided all the elements to conduct experimentations to assess the benefits of sound and animation on situation awareness: the experimental protocol for two experimentations, the
design of the animations to be tested, and an audio environment to design sounds relevant in ATC context. The work will be continued in 2006 by developing prototypes to support the two experimentations. For the with sound experimental conditions, specific sounds will be designed by taking into account the ambiance sound library. The experimentations will then be conducted. The analysis of the results will enable to confirm or invalidate the hypothesis on the benefits of sounds and animations in the context of these two experimentations. The work on specification of animations and sounds in user interfaces conducted in year 2005 enabled to have a better understanding of the relevant elements needed to describe them. The next step will consist in supporting the introduction of animation into development tools in accordance with the propose specification method. Study will also be conducted on software architecture required to make this introduction possible. The results of these studies will be integrated in IntuiKit, the development tool of IntuiLab, and tested with the development of the experimentation prototypes. Five documents detail the approach and the results of the studies conducted during year 2005: - Assessing human factors benefits of animation - Experimental protocol - Animated themes or How to give a personality to graphical objects? - ATC Sound Samples Library - Specification Method for Animation and Sound