Adaptive Interfaces to Adjustable Control Autonomy

Similar documents
Real-time monitoring of ECLSS flight rules

Managing Life Support Systems Using Procedures

Smart architecture for highly available, robust and autonomous satellite

Navigation Templates for PSA

Adjustably Automated Ground Control Procedure Execution for ECLSS Systems

Software Architecture--Continued. Another Software Architecture Example

ESL: A Language for Supporting Robust Plan Execution in Embedded Autonomous Agents

A Procedure Representation Language for Human Spaceflight Operations

Distributed and Dynamic Task Reallocation in Robot Organizations

DARPA Investments in GEO Robotics

An Immune System Paradigm for the Assurance of Dependability of Collaborative Self-organizing Systems

Implementing Some Foundations for a Computer-Grounded AI

StanReady Reference Guide

Lab 03: Edge Detection Tutorial

Mars Mining Robot Team

Contents Introduction... 4 Features new to CyDesk Web... 4 Call History: - Show All or Filter by Category... 4 Call History: - View the Call History

OVID Server. Setup and Configuration Guide

CCAgent User Guide. To open CCAgent Go to Start / Programs/Alcatel/CCAgent/CCAgent. This will then open up the CCAgent tool bar and configuration box

This topic explains how to use the Microsoft Lync interface. Use this topic to assist you in navigating the interface.

Automatic Recovery from Software Failure: A Model-Based Approach to Self-Adaptive Software

Software Architecture. Lecture 4

Deckblatt. APL Operator Guide SIMATIC PCS 7. Application description June Applikationen & Tools. Answers for industry.

Cisco Agent Desktop for Cisco Unified Contact Center 7.X

Copyright JAVS

Supervision Guide For software version September 2013 and later OCR 2014

Version Android User's Guide. May-02-13

CarLink Guide for Android Users

Getting Started Information for Providers

Work State Indicator Upon login, the initial state is Not Ready. Click this to open the dropdown menu.

GLUE - A Component Connecting Schema-based Reactive to Higher-level Deliberative Layers for Autonomous Agents

UPTIVITY DISCOVER ON-DEMAND USER GUIDE, V5.6. April

Chapter 5: Configuring ServerProtect

Uptivity WFO On- Demand User Guide, v5.7

XenData Alert Module Administrator Guide

Product Documentation SAP Business ByDesign August Analytics

Pattern-Based Analysis of an Embedded Real-Time System Architecture

Call Center v2.0 for IPBRICK Guide

Grid Security Policy

FPGA-Based Embedded Systems for Testing and Rapid Prototyping

A White Paper on Intelligent Infrastructure Concepts

Vision Document for Multi-Agent Research Tool (MART)

ADMINISTRATOR PORTAL MANUAL

LOREX Player User Manual Version 1.0

Adaptive Mobile Agents: Modeling and a Case Study

CamGuard Security System CamGuard Security System Manual

Probabilistic Planning for Behavior-Based Robots

htc HTC CONTROLS TEL:

User Guide PUSH TO TALK PLUS. For Android

Best Practices for Using the Lessons Tool - Design

B.H.GARDI COLLEGE OF MASTER OF COMPUTER APPLICATION. Ch. 1 :- Introduction Database Management System - 1

Decision-Making in a Robotic Architecture for Autonomy

Miracle Service Accent

How Do I Manage Licenses

What's New In Adobe Connect 9.4. Adobe Connect 9.4 : What s New? Meeting Related Changes. Adobe Connect 9.4: What s New? Screen Sharing Enhancements

INFORMATION NOTE. United Nations/Germany International Conference

Purchase and Setup instructions for SWVPS (Sept 15, 2014)

IndustrySafe Guide to Importing and Editing Inspection Checklist

IBM. OA VTAM 3270 Intrusion Detection Services - Overview, Considerations, and Assessment (Prerequisite) z/os Communications Server

WHITE PAPER: DELL MANAGEMENT CONSOLE TRIGGERS NETBACKUP 7 JOBS

JACQUES ANNOUNCEMENT SCHEDULER (JAS) USER GUIDE JED-0368

69 th IFIP WG 10.4 Meeting Dependability & IoT

CMS Failover White Paper. Surveillance Station and above

ALZNAV. Overview. Remarkable Technology, Easy To Use

Service Calibrations 5

ESChat Dispatch Client

USER S MANUAL FOR UNDER VEHICLE INSPECTION SYSTEM PORTABLE AND IN GROUND UNITS

Computer-Based Control System Safety Requirements

ESL: A Language for Supporting Robust Plan Execution in Embedded Autonomous Agents

Using Avaya one-x Agent

Fault Tolerant Supervisory Control of Human Interactive Robots

Product Release Notes

PUBLICATION 1 SERVICE DESCRIPTION FOR NETWORK INTERACTIVE VOICE RESPONSE SERVICE

Enterprise Chat and Supervisor s Guide, Release 11.5(1)

This will display a directory of your Agreement Schemes. You can access and edit Approval Schemes from here or create a new one.

Oracle Adapter for Salesforce Lightning Winter 18. What s New

Azimuth Client User Manual.

Growing Agents - An Investigation of Architectural Mechanisms for the Specification of Developing Agent Architectures

incontact On-Demand User Guide for Premises 16.2

Honeywell HRXD/HRSD. Creating and managing connections with Honeywell CCTV systems HRXD/HRSD

Implementation and Future Plans for Global IoT Maintenance System

SOLAR Web. Sysco On-Line Advanced Reporting System. Network Administrator Guide for Corporate

Microsoft Lync FAQ. Getting Started with Lync Features on the Lync Client Screen Contacts Instant Messaging Audio...

DSSLearnCenter User s Guide for DSS Employees

Oracle. Field Service Cloud Message Scenario Configuration Guide 18A

SmartBuilder Section 508 Accessibility Guidelines

User Guide: Sprint Direct Connect Plus - ios. User Guide. Sprint Direct Connect Plus Application. ios. Release 8.3. December 2017.

N30 Supervisory Controller Quick Start

Robotics Programming Laboratory

eidas Workshop Return on Experience from Conformity Assessment Bodies - EY June 13, 2016 Contacts: Arvid Vermote

Using MDC Manager to Manage Power Settings

Troubleshooting Cisco IP Communicator

Using SANDBOXIE to Safely Browse the Internet (verified with ver 5.22) Jim McKnight Sandboxie.lwp revised

Internet Engineering Task Force (IETF) Request for Comments: 7213 Category: Standards Track. M. Bocci Alcatel-Lucent June 2014

GWCommander V3.x. Administrators Guide

The Matrix Mixer. Yamaha 01X and Yamaha 01V96 Digital Audio Mixing Desks. User Manual

Jade Robot TM Introduction & User Guide

Red Hat Decision Manager 7.0 Designing a decision service using guided rules

Assurance Features and Navigation

EDAConnect-Dashboard User s Guide Version 3.4.0

Everything you need to know to connect, collaborate and share. Welcome to Skype!

Transcription:

From: AAAI Technical Report SS-00-01. Compilation copyright 2000, AAAI (www.aaai.org). All rights reserved. Adaptive Interfaces to Adjustable Control Autonomy Debra Keirn-Schreckenghost TRACLabs/Metrica 1012 Hercules Houston, TX 77058 (281) 244-6134 ghost@hypercon.com Carroll Thronesbery TRACLabs/SKE 1016 Hercules Houston, TX 77058 (281) 244-5602 c.thronesbery@jsc.nasa.gov Abstract We have used adaptive interface techniques when developing and operating autonomous control software for manned space systems like life support or robotics. We adapt the interface using both a model of control tasks and a model of user roles. The purpose of this adaptation is to support changing user roles resulting from adjusting the level of autonomy of the control software. We describe progress to date on applying adaptive interface techniques for manned space operations. Introduction We are developing user interface software that supports the development and use of autonomous control software for manned space support systems (e.g., life support, robotics). An important aspect of these interfaces is assisting users in interacting with the autonomous control system. Our approach for providing such capability is to support different levels of control autonomy, ranging from fully autonomous operation to manual selection and execution of tasks. We use adaptive interface technology in developing user interfaces for adjustable control autonomy (Kortenkamp, et al, 2000). These interfaces are adapted for the following: changes in task allocation corresponding to a change in control autonomy changes in control situation, including failures, that require altered level of human awareness or interaction changes in the user s roles We have used models of both the task and the user to adapt the interface. We describe our approach in this paper. Task Models for Adaptation The Three-Tier (3T) software architecture (Bonasso, et al. 1997) that we use for autonomous control includes tasklevel control. As such, control tasks are represented explicitly in the application software. When the application executes, control tasks and the results of these Copyright 2000, American Association for Artificial Intelligence (www.aaai.org). All rights reserved. tasks are monitored at the user interface. Information about which tasks were selected, executed, completed successfully, or failed can be used to define control events of interest to the user. We use these control events to reconfigure the display and logging of information at the user interface. Control events are defined in pairs an initiating and a terminating event. We define this event pair, and the associated specification of display changes as an event trigger. The occurrence of the initiating control event in the autonomous system changes the display configuration per the specification in the event trigger. This change persists until the terminating event occurs. Figure 1 shows an example defining an event trigger. Specific ways in which the user interface can be adapted include the following: show or hide a new display window initiate or terminate logging of pre-specified data groups initiate or terminate verbose reporting of tasks executed execute a custom user interface function We plan to add the capability to initiate or terminate highly salient alerts (audio, color changes) at the occurrence of a control event. This is particularly important for eyes-off, hands-occupied operations. These techniques for adapting the user interface can be used in a variety of ways. They can be used to notify the user of interesting control situations. For example, we can pop up a window showing a history of control tasks recently executed when a fault management task is executed. A second use of this adaptive capability is to notify the user of the need for information from the user or to request a manual action. For example, we can use an audio signal to notify a human working away from the workstation of a pending manual task. A third use of adaptation is to collect new information for later inspection by a user. For example, if an operation of interest occurs while the user is working elsewhere, we can log data values not normally stored that give insight into the effect of this operation on the environment. Since event triggers can be defined or changed during software execution, the user can adjust them to the changing needs of a developing situation. For example, after executing a failure repair operation, the user may define an event trigger to notify him when the repaired system is next started up.

Figure 1. Specifying an Event Trigger The 3T system is a layered control architecture. Each layer of control represents a different control abstraction (low-level, task-level, plan-level). Under normal circumstances, the user intervenes at the plan-level or the top of the task-level (initiate start up sequence). Under exceptional circumstances (anomalies, novel situations), we adapt the user interface to provide the capability to issue low-level commands to the system controllers. This adaptation can be triggered based on events indicating that upper layers of control are not operating correctly (e.g., control process stops communicating, autonomous anomaly response fails to correct a problem). Both the user and the autonomous system can determine when these exceptional conditions have occurred. The user interface to 3T is both adaptable and adaptive. It is adaptable because it can be reconfigured manually by the user to support different task objectives. The user can change the display configuration using the capabilities listed in Table 1. Examples of the adaptable interface include selecting a set of data values to monitor in a new window and displaying an interface to execute a manual procedure. Table 1. User Adaptation of 3T View of task agenda Inspect changes to memory Review message log Execute a manual task (single step) Execute an operating procedure (multi-step) The user interface to 3T also is adaptive because it can be reconfigured automatically based on changes in the control tasks. Using control events from the autonomous controller to trigger reconfiguration, the interface can adapt to support a variety user tasks, including the following: Human Supervision of Autonomous Control Human-assisted Anomaly Management Joint Human-Computer Task Performance We give an example below of how the 3T user interface adapts to support each of these tasks.

Human Supervision of Autonomous Control We can use the control event triggers described previously to adapt the user interface by initiating or terminating data logging when significant control events occur. For example, the initiation of a startup or shutdown sequence for life support hardware represents a period where the hardware configuration is very dynamic. To get an accurate history of control during this transient period, the human supervisor may need to monitor additional sensors as well as increase the rate at which data are sampled. An event trigger defines which control tasks should initiate more detailed logging and which control tasks should terminate such logging. The user interface will then monitor incoming data for these control events and will change data logging based on which events are received. Figure 1 shows how to define a special data log using triggers. A related capability we are developing is displays to inform the user of new log files created when specific events occur (including failure events). A summary timeline of control tasks will be updated with a new archive icon whenever a special data log is created. A mechanism for viewing these files (plotter, message list) can be launched by clicking on the archive icon on the timeline. Figure 2 illustrates this design concept (Schreckenghost and Thronesbery, 1998). Figure 2. Accessing Data Logs Created Automatically Human-assisted Anomaly Management During autonomous operation, the human supervisor occasionally monitors high level information summaries. When an anomaly occurs in system operation, the supervisor needs more detailed (and often different) information. We also use control event triggers to implement adaptive reconfiguration of displays, in response to an anomaly. System anomalies can be indicated by the execution of a failure handling method. The 3T user interface software monitors for the execution of these methods, and selects which display to show based on which method is executed. For example, the user interface can pop up a detailed history of recent control events that is hidden during nominal operations or a manual anomaly recovery procedure when an anomaly occurs (see figure 3). This helps the supervisor understand what autonomous control actions have been taken in response to the anomaly as well as what manual actions are needed. Because the autonomous control software that we are using detects when the control actions fail to have the intended effect, we can use triggers to associate custom display configurations with the failure of a software control action. Control action failures can result from a failure of the hardware controllers, network failures affecting distributed controllers, or novel control situations not handled by the control software. Figure 3. Manual Anomaly Recovery Procedure

Joint Human-Computer Task Performance We developed a checklist display (Schreckenghost, 1999) for use when performing joint human-computer tasks. This display keeps a history of all low level activities performed to accomplish the joint high-level task objective. The checklist display is adaptive, because it reconfigures based on information about the control tasks executed during joint task performance. The autonomous control software dynamically assigns tasks to either human or software agents. These assignments can be altered during task execution either by the human or the autonomous system. The 3T checklist interface uses information about which agent is assigned to a task to reconfigure the checklist display. As shown in figure 4, manual tasks (figure 4a) have a different colored background (blue background) than autonomous tasks (white background; figure 4b). Manual tasks also are annunciated using sound. Thus, the same task information will be displayed differently depending upon which agent was assigned to perform the task. The checklist display also adapts automatically to the changes in the control situation by selectively providing the user with access to custom control displays, based on which tasks have executed. These custom controls can be popped up by selecting the button next to a manual task in the checklist (see figure 4a). The custom controls are only available to the user when the situated control task context specifies that a specific manual task should be executed. Select button for custom controls Figure 4.a. Checklist for Manual Task Figure 4.b. Checklist for Autonomous Task User Models for Adaptation We have identified a variety of user roles for autonomous control software. These user roles break down into two categories: system development and system operation. During system development, the interface should support the following user roles: Control software engineering: provide manual commanding and data feedback at each level of control abstraction (planning, task, closed-loop), access to software internals, and hardware emulation for testing. User interface development: provide software to customize displays for the application (e.g., build schematics, define procedures or event triggers) System integration and testing: issue commands to each tier of architecture, permitting manual replacement of a control tier, and review environmental feedback on the effects and timing of control activities at each control level During system operation, the interface should support the following user roles: Operations supervision: access standard operating procedures, review integrated views of system operation, software Fault management and opportunity response: access anomaly response procedures, review activity and data histories, and issue commands at all levels of control Joint task performance: support task handover between manual and autonomous operations, assist computerbased manual tasks, and provide insight into autonomous activities

System repair and maintenance: provide procedures for repair and maintenance, including selective disabling of sensors and controllers. The user s ability to issue commands to the control software should vary with user role. System development roles require unrestricted access to system commands, including the ability to execute arbitrary functions in the control process (see figure 5 for an example of a developer interface for system commands). Commanding available for system operation roles is typically constrained to standard operating procedures and anomaly response procedures (see figure 6 for an example of an operations interface for system commands). User changes to commands are limited to parameter changes (such as a control setpoint). Only the more catastrophic operational situations will require unrestricted access to commands. The user s access to information about the system also should be adapted based on the role of the user. For example, we have developed software for building schematic views of control data. System developers using this display can change the components on the schematic while system operators cannot. Conversely, system operators have easy access to data displays (real-time plots, data tables) that are of little interest while constructing the schematic view of system components. User tasks related to checkout and maintenance of control hardware require views of low-level sensors not typically needed for normal operations. Adaptation does not necessarily restrict access to information or capability. In some cases it changes the emphasis that the display places on the information or capability. For example, when executing tasks jointly with a robot (what we call traded control) we provide displays for tracking manual and autonomous activities to help in coordinating these activities. These displays highlight information about manual tasks, using high saliency encoding like color and sound. Although we have developed different interfaces to support changes in user role, we have not yet implemented adaptation of the control interface based on user role. We expect that role will be specified by either the user or the autonomous system based on allocation of tasks. Since it is possible that a user will fulfill multiple roles simultaneously, the user interface should support as many roles as needed and should adjust as user roles change. Figure 5. Developer Interface for Shutdown Command References Bonasso, P., J. Firby, E. Gat, D. Kortenkamp, D. Miller, M. Slack. Experiences with an Architecture for Intelligent, Reactive Agents.Journal of Experimental and Theoretical AI, 9(2), 97. Kortenkamp, D., Schreckenghost, D. and Bonasso, R.P. Adjustable Control Autonomy for Manned Space Flight. IEEE Aerospace Conference. March 2000. Schreckenghost, D. and Thronesbery, C. (1998, Oct) Integrated display for supervisory control of space operations. Human Factors and Ergonomics Society. 42 nd Meeting. Schreckenghost, D. Checklists for Human-Robot Collaboration during Space Operations. Human Factors and Ergonomics Society. 43 nd Meeting. Sep 99. Figure 6. Operations Interface for Shutdown Command