CHAPTER 5 SYSTEM IMPLEMENTATION AND TESTING. This chapter describes the implementation and evaluation process conducted on the e-

Similar documents
Acceptance Test. Smart Scheduling. Empire Unlimited. Requested by:

Requirements Specification

Prepared By: PATEL DHARMESH M.( ) (C.E.)

Final Project Report

Requirements Specification

Recording end-users security events: A step towards increasing usability

Scorebook Navigator. Stage 1 Independent Review User Manual Version

Assignment 2: Website Development

Web Systems Policies Database Guerilla Testing. Project Cover Sheet. Policies Database Usability Evaluation. UM Library Web Systems

Volume-4, Issue-1,May Accepted and Published Manuscript

CSCI 6312 Advanced Internet Programming

Syllabus Course Number: CS 412 Course Title: Web and Database Programming

Visual Appeal vs. Usability: Which One Influences User Perceptions of a Website More?

USER EXPERIENCE ASSESSMENT TO IMPROVE USER INTERFACE QUALITY ON DEVELOPMENT OF ONLINE FOOD ORDERING SYSTEM

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

PELLISSIPPI STATE COMMUNITY COLLEGE MASTER SYLLABUS. INTRODUCTION TO INTERNET SOFTWARE DEVELOPMENT CSIT 2230 (formerly CSIT 2645)

Usability Evaluation of Cell Phones for Early Adolescent Users

Hyacinth Macaws for Seniors Survey Report

For More Solved Assignments Visit - For More Ignou Solved Assignments Visit -

Evaluating the suitability of Web 2.0 technologies for online atlas access interfaces

E-valuation U of S Online Evaluation System Using SEEQ

Assignment 1: Design Document

Running Head: TREE TAP USABILITY TEST 1

TERMS OF REFERENCE Design and website development UNDG Website

External Guide : Mr. Mayursinh Vaghela. Internal Guide : Mr. R. L. Patel

2016 Suite. Cambridge TECHNICALS LEVEL 3. Unit 21 Web design and prototyping. Northbound Events assignment M/507/5024. Version 1 September 2017

Access from the University of Nottingham repository:

Usability Testing. November 14, 2016

KM COLUMN. How to evaluate a content management system. Ask yourself: what are your business goals and needs? JANUARY What this article isn t

SQA Advanced Unit specification. General information for centres. Unit title: Web Development Fundamentals. Unit code: HR7M 47

PNC.com, Weather.com & SouthWest.com. Usability Analysis. Tyler A. Steinke May 8, 2014 IMS 413

USABILITY TEST REPORT

A Web Based Registration system for Higher Educational Institutions in Greece: the case of Energy Technology Department-TEI of Athens

Code Administration Code of Practice

HOLY ANGEL UNIVERSITY COLLEGE OF INFORMATION AND COMMUNICATIONS TECHNOLOGY WEB TECHNOLOGIES 1 COURSE SYLLABUS

The Development of Critical Undergraduate Program Selection System (CUPSS) for Admission to Malaysian Public Universities

A network is a group of two or more computers that are connected to share resources and information.

INFS 2150 (Section A) Fall 2018

Usability Report for Online Writing Portfolio

The Information Technology Program (ITS) Contents What is Information Technology?... 2

Chapter 17: INTERNATIONAL DATA PRODUCTS

A User Study on Features Supporting Subjective Relevance for Information Retrieval Interfaces

Networked Access to Library Resources

DESK REFERENCE. Decision Support Only User Gaining DSOnly access and the basics of using Decision Support. Purpose

ONLINE SHOPPING CHAITANYA REDDY MITTAPELLI. B.E., Osmania University, 2005 A REPORT

Objectives of the Security Policy Project for the University of Cyprus

COLUMN. Worlds apart: the difference between intranets and websites. The purpose of your website is very different to that of your intranet MARCH 2003

Comparing the Usability of RoboFlag Interface Alternatives*

Reducing Cybersecurity Costs & Risk through Automation Technologies

Atlantic Technical College Web Development Program Syllabus Instructor Office Hours:

Appendix F Course codes, descriptions, units, lecture/lab

ONS Beta website. 7 December 2015

User Interface Design

SMK SEKSYEN 5,WANGSAMAJU KUALA LUMPUR FORM

1.2 What Spotlight and Strata users can expect

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

Threat and Vulnerability Assessment Tool

Technology in Action. Chapter Topics. Scope creep occurs when: 3/20/2013. Information Systems include all EXCEPT the following:

Portfolios Creating and Editing Portfolios... 38

3Lesson 3: Web Project Management Fundamentals Objectives

Lab 1 MonarchPress Product Description. Robert O Donnell CS411. Janet Brunelle. September 20, Version #2

Synology Security Whitepaper

Incredible India. PreparedBy:Harshad Khamal &DivyeshDangar Page 1

Ursuline College Accelerated Program

Web Development: Client Side

COMMON ISSUES AFFECTING SECURITY USABILITY

Scorebook Navigator. Stage 1 Independent Review User Manual Version

Folsom Library & RensSearch Usability Test Plan

Media Services Online Mohammed Abukhiran. Report 13 on the work of Week 13

Table of Contents. I) Project Planning. User Analysis. III) Tasks Analysis. IV) Storyboard. V) Function Design. VI) Scenario Design.

Human Computer Interaction Lecture 14. HCI in Software Process. HCI in the software process

Foundation Level Syllabus Usability Tester Sample Exam

Compile together the individual QA Testing Checklists for your team site.

Automated Heuristic Evaluator

SYLLABUS. Departmental Syllabus. Modern Publication Design JOUR0180. Departmental Syllabus. Departmental Syllabus. Departmental Syllabus

Evaluation and Design Issues of Nordic DC Metadata Creation Tool

Students completing CISB 15 - Microcomputer Applications will understand the importance of continuing their education in computer literacy.

CHAIRSIDE 2009 USER MANUAL

CaliberRDM. Installation Guide

Inf 202 Introduction to Data and Databases (Spring 2010)

Human Computer Interaction Lecture 06 [ HCI in Software Process ] HCI in the software process

Version 1/2018. GDPR Processor Security Controls

To study the application of Data Visualization and Analysis tools

Using Internet as a Data Source for Official Statistics: a Comparative Analysis of Web Scraping Technologies

Request For Proposals. Information Technology (IT) Services. General: The City of Bishop requests proposals from consultants to provide IT services.

Screening applicants of SIIT scholarship

Academic Program Review at Illinois State University PROGRAM REVIEW OVERVIEW

Testing the Date Maintenance of the File Allocation Table File System

EVACUATE PROJECT WEBSITE

Design Patterns for CGI Web Applications with Visual Basic

Chapter 13: CODING DESIGN, CODING PROCESS, AND CODER RELIABILITY STUDIES

CO600 Group Project Magnus Bogucki, Christos Fragiadakis

for Q-CHECKER Text version 15-Feb-16 4:49 PM

Data Analysis and interpretation

Cascading versus Indexed Menu Design

INSTITUTE OF TECHNOLOGY AND ADVANCED LEARNING SCHOOL OF APPLIED TECHNOLOGY COURSE OUTLINE ACADEMIC YEAR 2012/2013

Adaptable and Adaptive Web Information Systems. Lecture 1: Introduction

Job Interview / New Job Learn the Buzzwords. Lab Assignment Login and Explore Key Systems. Buzzwords. Outline. Saddleback

Printed Circuit Board Development Automation

This PDF was generated from the Evaluate section of

Transcription:

CHAPTER 5 SYSTEM IMPLEMENTATION AND TESTING 5.1 Introduction This chapter describes the implementation and evaluation process conducted on the e- BSC system. In terms of implementation, the development environment, tools, development platform, database used and levels of system users are also discussed. Meanwhile for testing, the type of testing procedure used, participants and analysis made on the results for the testing procedure are explained. 5.2 Development Environment The suitable development environment has to be established to ensure that the implementation process runs smoothly. The following describes the software and hardware requirement for the development process. 5.2.1 Hardware Requirements Table 5.1 describes the hardware requirements of the e-bsc system. Table 5.1 Hardware Requirements Hardware Processor Memory Hard disk space Others Description Intel Centrino 1.6 Ghz Processor or higher or other equivalent processors At least 512 MB Recommended: 1GB or more At least 50MB Internet access 156

5.2.2 Software Requirements Table 5.2 describes the software requirements for the e-bsc system. Table 5.2 Software Requirements Software Operating System Web Server Relational Database Management System Internet Browser Programming languages Web design and Development tool Image Editor Description Microsoft Windows XP or higher Apache 2.2.6 or higher MySQL 5.0.45 or higher Microsoft Internet Explorer PHP, Javascript, HTML Adobe Dreamweaver MX 2004 (formerly Macromedia Dreamweaver MX 2004) Adobe Photoshop 5.3 Development Tools 5.3.1 PHP To develop the proposed e-bsc system, PHP (acronym for Hypertext Preprocessor) was selected as the server-side scripting language. PHP was first developed by Rasmus Lerdorf in 1995 (Kent et al., 2004), to track the online access of his personal resume. Since then, the creator improved the language with additional features such as database support and for web applications development. Subsequently, the creator felt the need to release the source codes to enable other programmers to contribute in how the language can be improved. Soon after, many threw in different ideas and even helped rectify the errors in the system while adding new functionalities to the language. Now, PHP is managed by the PHP Group and is widely used around the world due to its numerous strengths. The very fact that it is an open source language, its extensibility is limitless. In other words, developers around the world can easily write new extensions to further improve the functionality of the language. Besides that, its syntax which has been designed to support object-oriented programming makes it an attractive option for developers who prefer that concept. In addition to that, PHP can be easily 157

embedded into HTML codes to create interactive and dynamic web pages, which makes it especially appropriate for web development where the PHP code is executed in the web server which in turn creates the desired web page based on the codes. The platform compatibility of PHP is so vast that it can be used in most web servers, operating systems and can work in unison with majority databases. A sample of the programming codes for the e-bsc system is attached in Appendix I. 5.3.2 Javascript Javascript is the client-side scripting language used during the development phase. In certain circumstances, Javascript had to be used instead of PHP. One such instance includes displaying popup windows to alert users of errors in data entered during validation of input or possible loss of data when a delete operation is executed. The fact that the language works in a run-time environment which is especially true in web browsers, makes Javascript suitable for offering the required services that a server-side language cannot. However, Javascript is not used to pass sensitive data such as passwords in view that the codes are executed at the client side where the system can be vulnerable to malicious attacks. 5.3.3 HTML HTML, acronym for HyperText Markup Language, is a markup language used to describe the formatting of text in a document. It is useful in the sense that it allows text to be structured according to its purpose, namely as a heading, paragraph and so on. This is accomplished by writing the HTML in tags that describes to the web browser how the text is to be displayed. A scripting language such as PHP and JavaScript can be easily embedded in HTML to enhance the functionality of HTML. 158

5.3.4 Adobe Dreamweaver MX 2004 Adobe Dreamweaver MX 2004 was selected as the web design and development tool as it provides an easy way to build a site where graphics and webpage components can be dragged and dropped to the appropriate location. At the same time, Dreamweaver automatically generates the corresponding code where additional code manipulation can be made by the programmer. While using HTML alone is enough to build a website, other technologies such as scripting languages and database supported in Dreamweaver increases the functionality of the website and its dynamics. The development environment provided by Dreamweaver provides a simple and easy way of combining the strengths of PHP, Javascript, HTML and MySQL. 5.4 Development Platform The Windows XP operating system was selected as the platform for the development of the e-bsc system. The fact that the potential users are already using and are familiar with the Windows XP environment played an important role in the selection. Besides that, a simple observation on the computer labs in the university displayed that the operating system selected is widely adopted throughout the educational institution. Since the inception of the operating system, the developers of Windows XP have continually improved its stability while providing a user-friendly environment. 5.5 Database MySQL is a relational database management system (also known as an SQL Database Server) which is widely used around the globe due to it being open-sourced. Most SQL servers provide reliability but not ease of use unlike MySQL. MySQL is also mostly platform independent which means it can run on most operating systems such as Windows and Linux. When using PHP to develop a website, Macromedia 159

Dreamweaver MX 2004 only supports a MySQL database connection as both PHP and MySQL are tightly connected. Since PHP complements MySQL very well and given the fact that both technologies are widely used, it is can be certain that thorough testing has been done. 5.6 Levels of Users As identified in Chapter 4 there are four types of potential users of the system, namely SA, faculty deans, appraisers and the academic staff. In this section, the responsibilities of every user in the admin and faculty module are discussed in detail while demonstrating how the roles of each relate to the testing procedure that will be discussed in Section 5.7. 5.6.1 System Administrators (SA) SAs are responsible for system management and maintenance. The key role played by the SA involves adding new, editing or removing details of: KPIs as set at the corporate level. Evaluation score range values (edit only as these are permanently required by appraisers to evaluate staff). Faculties and corresponding information such as departments attached. Academic staff. Appraisers (except insertion and deletion as this is the responsibility of the respective faculty dean). Other system administrators. Notices. 160

Since the editing and deleting operations require seeking the correct data, whether KPI, faculty, academic staff and so on, search features have been implemented to ease this process. The testing procedure on the faculty module will also involve checking that the search features work as expected. The insertion, updating and deletion of data have to be implemented while ensuring referential integrity to make certain that there is data consistency while avoiding data redundancy. Furthermore, constraints have to be enforced during insertion and updating processes to avoid illegal data from being entered. 5.6.2 Faculty Dean The faculty module provides the functionalities for the faculty dean in terms of faculty performance planning and management. As the head of the faculty, the dean is responsible for ensuring that the whole unit works in alignment to corporate strategies and achieves the expectation of the management thus contributing to the advancement of the university. This can be accomplished if corporate level goals are well understood while preserving the alignment of management level KPIs to the faculty s KPIs. To do so, the faculty module includes providing features where the dean can select corporate KPIs to be cascaded into the faculty scorecard with the weight each carries, expected targets and measures. Hence, the testing procedure for this module requires that this feature be checked for errors where constraints such as maintaining that the total weight on the faculty scorecard is equal to 100, changes cannot be made on the scorecard when the contracting period has started. The dean is also responsible for assigning the first appraiser for each department attached to the faculty. Staff has to be first added by the SA in the admin module before the assignment process can take place. In the testing procedure for this feature, it has to 161

be made certain that each department can only have one first appraiser and therefore disabled when the assignment has already taken place. Besides that, as the head of the faculty, the dean has the authority to assign an appraiser to a different department. Lastly, the dean should be able to view the performance of individual staff, department or the entire faculty. To view the performance of an academic staff, a search has to take place first and foremost. Consequently, the performance of the respective individual can be viewed. The role played by the dean as the second appraiser for evaluating the performance of all academic staff in the faculty is covered in the appraiser module. 5.7 System Testing Before beta system testing was initiated, an alpha test was conducted on the integration of all four modules to ensure that the complete functionality of the e-bsc system can be presented and tested. Subsequently, a user acceptance test (beta testing) was conducted in the month of July 2008 in FCSIT and the Chancellery / Administration building with potential end users of the system, who are namely the lecturers, dean of FCSIT, members of the human resource department, representative from SPU, personnel from the IT centre and a BSC expert. 5.7.1 Testing Procedure Beta Testing was done to demonstrate the functionality of each module and how the responsibilities of the target users can be accomplished using the system. Besides that, the purpose of the procedure was to gather feedback through a system evaluation questionnaire distributed to the participants with regards to the system s usability and fitness for purpose. 162

Prior to conducting the test, a brief explanation of research objectives and descriptions of system modules with the roles and responsibilities played by each type of user was done. Consequently, a system demonstration was conducted to show the functions and features of the system. The participants were then presented with a system evaluation questionnaire (appended in Appendix J) to express their feedbacks towards the developed prototype. Subsequently results from the user acceptance test are used to substantiate the research in terms of improving PM and planning for lecturers as well as performance management and planning at the faculty level. 5.7.2 System Evaluation Questionnaire Format Participants for the user acceptance test were required to complete the system evaluation questionnaire that attempts to assess the usability of the prototype while verifying that the system truly delivers its business functions required. Subsequently, the results from the system evaluation test will determine if the e-bsc can indeed improve the performance measurement of the lecturers in UM. The questionnaire is divided into the two sections, where: i. Section 1 is divided into 4 parts each pertaining to the four system modules. Each part consists of 5 to 8 questions which require the participants to evaluate specific features of the system based on a 5-level Likert scale with the values 1 for Strongly Disagree, 2 for Disagree, 3 for Not Sure, 4 for Agree and 5 for Strongly Agree. The charts in Figures 5.1 to 5.3 and Tables 5.4 to 5.9 are based on the similar 5-level Likert scale. ii. Section 2 entails the overall evaluation of the e-bsc system, specifically the system s rate of complexity and its effectiveness in measuring the performance of the academic staff. In addition to that, the participants were 163

also required to rate the improvements, if any, of the system as compared to the staff PM system currently used. 5.7.3 Participants of the System Evaluation Test The following table lists the participants of the system evaluation. Table 5.3 List of participants in e-bsc system evaluation No. Name Role Date Tested 1 Respondent 1 Academic Staff 14 July 2007 2 Respondent 2 Academic Staff 21 July 2007 3 Respondent 3 Academic Staff 14 July 2007 4 Respondent 4 Academic Staff 18 July 2007 5 Respondent 5 Academic Staff / Appraiser 17 July 2007 6 Respondent 6 Academic Staff / Appraiser 17 July 2007 7 Respondent 7 Academic Staff / Appraiser 16 July 2007 8 Respondent 8 Academic Staff / Appraiser 17 July 2007 9 Respondent 9 Academic Staff / Appraiser / Dean 24 July 2007 10 HR Representative Human Resource (top 25 July 2007 management) 11 HR Representative Human Resource (top 25 July 2007 management) 12 HR Representative Human Resource (top 25 July 2007 management) 13 HR Representative Human Resource (top 25 July 2007 management) 14 IT centre representative SA 25 July 2007 15 IT centre representative SA 25 July 2007 16 SPU representative SPU (top management) 16 July 2007 17 Respondent 10 BSC expert 25 July 2007 Though the participants held different roles, all of them were presented with the same questionnaire to give their personal opinions with regards to each system module and the overall e-bsc effectiveness in measuring the performance of the academic staff in UM. 164

5.7.4 Test Data Analysis An analysis on the results obtained from the e-bsc system evaluation was carried out to determine the feedbacks from users with regards to the effectiveness and usability of the system. As explained earlier, the system evaluation questionnaire is divided into 2 sections where the first contains 4 distinct parts with questions pertaining to the respective system module while the second section contain questions with regards to the overall performance of the system. Since this portion of the study focuses on the faculty and system administration modules, only the results from these sections and the overall evaluation of e-bsc will be discussed. 5.7.4.1 Results of the evaluation on the System Administration module The chart in Figure 5.1 reveals results obtained from the evaluation on the system administration module. Evaluation on System Administrator Module Percentage (%) 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 29 24 29 29 18 29 47 59 53 59 47 35 12 18 12 Q1 Q2 Q3 Q4 Q5 Questions Not Sure Agree Strongly Agree Figure 5.1 Results for the Evaluation on the System Administration Module 165

Based on the results, positive results were evident for Questions 1, 3 and 4 which measured more agreement that the system displayed straightforward, sufficient of warnings for drastic actions and constraints to prevent users from performing illegal operations. Table 5.4 clearly displays the corresponding frequency distribution for the chart in Figure 5.1. Table 5.4 Results for the Evaluation on the System Administration Module Frequency Distribution Q1 Q2 Q3 Q4 Q5 Strongly Agree 5 4 5 5 3 Agree 10 5 9 10 8 Not Sure 2 8 3 2 6 Total 17 17 17 17 17 Based on Table 5.4, it is clear that more than half of the participants answered either agreed or strongly agreed for Questions 1, 3 and 4. The results are further validated by the mean of responses that were skewed towards agreement as shown in Table 5.5. Table 5.5 System Administration Module Descriptive Statistics Question Minimum Maximum Mean Std. Deviation Q1 3 5 4.18.636 Q2 3 5 3.76.831 Q3 3 5 4.12.697 Q4 3 5 4.18.636 Q5 3 5 3.82.728 However, almost half of the participants were unclear about the options for the search functions and the sufficiency of features for system management and maintenance with each having eight and six respondents who responded uncertainty that the system 166

displayed those features. One possible reason for the response is because these features were not highlighted during the system demonstration which emphasized more towards the value of the system in improving performance management and PM of academic staff. Another point to consider is the standard deviation showing the dispersion of data collected compared to the mean value. For questions 2 and 5, it can be seen that the standard deviation (Table 5.5) recorded higher than other values causing the consistency of responses to be queried. However, the inconsistency of data which led to the assumption will not be used as an excuse that the module does not need further refinement. Instead, all responses obtained will be taken into account to further investigate the possible weaknesses of the module and propose these for future enhancements. Nevertheless, no negative results to disagree that the system delivered specific purposes were noticeable among the responses. 167

5.7.4.2 Results of the evaluation on the Faculty module Figure 5.2 and Table 5.6 illustrates the results of evaluation on the faculty module of the e-bsc system. Evaluation on Faculty Module Percentage (%) 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 24 65 6 6 29 59 12 24 65 12 29 59 12 Q1 Q2 Q3 Q4 Q5 Q6 Q7 Questions 24 59 18 29 59 12 24 59 18 Disagree Not Sure Agree Strongly Agree Figure 5.2 Results for the Evaluation on the Faculty Module Table 5.6 Results for the Evaluation on the Faculty Module Frequency Distribution Q1 Q2 Q3 Q4 Q5 Q6 Q7 Strongly Agree 4 5 4 5 4 5 4 Agree 11 10 11 10 10 10 10 Not Sure 1 2 2 2 3 2 3 Disagree 1 0 0 0 0 0 0 Total 17 17 17 17 17 17 17 Based on the results, it is clear that most of the participants agree or strongly agree that the methods used in the module as sufficient for faculty performance management. 168

The result is further substantiated with the descriptive statistics shown in Table 5.7. Table 5.7 Faculty Module Descriptive Statistics Question Minimum Maximum Mean Std. Deviation Q1 2 5 4.06.748 Q2 3 5 4.18.636 Q3 3 5 4.12.600 Q4 3 5 4.18.636 Q5 3 5 4.06.659 Q6 3 5 4.18.636 Q7 3 5 4.06.659 In the table, the average responses for the questions ranged from 4.06 to 4.18 which showed strong inclination toward agreement that the module fits its purpose. Meanwhile, only a small proportion of the results showed uncertainty with regards to the features provided in this module as illustrated in the chart for Question 1. 5.7.4.3 Results of the evaluation on the overall e-bsc system Figure 5.3 shows the evaluation on the effectiveness of the overall e-bsc system. Evaluation on e-bsc Percentage (%) 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 29 12 24 18 18 29 35 29 24 35 59 59 41 59 41 53 59 47 29 29 18 6 18 12 6 6 6 Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Strongly Agree Agree Not Sure Disagree Questions Figure 5.3 Results for the Evaluation the overall e-bsc system 169

Table 5.8 displays the corresponding frequency distribution for the chart in Figure 5.3. Table 5.8 Results for the Evaluation the overall e-bsc system Frequency Distribution Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Strongly Agree 5 2 4 5 6 3 3 5 Agree 10 10 10 7 7 6 4 9 Not Sure 1 5 3 5 3 8 10 2 Disagree 1 0 0 0 1 0 0 1 Total 17 17 17 17 17 17 17 17 In the results, most of the results agreed that the e-bsc system delivered its purpose. However, uncertainty is obvious in the responses for questions 6 and 7, where eight and ten respondents respectively answered unsure when evaluating the effectiveness of the system in aligning strategies at the university level with that of the faculties and supporting staff to develop positive work ethics. As recorded in Table 5.9, the mean values for these two questions also recorded indecision among the responses. Table 5.9 e-bsc system Descriptive Statistics Question Minimum Maximum Mean Std. Deviation Q1 2 5 4.12.781 Q2 3 5 3.82.636 Q3 3 5 4.06.659 Q4 3 5 4.00.791 Q5 1 5 4.00 1.061 Q6 3 5 3.71.772 Q7 3 5 3.59.795 Q8 2 5 4.06.827 It is important to note that the result for Question 6 does not concur with the result for Question 2 for the faculty module evaluation as shown in Figure 5.2 where the former records as much as 47 percent of uncertainty while the latter shows 88 percent (59% 170

+29%) agreement that the e-bsc is effective in organisation wide alignment of strategies. Nevertheless, this result will be used to fuel efforts for future enhancements on the system to make these features more obvious and easily understood by end-users. Figure 5.4 illustrates the results for the evaluation on the complexity of the e-bsc system. Figure 5.4 Complexity (Learnability) of e-bsc System From the results, it is clear that 100% of responses from the target users who are the staff evaluated the system as average or moderately easy to use. Similar results are observed for responses from users other than the human resource and IT centre personnel. Meanwhile, the dean expressed that the system as easy to use. Only a minimal percentage of responses from the human resource and IT centre representatives rated the system as complex. Otherwise the system has proven to be easy to learn as can be seen in the summary result for the complexity/learnability of e- BSC as exhibited in Figure 5.5. 171

Figure 5.5 and Table 5.10 depicts the summary result for the complexity of e-bsc without the separation of ratings from each category of participants. Summary results for Complexity/Learnability of e-bsc 100 82% Percentage (%) 80 60 40 20 0 6% 12% 0% Ea s y to use Moderate / Average Complex S1 Very Complex Figure 5.5 Summary results Complexity (Learnability) of e-bsc System Table 5.10 Summary results Complexity (Learnability) of e-bsc System Frequency Distribution Complexity Frequency Percent (%) Easy to learn/use 1 6 Moderate/Average 14 82 Complex 2 12 Very Complex 0 0 Total 17 100 If compared to Figure 4.15 in Chapter 4 which illustrates the evaluation of respondents on the PM system currently used in UM, e-bsc has demonstrated that none of the participants of the system evaluation felt that the system is too complex to learn unlike the current system which recorded 5% in that category. In addition to that, if the positive results for the evaluation, in particular easy to use and moderate/average, are compared with the results for the current system, e-bsc responses totaled to 88% 172

(6% easy to use + 82% moderate/average) which showed a slight increased in satisfaction with its deliverables while the current PM system only recorded a total of 85% (20% easy to use + 65% moderate/average). Figure 5.6 displays the results for the ratings from every category of participants on the effectiveness and quality of the e-bsc system. Most of the responses showed positive results. Figure 5.6 Quality (Effectiveness) of e-bsc System 173

Figure 5.7 illustrates the summary of the response. When contrasted with figure 4.16 in chapter 4 which shows the evalution on the quality of the current PM system used, the results for e-bsc showed significant improvements. Unlike the current system which showed a high of 20% of respondents rating it as not suitable, the evaluation on e-bsc did not record any negative feedbacks. Instead, some 12% (from two participants out of the seventeen) even rated the e-bsc as very effective in delivering its purpose. Figure 5.7 Summary results Quality (Effectiveness) of e-bsc System 174

Figure 5.8 below demonstrates whether e-bsc displayed any form of improvements in comparison to the current PM system. Figure 5.8 Comparison of e-bsc System to the Current PM system From the results, most participants considered e-bsc to have some enhancement and improvement in terms of effectiveness compared to the current system. Only a small 25 percent of staff and human resource respectively (one person from each category) revealed uncertainty in their responses. 175

Meanwhile, the Figure 5.9 shows the summary of ratings with regards to the improvements in e-bsc compared to the current PM system. Figure 5.9 Summary Comparison of e-bsc System to the Current PM system As observed, the pie chart shows a feedback with a high of 47 percent, from eight of the participants who evaluated the system as delivering significant improvement in contrast to the current system. Also, seven participants who make up 41 percent rated e- BSC to have slight improvement over the current system. No more than 12 percent (from two participants) recorded uncertainty for that query. 5.7.4.4 Comments from participants During the e-bsc system demonstration, several comments from the participants were also recorded as recommendations for future enhancements purposes. Most of the comments were related to the appraiser, faculty and system administrator module where enhanced features had to be provided to ensure that the system truly performs alignment of all strategies while supporting user friendliness. Meanwhile, some 176

comments targeted on how to further ease data entry efforts. Please refer to Appendix K for the detailed comments and the contributors. At the same time, written comments on the system evaluation forms were also tabulated for easy comprehension of participants expectations and feedbacks on the e-bsc system. Table 5.11 summarizes the comments as follows: Table 5.11 Summary of Written Comments from System Evaluation Participant Category Comment(s) Staff My response is only based on what has been demonstrated during the meeting. I don t have prior knowledge and experience with the system before. System is adequate (referring to question 10) but still have rooms for improvements. Good effort and good system. Predefine the ISI journals titles and match that with staff s records. Very good. Appraiser A good effort. Well done. Particularly like the interface! More variables and lookup tables needed to facilitate data entry. Well done. Dean Well structured and very intuitive. IT Centre Needs to integrate with existing information systems and to the corporate scorecard so that it is cascaded upwards. SPU e-bsc must have features that will enable modifications to suit/accommodate feedbacks on the needs and requirements of users. System rigidity will kill it! Ease of use is of utmost importance. 177

5.8 Summary In this chapter, the system development environment and programming tools used were discussed in detail. Adding to that, the system evaluation procedure conducted on the finished product was explained. From the results obtained, it is obvious that the e-bsc is effective in meeting its objectives though there is still room for improvement to further enhance the features of the system. Nevertheless, satisfaction with the e-bsc was generally observed in the participants expression during the system demonstration to show their contentment with its features. Compared to the current PM system used in UM, e-bsc has shown to be suitable for measuring and managing the performance of academic staff. 178