Setting and Sharing Adaptive Assessment Assets

Similar documents
WHITE PAPER. SCORM 2.0: Assessment Framework. Business Area Data Representation and Interfaces

Issues raised developing

Managing Learning Objects in Large Scale Courseware Authoring Studio 1

GENGHIS, AN AUTHORING TOOL FOR KHAN ACADEMY EXERCISES

Authoring Tool of Sharable Question Items Based on QTI Specification for E-learning Assessment

SDMX self-learning package No. 3 Student book. SDMX-ML Messages

Tennessee. Trade & Industrial Course Web Page Design II - Site Designer Standards. A Guide to Web Development Using Adobe Dreamweaver CS3 2009

Chapter 13 XML: Extensible Markup Language

Authoring Environment for E-learning Production Based on Independent XML Formats

The RAMLET project Use cases

Project Deliverable Report. D4.3 Final System

1D CIW: Web Design Specialist. Course Outline. CIW: Web Design Specialist Apr 2018

Assessment Delivery Engine for QTIv2 Tests

Reusability and Adaptability of Interactive Resources in Web-Based Educational Systems. 01/06/2003

An Introduction to. Freedom to Innovate! Freedom to Migrate! Freedom to Interoperate!

For many years, the creation and dissemination

Integrating IEC & IEEE 1815 (DNP3)

JBI Components: Part 1 (Theory)

ZEND: Survey on the Examination System

An XML-based Question Bank using Microsoft Office

Ministry of Higher Education and Scientific Research

An Overview of the IMS Question & Test Interoperability Specification

LearningMate Solutions - Creating Content Using SkillsCommons

ASPECT Adopting Standards and Specifications for. Final Project Presentation By David Massart, EUN April 2011

Interoperability for Digital Libraries

Response to the. ESMA Consultation Paper:

Databases and Information Retrieval Integration TIETS42. Kostas Stefanidis Autumn 2016

A Tool for Managing Domain Metadata in a Webbased Intelligent Tutoring System

PExIL: Programming Exercises Interoperability Language

Semantic Web: vision and reality

Feeding the beast: Managing your collections of problems

The XML Metalanguage

IMS Learning Design XML Binding

Modularization and Structured Markup for Learning Content in an Academic Environment

CIW: Web Design Specialist. Course Outline. CIW: Web Design Specialist. ( Add-On ) 16 Sep 2018

Three Phase Self-Reviewing System for Algorithm and Programming Learners

Burrokeet, an Application for Creating and Publishing Content Packages with support for Multiple Input and Output Formats

Introduction. Who wants to study databases?

Stylus Studio Case Study: FIXML Working with Complex Message Sets Defined Using XML Schema

Midterm 1 Review Sheet CSS 305 Sp 06

Conceptualising Item Banks

SQL Based Paperless Examination System

An UML-XML-RDB Model Mapping Solution for Facilitating Information Standardization and Sharing in Construction Industry

Higher National Unit Specification. General information for centres. Unit code: DH3J 34

Copyright 2007 Ramez Elmasri and Shamkant B. Navathe. Slide 1-1

Xyleme Studio Data Sheet

Research on standards supporting Adaptation and Accessibility for ALL in Higher Education

Creating a National Federation of Archives using OAI-PMH

A Mapping of Common Information Model: A Case Study of Higher Education Institution

A Digital Reference Room for E-learning

LECTURE1: PRINCIPLES OF DATABASES

Course Outline. MCSA/MCSE - Querying Microsoft SQL Server 2012 (Course & Lab) ( Add-On )

Introduction. Example Databases

SIMON. Creating and Assessing Assessment Tasks. Creating an Assessment Task. Step 1

To complete this tutorial you will need to install the following software and files:

EDUPUB Forging an Open, Agile, Global, Interoperable Educational Ecosystem for the 21st Century

Robin Wilson Director. Digital Identifiers Metadata Services

Automatic Test Markup Language <ATML/> Sept 28, 2004

Learner-centred Accessibility for Interoperable Web-based Educational Systems

CTI Short Learning Programme in Internet Development Specialist

Introduction to Databases Fall-Winter 2009/10. Syllabus

CMPSCI 645 Database Design & Implementation

EUROPASS DIPLOMA SUPPLEMENT TO HIGHER TECHNICAL VOCATIONAL TRAINING

Microsoft. [MS20762]: Developing SQL Databases

The Semantic Web Revisited. Nigel Shadbolt Tim Berners-Lee Wendy Hall

DESIGN AND IMPLEMENTATION OF TOOL FOR CONVERTING A RELATIONAL DATABASE INTO AN XML DOCUMENT: A REVIEW

Development of Educational Software

Metadata and Encoding Standards for Digital Initiatives: An Introduction

Notes. Submit homework on Blackboard The first homework deadline is the end of Sunday, Feb 11 th. Final slides have 'Spring 2018' in chapter title

Databases and Database Systems

Tutorial 1 Getting Started with HTML5. HTML, CSS, and Dynamic HTML 5 TH EDITION

Chapter 126 TEKS for Technology Applications

CIW: JavaScript Specialist. Course Outline. CIW: JavaScript Specialist. 30 Dec

Extending the AAT Tool with a User-Friendly and Powerful Mechanism to Retrieve Complex Information from Educational Log Data *

Web Design Course Syllabus and Course Outline

Market Information Management in Agent-Based System: Subsystem of Information Agents

IMS Learning Object Discovery & Exchange

SOFTWARE ENGINEERING. To discuss several different ways to implement software reuse. To describe the development of software product lines.

Introduction: Databases and. Database Users

Web-Based Learning Environment using Adapted Sequences of Programming Exercises

Why Consider Implementation-Level Decisions in Software Architectures?

An Archiving System for Managing Evolution in the Data Web

A Semantic Lifecycle Approach to Learning Object Repositories

COURSE OUTLINE. School of Engineering Technology and Applied Science

Warfare and business applications

Transforming UML Collaborating Statecharts for Verification and Simulation

Hospitality Industry Technology Integration Standards Glossary of Terminology

ICOPER - Interoperable Content for Performance in a Competency-driven Society

On the Semantics of Aggregation and Generalization in Learning Object Contracts

20762B: DEVELOPING SQL DATABASES

A Digital Library Framework for Reusing e-learning Video Documents

THE NEW NATIONAL ATLAS OF SPAIN ON INTERNET

TITLE OF COURSE SYLLABUS, SEMESTER, YEAR

HTML5-CSS3 - Beginning HTML5 and CSS3. Course Outline. Beginning HTML5 and CSS Apr 2018

SCORM Content Aggregation Model [ CAM ] 4 th Edition

Guidance on the appropriateness of the information technology solution

Ontological Library Generator for Hypermedia-Based E-Learning System

CTI Higher Certificate in Information Systems (Internet Development)

DISCOVERY EDUCATION streaming GETTING STARTED. Search Tools

Copyright 2016 Ramez Elmasri and Shamkant B. Navathe

Transcription:

Setting and Sharing Adaptive Assessment Assets Héctor Barbosa 1, Francisco García 1 1 Universidad de Salamanca, 37008 Salamanca, Spain {barbosah@usal.es, fgarcia@usal.es } Abstract. In this article, we present a proposal model to construct educative assessment assets using accepted standards and specifications like the IMS Question and Test Interoperability [1], in order to ensure that those objects could be shareable and compatible with several standardized Learning Management Systems. The model comprises four sections, following the steps from the definition of the assessment items to the use of them in a test inside of a Learning Management System. Keywords: On-line Assessment, IMS QTI, assessment assets. 1 Introduction In recent years, instructional and educational institutions have been incorporating information and communication technologies in learning and teaching processes in order to increase the quality, efficiency, and dissemination of education. As long as those projects cover the needs of individuals in a particularly way, the success and transcendence of such developments could be incremented by performing adaptability to each user so the learning experience can be enhanced. To be sure that all of these efforts don t become groups of isolated isles, most of these projects look to be compliant to some accepted standards, so they can be interoperable, compatible and shareable between them. One accepted standard is the IMS (Instructional Management Systems) [2], a global learning consortium that develops and promotes the adoption of open technical specifications for interoperable learning technologies that become the standards for delivering learning products worldwide. Among the inherent importance of these works, we want to emphasise in the role of the assessment activity inside the e-learning process. We want to concentrate in this task, and see how it can help to improve the e-learning process to all the participants: students, teachers, and content designers. The rest of this paper is structured as follows. In section two, we depict how we see the importance of the assessment activity in the e-learning process. In the section three, we describe our model for the creation of assessment assets, focusing in the first two phases: the authoring and the management of the items. In the section four we describe with more detail, the adaptation process of the assessment items. Finally, in the section five we give our conclusions.

2 Importance and characteristics of and Assessment Tool Traditionally, assessment activity has been seen like task aside of the e-learning process and there is a danger in focusing research on assessment specifically [3], as this tends to isolate the assessment process from teaching and learning in general. For us, the assessment activity after the student took the educative lesson, could be seen as an element that closes and complete a circular activity, being an integral part of the learning process as well. In addition, the assessment activity inside the e- learning process could be used to adapt the system by setting a new user knowledge level, evaluate, and setting new learning profiles, assign user grades and, in consequence, performing user content re-adaptation [4]. According to the Australian Flexible Learning Framework [5], assessment, especially when is included within a real learning task or exercises, could be an essential part of the learning experience, giving to the entire Web site the characteristic to adapt itself to the needs of the users. Nowadays, it is necessary to produce educative Internet-based systems that permit the dissemination of the education, covering the needs of diverse learning group profiles. To obtain this, it is desirable that such systems perform automatic task to adapt itself to each user, disconnecting the content from its presentation by using a semantic approach rather than a syntactical one, defining a meaningful web. In consequence, learning systems must be flexible and efficient, and one way to accomplish that is to be an open and standardized system. When we design a system using open standards, we ensure characteristics of interoperability, manageability, reusability, and durability [6]. 3 Model overview The model have four main sections (figure 1), some of them have three levels of conceptualizations with activity definitions form the abstract level (lower layers) to a more concrete level (upper layers). We structured the model starting with the definition of the basic core elements so we can evolve them to a more concrete definition, identifying their requirements and interactions between them at the same time. 3.1 Model Structure Levels: from an abstract view to a concrete view: In the first level (dark color), we identify the core elements (learning objects, management, test construction, and LMS interaction). In the second level (gray color), we describe the main sections, identifying four phases: authoring, repository management, visualization, and interaction with LMS. In the third level (white color), we categorize each activity into a subsystem (authoring, item management, publishing, and interaction with LMS).

Figure 1. Model for the construction of exam items. Sections: from the creation to the interaction with LMS: Section one: Learning Objects Definition (ASI: Assessment, Section, and Items). We focused our research in technologies and specifications in the definition of this section; we use the authoring use-cases of the IMS QTI specification in this section [1]. In addition, we want to use the IMS AccessForAll [7] specification to define a resource profile for our assessment objects Section two: Item Management. We define this section with the aim to give to the developer a tool to organize, manage, and import ASI from other authoring tools. We propose a native XML database management system to manage the items. Section three: Test Construction: We consider in this section the activities made by the assessor to view the items by rendering them in a visualization system. We suggest using software like Flash [8], Shockwave [8], or XHTML [9]. In this section, the assessor selects the ASI to construct the test that will be delivered into LMS in a XML [10] format. In addition, we want to integrate data fields so the assessor could construct adaptability test by defining trees of related questions depending of the responses and the feedback presented to the user. We propose the development of a user interface for the students or candidate so they can access and respond to the exam, sending the results to the LMS to make an assessment record. Section four: Test Delivery: In this phase, the candidate or student activates the test by accessing to it through an LMS, from which we get the learning style definition to make the adaptation.

3.2 Level one: an abstract definition (dark color). 3.2.1 ASI authoring subsystem (Learning Object Definition). A learning object is a reusable unit of instruction for learning activities, typically in an e-learning environment. The aim of the authoring system is to develop an interface for the development of learning objects to assess and grade the students learning that could aggregate in learning activities in the LMS. In order to be interoperable and interchangeable, the objects are defined in its smallest functional size: single questions that could be correctly identified and categorized so we can make an aggregation process in the publishing phase. 3.2.2 ASI DBMS subsystem (Learning Object Management). The item database is known as the ASI Repository in the IMS QTI specification [1]. This DBMS could be a relational database like the open system MySQL [11], but nowadays we could find XML databases that could store the items in its native format and make queries upon the stored data using the XQuery language [13]. In the traditional DBMS, we have to use a middleware to map the XML data, transforming the labels into tables and the attributes into columns, so we could store the XML file. 3.2.3 Publishing subsystem (test construction). In this step the assessor selects the ASI to visualize and evaluate apriori just before to publish them. Another activity is to select the ASI to construct the final exam that will be delivered to the LMS or the student itself. This process generates an output file in XML format. 3.2.4 LMS platform. The last step in our model is the delivery of the exam to the LMS so the user could interact with it and respond to the question. 3.3 Level Two: Definition of technologies to be used (gray color). 3.3.1 ASI authoring subsystem (Learning Management Definition). We want to use: 1. IMS QTI [1] to describe the assessment items. The QTI describe an information model and the associate representation to create and share assessment items. This element contains a group of interactions or questions and the supporting material like external file links and rules. 2. To define a first level of adaptability, we propose to aggregate to the item, metadata describing alternative resources to perform user adaptability. In this case we will use the IMS AccessForAll specification [7] 3. LOM, Learning Object Metadata (IEEE, 1990) [13]. This standard outlines the syntax and semantics of the Learning Objects metadata describing a small set of attributes that allow the objects to be managed, located, and evaluated. 4. Assessment item categorization. IMS QTI and LOM allow the author to categorize the items so they can be correctly managed and located, also in this process be can define a uniform resource identifier (URI). 5. File import. Those can be files in text format generated in word processor programs. Those files must follow some minimal rules in their structure, like special labels o words so they can be identified by a parser like question header, question text, question options and extra data to categorize and evaluate the item.

3.3.2 ASI DBMS subsystem (item bank description). We propose the creation of a user interface for the database administrator to access, management, and query of the ASI, also to import of new elements in XML-QTI format to this database. For the management of the ASI we suggest the implementation of a repository for items in XML format using a native XML DBMS using XQuery [12] to make queries, this DBMS could be the dbxml [14] system. The second process is the import of the ASI (created in other authoring systems) to the repository without any modification or adecuation of the data. 3.4 Level Three: Implementation (white color). 3.4.1 ASI authoring subsystem (ASI creation). This is a user interface for the author of the ASI, to select the kind of assessment item, type de item text and the responses with its values, the feedback and the links to the external supporting media. We suggest the use of an open DBMS (like MySQL) defining a table for each kind of question to save the items before make their transformation to the XML-QTI format. 3.4.2 ASI DBMS subsystem (required technologies). The DBMS for XML data manage the data in collections of items in a tree structure, storing the elements in their XML native format or even in binary format (DBMS records). 3.4.3 Publishing subsystem (test construction). The test construction is a process defined in the user cases of the IMS QTI [1] specification. The aim of this phase is to define a subsystem for item publishing so the teacher or assessor could visualize and evaluate each ASI, select them and construct the exam, packaging the questions in a XML format file that will be accessed by a Learning Management System. In this process could be possible to render the item to view their final presentation to the user. This phase could define a final user (student) interface so they could visualize and answer the test, sending the results to the LMS. 3.4.4 LMS platform. The LMS controls the activities of access control, response time, response processing invocation, and feedback supporting. 4. Adaptive Assessment Items In this section, we want to focus in the adaptability processes that we propose for the assessment items. We divide this into two levels: 4.1 First level of adaptability: Defined in the authoring stage by using the IMS AccessForAll specifications, in this case the ACCMD [15] in which the author could select alternative resources (supplementary or alternative) to the primary ones, presented to the user by default. Those alternative media are presented adapted to the final user by matching the resource profile (defined in the authoring stage) to the profile defined by the user when he/she made their preferred selections that are recorded in the user profile (in this case we use the ACCLIP [16] specification). 4.2 Second level of adaptability: In the authoring stage, the assessment author could define a tree of additional nested questions, which are presented to the user based in his/her response to the actual item. In the test construction stage, the assessor could active this tree processing so the user is presented with feedback options and

showing additional questions based on current answers, taking into account the user profile defined by using the ACCLIP [16] the items are presented using the primary or equivalent resources. Figure 2: AssessmentItem package. Figure 2 shows the package of a single assessment item, composed of: 1. Objective: is the description of the educational objective that is to be accomplished by the assessment item. 2. Properties: include the definition of the global variables to receive the values of the learning style of the student from the LMS. Also contain the categorization of the item as a root o leaf (defined by the value of the hasparent variable).

3. AssessmentItem element: Contain three files in QTI format for the description of the question for each learning style. 4. Method: Is a file with ACCMD format that contains the adaptation rules that will be performed according the values of the global variables. 5. Conclusions Online assessment is an important step inside the e-learning process because gives convenient feedback to all participants in the process, helping to improve the learning and teaching experience. According to the new developments in the area of e-learning we can see that most of them look to be compliant with accepted standards like the IMS QTI. This gives the convenience to those developments to be interoperable and adaptable to different platforms. In concordance, referring to the assessment activity we can think that it must be interoperable as well, because it is one element of the e-learning process and plays an important role inside this experience. Adaptability is another key factor in assessment. Given the fact that assessment is an important element of the e-learning process and that this process looks to be interoperable, then we can think that the assessment tool could be used with different educative content administrators with different conceptualizations and ways to design and apply a test for their students. To face this situation it is necessary to develop an assessment tool that give several ways to design an test with different types of resources, different kind of assessments, group of students, kind of questions, managing schedules, etc. Under this conceptualization, we want to create an Adaptive Assessment tool (AAT) that could take into account the specific characteristics of educational open systems. This application will allow the professor or instructional designer to integrate the test questions, with the supporting media and generate an output file in XML format, following the IMS specifications reviewed here. After that, the test is integrated with other learning activities inside a Learning Management System. 6. Acknowledgements We want to thank to the group KEOPS (ref. TSI2005-00960) of the University of Salamanca for their contributions and ideas for the development of this work. In addition, we want to thank to the Education and Science Ministry of Spain, National Program in Technologies and Services for the Information Society, Ref.: TSI2005-00960. Héctor Barbosa thanks the National System of Technological Institutes (SNIT Mexico) for its financial support.

7. References 1 IMS QTI, Question and Test Interoperability. http://www.imsglobal.org/question/index.html. 2 IMS/GLC, Instructional Management Systems Global Learning Consortium. http://www.imsglobal.org 3 Booth, R., Berwyn, C. The development of quality online assessment in vocational education and training. Australian Flexible Learning Framework, Vol.1, 2003, 17 4 Barbosa, H., Garcia F. Importance of the online assessment in the e-learning process. 6th International Conference on Information Technology-based Higher Education and Training ITHET, Santo Domingo, Dominican Republic, 2005. CD version. 5 Backroad Connections Pty Ltd. Assessment and Online Teaching, Australian Flexible Learning Framework Quick Guides series, Australian National Training Authority, Version 1.00, 2003, 3 6 Garcia, F., Berlanga, A., Moreno Ma. N., Garcia, J., Carabias, J. HyCo An Authoring Tool to Create Semantic Learning Objects for Web-based E-Learning Systems. In Web Engineering. 4 th International Conference, ICWE 2004, proceedings. Vol. LNCS 3140, 344-348. 7 IMS AccessForAll Specification. http://www.imsglobal.org/accessibility/index.html 8 Adobe Systems. http://www.adobe.com 9 XHTML. Extensible Hypertext Markup Languaje, 2 nd edition. http://www.w3.org/markup/ 10 XML. Extensible Markup Languaje. http://www.w3.org/xml/ 11 MySQL. Open Source Database. http://www.mysql.com/ 12 XML Query. http://www.w3.org/xml/query/ 13 Learning Object Metadata. http://ltsc.ieee.org/wg12/ 14 dbxml Native XML Database. http://www.dbxmlgroup.com/ 15 IMS ACCMD. Accessibility Metadata Model. http://www.imsglobal.org/accessibility/accmdv1p0/imsaccmd_oviewv1p0.html. 16 IMS ACCLIP. Accessibility for LIP specification. http://www.imsglobal.org/accessibility/accmdv1p0/imsaccmd_oviewv1p0.html#1632695.