E-Assessment standards


E-assessment

 


 

Assessment is a mature domain in education compared to areas such as learning activity theory. Research on assessment has stabilised in recent years, and there has been an emphasis on technology and delivery using technology. It could be argued that in the quest for automated marking, however, that there has been an undue emphasis on simple multiple choice type assessment. These types of assessment have largely employed software to create drag and drop, question and answer, simple fill in the blank activities, which often do little to support higher skills. These formats have been criticised for their lack of suitability to certain question types, meaning that the questions need to be rewritten to accommodate the software. (Sangwin, 2002) Consensus exists that these forms of ‘objective’ test assess only surface or strategic skills in the learner. Automatic marking in its present form is still in its infancy.

 

Recent research in e-learning has concentrated on issues concerning interoperability and learning technology standards, and with pedagogical research exploring the possibilities of constructivist learning, learning objects and more recently learning activities. Assessment is seen as an integral part of learning activities, as opposed to a discrete service in learning. The potentialities and issues highlighted below should also apply to the field of assessment, with new technologies offering opportunities for new and innovative types of assessment, moving away from the emphasis on automated marking of lower level skills.

 

E-learning standards relating to assessment

 

Standards developed in association with assessment in recent years have been in mostly in relation to sequencing and sharing content rather than any form of pedagogic innovation in the area of assessment. Standards developed include:

 

• IMS Question and Test Interoperability

• IMS Simple Sequencing

• IMS Learning Design

• MATHML

• MATHQTI

• TOIA-COLA Metadata Application Profile

 

IMS QTI – Question and Test Interoperability

 

This is a global e-learning standard in XML that has developed in recent years in relation to assessment in e-learning. QTI is designed to be used either as a self-contained e-learning standard, or in conjunction with other IMS specifications such as Learning Design, Simple Sequencing, Content Packaging and learning resource metadata.

 

IMS QTI aims to:

• Provide a well documented specification for storing and exchanging items independent of the authoring tool used to create them.

• Support the deployment of item banks across a wide range of learning and assessment delivery systems.

• Provide a well documented content format for storing and exchanging tests independent of the test construction tool used to create them.

• Support the deployment of items, item banks and tests from diverse sources in a single learning or assessment delivery system.

• Provide systems with the ability to report test results in a consistent manner.

http://www.imsglobal.org/question/qtiv2p1pd/imsqti_oviewv2p1pd.html

 

IMS Simple Sequencing

 

Based on an XML Schema, IMS SS is used for the sequencing of learning activities depending on the conditions in which the activity is presented, skipped or selected. Simple Sequencing is mostly focussed on the single learner, with collaborative learning covered by Learning Design.

http://www.icodeon.com/pdf/ss2brief.pdf

 

IMS Learning Design

 

Another IMS XML standard that deals with the sequencing of learning activities. IMS LD presents learning scenarios to learners online, and enables them to be shared between systems. LD allows for a wide range of pedagogical models, though it has an emphasis on group work or collaborative learning. http://www.cetis.ac.uk/lib/media/WhatIsLD2_web.pdf

 

MATHQTI

 

The Mathematical Questions & Test Interoperability specification is an extension of the information model of IMS Question & Test Interoperability (QTI). MATHQTI is used by Active Maths as a standard file format.

 

TOIA-COLA Metadata Application Profile

source - http://frema.ecs.soton.ac.uk

 

This metadata profile was developed for the COLA project. It is an application profile developed from the UK LOM Core. Its aims are:

 

to specify the metadata elements for the COLA project templates which require to be completed by question authors

to specify those elements which will be generated automatically by the templates

to show how these map onto the IEEE Learning Object Metadata Standard

to provide notes on how the metadata fields can best be completed

 

This application profile is in two parts. The first part refers to items ie questions with their associated data. Part 2 is for assessments (ie groups of items).

 

Types of Assessment Authoring Software