Extending LMS to support IRT-based assessment test calibration

Fotaris, Panagiotis ORCID: https://orcid.org/0000-0001-7757-7746, Mastoras, Theodoros, Mavridis, Ioannis and Manitsaris, Athanasios (2010) Extending LMS to support IRT-based assessment test calibration. In: Technology Enhanced Learning. Quality of Teaching and Educational Reform. Communications in Computer and Information Science, 73. Springer, Heidelberg, Germany, pp. 534-543. ISBN 9783642131653

Full text not available from this repository.

Abstract

Developing unambiguous and challenging assessment material for measuring educational attainment is a time-consuming, labor-intensive process. As a result Computer Aided Assessment (CAA) tools are becoming widely adopted in academic environments in an effort to improve the assessment quality and deliver reliable results of examinee performance. This paper introduces a methodological and architectural framework which embeds a CAA tool in a Learning Management System (LMS) so as to assist test developers in refining items to constitute assessment tests. An Item Response Theory (IRT) based analysis is applied to a dynamic assessment profile provided by the LMS. Test developers define a set of validity rules for the statistical indices given by the IRT analysis. By applying those rules, the LMS can detect items with various discrepancies which are then flagged for review of their content. Repeatedly executing the aforementioned procedure can improve the overall efficiency of the testing process.

Item Type: Book Section
Identifier: 10.1007/978-3-642-13166-0_75
Keywords: e-learning, Assessment Test Calibration, Computer Aided Assessment, Item Analysis, Item Response Theory, Learning Management System
Subjects: Computing
Depositing User: Vani Aul
Date Deposited: 07 Dec 2013 14:37
Last Modified: 28 Aug 2021 07:16
URI: https://repository.uwl.ac.uk/id/eprint/447

Actions (login required)

View Item View Item

Menu