dc.contributor.author |
De Villiers, M.R. (Ruth)
|
|
dc.date.accessioned |
2014-02-06T16:29:26Z |
|
dc.date.available |
2014-02-06T16:29:26Z |
|
dc.date.issued |
2004 |
|
dc.identifier.citation |
SAICSIT '04 Proceedings of the 2004 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries Pages 284-291 |
en |
dc.identifier.uri |
http://hdl.handle.net/10500/13152 |
|
dc.description.abstract |
Computing systems require rigo rous evaluation both of functionality and usability. Evaluation of software within the growing e-learning sector is currently receiving attention. Relevant aspects are evaluation paradigms, techniques, and the issue of who does the evaluating. Squires and Preece developed a set of ‘learning with software’ heuristics to be used by experts/educators in predictive evaluation prior to adopting a system. These were adapted for post -production end-user evaluation of an operational e-learning tutorial lesson, Relations, used in Theoretical Computer Science. Findings are given from a questionnaire survey among learners. This process evaluated the artifact, and also reflectively confirmed the utility of the evaluation technique and criteria. Lessons have also been learned for the future development of educational software. |
en |
dc.language.iso |
en |
en |
dc.subject |
Cognition |
en |
dc.subject |
e-learning |
en |
dc.subject |
human-computer interaction |
en |
dc.subject |
reflection |
en |
dc.subject |
software evaluation |
en |
dc.subject |
Usability Evaluation |
en |
dc.subject |
E-Learning Tutorial |
en |
dc.title |
Usability Evaluation Of an E-Learning Tutorial: Criteria, Questions and Case Study |
en |
dc.type |
Article |
en |
dc.description.department |
Computing |
en |