dc.description.abstract |
Abstract
Higher education institutions are moving to design and implement teacher-facing learning analytics (LA) dashboards with the hope that instructors can extract deep insights about student learning and make informed decisions to improve their teaching. While much attention has been paid to developing teacher-facing dashboards, less is known about how they are designed, implemented and evaluated. This paper presents a systematic literature review of existing studies reporting on teacher-facing LA dashboards. Out of the 1968 articles retrieved from several databases, 50 articles were included in the final analysis. Guided by several frameworks, articles were coded based on the following dimensions: purpose, theoretical grounding, stakeholder involvement, ethics and privacy, design, implementation, and evaluation criteria. The findings show that most dashboards are designed to increase teachers’ awareness but with limited actionable insights to allow intervention. Moreover, while teachers are involved in the design process, this is mainly at the exploratory/problem definition stage, with little input beyond this stage. Most dashboards were prescriptive, less customisable, and implicit about the theoretical constructs behind their designs. In addition, dashboards are deployed at prototype and pilot stages, and the evaluation is dominated by self-reports and users’ reactions with limited focus on changes to teaching and learning. Besides, only one study considered privacy as a design requirement. Based on the findings of the study and synthesis of existing literature, we propose a four-dimensional checklist for planning, designing, implementing and evaluating LA dashboards. |
|