Evaluation of educational innovations at scale

Design Patterns_Banners-Red

Context & Scale

This pattern is concerned with developing a structured approach to evaluating the implementation of educational innovations in large units of study.

Working with large units requires a structured approach to evaluation because small issues may have a big impact due to the involvement of a large number of students in the unit.

Research on evaluation of educational projects indicates that academics often find evaluation overwhelming and so there is a need for more targeted evaluation support mechanisms that are flexible, adaptable, and timely (Huber & Harvey, 2016). Evaluation can take many forms and this depends on the context and the approach taken. If done systematically, evaluation can result in useful learning. According to Preskill and Torres (2000), “when individuals participate in an evaluation process that is collaborative and guided by dialogue and reflection, learning occurs not only at the individual level but also at the team and organization levels” (p. 26).

Problem

Evaluation of educational development activities in units of study is not a common practice, over and above the standard end of unit student survey. Introducing educational innovation in a large unit of study requires a more systematic approach which includes determining measures of success. The purpose of evaluation in our Connected Learning at Scale (CLaS) project is to validate the effectiveness of developments made through the co-design process in supporting student learning and addressing issues of teaching at scale.

Solution

Develop a structured evaluation plan to assess the implementation of educational innovations in a unit of study. Due to the involvement of multiple stakeholders in large units of study, it is important to follow a clear and transparent process which lays out the main criteria for evaluation, determine measures of success, identify sources of data collection, identify inter-dependencies, and be responsive to the specific needs of each unit.

Implementation

  • Begin discussing evaluation at the outset of a development phase.

  • Decide who will lead the evaluation.

  • Co-design an evaluation plan together with unit of study stakeholders (which may include the unit coordinator/s, lead tutor/s, educational developers, learning designers, and students).

  • Decide on the audience for the evaluation outcomes reports.

  • Determine the evaluation criteria.

  • Identify sources of data and collection points throughout the development stage.

  • Identify inter-dependencies, such as one source of data feeding into another, etc.

  • Clearly outline the plan (represent the plan visually for greater clarity see Figure 1 in the first example below).

  • Follow the steps in the plan (collect and analyse data).

  • Write a short evaluation report (may be more than one depending on the identified audiences).

  • Use the results in discussion with stakeholders to refine aspects of the unit for further development in subsequent cycles/iterations.

Examples of pattern in use

Connected Learning at Scale (CLaS) evaluation framework details

The specific problem we aim to target is that evaluation is often not discussed or planned at the outset. Evaluation of educational development often comes in at the end. Furthermore, evaluation needs to be a collaborative and dynamic process that incorporates feedback from different stakeholders (Huber, 2017). The details of an evaluation plan will depend on the types of educational development innovations implemented in the unit of study. In general, the broader evaluation plan should follow a set structure so that units of study in the same large projects, such as the CLaS project, could be compared to each other longitudinally. However, the evaluation plan should also have enough flexibility built into it so it can be tailored to the specific contents of different units of study.

All CLaS units have their own evaluation plans. This means we have examples of a variety of evaluation plans with different context, scales, and from different disciplines.

The proposed evaluation plan has been successfully implemented in all units of study as part of the Connected Learning at Scale (CLaS) project. It has been instrumental in engaging the stakeholders in the process and producing useful data to determine the success of educational innovations and in guiding further developments/refinements in subsequent iterations.

The following diagram is the current CLaS evaluation framework that illustrates the different sources of data and the collection points across a typical semester. Flexibility is built into this process where for example data can be collected from the students over a window of several weeks.

Figure 1 – CLaS evaluation plan visualised
Example use of the framework in an undergraduate accounting unit

This specific example was implemented in Semester 1 2021 in a large core first-year undergraduate accounting unit of study at the Business School. There were approximately 1,500 students enrolled in this unit. There was an even mix of local and international students with around 60% of students studying remotely during this semester.

This pattern was developed and implemented in BUSS1030. We would like to acknowledge the unit coordinators and lecturers closely involved from the Accounting discipline, including Olga Gouveros and Janine Coupe. 

Implementation

01 > The discussion of the evaluation started in Semester 2 2020 during the development phase.

02 > The evaluation plan was co-designed with the development team, which consisted of two unit coordinators, an educational developer, a learning designer, and a research and evaluation (R&E) associate.

03 > The R&E associate led the evaluation, which included the development of the plan in consultation with the development team, development of the evaluation instruments, conduction of the surveys, focus groups, interviews and collection of data analytics.

04 > The audience for evaluation reports were identified as the unit of study coordinators, the Business Co-design team coordinator, Head of Discipline, and the Associate Dean (Education).

05 > The evaluands included:

  • A new textbook
  • The introduction of online modules, which replaced two-hour recorded lectures
  • The development of short concept videos, including pencast videos explaining core concepts
  • Interactive elements incorporated in the learning management system Canvas, including Genially, H5P, Padlets, Opinion Polls, and others
  • The introduction of reflection activities on Canvas
  • Personalised feedback through the Student Relationship Engagement System (SRES)
  • The co-design process used for the development project

06 > Sources of data and collection points:

  • Student survey in Week 4 of the semester
  • Two student focus groups in Week 8 of the semester
  • Interviews with unit coordinators
  • Learning analytics data collected from Canvas
  • Unit of Study Survey (USS)

07 > Inter-dependencies: The survey responses from Week 4 contributed to the formulation of the focus group questions. The focus groups provided an opportunity to ask students about issues discussed in the survey for which we needed more detail.

08 > Visual representation of the evaluation plan is illustrated in Figure 2 below.

BUSS1030 evaluation plan visualised

About the Authors

Dewa Wardak

Lecturer in Educational Development at the University of Sydney Business School – Fellow of the Higher Education Academy (FHEA) – Learning Scientist

Leave a Reply

Discover more from University of Sydney Business School

Subscribe now to keep reading and get access to the full archive.

Continue reading