The CILASS Research and Evaluation Programme focused on two main areas of inquiry:
- Conceptions, experiences, practices and outcomes of inquiry-based learning; and
- The facilitation of educational change and development at institutional level.
Research and evaluation activities
CILASS pursued the following over-arching questions through a number of evaluation and research projects:
- How do students conceptualise and experience inquiry and inquiry-based learning? What is the value of inquiry pedagogies for students? What are the challenges?
- What is the role of digital technologies, information literacy development and new learning spaces in students’ experiences of inquiry and inquiry-based learning?
- How do academic staff conceptualise, design and facilitate inquiry-based learning? What approaches are found to be effective, and what are the design and facilitation challenges?
- How can ‘design for inquiry-based learning’ be supported effectively at the level of individuals (academic staff, students) and at the level of institutions?
- Why and how is CILASS achieving impact as an educational change programme and what are the challenges for, and constraints on, achieving impact?
Evaluation and research projects exploring aspects of these questions were carried out by CILASS core team staff and others, including Academic Fellows, curriculum development project leaders, Student Ambassadors and student researchers. Projects included:
- Longitudinal, qualitative studies of undergraduates’ conceptions and experiences of inquiry and inquiry-based learning, and graduating students’ experiences of inquiry-based learning;
- A study of students’ experiences and use of digital technologies in inquiry;
- Impact evaluations and ‘meta-analyses’ of CILASS-funded development projects;
- Impact evaluation of CILASS programme-level change strategies and activities;
- Personal research projects undertaken by students and CILASS Academic Fellows.
The 'Theory of Change' approach
CILASS used a `Theory of Change´ (ToC) methodology for evaluation research; both at programme-level and for individual development projects. The CILASS approach was aligned with a new approach to evaluating learning and teaching development developed and applied more generally at the University of Sheffield. This was an adaptation of Theories of Change programme evaluation combined with the use of EPO (Enabling, Process and Outcome) Performance Indicators.
Through backward mapping, a causal narrative or `theory´ was established, identifying evaluation indicators and becoming the basis for an evaluation plan. For example, `to achieve the desired impact on student learning experiences, the outcomes of the initiative need to be x, y and z; in order to achieve these outcomes, the processes or activities a, b and c need to happen; in order to carry out a, b and c, the enabling factors and resources d, e and f are required´. The narrative thus identified three different types of evaluation indicator: enabling indicators concerned with the structures and support, process indicators concerned with what needed to happen, and outcome indicators concerned with intermediate outcomes of an initiative and tied to broader and longer-term impact goals. The ToC approach distributed weight between outcomes, processes and enabling factors, identifying them all as valid indicators of impact. Underlying the `theory of change´ narrative were various assumptions, beliefs and values relating to the change initiative, its context and purposes. Exploring these in the course of impact evaluation afforded insight into why and how impact occurs.
The CILASS core team provided a range of information resources and evaluation guidance to support leaders of development projects both within the University and at a number of other HE institutions in engaging with Theory of Change, using the following process:
The purpose of the CILASS Theory of Change evaluation process was not to audit projects but to extend our understanding of inquiry-based learning and to share experiences of implementing IBL pedagogies with the HE learning and teaching community. The key question driving this approach to evaluation was "What have you learned about IBL from doing your project?"