Centre for Advancement in Realist Evaluation and Synthesis (CARES)

Video: Introduction to Realist Evaluation and Synthesis

Dr. Justin Jagosh introduces foundational concepts in Realist Methodology in an interactive seminar through Engage Liverpool, January 25th, 2017.

Welcome to the website for the Centre for Advancement in Realist Evaluation and Synthesis (CARES), operated by Dr. Justin Jagosh. Realist methodology (in the tradition of Pawson and Tilley 1997; Pawson 2006, 2013) is a theory-driven approach to understanding 'what works, for whom, under what circumstances and how' for programmes, services, interventions and policies. The methodology is burgeoning for assessment across many sectors including health and social service, crime prevention, social policy, international development and environmental studies. For an introduction to the methodology, please feel free to view this video

References:

Pawson R, & Tilley N. (1997). Realistic Evaluation. Thousand Oaks, CA: Sage.

Pawson R, (2006). Evidence-Based Policy: A Realist perspective, Thousand Oaks, CA: Sage

Pawson R, (2013). The Science of Evaluation: A Realist Manifesto, Thousand Oaks, CA: Sage

 

Realist Evaluation and an Architectural Understanding of Social Policy: Illuminating Blind Spots

engage@liverpool
With Justin Jagosh, Ph.D
Director, Centre for Advancement in Realist Evaluation and Synthesis (CARES)

Thursday, February 7th, 2019
1:00pm – 4:00pm
Foresight Centre
University of Liverpool
1 Brownlow St, Liverpool

Cost: free

Realist Evaluation is a methodology that addresses the questions: ‘what works, for whom, under what circumstances, and how’. Building on the work of Pawson and Tilley (1997), this seminar will discuss the metaphor of “programme architecture” that can support a realistic evaluation of complex outcomes. Clarifying the architecture of a programme or policy is a building block toward understanding how initiatives are meant to work based on the resources implemented in contexts and how people respond to those resources. 

To do justice to complexity theorizing, it is necessary to have an open-minded, creative, emergent, theoretically sensitive appreciation of programmes mechanisms and their activation in contexts. Even with this kind of appreciation, our conceptualizations highlight certain features of policy design while inevitably creating blind spots in other areas. A realistic endeavour is to cumulate knowledge on complex outcomes, but also to illuminate the blind spots that preclude innovation in policy and programme implementation.  

Pawson (2013) has noted that there are discernible classes of interventions. For example: carrots (incentives), sermons (educational and cautionary messaging), and sticks (punitive, legislative). Building on this line of thinking, this seminar will discuss how an architectural understanding of social policy can be used to produce realistic data and uncover areas that remain persistently undertheorized.


Evidencing Generative Causal Pathways: Future Directions in Realist Inquiry
Delivered for: Realist Approaches Workshop - Founding Principles and Contemporary Developments, Northumbria University, (UK) (November 14th, 2018)

This presentation examines how realist methodology can be applied to the task of linking distal outcomes to programme mechanisms via generative causal chains. Many programmes have lofty outcome goals for addressing entrenched and complex problems. Given that, the way in which generative causal pathways (and rival pathways) are envisioned will shape our ideas about evaluation design. Another point, alternatively, is that many programmes are implemented with the hope for immediate resolution of acute problems. In these instances, an absence of vision of the potential longer-term ripple effects and the direction of the causal pathway in emerging contexts can result in a blindness to potential harms and complications attributable to such programmes. This presentation was delivered and recorded for the Northumbria University Realist Approaches workshop on November 14th 2018.

 
 

 
 

The Importance of Understanding Context
Delivered for the School of Public Health, University of Queensland, Australia on February 13th, 2018.

Realist inquiry uses programme and middle-range theory and context-mechanism-outcome configuration to address the questions, 'what works, for whom, under what circumstances and how?' This approach to research and evaluation can be adapted in many ways – including in the design of large-scale study, literature-based knowledge synthesis, policy appraisal, implementation scale-up, and in day-to-day decision-making. In this introductory webinar, ideas from Realist Evaluation will be presented to clarify why it is important to understand context.


Three webinars on realist methodology, given in partnership with the Centre for Complexity Across the Nexus (CECAN):

Realist Methodology for Complexity-Mindedness in the Food, Energy, Water and Climate Sectors: 

March 13th, 2017

Many approaches to addressing complexity in evaluation design are burgeoning. In part this is in response to the inadequacy of 'complexity-thin' approaches to supporting solutions to entrenched and evolving problems. A second factor is witnessing new complex problems emerging with radical changes in society and sectors.

In this context, the realist paradigm serves as a guide to expand complexity-mindedness, which presumably translates into more useful and relevant knowledge products, and a community of realist evaluation specialists dedicated to developing a lens on 'context-mechanism' interactions.

This webinar will introduce realist methodological principles, and explain how realism can increase our complexity-mindedness and capacity to gain causal insight.  

 

 

 

 

 

 

 

 

 

 

"Context + Mechanism" Association: A key heuristic in Realist Evaluation for innovating  programmes:

June 26th, 2017

Undertaking inquiry using the realist approach involves analysing complexity in terms of context-mechanism-outcome configurations. Confusion often arises in determining when data should fit under 'context' or else 'mechanism' in the process of configuring.

This webinar will offer a simple set of definitions for context, mechanism and outcome and will introduce examples of CMO configuring to demonstrate the context-mechanism interaction.

The goal is to stimulate ideas around how to define concepts, theorize programmes and configure data in realist analysis, with the ultimate ambition of increasing capacity for using realist evaluation to innovate and transform programmes.    

 

Dosage, Timelines, and other Considerations for Conducting a Realist Evaluation:                          

September 6th, 2017

The functioning of a programme can be determined from the realist theories that sketch out its basic architecture. This becomes increasingly evident in the process of theory testing using the context-mechanism-outcome configuration in which we try to understand how resources, when placed in environments trigger responses to produce outcomes.

In understanding how programmes work, we may arrive at questions about dosage (e.g., how many times should we expect the resource to be delivered before the desired response will be triggered?) and timelines of impact (e.g., what are reasonable estimations of time delay between the introduction of resources and the triggering of response?).

Using examples, this webinar will demonstrate why dosage and timelines of impact are important considerations in the process of realist theory developing and testing.