Incorrigibly plural


“Blind monks examining an elephant” by Itcho Hanabusa.

World is crazier and more of it than we think”[1]

The story goes that a group of blind monks encounter and feel different parts of an elephant, each describing it based on their partial experience. Evaluation can feel like this. Each data collection technique and theory gives you access only to a part of the ‘whole’.

This blog is about developing and evaluating complex interventions. In this first post, I pull out some themes, which will be revisited, by looking at how we conducted a recently published study[2] (view a film about this project).

In design terms[3–4], this was a mixed-method feasibility study: a pilot trial[5] and a qualitative case study[6]. We worked with physiotherapists to articulate an ‘intervention theory’ linking ingredients, physiological mechanisms and targets[7, 8] (Figure 1).


Figure 1. An intervention theory[2] based on rehabilitation taxonomy[7] and the ICF-CY [8].

We also developed a broader ‘programme theory’[9, 10], illustrated through a logic model[10, 11] (Figure 2), to explain how inputs and activities might lead to outcomes. This identified – and enabled us to monitor – potential sources of implementation failure[12], a key function of process evaluation[12–14].


Figure 2. A logic model to illustrate a programme theory[2]

Using different data collection methods, with formal comparisons of their findings, helps address different aspects of a research question[15]. Our logic model constructs provided the basis for a joint display table[16] (Figure 3) and a succinct summary (Figure 4) to help those involved grasp how the intervention had been implemented.


Figure 3. Extract from joint display table – rows based on logic model constructs[2]


Figure 4. Succinct summary based on logic model and joint display table[2]

To understand why it happened like this, we used an explanatory middle range theory[17] (Figure 5).


Figure 5. Intervention failure[2] explained through Normalisation Process Theory[17]

In the story, there really is one ‘whole’ elephant to be perceived. But the object of our inquiry is never really a single closed system[18] and always remains somewhat indeterminate. So we inform decision-makers by integrating multiple methods, perspectives and theories to produce knowledge we acknowledge as provisional and tentative[19].


This post describes work funded by the National Institute for Health Research Health Technology Assessment Programme (project number 12/144/04). The views and opinions expressed therein are those of the authors and do not necessarily reflect those of the Health Technology Assessment Programme, NIHR, NHS or the Department of Health.


  1. MacNeice L. Snow. In: Longley M, editor. Selected Poems. London: Faber and Faber; 1988:23.
  2. Hind D, Parkin J, Whitworth V, Rex S, Young T, Hampson L, et al. Aquatic therapy for children with Duchenne muscular dystrophy: a pilot feasibility randomised controlled trial and mixed-methods process evaluation. Health Technol Assess (Rockv). 2017;21(27):1–120.
  3. Creswell JW. Research Design (Fourth Edition). London: Sage; 2014.
  4. Carter SM, Little M. Taking Action : Epistemologies , Methodologies , and Methods in Qualitative Research. Qual Health Res. 2007;17(10):1316–28.
  5. Thabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, et al. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol. 2010;10(1):1+.
  6. Yin RK. Case Study Research: Design and Methods. London: Sage; 2014.
  7. Dijkers MP, Hart T, Whyte J, Zanca JM, Packel A, Tsaousides T. Rehabilitation treatment taxonomy: Implications and continuations. Arch Phys Med Rehabil. 2014;95(1 SUPPL.)(1 SUPPL.):S45–S54.e2.
  8. Zakirova-Engstrand R, Granlund M. The international classification of functioning, disability and health–children and youth (ICF-CY): testing its utility in classifying information from eco-cultural family interviews with ethnically diverse families with children with disabilities in Kyr. Disabil Rehabil. 2009;31(12):1018–1030.
  9. Leeuw FL, Donaldson SI. Theory in evaluation: Reducing confusion and encouraging debate. Evaluation. 2015;21(4):467–480.
  10. Funnell SC, Rogers PJ. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. Jossey-Bass; 2011.
  11. McLaughlin JA, Jordan GB. Logic models: a tool for telling your programs performance story. Eval Program Plann. 1999;22(1):65–72.
  12. Stufflebeam DS. The use and abuse of evaluation in title III. Theory Pract. 1967;6(3):126–133.
  13. Linnan L, Steckler A. Process evaluation for public health interventions and research: an overview. In: Linnan L, Steckler A, editors. Process evaluation for public health interventions and research. 1st edition. San Francisco: Jossey-Bass; 2002:1–23.
  14. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350(mar19 6)(mar19 6):h1258–h1258.
  15. O’Cathain A, Murphy E, Nicholl J. Why, and how, mixed methods research is undertaken in health services research in England: a mixed methods study. BMC Health Serv Res. 2007;7:85+.
  16. Guetterman TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Ann Fam Med. 2015;13(6):554–561.
  17. May C, Finch T. Implementing, Embedding, and Integrating Practices: An Outline of Normalization Process Theory. Sociology. 2009;43(3):535–554.
  18. Richards DA. The Complex Intervention Framework. In: Richards DA, Hallberg IR, editors. Complex Interventions in Health. Oxford: Routledge; 2015:1–15.
  19. Dewey J. Logic: The Theory of Inquiry. New York: Henry Holt and Company; 1938.