2023-24-project-catalogue

###Learning from policy ‘messes’: a systems analysis of evidence use in practice

Project ID: 2228bd1006 (You will need this ID for your application)

Research Theme: Engineering

UCL Lead department: Science, Technology, Engineering and Public Policy (STEAPP)

Department Website

Lead Supervisor: Ine Steenmans

Project Summary:

This project aims to improve our understanding of ‘better’ evidence use in public policy. Despite two decades of action towards more evidence-based policy making, a critical capability gap persists (see UK Government 2019 Science Capability Review), constraining the contributions of science and engineering expertise to achieving more sustainable futures.

Several recent studies identify a need for more ‘systems-based’ perspectives to bridge disciplinary siloes impacting current evidence use in policy (e.g. GO-Science 2022 Systems thinking for civil servants). Even more recent efforts call for approaches that make sense of the multi-dimensional, nested systems nature of evidence capabilities in public administrations (e.g. Nesta 2022 Engaging with Evidence Toolkit). They identify a research gap to investigate the ways that multiple individual-group-institution-context factors combine in more, or less, successful use of policy evidence. In the words of the scientists Russell Ackoff and Ian Mitroff, we need to approach the evidence capability gaps as “wicked messes”, i.e. evidence use in policy presents us with a complex system of problems with multiple issues (e.g. cognitive biases, group capabilities and organisational practices) that we need to consider as a whole and not as parts.

However, while researchers have called for such integrated analysis (decision analysts called for the development of a new “policy analytics” field (Tsoukias et al. 2013); operational researchers have called for “behavioural” research on modelling practices (Lane & Rouwette 2022)), we lack demonstration of how this can work in practice.

This project responds with developing a holistic, diagnostic tool for learning from ‘better’ and ‘worse’ cases of evidence use in policy. The data for tool development as well as testing will come from active collaborations with two policy teams.

The researcher needs to be passionate about interdisciplinary analysis and practical, participatory tool development. They will be trained in public policy and co-production methodologies.