Review article
Author: Sandy Oliver (Professor of Public Policy, IOE, UCL’s Faculty of Education and Society, London, UK)
Ann Oakley, pioneering social researcher for nearly 60 years, is Professor of Sociology and Social Policy at IOE (Institute of Education), UCL’s Faculty of Education and Society (University College London, UK). This article explores the innovation and influence of her work and the work of her close colleagues at the Social Science Research Unit (SSRU) and its Evidence for Policy and Practice Information and Coordinating Centre (EPPI-Centre). It describes advances in research and knowledge that have their roots in listening to what women have to say about their lives. The resulting novel research methods have straddled academic boundaries – between qualitative and quantitative methodologies, between disciplines, and between academia and wider society – to enhance understanding of complex social issues and approaches to addressing them within the public sector. The impact of this work is seen in terms of influencing science, knowledge management, policy decisions, professional practice and the general public. These achievements come from approaches that are outward looking and straddle academic disciplines to produce evidence that is relevant to policymaking and to practice, with the ultimate aim being to improve day-to-day life.
Keywords: transdisciplinary research, evidence-informed policy, patient involvement, public involvement, engaged research, participatory democracy, attentive listening, collaborative learning
How to Cite: Oliver, S. (2023) “Ann Oakley: new learning and global influence from working across conventional boundaries”, London Review of Education. 21(1). doi: https://doi.org/10.14324/LRE.21.1.11
Ann Oakley, pioneering social researcher for nearly 60 years, is Professor of Sociology and Social Policy at IOE (Institute of Education), UCL’s Faculty of Education and Society (University College London, UK). She founded the Social Science Research Unit (SSRU) there in 1990, where she attracted colleagues who shared her predilection for upending conventional ways of thinking, both in professional and personal lives and in how they conducted research, focusing largely on women and children as marginalised populations. Three years later, she embarked on an ambitious challenge of reviewing social science systematically. This became a steady stream of SSRU work that evolved into the Evidence for Policy and Practice Information and Coordinating Centre (EPPI-Centre). In 2005, Oakley stepped down as director of SSRU to focus on the historical development of social science, its methods, aims and institutionalisation. Three clear themes are threaded throughout her career: gender, methodology and research that aims to actually improve people’s lives. This work inspired a growing portfolio of research, much of it led by her SSRU colleagues.
Oakley was first known for her qualitative research about women’s lives. Her emphasis on our gendered world began by taking time-use research from workplaces to family homes to produce her Sociology of Housework (Oakley, 1974). It was this early work that years later, in 2015, the Economic and Social Research Council (ESRC, n.d.: n.p.) chose to highlight as ‘a landmark work in feminist research and in how we understand the social world’ to celebrate its own 50th anniversary.
Her second wave of studies evaluated social support and education, sometimes applying the experimental design of randomised controlled trials that were more commonly seen in medical research to assess the effects of intervening in daily life. This did not replace her interest in qualitative research. Rather, the two were integrated to investigate simultaneously whether interventions had an effect, how that effect was achieved, or not, and how interventions and their effects were perceived. When Oakley established the SSRU in 1990, her mixing of experimental designs with qualitative methods was already under way. A series of such studies evaluated one-to-one support in homes, group support in clinical settings, and education programmes in schools.
Then came systematic reviews of social research. These reviews answered policy-relevant questions by identifying, appraising and synthesising the findings of existing research. Again, they had been more commonly seen addressing clinical care, but Oakley and her colleagues now applied them to social interventions, and again incorporated a broader range of methods to investigate the acceptability, feasibility and effects of interventions. Their innovative methods were adopted and adapted by research teams around the world, and their findings influenced policy decisions for improving health, education, social welfare and more.
As SSRU grew in size and influence, Oakley left the evaluations and systematic reviews of social interventions to her colleagues, and turned her attention to the history of social science through the biographies of other women who wielded influence through engaging with evidence and policy.
Throughout this work, her attention to gender and methodology, separately and together, has advanced understanding about our world by challenging both social norms and scientific norms, in particular by examining how research methods are understood and applied by different people or in different contexts.
This article explores the innovation and influence of her work and the work of her close colleagues at SSRU and its EPPI-Centre. It draws on the author’s familiarity with SSRU and the EPPI-Centre as an insider, on publications by and about SSRU and the EPPI-Centre, and on research impact case studies submitted to the Research Excellence Framework in 2014 and 2021. Rather than being comprehensive – an impossible task when writing about such prolific authors – this article traces the role and value of listening well, from 1981 when Oakley challenged textbook guidance for research interviews, to when listening well has changed research designs, policies and daily lives. Part One describes advances in research and knowledge that have their roots in listening to what women have to say about their lives. The resulting novel research methods have straddled academic boundaries – between qualitative and quantitative methodologies, between disciplines, and between academia and wider society – to enhance understanding of complex social issues and approaches to addressing them within the public sector. Part Two traces the impact of this work, in terms of how the findings have influenced science, policy decisions, professional practice and personal lives. Part Three explores the factors that influenced these achievements.
Oakley’s research focused on complex social issues that feature strongly in women’s lives, yet had been overlooked by sociologists, such as how their days were shaped by their roles in the home, or how their health was influenced by medicine and healthcare. This involved listening to women themselves. The result was not only learning more about how our world is gendered; it also led to transformations in how social research is designed and carried out. This section documents the learning gained from listening to women talk, gathering studies where young people were heard, and listening to policymakers and service providers unsure about their decisions before embarking on new studies. It also explains the concomitant advances in methodology.
Oakley’s (1981) most cited work is a book chapter that reflects on a decade of interviewing hundreds of women about housework and becoming a mother. From this experience, she saw interviewing less as an instrument for collecting data to serve the researchers’ interests, and more as a conversation with warmth, developing rapport and offering ‘non-directive comments’ and open questions to encourage interviewees to talk freely about their experiences. Listening attentively reappears in the story of SSRU as contributing to randomised controlled trials with integral process evaluations (Sawtell and Jones, 2002), when systematically reviewing studies about young people (Harden and Oliver, 2001), and engaging policymakers in shaping systematic reviews to meet their needs for research evidence (Oliver et al., 2018a).
It was through listening to women and analysing their stories that Oakley distinguished gender, the social expectations of men and women, from the biological differences of sex (Oakley, 1972), and explored the practical implications for women’s lives in their homes and in wider society (Oakley, 1974). Her focus on women’s lives, and her egalitarian approach to listening to what they had to say, earned her a reputation as a ‘feminist’ social scientist (Crow, 2020; Platt, 2007). This less hierarchical approach, being more conversational, sharing and inviting intimacy, now features prominently in feminist research (McHugh, 2014). Conversely, she interprets conventional social science as adopting a ‘masculine’ paradigm (Oakley, 1981: 32).
Her interest in childbirth drew on her experience of personal, academic and advocacy spaces to write a history of the medical care of pregnant women (Oakley, 2016). From listening to professionals (predominantly men) who provided, indeed designed, clinical care for pregnant women, her history, The Captured Womb, documented the medicalisation of childbirth, the dominance of male obstetricians, and the inadequacy of existing research to justify the procedures they recommended (Oakley, 1984).
While writing this history, she joined a World Health Organization (WHO) Perinatal Study Group, which had 15 core members from 10 countries and several disciplines. Thinking differently from the epidemiologists and obstetricians, she brought a sociological lens to focus on ‘the “socials” – the sociological perspective, social factors and social support’ leading to the proposition that reducing mothers’ stress through social forms of care may have measurable medical benefit (Oakley, 2016: 697). At the time, all the research reports available found no ill effects of social support, but measurable desired benefits remained elusive (Elbourne et al., 1989).
Alongside her interest in childbirth, Oakley applied a gender lens to violence. She drew attention to policy and mass media reports of violence that all but ignored that the perpetrators of violence are overwhelmingly men, and the victims are overwhelmingly women (Cockburn and Oakley, 2013a, 2014). In characteristically thorough fashion, she analysed the painful personal responses, the deeply hidden data on gender, the costs in terms of lives and economics, and the roots of violence in the structures of our patriarchal society (Oakley, 2002a; Cockburn and Oakley, 2013b; Scott-Samuel et al., 2015).
Women’s experiences of childbirth and violence in the home are both topics that SSRU has subsequently addressed, with several evaluations of efforts to reduce ill health, injury and death.
After seeing the discrepancies between women’s narratives about childbirth and the sparse research about their care at that time, Oakley set about evaluating the effects of social interventions required by designing studies differently. Just as SSRU was established, Oakley published an innovative study with a design taken from medical science and applied to social science, while adding qualitative to the conventional quantitative research methods. It was a randomised controlled trial, not of a medical treatment, but of social support from midwives for pregnant women with a history of babies with low birth weight (Oakley et al., 1990). The conversations between mother and midwife were both the social support intervention and the qualitative interview for data collection. The midwife was both service provider and researcher. The qualitative research integrated into a randomised controlled trial revealed that what women with a ‘high-risk’ pregnancy needed most was ‘more reassurance and information, recognition of the economic hardships that can be caused, and more attention to their feelings and opinions’ (Rajan and Oakley, 1990: 73), and that women offered a listening ear subsequently needed less healthcare and experienced better health, as did their babies (Oakley et al., 1990). A study spanning similar boundaries replicated the social support for disadvantaged women with new babies, with the midwives replaced by health visitors (Wiggins et al., 2004). Subsequent randomised controlled trials integrated both process and economic evaluations (Christie et al., 2014; Wiggins et al., 2020).
Against considerable resistance to this methodology (Oakley, 2006), SSRU employed it to evaluate educational interventions in schools (Oakley et al., 2004; Stephenson et al., 2004; Bonell et al., 2018; Jerrim et al., 2017). Throughout this portfolio of experimental evaluation, emphasis was on both randomly allocating participants to different conditions (Oakley et al., 2003) and evaluating the pathways through which change happened, or did not (Oakley et al., 2006).
Now, after more research, including from Oakley herself, we have evidence of social interventions offering measurable medical benefit. Programmes offering additional social support during pregnancy may help reduce antenatal hospital admissions and caesarean births, but not prevent babies being born too soon or too small (East et al., 2019).
The principle of seeking to understand and improve women’s lives by listening to them was also applied when funding was secured from the Department of Health to understand and improve young people’s lives. Rather than mounting new studies of young people, this programme of work was to synthesise the findings of existing studies.
Oakley’s first experience of systematically reviewing research literatures was of methods typically applied in the 1980s to aggregate the findings of studies addressing the effects of similar interventions – meta-analyses of randomised controlled trials (Elbourne et al., 1989) – the forerunner to the review cited above (East et al., 2019).
The following decade, she secured funding to take the methods that had been developed for synthesising research in clinical settings and adapt them for synthesising research conducted in community settings. She built a team whose work revealed a dearth of studies offering rigorous assessments of causal relationships (Peersman et al., 1999). Undeterred by literatures that could not support conclusions about effectiveness, the Department of Health encouraged the review team to ask other questions of the literature that could inform their policies: how interventions worked, or not (Harden et al., 2001), and what factors influenced health behaviours (Oliver et al., 2005). This began a journey that changed systematic reviews from looking for controlled trials that assess the effects of healthcare to looking for a broader range of studies to answer a broader range of questions using methods that are ‘fit for purpose’ (Gough et al., 2012b).
The EPPI-Centre team has adapted synthesis methods to suit policy problems in the fields of public health, social policy, global health systems, development studies, criminology, humanitarian aid and environmental science (Gough et al., 2017). Each time synthesis methods were introduced to a new field, starting with funding from the Medical Research Council to review behavioural interventions to prevent the spread of HIV/AIDS before medical treatments were available (with Deidre Fullerton and Janet Holland in 1993–4; Oakley et al., 1995), the literature inspired (and demanded) innovative methods to assess what could be justifiably learnt from the literature available. There was fierce debate as academics either resisted systematic reviews (Oakley, 2006; Thomas, in press) or accepted the principles and adapted the methods to suit their context of working (Hansen and Rieper, 2009; Langer and Stewart, 2014).
In SSRU, just as listening to research participants and practitioners was key to transforming randomised controlled trials with the integration of process evaluations, so also with systematic reviews. Reviewing studies that included the views of children and young people involved adapting Oakley’s egalitarian methods of interviewing women. The result was an evolving tool for appraising how well researchers: involved children and young people in developing and evaluating health promotion programmes; invited informed consent; assured confidentiality; and placed importance on the perspectives of children themselves, for instance, by asking open-ended questions, and during the analysis (Harden and Oliver, 2001; Rees et al., 2011).
In this way, this small stream of work within SSRU that started in 1993 with compiling controlled trials in education and social welfare (Oakley et al., 2005) grew to pioneer research synthesis methods that suited the task of making sense of different types of research literatures. Each systematic review involved some methodological development, whether this was to address differences in types of questions, types of research available, or breadth and depth of literature. Such methodological problem solving and engaging in academic debates with other research synthesis teams gradually evolved and clarified a range of synthesis designs and methods to suit different purposes and literatures available (Gough et al., 2012b). Oakley and her colleagues developed research methods to find studies of various traditions (Brunton et al., 2017), and to synthesise studies of people’s views (Harden et al., 2004), as well as the effects of social interventions.
The EPPI-Centre’s expectations of methodological rigour were considered unrealistic by health promotion practitioners, and insufficient by epidemiologists. A notable breakthrough came from interrogating the evidence from randomised controlled trials of children’s healthy eating with the findings of a qualitative synthesis of the children’s views. Each literature was reviewed according to the accepted standards of its own field. The findings indicated components of interventions likely to be beneficial (or harmful) that could be explained by qualitative understanding of children’s lives (Thomas et al., 2004). Consequently, the totality of evidence was greater than the sum of the two parts. The approach, when replicated with larger data sets, made it possible to explain two very different approaches for encouraging patients’ adherence to medicines – didactic or interactive (Candy et al., 2011, 2013). This method was qualitative comparative analysis, and when applied to the more complex challenges of understanding how engaging communities can reduce inequalities in health, it identified three distinct promising routes: involving communities in defining their health needs, in intervention design, and in intervention delivery (O’Mara-Eves et al., 2013).
Less visible in the social science literature, but of fundamental importance, was moving systematic reviewing from a paper-based activity into the digital age. Now every review, whether of qualitative or quantitative research, rests on a combination of information science and information technology. Synthesising studies that are scattered across academic disciplines, and published in journals or available through organisations, presented a challenge for managing thousands of reports (largely on paper in the twentieth century, and increasingly online in the twenty-first). It is information science that underpins the development of search strategies (Stansfield et al., 2012, 2014, 2016), and information technology that safeguards the studies identified and accelerates their analysis (Thomas et al., 2011; Stansfield et al., 2017). Regular conversations linked methodological challenges to the development of in-house software, EPPI-Reviewer. This resulted in exceptionally flexible bespoke software to support the whole review process, from identifying studies to appraising and synthesising their findings, across qualitative and quantitative research to accommodate reviews of effectiveness and reviews of people’s perspectives and experiences (Thomas et al., 2022).
All this work was part of two social movements: a move towards greater reliance on research evidence when making decisions in the public sector (Head, 2016), and a move towards greater involvement of the public in decisions about what research to conduct and how (Fransman, 2018). SSRU’s innovations in evaluation, synthesis and research use related to both the evidence movement and the public involvement movement (Oakley, 2002b; Oliver et al., 2015; Langer et al., 2016).
As evaluating or reviewing complex social situations became common, the need for different perspectives became greater. Specialist knowledge from different academic disciplines was essential, but not sufficient. Potential outcomes may be more or less important to policymakers, than to those implementing or to those offered a programme. Potential research participants may be ‘hard to reach’, at least by researchers unfamiliar with poorer neighbourhoods, unsettled populations or the languages they speak. Indeed, for one randomised controlled trial, listening to women required employing interpreters for 25 languages (Wiggins et al., 2004). Questionnaires may be hard to complete, or routine data hard to access. Interventions developed in one context may be poorly suited to another. All these challenges require knowledge from people with a stake in the research topic: the policymakers who face constraints when making decisions; managers and practitioners who face constraints when implementing decisions; and the disadvantaged populations who are most often the focus of research, as they struggle to survive on a daily basis. They all hold specialist knowledge about their context of living and working. This is why the egalitarian attitude adopted when interviewing women for data collection (Oakley, 1981) has been similarly adopted in conversations with community groups, advocacy groups, policymakers, managers and practitioners when designing and conducting research.
The principle of listening to a range of stakeholders has been applied to setting research agendas, designing evaluations and systematic reviews, recruiting research participants, collecting data, and interpreting findings. Patient advocates were invited to comment on ideas for health technology assessment and to discuss how best to invite their comments (Oliver et al., 2001a, 2001b), and ultimately to establish partnerships with practitioners for setting research agendas (Cowan and Oliver, 2021; Oliver et al., 2019). Evaluating social interventions now includes reaching women who speak different languages by training health advocates from various interpreting services (Wiggins et al., 2018). Patient advocates and practitioners joined advisory groups to guide procedures and interpret findings. Overall, SSRU’s randomised controlled trials have seen more relevant practitioners actively involved, in increasingly varied and influential study roles, and with greater capacity to contribute to the research process (Sawtell, 2019).
Similar trends can be seen in the systematic review work, beginning with the UK Department of Health, where rigorous research methods were required to produce replicable and updateable reviews that were also relevant and useful. The relevance and utility came from involving potential users in choosing the questions to be addressed, discussing how to find relevant studies, and interpreting the emerging findings (Rees and Oliver, 2017). Guided by these principles, the research methods themselves needed to be adapted to different questions and research literatures as review work expanded to inform many government departments, and particularly the Department for Education and the Department for International Development.
Maximising the relevance of reviews to policy problems and evidence needs came from review teams and potential review users engaging with each other to understand issues from different standpoints, challenging each other, and tackling technical and political constraints to shape review questions and the analysis, and then to interpret the findings (Oliver et al., 2018a). Addressing policymakers’ interests in complex systems and their complex pathways between intervention and outcome continues to drive methodological innovation (Hong et al., 2021; Kneale et al., 2018, 2020). This is particularly so for resource-poor settings where challenges prompted government commissioning of the Centre for Excellence in Development Learning and Impact (CEDIL), where David Gough, as director of the EPPI-Centre, brought his experience of synthesis and studying use of research for policy decisions to his role on the steering group.
For delivering evidence quickly to meet a policy deadline, methods were developed for this too (Thomas et al., 2011, 2013; Wilson et al., 2021), including living maps of evidence to offer easy access to new studies (Shemilt et al., 2022) and information technology to accelerate review processes (Shemilt et al., 2014; O’Mara-Eves et al., 2015; Stansfield et al., 2017).
The purpose of this work was to create an impact, by influencing ways of thinking or building knowledge, by strengthening the capacity to get things done and by influencing how things are done in the wider world. Oakley can claim all these impacts.
Oakley changed ways of thinking, for instance, by making private lives a public concern, by considering children and women as minority populations, and by influencing the ethics of social science.
Taking the first of these, she expanded public debates to include private lives, starting with paying serious attention to housework as an occupation (Oakley, 1974). She drew on her personal and family histories to explore gender, patriarchy, feminism, methodology and much more (Oakley, 1996, 2014). This challenging of traditional boundaries was also expressed by her and her colleagues as they combined their academic knowledge with the networked experiential knowledge of community advocates to reshape social science studies designed to inform policy. In these ways, social science at SSRU was and remains a political activity that takes into account private lives (Hood et al., 1999).
Also political is seeing women and children as marginalised populations. While Oakley focused predominantly on women, she enjoyed fruitful discussions with her SSRU colleagues, Berry Mayall and Priscilla Alderson, who were leaders in the emerging field of sociology of childhood, and who pioneered postgraduate study of sociology of childhood and children’s rights (Moran-Ellis, 2010).
Making knowledge production more inclusive also had implications for the ethics of social science. In the early 1990s, a series of SSRU conferences took discussion of informed consent beyond medicine, law and philosophy to a wider range of social sciences and advocacy (Alderson, 1992). A continuing focus on ethics has led to the implications being captured in a practical guide (Alderson and Morrow, 2020), and to raising the focus on ethics when systematic reviewing (Harden and Oliver, 2001).
These activities demonstrate Oakley’s local impact of attracting colleagues with diverse backgrounds open to unfamiliar ideas, and fostering an environment conducive to transdisciplinary research. With her growing research unit, she influenced how social research is done, and spread the skills for engaging with systematic reviews.
The methodological innovations for evaluating complex interventions and systematically reviewing diverse literatures to suit the evidence needs of decision-makers have not only stood the test of time, they have also influenced how research is done, how findings are shared and what decisions are made. When Oakley (1999) chose to adopt different research traditions, both successively for different studies and simultaneously within single studies, she drew criticism from both those who espoused qualitative ‘feminist’ methods and those who valued experimental design and precise measures of causal relationships. Just as she embraced different ways of knowing (Oakley, 2000) to study health promotion and education, under her leadership, SSRU continued to combine the two approaches, and over time the practice became more common with academics elsewhere. Advances with systematic review methods met resistance similar to that encountered with primary research. There were objections to synthesising social science using quantitative methods and, conversely, to synthesising qualitative research either at all or not using the ‘right’ method (Thomas, in press). Nevertheless, the EPPI-Centre’s book advocating a broad range of approaches to systematic reviewing (Gough, Oliver and Thomas, 2012; Gough et al., 2017) is widely cited, and listed as a set text in many accredited higher education programmes.
The mainstreaming of research methods advanced at SSRU is now apparent in national and international guidance. Aligned with Oakley’s egalitarian approach, which started with listening to women using NHS services, were SSRU’s advances in patient and public involvement in research. These were integrated into routine procedures for the NHS Health Technology Programme (Oliver et al., 2001a, 2001b), and then across the National Institute for Health and Care Research (NIHR). In particular, partnership methods for setting research agendas (Cowan and Oliver, 2021) were developed by the James Lind Alliance, and then taken in house by the NIHR to influence research agendas across the NHS.
Integrating deep listening into evaluations of complex interventions has also been widely taken up. The Medical Research Council (Moore et al., 2015) cites SSRU authors to guide process evaluations of complex interventions (Oakley et al., 2006). Broader Medical Research Council guidance on developing and evaluating complex interventions (Craig et al., 2008), and the updated version prepared with the NIHR (Skivington et al., 2021), similarly cite SSRU authors for integrating process evaluations in randomised controlled trials (Oakley et al., 2006), employing a prospective matched comparison design (Wiggins et al., 2009) and theorising the harmful consequences of public health interventions (Bonell et al., 2015).
Similarly, methods for reviewing people’s views and experiences, not only the effects of interventions, are now widely endorsed. Many EPPI-Centre publications appear in guidance for systematic reviewing published by the Centre for Reviews and Dissemination (2001). Particularly notable was Harden et al.’s (2001) work, which this handbook offered as the first example of reviewing process evaluations. The Cochrane Handbook, which offers guidance to systematic reviewers internationally, now has James Thomas at the EPPI-Centre as co-senior scientific editor (Higgins et al., 2022), and includes many EPPI-Centre advances, from planning a review to transferability of the findings. The United Nations Children's Fund (UNICEF), a non-governmental organisation advocating the use of evidence, cites much of SSRU’s work in their guidance for staff on how to undertake, commission and manage evidence synthesis (Bakrania, 2020).
The manual for developing National Institute for Health and Care Excellence (NICE) guidelines cites EPPI-Centre authors for systematic review methods (Gough et al., 2012a, 2012b), in particular for identifying studies (Brunton et al., 2017) and accelerating this process with text mining (Stansfield et al., 2017), reviewing studies of people’s views (Harden et al., 2004), and implementing NICE guidance (Kneale et al., 2016a).
The WHO’s (2021) guide to evidence-informed decision-making draws on EPPI-Centre research for broad approaches to: engaging stakeholders in research and decision-making (Oliver et al., 2018c); systematically reviewing for different types of questions (Gough et al., 2012b, 2019); and engaging decision-makers with evidence through enhancing motivation, skills and access to evidence (Langer et al., 2016).
Guidance for research is not enough. Skills for conducting research and making good use of it are also required for making an impact. The EPPI-Centre has worked with policy organisations nationally and internationally to spread evidence skills through academic, policy and service organisations. When the Department of Health commissioned systematic reviews, they commissioned complementary workshops for practitioners to learn how to appraise research evidence relevant to their own work (Oliver et al., 1996). When interest in systematic reviews spread from health to other sectors, experienced systematic reviewers led workshops for policymakers and offered ongoing methodological support to novice systematic review teams. The EPPI-Centre supported 23 review groups to produce 40 reviews in education, with commentaries from decision-makers outside academia, and supported 66 review teams in international development. When the WHO invested in systematic review centres across the Global South, with EPPI-Centre support, new centres were required to engage with policymakers to identify priority questions (Alliance for Health Policy and Systems Research, 2009). Other organisations embarking on producing systematic reviews, or using them for decisions, requested discussions and support tailored to their specific interests, such as literacy and learning, food safety and agriculture. Investment in the next generation to engage with evidence came from building on the resources for continuing professional education, consolidating and developing them further for accredited teaching at postgraduate level. There is now a thriving master’s programme for social research and social policy, including an option to focus on systematic reviews.
While medicine has long been considered a research-based profession, and more so with the growth of systematic reviews, it is relatively recently that Best Evidence Medical Education (BEME) began collating research evidence about educating health professionals, with Mark Newman at SSRU being a founding member. As support and skills have grown in other areas, teaching, policing and health visiting are also maturing as research-based professions. The challenge to teaching came first in 1996, when David Hargreaves (1996), advising government ministers of education, imagined teaching being more effective and more satisfying if informed by research. Teachers’ appreciation of easy access to research, mentioned above, and the investment in this access, considered below, show that this is becoming a reality. Similar developments are happening in policing, and the EPPI-Centre has played important roles working with both professions, producing systematic reviews that address questions raised by the professions, and supporting novice teams to do the same.
In supporting the research base of professions allied to medicine, Oakley and colleagues have contributed randomised controlled trials (Wiggins et al., 2004; Christie et al., 2014) and systematic reviews (Schucan Bird et al., 2015; Caird et al., 2010). Co-authoring much of this work was Mary Sawtell, who combined her profession as a health visitor with her academic research, starting with the randomised controlled trial of health visiting support (Sawtell and Jones, 2002). In 2015, she was appointed Fellow of the Institute of Health Visiting, the national professional body for health visitors, and in 2017, she became their Research Champion, to develop a research-rich culture within the health-visiting workforce, and for the creation of opportunities for more research careers in the allied health professions.
However, individual or even team skills are insufficient for either producing evidence for decisions, or using it. Researchers and decision-makers need conducive environments where structures and networks offer support, access to evidence and procedures to routinise their evidence work. Oakley was a pioneer in this too, starting in the mid-1990s with her contributions to the Cochrane Collaboration, a growing global network committed to producing systematic reviews of healthcare and making them readily available. Her work here began with an informal network of academics interested in changing the behaviours of health professionals and the populations they served. These efforts later consolidated with groups producing systematic reviews addressing, initially, effective healthcare and organisation of practice and, considerably later, public health. These areas of interest lend themselves to crossing the boundaries of health research that were conventionally aligned with specific health conditions. Collaborations and progress here were hampered because promoting healthy behaviour was a poor fit for the fast emerging review groups who shared interests delineated by, for instance, cancer, heart disease or sexually transmitted disease. The core health work of the EPPI-Centre therefore continued to develop novel methods, synthesising qualitative and quantitative studies for tackling major health policy problems, rather than conforming to the interests of academic colleagues more interested in clinical care. Nevertheless, EPPI-Centre colleagues showed commitment to mutual sharing of knowledge and skills by taking on individual roles within Cochrane as authors, editors, methods group convenors and peer reviewers.
The story so far includes the SSRU working very closely with government departments and national research programmes to deliver research that met the needs of policymakers and the NHS. Complementing these contributions of policy-relevant research were efforts to develop evidence infrastructures, in the form of organisational intermediaries that support both demand and supply of research evidence for policy decisions.
Complementing the EPPI-Centre’s portfolio of systematic reviews by teams of largely UK academics in education was the development of a network of 35 education organisations in 23 countries across Europe. This was prompted by the European Commission’s call for projects ‘to develop knowledge brokerage mechanisms in the field of education and training … to strengthen the links between research, policy and practice … [and] to bring research to the attention of policy and decision-makers and practitioners’ (European Commission, 2009a, in Gough et al., 2011: 14). This work built on research and direct experience to understand the range of European activities linking research and policy in education, and training activities developed knowledge and skills for finding, using and interpreting research.
Other organisations took the same two-pronged approach to strengthen capacity in doing and using systematic reviews simultaneously. The Alliance for Health Policy and Systems Research developed a network of review centres across the Global South. Particular emphasis was placed on priority-setting exercises to ensure that systematic reviews addressed concerns considered important by policymakers. The Department for International Development funded academics in development studies to produce systematic reviews, in discussion with their policy teams. With policy teams initiating the questions and discussing the emerging findings, these reviews had an impressive impact on the department's work and elsewhere (Oliver et al., 2020). It was the commitment to choosing from a range of systematic review methods to suit the task in hand – the type of research question and the literature available – that prompted these organisations to invite EPPI-Centre support for their initiatives.
This maturing evidence movement was supported by knowledge infrastructures and procedures for setting research priorities, collating systematic review evidence and making that evidence available in formats accessible to decision-makers. These were all areas pioneered by SSRU.
SSRU’s innovative work of developing consensus methods for setting research agendas, described above, was originally undertaken as pilot studies within the NHS Health Technology Assessment programme and independently with the James Lind Alliance in the early 2000s. During the following decade, these ways of working were institutionalised when the NIHR took responsibility for recruiting and training advisers to guide agenda-setting partnerships.
The EPPI-Centre co-authored a Rapid Evidence Assessment Toolkit for civil servants with the Government Social Research Service (GSR and EPPI-Centre, n.d.). Its ways of working with evidence are embedded in procedures for developing health and social care guidance (NICE, 2014). Further afield, it informed the development of various Scandinavian centres, in Denmark, Sweden and Norway, and thinking about use of evidence by the Organisation for Economic Co-operation and Development (Gough, 2007). Beyond government, UNICEF has developed guidance for its own evidence-informed decision-making that draws heavily on the EPPI-Centre’s pluralistic approach to systematic reviewing (Bakrania, 2020).
While these developments suited policymakers who were commissioning evidence, it was a network of What Works Centres that developed toolkits to suit those looking for existing evidence.
Evidence toolkits have been summarising research evidence for a wide range of practitioners, such as police officers, teachers, managers and leaders. The EPPI-Centre has supported this development by contributing content and software.
The What Works Centre for Crime Reduction Toolkit offers evidence appraised and summarised by SSRU, including systematic reviews conducted at SSRU about mental health (Schucan Bird et al., 2018) and domestic violence (Vigurs et al., 2015, 2016). These systematic reviews all share the themes of complex interventions crossing sectoral boundaries, in this case, between health and justice. The review of motivational approaches to reduce domestic violence (Vigurs et al., 2015) also shares the theme of listening deeply to understand those receiving the services, rather than only those providing them.
Similarly, the findings of mixed methods SSRU research have been incorporated into toolkits for teachers and school leaders wishing to improve learning outcomes. For instance, teachers interested in hand-held devices for improving classroom feedback can learn from one of the toolkits (EEF, n.d.) that their peers were impressed by the speed of assessment and feedback, and perceived improvements in student engagement, although there were practical and organisational difficulties, and no improvement in student attainment – all of which was based on an SSRU randomised controlled trial with an integral process evaluation (Wiggins et al., 2017). Such research-based guidance is appreciated by schools (DfE, 2020), particularly during the Covid-19 pandemic (Achtaridou et al., 2022).
In time, the software developed to safeguard data and facilitate analysis in EPPI-Centre systematic reviews was introduced for review authors working with Cochrane and its sister organisation, the Campbell Collaboration, covering sectors beyond health. It also now underpins systematic reviews summarised and made available by What Works Centres for health and social care (NICE), crime reduction (College of Policing) and education (Education Endowment Foundation). The What Works Centres for youth offending and youth employment also use this and other EPPI-Centre software for visualising evidence maps.
Evidence toolkits, and the work of evidence intermediaries more generally, are works in progress, as illustrated by an analysis of What Works Centres (Gough et al., 2018). Intermediary organisations are becoming more explicit about the evidence underpinning their own aims and methods, a step that enhances confidence in claims made about the evidence available and the subsequent uptake of research evidence (Gough et al., 2022). A leading example was NICE, with its Research Support Unit, led by the EPPI-Centre, which conducted research about using real-world data, making decisions by committee and implementing guidance (Kneale et al., 2016a, 2016b; Oliver et al., 2018b).
The ultimate test of usefulness is whether systematic reviews inform policy decisions which then make a difference to personal and working lives.
SSRU’s impact on policy has been achieved less from engaging decision-makers with the findings of their research, and more from shaping the research to focus on the priorities expressed by decision-makers. Systematic reviews of smoking cessation were sharply criticised by health promotion practitioners for compiling biomedical evidence alone. After the research was redesigned to take into account health worker and service user concerns about how to help women quit smoking, and the social and emotional impacts (Oliver et al., 2001c), it informed national guidelines for maternity care in Australia, Brazil, South Africa and the UK. This mixed methods approach to systematic reviewing (Stead et al., 2013) was also influential in shaping national policy for plain tobacco packaging to discourage smoking (Chantler et al., 2014).
Further afield, reviewing qualitative studies (Magwood et al., 2018) influenced the WHO’s recommendation that women carry health records for themselves and their children (WHO, 2018), and uptake of this recommendation in Afghanistan (Saeedzai et al., 2019). Similarly, the findings of another qualitative synthesis (Eshun-Wilson et al., 2019) helped redesign a service for HIV patients struggling with their antiretroviral therapy (Arendse et al., 2021).
The next step, of implementing policy, also benefited from social scientists bringing direct experience of using or providing public services to the roll-out of national policies. SSRU’s experience of this came from policies supporting pregnant teenagers and teenage parents, and for health screening newborn babies.
Sure Start Plus was a national initiative to support pregnant young women and teenage parents. Innovative approaches were developed locally to explore different ways of delivering services, and the research team shared the findings of their evaluation at regular intervals to allow teams across the country to refine their services (Wiggins et al., 2005).
Policy implementation for screening newborn babies’ blood for rare but serious health conditions took a more centralised approach. As with earlier perinatal research, this work required the ability to navigate natural and social sciences simultaneously. The research team synthesised research about screening experiences, and updated the findings by interviewing parents and clinicians (Stewart et al., 2005; Hargreaves et al., 2005a, 2005b). This research was guided by parents, laboratory scientists, clinicians and data managers, who then guided the subsequent development of new standard practices and training for professionals, and information for parents, all based on research evidence (Stewart et al., 2011). It required SSRU staff to combine their research, facilitation, education and networking skills to work collaboratively across boundaries between professions, between service providers and users, and between the four nations of the UK.
The newborn screening research and development offered midwives training and resources to discuss screening with parents, and made unbiased information available for the parents of the 810,000 babies screened each year and nearly 1,500 who needed further diagnostic tests, and year-on-year improvements in the timeliness of screening (Institute of Education, 2014).
The rigour and relevance of SSRU’s portfolio of studies offered findings that have influenced public services across the world.
This track record of working across boundaries to create new knowledge (Part One) and that new knowledge creating an impact in the wider world (Part Two) raises questions about the critical factors that led to these two achievements.
The story in this article began in the early 1970s, when science was being recognised as a social activity, not merely a technical activity, and continued through the decades in which that social activity was increasingly open to academics working across multiple disciplines, and to people bringing personal or practical experience, rather than scientific qualifications (Collins and Evans, 2002).
A study for the Higher Education Funding Council for England (HEFCE) and Research Councils UK (RCUK) explored with researchers, funders and strategic leaders at higher education institutions the incentives and barriers to working across disciplines (Davé et al., 2016). Incentives included intellectual curiosity to answer complex research questions, a route to better quality research, broader recognition and impact, and increasingly available funding. Conversely, perceived barriers are the time and effort required to overcome disciplinary boundaries, and to develop shared priorities and language, and disciplinary norms constraining ways of working, career pathways and evaluations of research outputs.
Since the 1970s, universities have been also going beyond encouraging public appreciation and understanding of science, and beyond supporting public debate about the value of scientific advances, and have been reaching out to invite wide discussion and influence of what research should be conducted and how (Collins and Evans, 2002). Burchell et al. (2017) record the history and influence of policy interventions to encourage and assess such public engagement with research in the UK. Public engagement with research was increasingly encouraged by government funding for Beacons and Catalysts for public engagement from 2008 to 2017, by RCUK’s Concordat for Engaging the Public With Research in 2011 (Duncan and Manners, 2012), and by frameworks for Responsible Research and Innovation and a work programme of Science With and for Society from the European Commission in 2015 (European Commission Directorate–General for Research and Innovation, 2015). Public engagement and its impact was first formally assessed across UK universities in the 2014 Research Excellence Framework. Despite these policy interventions, ‘public engagement by researchers [remained] constrained by systems of funding and reward for research, teaching and other activities’ (Burchell et al., 2017: 198).
If working across and beyond academic disciplines has been so constrained, why, as early as the 1990s, was SSRU so enthusiastic in reaching out? Answers can be found in part in the composition and attributes of the research teams, and in part in their context of working.
Oakley did not wait for policy interventions or special funding opportunities before listening and acting on voices from outside academia. From its inception, her research unit provided a conducive setting for engaging wider society with doing and using research that ultimately led to innovations in national and international research programmes, and more inclusive ways of designing and implementing public programmes, and contributing to systems for both, in the UK, across Europe, in the Global South and worldwide.
The constraints that slowed progress in this direction barely applied to SSRU. Academics widely see public engagement as inevitably competing with research and teaching for time and attention (Burchell et al., 2017). In contrast, Oakley’s research unit had no opportunity to host accredited teaching for 25 years, and public engagement was considered an integral part of the unit’s research. Indeed, government contracts often required public or policy engagement in research. To a research unit sustained largely by ‘soft’ funding, often from charities or government departments, success was as much a matter of delivering useful research as of documenting the work in academic journals. A clear and shared purpose of influencing the wider world, and clear measurable goals of completing each contract, in turn align well with key attributes of interdisciplinary research teams (Lakhani et al., 2012). A more prosaic explanation is that salaries for most of Oakley’s immediate colleagues depended on keeping funders happy, with or without curricula vitae with long publication lists. This incentive prompted successive research proposals to seek funding, with sufficient success to maintain low staff turnover, retain talent and grow a cohesive team, which are other key attributes of interdisciplinary research teams (Lakhani et al., 2012).
Staff recruited from local government and the voluntary sector related well to research funders from these sectors. Thus, the research conducted by Oakley and her colleagues was largely ‘Mode 2’, conducted in the context where it was intended to be used, and often conducted in discussion with the professionals best placed to use it (Nowotny et al., 2003). Working with managers and practitioners has led to programme and practice impact, while working with policy professionals has led to policy impact. Long-term funding came from the Department of Health’s Policy Research Programme; that programme and the Department for International Development’s Knowledge to Action team helpfully supplied their own knowledge brokers from whom SSRU staff developed their own knowledge brokering skills with each successive contract.
Oakley and the colleagues who joined her at SSRU have made a specialism of creating new knowledge by working across boundaries – academic boundaries between social sciences and natural sciences, or between qualitative and quantitative methods, and boundaries between academia and wider society.
Oakley’s innovations in methodology took randomised controlled trials from clinical settings to community settings, qualitative enquiry from sociology to experiment social science, systematic reviews from healthcare to social care, and evidence-informed decision-making from the health sector to education. Her work attracted researchers who shared her enthusiasm for crossing disciplinary boundaries and upending conventional understandings. Their groundbreaking methods integrated qualitative research within randomised controlled trials and into systematic reviews. Both have led to more illuminating research findings, and have strengthened the role of practitioners in guiding and doing health and education research.
SSRU’s early advances in systematically reviewing process evaluations were attributed to working as a team (Harden, 2007). Teamwork facilitated critical debate, lateral thinking and protection against blind spots by drawing on multiple disciplines (in this case, biology, nursing, geography, psychology, sociology, social policy and education). The inability to rely on disciplinary-specific assumptions, and a shared commitment to the scientific method, including randomised controlled trials and qualitative research, inspired methodological developments for the task at hand, namely producing research tools and findings relevant to policy dilemmas that can be justified by methodological rigour. This extended to many colleagues committing personal time to doctoral studies delineated not by academic disciplines, but by a research agenda driven by policy priorities. Thesis by thesis, they contributed to a new methodology about learning from broad research literatures and different perspectives. By listening to people faced with personal, professional and policy decisions, they developed methods for: assessing the quality of qualitative research (Harden, 2007); sharing expertise within mixed teams (Stewart, 2007); sharing evidence-informed decision-making with children (Sutcliffe, 2010); involving marginalised groups in research synthesis (Liabo, 2013) to inform equitable public policy (Rees, 2017); designing search strategies for diverse literatures (Stansfield et al., 2017); evolving framework synthesis to analyse such literatures (Brunton, 2017); producing policy-relevant reviews with interpersonal interactions and institutional supports (Dickson, 2019); involving health and education practitioners in randomised controlled trials evaluating complex social interventions (Sawtell, 2019); involving children in developing paediatric medicines (Stokes, 2019); and combining research with large-scale data sets to analyse children’s outcomes spanning different policy domains (Simon, 2022).
Oakley’s motivation for her work is:
to produce evidence that is relevant to policy-making and to practice, so, the ultimate aim being to actually improve people’s lives. For me, the point of it all is not to theorize in an armchair kind of way, it’s about having some kind of practical impact, and sometimes you have that by opening a debate, by making people argue, and by highlighting an issue, like the treatment of women in childbirth, that was not regarded as an issue before.
Her social science was closely related to her activism in the field of childbirth (Oakley, 2016), and this motivation to make social and political change is shared by many of her colleagues.
In 2005, Oakley relinquished the managerial responsibilities that came with leading a research unit to focus on her (auto)biographical research and scholarship. These works make clear that the interplay between research and a strong motivation for change is nothing new (Oakley, 1996, 2007, 2011, 2014). While her colleagues continue to build on her social science methodology, her recent scholarship has revealed a long history of ‘evidence work’ advanced by women whose contributions have been ignored or undervalued, and it has drawn attention to the need for greater methodological rigour in historical and biographical work (Oakley, 2018, 2021).
Early contributions to social science and societal change were made by ‘settlement sociologists’ in the late nineteenth and early twentieth centuries (Oakley, 2017). These were mainly women who provided neighbourhood services to meet the expressed needs of people living in crowded, disadvantaged urban areas, and they simultaneously collected data about those needs to guide public policy. Sometimes living in the settlements, sociologists developed relationships organically with those they supported and researched. They followed neighbourhood agendas, and pioneered a mix of research methods for analysing data from ‘house-to-house surveys, in-depth interviews, questionnaires, personal budget-keeping, participant observation and the use of key informants [and documentary evidence], legislation, memoirs and diaries, wage and cost-of-living records, court and industrial accident reports, tax rolls and nursery rhymes’ (Oakley, 2017: 26). These projects were the forerunners of community-based participatory research.
There was another group of women combining activism and research about poverty in the early twentieth century. They were conducting empirical social science, innovating research methods to tackle sensitive issues, belonging to both reform and research organisations, and teaching and writing sociology, albeit without secure posts, at the London School of Economics (Oakley, 2020). They shared characteristics with the settlement sociologists that were echoed much later during the first 25 years of SSRU. They supported the cause of social reform, advanced empirical mixed research methods, valued the knowledge held by the communities they researched, and relied largely on charitable or ‘soft’ funds without the more secure structures of mainstream academia. They and SSRU changed social science and the wider world as a result of their work.
Academic research having an impact in the wider world depends on academic environments conducive to producing policy-relevant research, and policy environments conducive to assimilating academic findings.
With research salaries relying more on external funders than core HEFCE budgets, incentives to work within disciplines were weaker in SSRU than the incentives to work across disciplines and beyond. The vision and skills for outreach, where SSRU already had a strong track record, gained greater legitimacy when HEFCE’s Research Excellence Framework (REF) placed greater emphasis on the impact of research in the wider world. The research unit submitted two research impact case studies to the REF in 2014, and another in 2021.
SSRU staff long earned promotion not through their teaching, as opportunities for accredited teaching in a research unit were rare, but through their public and policy engagement. Later, UCL’s academic career framework also placed more emphasis on such external engagement, thereby providing a promotion route that suited academics who had been exceptionally outward looking.
This explanation of contexts that are welcoming or hostile to using specific research findings may explain why SSRU research has met with more or less enthusiasm. A systematic review of peer-delivered health promotion in schools pioneered the integration of effectiveness and process evidence (Harden et al., 2001). Although this influenced social science, as a case study for a handbook of systematic review methods (Centre for Reviews and Dissemination, 2001), findings about implementation challenges discouraged the Department of Health from adopting the policy for schools, making their consideration of this evidence invisible.
Conversely, evidence may be particularly welcome because it supports existing policy inclinations, for instance, when Médecins Sans Frontières cited a systematic review (Eshun-Wilson et al., 2019) that supported their plans for designing a service to welcome back patients with HIV who had dropped out of care. Similarly, a UKAid jobs programme announced by the UK’s prime minister when visiting India was justified by a systematic review produced with EPPI-Centre methodological guidance, training and IT (Hawkes and Ugur, 2012).
In her biographical research of Barbara Wootton, social scientist, policy activist and social reformer, Oakley (2012) draws similar conclusions about (un)conducive environments for evidence-informed policy change. Wootton led two commissions of enquiry for the government: one on cannabis in the 1960s, and the other on alternatives to prison in the 1970s. Her meticulous critique concluding that claims of significant harms of cannabis were unsupported by research failed to override ideological political opinion and dismissive mass media. In contrast, her recommendation of Community Service Orders in her second report, inspired by her professional experience as a magistrate rather than by any research, was warmly welcomed, possibly for its cost saving.
The conceptual and methodological innovations led by Oakley and her colleagues were inspired first by her commitment to listening to women talking about their lives. This attitude of deep listening transformed understandings of interviewing as a method for social science. It has since been developed as a form of social support for disadvantaged women at home, and evaluated for its potential to motivate change, improve health and avoid violence. Listening is at the core of collaborations to shape research that is relevant to policymakers, policy implementers and the communities they are meant to serve. It led to groundbreaking methods for including qualitative research alongside trials and in systematic reviews. Both have led to more illuminating research findings, and have strengthened the role of practitioners in guiding and doing health and education research. Oakley’s original cottage industry of systematic reviewing was transformed by state-of-the-art information technology which seeded evidence infrastructures now used by policy organisations across the UK, and by research organisations worldwide.
While Oakley’s colleagues continue to build on her earlier methodological work, her recent scholarship has revealed a long history of ‘evidence work’ that has often been advanced by women who, like her, have brought academic thinking closer to personal lives, and vice versa. She has been awarded honorary degrees by the University of Salford (1995), the University of Edinburgh (2012) and UCL (2018). The importance of her work was recognised by the Academy of Social Sciences when she became an Academician in 2009, and by the British Sociological Association with one of their first Lifetime Achievement Awards in 2011 for her ‘extraordinary contribution to the history of the development of sociology in Britain’. More recently, she was one of 12 exceptional women associated with UCL whose work was celebrated in Female Firsts, an art project to mark the 100th anniversary of women receiving the right to vote.
Oakley’s local legacy is SSRU, with its work in the field and studies of existing literature, rooted in strong motivations to collaborate across and beyond academia, with local authorities, government departments, community advocates, and practitioners in the public sector. All this work has an impressive reach across policy sectors (education, health, social welfare, transport, culture, environmental conservation, and charity and philanthropy), from producing evidence to studying decision-making with evidence, building systems for both, and improving professional practice and personal lives in the UK, across Europe and worldwide.
I am very grateful to SSRU colleagues who discussed and commented on earlier versions of this article – Meg Wiggins, Mary Sawtell and James Thomas – and to the anonymous peer reviewers who offered additional insights. I owe particular thanks to Ann Oakley for tolerating my story of her work and influence, with my emphases and omissions.
Not applicable to this article.
Not applicable to this article.
The author, as an academic in the Social Science Research Unit, offers these insider reflections on Ann Oakley’s work and impact through the research unit she founded. All efforts to sufficiently anonymise the author during peer review of this article have been made. The author declares no further conflicts with this article.
Achtaridou, E; Mason, E; Behailu, A; Stiell, B; Willis, B; Coldwell, M. (2022). School Recovery Strategies: Year 1 findings. London: Department for Education. Accessed 5 January 2023. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1045471/School_Recovery_Strategies_year_1_findings.pdf .
Alderson, P. (1992). Consent to Health Treatment and Research: Differing perspectives. London: University of London, Institute of Education, Social Science Research Unit.
Alderson, P; Morrow, V. (2020). The Ethics of Research with Children and Young People: A practical handbook. 2nd ed. New York: SAGE.
Alliance for Health Policy and Systems Research. (2009). Systematic Reviews in Health Policy and Systems Research. Briefing Note 4. Geneva: World Health Organization. Accessed 5 January 2023. https://ahpsr.who.int/docs/librariesprovider11/briefing-notes/alliancehpsr_briefingnote4.pdf?sfvrsn=f624f623_1&download=true .
Arendse, K.D; Pfaff, C; Makeleni-Leteze, T; Dutyulwa, T; Keene, C.M; Mantangana, N; Malabi, N; Lebelo, K; Beneke, E; Euvrard, J; Dumile, N; Roberts, E; Hlalukana, B; Cassidy, T. (2021). ‘Addressing disengagement from HIV healthcare services in Khayelitsha, South Africa, through Médecins Sans Frontières’ Welcome Service approach: Comprehensive clinical and patient centered care’. Paper presented at the 11th IAS Conference on HIV Science, 18–21 July, Accessed 10 January 2023. https://theprogramme.ias2021.org/Abstract/Abstract/1662 .
Bakrania, S. (2020). Methodological Briefs on Evidence Synthesis. Brief 1: Overview. Innocenti Research Briefs no. 2020-01. Florence: UNICEF Office of Research. Accessed 10 January 2023. https://www.unicef-irc.org/publications/1077-methodological-briefs-on-evidence-synthesis-brief-1-overview.html .
Bonell, C; Allen, E; Warren, E; McGowan, J; Bevilacqua, L; Jamal, F; Legood, R; Wiggins, M; Opondo, C; Mathiot, A; Sturgess, J; Fletcher, A; Sadique, Z; Elbourne, D; Christie, D; Bond, L; Scott, S; Viner, R.M. (2018). ‘Effects of the Learning Together intervention on bullying and aggression in English secondary schools (INCLUSIVE): A cluster randomised controlled trial’. The Lancet 392 (10163) : 2452–64, DOI: http://dx.doi.org/10.1016/S0140-6736(18)31782-3 [PubMed]
Bonell, C; Jamal, F; Melendez-Torres, G.J; Cummins, S. (2015). ‘“Dark logic”: Theorising the harmful consequences of public health interventions’. Journal of Epidemiology and Community Health 69 (1) : 95–8, DOI: http://dx.doi.org/10.1136/jech-2014-204671 [PubMed]
Brunton, G. (2017). ‘Innovation in Systematic Review Methods: Successive developments in framework synthesis’. PhD thesis. London, UK: UCL IOE.
Brunton, G; Stansfield, C; Thomas, J. (2017). ‘Finding relevant studies’. Introduction to Systematic Reviews. 2nd ed. Gough, D, Oliver, S; S and Thomas, J J (eds.), London: SAGE, pp. 93–132.
Burchell, K; Sheppard, C; Chambers, J. (2017). ‘A “work in progress”?: UK researchers and participation in public engagement’. Research for All 1 (1) : 198–224, DOI: http://dx.doi.org/10.18546/RFA.01.1.16
Caird, J; Rees, R; Kavanagh, J; Sutcliffe, K; Oliver, K; Dickson, K; Woodman, J; Barnett-Page, E; Thomas, J. (2010). The Socioeconomic Value of Nursing and Midwifery: A rapid systematic review of reviews. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.
Candy, B; King, M; Jones, L; Oliver, S. (2011). ‘Using qualitative synthesis to explore heterogeneity of complex interventions’. BMC Medical Research Methodology 11 (1) : 1–9, DOI: http://dx.doi.org/10.1186/1471-2288-11-124
Candy, B; King, M; Jones, L; Oliver, S. (2013). ‘Using qualitative evidence on patients' views to help understand variation in effectiveness of complex interventions: A qualitative comparative analysis’. Trials 14 (1) : 179. DOI: http://dx.doi.org/10.1186/1745-6215-14-179
Centre for Reviews and Dissemination. (2001). Undertaking Systematic Reviews of Research on Effectiveness: CRD guidance for those carrying out or commissioning reviews. York: Centre for Reviews and Dissemination, University of York.
Chantler, C; Jay, T; Cox, C; Wang, Y; Peacock, J; Pope, C; Collis, J; Ransford, M. (2014). Standardised Packaging of Tobacco: Report of the independent review undertaken by Sir Cyril Chantler. King’s College London. Accessed 10 January 2023. https://content.tobaccotactics.org/uploads/2021/07/C-Chantler_Standardised-packaging-of-tobacco-Report-of-the-independent-review-undertaken-by-Sir-Cyril-Chantler_April-2014.pdf .
Christie, D; Thompson, R; Sawtell, M; Allen, E; Cairns, J; Smith, F; Jamieson, E; Hargreaves, K; Ingold, A; Brooks, L; Wiggins, M; Oliver, S; Jones, R; Elbourne, D; Santos, A; Wong, I.C.K; O’Neill, S; Strange, V; Hindmarsh, P; Annan, F; Viner, R. (2014). ‘Structured, intensive education maximising engagement, motivation and long-term change for children and young people with diabetes: A cluster randomised controlled trial with integral process and economic evaluation the CASCADE study’. Health Technology Assessment 18 (20) : 1. DOI: http://dx.doi.org/10.3310/hta18200
Cockburn, C; Oakley, A. (2013a). ‘Sexual exploitation in street gangs: Protecting girls or changing boys?’. OpenDemocracy, December 9 2013a Accessed 10 January 2023. https://www.opendemocracy.net/en/5050/sexual-exploitation-in-street-gangs-protecting-girls-or-changing-bo/ .
Cockburn, C; Oakley, A. (2013b). ‘The cost of masculine crime’. OpenDemocracy, March 8 2013b Accessed 10 January 2022. https://www.opendemocracy.net/en/5050/cost-of-masculine-crime/ .
Cockburn, C; Oakley, A. (2014). ‘Domestic violence must be about prevention as well as protection’ [letter]. The Guardian, February 28 2014
Collins, H.M; Evans, R. (2002). ‘The third wave of science studies: Studies of expertise and experience’. Social Studies of Science 32 (2) : 235–96, DOI: http://dx.doi.org/10.1177/0306312702032002003
Cowan, K; Oliver, S. (2021). The James Lind Alliance Guidebook. Oxford: James Lind Alliance. Version 10. Accessed 5 January 2023. https://www.jla.nihr.ac.uk/jla-guidebook/downloads/JLA-Guidebook-Version-10-March-2021.pdf .
Craig, P; Dieppe, P; Macintyre, S; Michie, S; Nazareth, I; Petticrew, M. (2008). ‘Developing and evaluating complex interventions: The new Medical Research Council guidance’. BMJ 337 : a1655. DOI: http://dx.doi.org/10.1136/bmj.a1655
Crow, G. (2020). ‘Oakley, Ann’. SAGE Research Methods Foundations. Atkinson, P, Delamont, S; S and Cernat, A; A, Sakshaug, J.W; J.W, Williams, R.A R.A (eds.), London: SAGE, DOI: http://dx.doi.org/10.4135/9781526421036887377 Accessed 5 January 2023.
Davé, A; Hopkins, M; Hutton, J; Krčál, A; Kolarz, P; Martin, B; Nielsen, K; Rafols, I; Rotolo, D; Simmonds, P; Stirling, A. (2016). Landscape Review of Interdisciplinary Research in the UK: Report to HEFCE and RCUK by Technopolis and the Science Policy Research Unit (SPRU), University of Sussex. Brighton: Technopolis. Accessed 10 January 2023. http://sro.sussex.ac.uk/id/eprint/65332/1/2016HEFCE_Landscape%20review%20of%20UK%20interdisciplinary%20research.pdf .
DfE (Department for Education). (2020). The School Snapshot Survey: Winter 2019. 2: Workforce, Accessed 5 January 2023. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/903642/2._Workforce__6104.01_Winter_2019_.pdf .
Dickson, K.E. (2019). ‘Systematic Reviews to Inform Policy: Institutional mechanisms and social interactions to support their production’. PhD thesis. London, UK: UCL.
Duncan, S; Manners, P. (2012). ‘Embedding public engagement within higher education: Lessons from the Beacons for Public Engagement in the United Kingdom’. Higher Education and Civic Engagement. McIlrath, L, Lyons, A; A and Munck, R R (eds.), New York: Palgrave Macmillan, pp. 221–40.
East, C.E; Biro, M.A; Fredericks, S; Lau, R. (2019). ‘Support during pregnancy for women at increased risk of low birthweight babies’. Cochrane Database of Systematic Reviews (3) : CD000198. DOI: http://dx.doi.org/10.1002/14651858.CD000198.pub3
EEF (Education Endowment Foundation). (). n.d.. ‘Testing the impact of hand-held devices on improving classroom feedback and pupil attainment’. Accessed 10 January 2023. https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/learner-response-system .
Elbourne, D; Oakley, A; Chalmers, I. (1989). ‘Social and psychological support during pregnancy’. Effective Care in Pregnancy and Childbirth. Chalmers, I, Enkin, M; M and Keirse, M.J.N.C M.J.N.C (eds.), Oxford: Oxford University Press, pp. 221–36.
Eshun-Wilson, I; Rohwer, A; Hendricks, L; Oliver, S; Garner, P. (2019). ‘Being HIV positive and staying on antiretroviral therapy in Africa: A qualitative systematic review and theoretical model’. PLoS ONE 14 (1) : e0210408. DOI: http://dx.doi.org/10.1371/journal.pone.0210408
ESRC (Economic and Social Research Council). (). n.d.. ‘Putting a price on housework’. Accessed 10 January 2023. https://webarchive.nationalarchives.gov.uk/ukgwa/20220208115311/https://esrc.ukri.org/about-us/50-years-of-esrc/50-achievements/putting-a-price-on-housework/ .
European Commission Directorate–General for Research and Innovation. (2015). Indicators for Promoting and Monitoring Responsible Research and Innovation: Report from the Expert Group on policy indicators for responsible research and innovation. Luxembourg: European Commission, DOI: http://dx.doi.org/10.2777/9742
Fransman, J. (2018). ‘Charting a course to an emerging field of “research engagement studies”: A conceptual meta-synthesis’. Research for All 2 (2) : 185–229, DOI: http://dx.doi.org/10.18546/RFA.02.2.02
Gough, D. (2007). ‘The Evidence for Policy and Practice Information and Co-ordinating (EPPI) Centre, United Kingdom’. In Evidence in Education: Linking research and policy. OECD–CERI. Paris: OECD Publishing, pp. 63–9, DOI: http://dx.doi.org/10.1787/9789264033672-5-en
Gough, D; Maidment, C; Sharples, J. (2018). UK What Works Centres: Aims, methods and contexts. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Gough, D; Maidment, C; Sharples, J. (2022). ‘Enabling knowledge brokerage intermediaries to be evidence-informed’. Evidence & Policy 18 (4) : 746–60, DOI: http://dx.doi.org/10.1332/174426421X16353477842207
Gough, D, Oliver, S; S and Thomas, J J (eds.), . (2012). An Introduction to Systematic Reviews. London: SAGE.
Gough, D, Oliver, S; S and Thomas, J J (eds.), . (2017). An Introduction to Systematic Reviews. 2nd ed. London: SAGE.
Gough, D; Thomas, J; Oliver, S. (2012). ‘Clarifying differences between review designs and methods’. Systematic Reviews 1 : 28. DOI: http://dx.doi.org/10.1186/2046-4053-1-28
Gough, D; Thomas, J; Oliver, S. (2019). ‘Clarifying differences between reviews within evidence ecosystems’. Systematic Reviews 8 (1) : 170. DOI: http://dx.doi.org/10.1186/s13643-019-1089-2
Gough, D; Tripney, J; Kenny, C; Buk-Berge, E. (2011). Evidence Informed Policy in Education in Europe: EIPEE final project report. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.
GSR and EPPI-Centre (Government Social Research Service and Evidence for Policy and Practice Information and Coordinating Centre). (). n.d.. ‘Rapid Evidence Assessment Toolkit’. Accessed 10 January 2023. https://webarchive.nationalarchives.gov.uk/ukgwa/20140402164155/http:/www.civilservice.gov.uk/networks/gsr/resources-and-guidance/rapid-evidence-assessment .
Hansen, H.F; Rieper, O. (2009). ‘The evidence movement: The development and consequences of methodologies in review practices’. Evaluation 15 (2) : 141–63, DOI: http://dx.doi.org/10.1177/1356389008101968
Harden, A. (2007). ‘Qualitative Research, Systematic Reviews, and Evidence-Informed Policy and Practice’. PhD thesis. London, UK: UCL.
Harden, A; Garcia, J; Oliver, S; Rees, R; Shepherd, J; Brunton, G; Oakley, A. (2004). ‘Applying systematic review methods to studies of people's views: An example from public health research’. Journal of Epidemiology and Community Health 58 : 794–800, DOI: http://dx.doi.org/10.1136/jech.2003.014829
Harden, A; Oakley, A; Oliver, S. (2001). ‘Peer-delivered health promotion for young people: A systematic review of different study designs’. Health Education Journal 60 (4) : 339–53, DOI: http://dx.doi.org/10.1177/001789690106000406
Harden, A; Oliver, S. (2001). ‘Who’s listening? Systematically reviewing for ethics and empowerment’. Using Research for Effective Health Promotion. Oliver, S, Peersman, G G (eds.), Buckingham: Open University Press, pp. 123–37.
Hargreaves, D. (1996). ‘Teaching as a research-based profession: Possibilities and prospects’. The Teacher Training Agency Annual Lecture. Accessed 10 January 2023. http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/TTA%20Hargreaves%20lecture.pdf .
Hargreaves, K; Stewart, R; Oliver, S. (2005a). ‘Informed choice and public health screening for children: The case of blood spot screening’. Health Expectations 8 (2) : 161–71, DOI: http://dx.doi.org/10.1111/j.1369-7625.2005.00324.x
Hargreaves, K; Stewart, R; Oliver, S. (2005b). ‘Newborn screening information supports public health more than informed choice’. Health Education Journal 64 (2) : 110–9, DOI: http://dx.doi.org/10.1177/001789690506400203
Hawkes, D; Ugur, M. (2012). Evidence on the Relationship Between Education, Skills and Economic Growth in Low-Income Countries: A systematic review. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.
Head, B.W. (2016). ‘Toward more “evidence-informed” policy making?’. Public Administration Review 76 : 472–84, DOI: http://dx.doi.org/10.1111/puar.12475
Higgins, J.P.T, Thomas, J; J and Chandler, J; J, Cumpston, M; M, Li, T; T, Page, M.J; M.J, Welch, V.A V.A (eds.), . (2022). Cochrane Handbook for Systematic Reviews of Interventions, Version 6.3 (updated February 2022). Cochrane. Accessed 5 January 2023. www.training.cochrane.org/handbook .
Hong, Q.N; Bangpan, M; Stansfield, C; Kneale, D; O’Mara-Eves, A; Van Grootel, L; Thomas, J. (2021). ‘Using systems perspectives in evidence synthesis: A methodological mapping review’. Research Synthesis Methods 13 : 667–80, DOI: http://dx.doi.org/10.1002/jrsm.1595
Hood, S; Mayall, B; Oliver, S. (1999). Critical Issues in Social Research: Power and prejudice. Buckingham: Open University Press.
Institute of Education. (2014). ‘Neonatal screening: Educating parents and health professionals to improve children’s health’. Impact case study (REF3b). Accessed 10 January 2023. https://impact.ref.ac.uk/casestudies/CaseStudy.aspx?Id=44329 .
Jerrim, J.P; Macmillan, L; Micklewright, J; Sawtell, M; Wiggins, M. (2017). ‘Does teaching children how to play cognitively demanding games improve their educational attainment? Evidence from a randomised controlled trial of chess instruction in England’. Journal of Human Resources 53 (4) : 993–1021, DOI: http://dx.doi.org/10.3368/jhr.53.4.0516.7952R
Kneale, D; Goldman, R; Thomas, J. (2016a). A Scoping Review Characterising the Activities and Landscape Around Implementing NICE Guidance. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Kneale, D; Khatwa, M; Thomas, J. (2016b). Identifying and Appraising Promising Sources of UK Clinical, Health and Social Care Data for Use by NICE. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Kneale, D; O’Mara-Eves, A; Rees, R; Thomas, J. (2020). ‘School closure in response to epidemic outbreaks: Systems-based logic model of downstream impacts [version 1; peer review: 2 approved]’. F1000Research 9 : 352. DOI: http://dx.doi.org/10.12688/f1000research.23631.1
Kneale, D; Thomas, J; Bangpan, M; Waddington, H; Gough, D. (2018). ‘Conceptualising causal pathways in systematic reviews of international development interventions through adopting a causal chain analysis approach’. Journal of Development Effectiveness 10 (4) : 422–37, DOI: http://dx.doi.org/10.1080/19439342.2018.1530278
Lakhani, J; Benzies, K; Hayden, A. (2012). ‘Attributes of interdisciplinary research teams: A comprehensive review of the literature’. Clinical and Investigative Medicine 35 (5) : E260–5, DOI: http://dx.doi.org/10.25011/cim.v35i5.18698
Langer, L; Stewart, R. (2014). ‘What have we learned from the application of systematic review methodology in international development? – A thematic overview’. Journal of Development Effectiveness 6 (3) : 236–48, DOI: http://dx.doi.org/10.1080/19439342.2014.919013
Langer, L; Tripney, J; Gough, D. (2016). The Science of Using Science: Researching the use of research evidence in decision-making. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Liabo, K. (2013). ‘Service User Involvement in Research: Collaborating on a systematic review with young people who have experience of being in care’. PhD thesis. London, UK: UCL.
Magwood, O; Kpade, V; Afza, R; Oraka, C; McWhirter, J; Oliver, S; Pottie, K. (2018). ‘Understanding mothers’, caregivers’, and providers’ experiences with home-based records: A systematic review of qualitative studies’. PLoS ONE 13 (10) : e0204966. DOI: http://dx.doi.org/10.1371/journal.pone.0204966
McHugh, M.C. (2014). ‘Feminist qualitative research: Toward transformation of science and society’. The Oxford Handbook of Qualitative Research. Leavy, P (ed.), Oxford: Oxford University Press, pp. 137–64.
Moore, G.F; Audrey, S; Barker, M; Bond, L; Bonell, C; Hardeman, W; Moore, L; O’Cathain, A; Tinati, T; Wight, D; Baird, J. (2015). ‘Process evaluation of complex interventions: Medical Research Council guidance’. BMJ 350 : h1258. DOI: http://dx.doi.org/10.1136/bmj.h1258
Moran-Ellis, J. (2010). ‘Reflections on the sociology of childhood in the UK’. Current Sociology 58 (2) : 186–205, DOI: http://dx.doi.org/10.1177/0011392109354241
NICE (National Institute for Health and Care Excellence). (2014). ‘Developing NICE guidelines: The manual’. Accessed 5 January 2023. www.nice.org.uk/process/pmg20 .
Nowotny, H; Scott, P; Gibbons, M. (2003). ‘Introduction: “Mode 2” revisited: The new production of knowledge’. Minerva 41 (3) : 179–94, DOI: http://dx.doi.org/10.1023/A:1025505528250
Oakley, A. (1972). Sex, Gender and Society. London: Temple Smith.
Oakley, A. (1974). The Sociology of Housework. London: Martin Robertson.
Oakley, A. (1981). ‘Interviewing women: A contradiction in terms?’. Doing Feminist Research. Roberts, H (ed.), London: Routledge and Kegan Paul, pp. 30–61.
Oakley, A. (1984). The Captured Womb: A history of the medical care of pregnant women. Oxford: Blackwell.
Oakley, A. (1996). Man and Wife: Richard and Kay Titmuss, my parents' early years. London: HarperCollins.
Oakley, A. (1999). ‘Paradigm wars: Some thoughts on a personal and public trajectory’. International Journal of Social Research Methodology 2 (3) : 247–54, DOI: http://dx.doi.org/10.1080/136455799295041
Oakley, A. (2000). Experiments in Knowing: Gender and method in the social sciences. Cambridge: Polity Press.
Oakley, A. (2002a). Gender on Planet Earth. Cambridge: Polity Press.
Oakley, A. (2002b). ‘Social science and evidence-based everything: The case of education’. Educational Review 54 (3) : 277–86, DOI: http://dx.doi.org/10.1080/0013191022000016329
Oakley, A. (2006). ‘Resistances to “new” technologies of evaluation: Education research in the UK as a case study’. Evidence & Policy: A journal of research, debate and practice 2 (1) : 63–87, DOI: http://dx.doi.org/10.1332/174426406775249741
Oakley, A. (2007). Fracture: Adventures of a broken body. Bristol: Policy Press.
Oakley, A. (2011). A Critical Woman: Barbara Wootton, social science and public policy in the twentieth century. London: Bloomsbury Academic.
Oakley, A. (2012). ‘The strange case of the two Wootton Reports: What can we learn about the evidence–policy relationship?’. Evidence & Policy: A journal of research, debate and practice 8 (3) : 267–83, DOI: http://dx.doi.org/10.1332/174426412X654022
Oakley, A. (2014). Father and Daughter: Patriarchy, gender and social science. Bristol: Policy Press.
Oakley, A. (2016). ‘The sociology of childbirth: An autobiographical journey through four decades of research’. Sociology of Health & Illness 38 (5) : 689–705, DOI: http://dx.doi.org/10.1111/1467-9566.12400
Oakley, A. (2017). ‘The forgotten example of “settlement sociology”: Gender, research, communities, universities and policymaking in Britain and the USA, 1880–1920’. Research for All 1 (1) : 20–34, DOI: http://dx.doi.org/10.18546/RFA.01.1.03
Oakley, A. (2018). Women, Peace and Welfare: A suppressed history of social reform, 1880–1920. Bristol: Policy Press.
Oakley, A. (2020). ‘Women, the early development of sociological research methods in Britain and the London School of Economics: A (partially) retrieved history’. Sociology 54 (2) : 292–331, DOI: http://dx.doi.org/10.1177/0038038519868631
Oakley, A. (2021). Forgotten Wives: How women get written out of history. Bristol: Policy Press.
Oakley, A; Fullerton, D; Holland, J. (1995). ‘Behavioural interventions for HIV/AIDS prevention’. AIDS 9 : 479–86, DOI: http://dx.doi.org/10.1097/00002030-199509050-00010
Oakley, A; Gough, D; Oliver, S; Thomas, J. (2005). ‘The politics of evidence and methodology: Lessons from the EPPI-Centre’. Evidence and Policy 1 (1) : 5–31, DOI: http://dx.doi.org/10.1332/1744264052703168
Oakley, A; Rajan, L; Grant, A.M. (1990). ‘Social support and pregnancy outcome’. British Journal of Obstetrics and Gynaecology 97 : 155–62, DOI: http://dx.doi.org/10.1111/j.1471-0528.1990.tb01741.x
Oakley, A; Strange, V; Bonell, C; Allen, E; Stephenson, J. (2006). ‘Process evaluation in randomised controlled trials of complex interventions’. BMJ 332 : 413–6, DOI: http://dx.doi.org/10.1136/bmj.332.7538.413
Oakley, A; Strange, V; Stephenson, J; Forrest, S; Monteiro, H. the RIPPLE study team. (2004). ‘Evaluating processes: A case study of a randomized controlled trial of sex education’. Evaluation 10 (4) : 440–62, DOI: http://dx.doi.org/10.1177/1356389004050220
Oakley, A; Strange, V; Toroyan, T; Wiggins, M; Roberts, I; Stephenson, J. (2003). ‘Using random allocation to evaluate social interventions: Three recent UK examples’. Annals of the American Academy of Political and Social Science 589 : 170–89, DOI: http://dx.doi.org/10.1177/0002716203254765
Oliver, S; Anand, K; Bangpan, M. (2020). Investigating the Impact of Systematic Reviews Funded by DFID. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education.
Oliver, S; Bangpan, M; Dickson, K. (2018a). ‘Producing policy relevant systematic reviews: Navigating the policy–research interface’. Evidence and Policy 14 (2) : 197–220, DOI: http://dx.doi.org/10.1332/174426417X14987303892442
Oliver, S; Harden, A; Rees, R; Shepherd, J; Brunton, G; Garcia, J; Oakley, A. (2005). ‘An emerging framework for including different types of evidence in systematic reviews for public policy’. Evaluation 11 (4) : 428–46, DOI: http://dx.doi.org/10.1177/1356389005059383
Oliver, S; Hollingworth, K; Briner, B; Swann, C; Hinds, K; Roche, C. (2018b). ‘Effective and efficient committee work: A systematic overview of multidisciplinary literatures’. Evidence Base (2) : 1–21, DOI: http://dx.doi.org/10.21307/eb-2018-002
Oliver, S; Liabo, K; Stewart, R; Rees, R. (2015). ‘Public involvement in research: Making sense of the diversity’. Journal of Health Services Research & Policy 20 (1) : 45–51, DOI: http://dx.doi.org/10.1177/1355819614551848
Oliver, S; Milne, R; Bradburn, J; Buchanan, P; Kerridge, L; Walley, T; Gabbay, J. (2001a). ‘Involving consumers in a needs-led research programme: A pilot project’. Health Expectations 4 (1) : 18–28, DOI: http://dx.doi.org/10.1046/j.1369-6513.2001.00113.x
Oliver, S; Milne, R; Bradburn, J; Buchanan, P; Kerridge, L; Walley, T; Gabbay, J. (2001b). ‘Investigating consumer perspectives on evaluating health technologies’. Evaluation 7 (4) : 468–86, DOI: http://dx.doi.org/10.1177/13563890122209847
Oliver, S; Nicholas, A; Oakley, A. (1996). PHASE: Promoting Health After Sifting the Evidence: Workshop report. London: EPPI-Centre Report, Social Science Research Unit, Institute of Education.
Oliver, S; Oakley, L; Lumley, J; Waters, E. (2001c). ‘Smoking cessation programmes in pregnancy: Systematically addressing development, implementation, women’s concerns and effectiveness’. Health Education Journal 60 (4) : 362–70, DOI: http://dx.doi.org/10.1177/001789690106000408
Oliver, S; Roche, C; Stewart, R; Bangpan, M; Dickson, K; Pells, K; Cartwright, N; Hargreaves, J; Gough, D. (2018c). Stakeholder Engagement for Development Impact Evaluation and Evidence Synthesis. CEDIL Inception Paper 3. London: The Centre of Excellence for Development Impact and Learning (CEDIL).
Oliver, S; Uhm, S; Duley, L; Crowe, S; David, A.L; James, C.P; Chivers, Z; Gyte, G; Gale, C; Turner, M; Chambers, B; Dowling, I; McNeill, J; Alderdice, F; Shennan, A; Deshpande, S. (2019). ‘Top research priorities for preterm birth: Results of a prioritisation partnership between people affected by preterm birth and healthcare professionals’. BMC Pregnancy Childbirth 19 (1) : 528. DOI: http://dx.doi.org/10.1186/s12884-019-2654-3 [PubMed]
O’Mara-Eves, A; Brunton, G; McDaid, D; Oliver, S; Kavanagh, J; Jamal, F; Matosevic, T; Harden, A; Thomas, J. (2013). ‘Community engagement to reduce inequalities in health: A systematic review, meta-analysis and economic analysis’. Public Health Research 1 (4) DOI: http://dx.doi.org/10.3310/phr01040
O’Mara-Eves, A; Thomas, J; McNaught, J; Miwa, M; Ananiadou, S. (2015). ‘Using text mining for study identification in systematic reviews: A systematic review of current approaches’. Systematic Reviews 4 (1) : 1–22, DOI: http://dx.doi.org/10.1186/2046-4053-4-5
Peersman, G.V; Oakley, A.R; Oliver, S.R. (1999). ‘Evidence-based health promotion? Some methodological challenges’. International Journal of Health Promotion and Education 37 (2) : 59–64, DOI: http://dx.doi.org/10.1080/14635240.1999.10806096
Platt, J. (2007). ‘The women’s movement and British journal articles, 1950–2004’. Sociology 41 (5) : 961–75, DOI: http://dx.doi.org/10.1177/0038038507080448
Rajan, L; Oakley, A. (1990). ‘Low birth weight babies: The mother's point of view’. Midwifery 6 (2) : 73–85, DOI: http://dx.doi.org/10.1016/s0266-6138(05)80151-2
Rees, R. (2017). ‘Inclusive Approaches to the Synthesis of Qualitative Research: Harnessing perspectives and participation to inform equitable public health policy’. PhD thesis. London, UK: UCL.
Rees, R; Oliver, K; Woodman, J; Thomas, J. (2011). ‘The views of young children in the UK about obesity, body size, shape and weight: A systematic review’. BMC Public Health 11 : 188. DOI: http://dx.doi.org/10.1186/1471-2458-11-188
Rees, R; Oliver, S. (2017). ‘Stakeholder perspectives and participation in systematic reviews’. An Introduction to Systematic Reviews. Gough, D, Oliver, S; S and Thomas, J J (eds.), London: SAGE, pp. 19–42.
Saeedzai, S.A; Sadaat, I; Anwari, Z; Hemat, S; Hadad, S; Osaki, K; Asaba, M; Ishiguro, Y; Mudassir, R; Machlin Burke, J; Higgins-Steele, A; Yousufi, K; Edmond, K.M. (2019). ‘Home-based records for poor mothers and children in Afghanistan, a cross sectional population based study’. BMC Public Health 19 : 766. DOI: http://dx.doi.org/10.1186/s12889-019-7076-7
Sawtell, M. (2019). An Exploration of Practitioner-Researcher Collaboration on Randomised Controlled Trials of Complex Interventions. London: UCL Institute of Education, University College London.
Sawtell, M; Jones, C. (2002). ‘Time to listen: An account of the role of support health visitors’. Community Practitioner 75 (2) : 461–3.
Schucan Bird, K; Newman, M; Hargreaves, K; Sawtell, M. (2015). Workplace-Based Learning for Undergraduate and Pre-registration Healthcare Professionals: A systematic map of the UK research literature 2003–2013. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Schucan Bird, K; Vigurs, C; Quy, K; Shemilt, I. (2018). Police Pre-Arrest Diversion of People with Mental Health Issues: A systematic review of the impacts on crime and mental health. London: UCL Institute of Education.
Scott-Samuel, A; Crawshaw, P; Oakley, A. (2015). ‘”Men behaving badly”: Patriarchy, public policy and health inequalities’. International Journal of Men’s Health 14 (3) : 250–8, DOI: http://dx.doi.org/10.3149/jmh.1403.250
Shemilt, I; Gough, D; Thomas, J; Stansfield, C; Bangpan, M; Brunton, J; Dickson, K; Graziosi, S; Hull, P; Kneale, D; Larsson, C; Mendizabal-Espinosa, R; Muraki, S; Ramadani, F; Vigurs, C. (2022). Living Map of Systematic Reviews of Social Sciences Research Evidence on COVID-19. London: EPPI-Centre, UCL Social Research Institute, University College London.
Shemilt, I; Simon, A; Hollands, G; Marteau, T.M; Ogilvie, D; O’Mara-Eves, A; Kelly, M.P; Thomas, J. (2014). ‘Pinpointing needles in giant haystacks: Use of text mining to reduce impractical screening workload in extremely large scoping reviews’. Research Synthesis Methods 5 (1) : 31–49, DOI: http://dx.doi.org/10.1002/jrsm.1093 [PubMed]
Simon, A. (2022). ‘Opportunities and Challenges of Using Secondary Analysis for Analysing Social Policy Questions in Early Childhood Education and Care and Children’s Food and Nutrition’. PhD thesis. UCL IOE.
Skivington, K; Matthews, L; Simpson, S.A; Craig, P; Baird, J; Blazeby, J.M; Boyd, K.A; Craig, N; French, D.P; McIntosh, E; Petticrew, M; Rycroft-Malone, J; Moore, L. (2021). ‘A new framework for developing and evaluating complex interventions: Update of Medical Research Council guidance’. BMJ 374 : 2061. DOI: http://dx.doi.org/10.1136/bmj.n2061 [PubMed]
Stansfield, C; Brunton, G; Rees, R. (2014). ‘Search wide, dig deep: Literature searching for qualitative research. An analysis of the publication formats and information sources used for four systematic reviews in public health’. Research Synthesis Methods 5 : 142–51, DOI: http://dx.doi.org/10.1002/jrsm.1100 [PubMed]
Stansfield, C; Dickson, K; Bangpan, M. (2016). ‘Exploring issues in the conduct of website searching and other online sources for systematic reviews: How can we be systematic?’. Systematic Reviews 5 : 191. DOI: http://dx.doi.org/10.1186/s13643-016-0371-9
Stansfield, C; Kavanagh, J; Rees, R; Gomersall, A; Thomas, J. (2012). ‘The selection of search sources influences the findings of a systematic review of people's views: A case study in public health’. BMC Medical Research Methodology 12 : 55. DOI: http://dx.doi.org/10.1186/1471-2288-12-55 [PubMed]
Stansfield, C; O’Mara-Eves, A; Thomas, J. (2017). ‘Text mining for search term development in systematic reviewing: A discussion of some methods and challenges’. Research Synthesis Methods 8 (3) : 355–65, DOI: http://dx.doi.org/10.1002/jrsm.1250
Stead, M; Moodie, C; Angus, K; Bauld, L; McNeill, A; Thomas, J; Hastings, G; Hinds, K; O’Mara-Eves, A; Kwan, I; Purves, R.I; Bryce, S.L. (2013). ‘Is consumer response to plain/standardised tobacco packaging consistent with framework convention on tobacco control guidelines?. A systematic review of quantitative studies’. PLoS ONE 8 (10) : e75919. DOI: http://dx.doi.org/10.1371/journal.pone.0075919
Stephenson, J.M; Strange, V; Forrest, S; Oakley, A; Copas, A; Allen, E; Black, S; Ali, M; Monteiro, H; Johnson, A.M. the RIPPLE study team. (2004). ‘Pupil-led sex education in England (RIPPLE study): Cluster-randomised intervention trial’. The Lancet 364 : 338–46, DOI: http://dx.doi.org/10.1016/s0140-6736(04)16722-6
Stewart, R. (2007). ‘Expertise and Multi-Disciplinary Training for Evidence-Informed Decision-Making’. PhD thesis. London, UK: University of London.
Stewart, R; Coppinger, C; Cavanagh, C; Oliver, S. (2011). ‘Participative research and policy’. International Public Health Journal 3 (2) : 145–9. Accessed 10 January 2023. https://www.semanticscholar.org/paper/Participative-Research-and-Policy-Stewart-Oliver/ead9ad5e3c1cbe3498d18455548b25863cf83ce9 .
Stewart, R; Hargreaves, K; Oliver, S. (2005). ‘Evidence informed policy making for health communication’. Health Education Journal 64 (2) : 120–8, DOI: http://dx.doi.org/10.1177/001789690506400204
Stokes, G. (2019). ‘“It’s a Bit of an Oxymoron”: A multimethod investigation into professionals’ views of children’s involvement in medicines research and development’. PhD thesis. London, UK: UCL.
Sutcliffe, K. (2010). ‘Shared Decision-Making: An evidence-based approach for supporting children, parents and practitioners to manage chronic conditions’. PhD thesis. London, UK: UCL.
Thomas, J. (). (in press). ‘Methods development in evidence synthesis: A dialogue between science and society’. The Handbook of Meta-Research. Oancea, A, Derrick, G; G and Xu, X; X, Nuseibeh, N N (eds.), Cheltenham: Edward Elgar.
Thomas, J; Newman, M; Oliver, S. (2013). ‘Rapid evidence assessments of research to inform social policy: Taking stock and moving forward’. Evidence & Policy 9 (1) : 5–27, DOI: http://dx.doi.org/10.1332/174426413X662572
Thomas, J; Harden, A; Oakley, A; Oliver, S; Sutcliffe, K; Rees, R; Brunton, G; Kavanagh, J. (2004). ‘Integrating qualitative research with trials in systematic reviews: An example from public health’. BMJ 328 : 1010–2, DOI: http://dx.doi.org/10.1136/bmj.328.7446.1010
Thomas, J; McNaught, J; Ananiadou, S. (2011). ‘Applications of text mining within systematic reviews’. Research Synthesis Methods 2 (1) : 1–14, DOI: http://dx.doi.org/10.1002/jrsm.27 [PubMed]
Thomas, J; Graziosi, S; Brunton, J; Ghouze, Z; O’Driscoll, P; Bond, M; Koryakina, A. (2022). EPPI-Reviewer: Advanced software for systematic reviews, maps and evidence synthesis. London: EPPI-Centre, UCL Social Research Institute, University College London.
Vigurs, C; Schucan-Bird, K; Quy, K; Gough, D. (2015). A Systematic Review of Motivational Approaches as a Pre-treatment Intervention for Domestic Violence Perpetrator Programmes. College of Policing. Accessed 10 January 2023. https://library.college.police.uk/docs/college-of-policing/Motivational-intervewing-2016.pdf .
Vigurs, C; Wire, J; Myhill, A; Gough, D. (2016). Police Initial Responses to Domestic Abuse: A systematic review. Crook: College of Policing.
Warburton, N. (2013). ‘Ann Oakley on women’s experience of childbirth’. Social Science Space. February 4 2013 Accessed 10 January 2023. https://www.socialsciencespace.com/2013/04/podcast-ann-oakley-on-womens-experience-of-childbirth/ .
WHO (World Health Organization). (2018). WHO Recommendations on Home-Based Records for Maternal, Newborn and Child Health. Geneva: World Health Organization.
WHO (World Health Organization). (2021). Evidence, Policy, Impact: WHO guide for evidence-informed decision-making. Geneva: World Health Organization.
Wiggins, M; Bonell, C; Sawtell, M; Austerberry, H; Burchett, H; Allen, E; Strange, V. (2009). ‘Health outcomes of youth development programme in England: Prospective matched comparison study’. BMJ 339 : b2534. DOI: http://dx.doi.org/10.1136/bmj.b2534 [PubMed]
Wiggins, M; Oakley, A; Roberts, I; Turner, H; Rajan, L; Austerberry, H; Mujica, R; Mugford, M. (2004). ‘The Social Support and Family Health Study: A randomised controlled trial and economic evaluation of two alternative forms of postnatal support for mothers living in disadvantaged inner-city areas’. Health Technology Assessment 8 (32) : 1–120, iii, ix–x, 1–120 DOI: http://dx.doi.org/10.3310/hta8320
Wiggins, M; Rosato, M; Austerberry, H; Sawtell, M; Oliver, S. (2005). Sure Start Plus National Evaluation: Final report. London: Social Science Research Unit, Institute of Education, University of London.
Wiggins, M; Sawtell, M; Jerrim, J. (2017). Learner Response System Evaluation Report and Executive Summary. London: Education Endowment Foundation.
Wiggins, M; Sawtell, M; Wiseman, O; McCourt, C; Eldridge, S; Hunter, R; Bordea, E; Mustard, C; Hanafiah, A; Hatherall, B; Holmes, V; Mehay, A; Robinson, H; Salisbury, C; Sweeney, L; Mondeh, K; Harden, A. (2020). ‘Group antenatal care (Pregnancy Circles) for diverse and disadvantaged women: Study protocol for a randomised controlled trial with integral process and economic evaluations’. BMC Health Services Research 20 (1) : 1–14, DOI: http://dx.doi.org/10.1186/s12913-020-05751-z [PubMed]
Wiggins, M; Sawtell, M; Wiseman, O; McCourt, C; Greenberg, L; Hunter, R; Eldridge, S; Haora, P; Kaur, I; Harden, A. (2018). ‘Testing the effectiveness of REACH Pregnancy Circles group antenatal care: Protocol for a randomised controlled pilot trial’. Pilot and Feasibility Studies 4 : 169. DOI: http://dx.doi.org/10.1186/s40814-018-0361-x [PubMed]
Wilson, M.G; Oliver, S; Melendez-Torres, G.J; Lavis, J.N; Waddall, K; Dickson, K. (2021). ‘Paper 3: Selecting rapid review methods for complex questions related to health policy and system issues’. Systematic Reviews 10 : 286. DOI: http://dx.doi.org/10.1186/s13643-021-01834-y