Introduction
In 2020, policy engagement teams based at UCL and the University of Manchester, UK (from now on: ‘we’) created a Policy Boot Camp – an online summer school for undergraduate and master’s degree students to develop skills and experience in policymaking. This formed part of a wider HEFCE Catalyst-funded programme of policy internships and summer schools covering a variety of sub-regional, national and international policy areas, for students at UCL, the University of Manchester and Northumbria University. In 2020, due to the Covid-19 pandemic, the programme moved online, and the University of Manchester and UCL decided to co-deliver a joint programme open to students at both universities: the Policy Boot Camp.
We were not trained teachers. Our roles were in our universities’ central policy engagement units working to connect researchers with the policy world through knowledge exchange mechanisms. Therefore, instead of a traditional policy module that might be found on a politics course at a university, we conceived of the Boot Camp as a knowledge exchange project – and a way of trialling how students could be involved in universities’ policy engagement agendas.
Knowledge exchange is an increasingly valued part of the higher education sector – it allows academic insights and ideas to reach non-academic audiences, and external perspectives to be brought into the world of academia (London School of Economics and Political Science, n.d.; UK Parliament, 2020). It has been encouraged nationally as a means of supporting the UK’s industrial strategy, business environment and policy sphere (Whitely et al., 2020; Tyler et al., 2020). At the same time, it is being driven by Research England through the Knowledge Exchange Framework (UK Research and Innovation, 2022). Amid this, there is growing recognition of the role students can play in universities’ knowledge exchange projects. McAlpine (2019: n.p.) suggests that knowledge exchange activities often ‘[cut] across research and teaching to encompass concepts such as skills development, practice informed teaching and co-location of industry, delivering benefits not just to the partner, but staff, students and graduates too’. But student involvement in knowledge exchange activity is, according to Basu-McGowan (2019), often formally ‘unrecognised’, and there is not consensus on best practice for this kind of activity. This has prompted the Office for Students (2020: n.p.) to fund a series of demonstrator projects to better ‘identify how students benefit from involvement in Knowledge Exchange, and address issues of equality, diversity and inclusion’ within this space.
Against this backdrop, we were interested in how students could be involved in our policy engagement work. Given this, the Boot Camp was never intended to be similar to a degree module, but rather was intended as a hands-on training course in which policy professionals, students and academics were brought together to work on responses to policy problems more collaboratively, and to learn from one another in doing so. The hope was that this could provide effective training to students in evidence-informed policymaking.
When thinking about how students and policy professionals might learn through the course, we were influenced by theories of adult learning – in particular, the principles of andragogy, as theorised by Knowles (1984). The key tenet of andragogy is that children and adults learn differently, with different learning characteristics and tendencies (Knowles, 1980). These differences have implications for curriculum design and teaching methods when creating education programmes for adults. Knowles (1984) and Kearsley (2010) suggest that in andragogy:
-
adults need to be involved in the planning and evaluation of their instruction
-
experience (including mistakes) provides the basis for the learning activities
-
adults are most interested in learning subjects that have immediate relevance and impact to their job or personal life
-
adult learning is problem-centred rather than content-oriented.
Broadly, as McDonough (2013: 345) summarises, in traditional teaching styles, the student is dependent on the teacher for the dissemination of information, while in the adult learning process, ‘knowledge is disseminated in a circular and reciprocal way through a collaborative sharing of experiences, centered on real life situations, and learners are responsible for their own learning’. Some of Knowles’ (1984) assumptions about adult learners have been critiqued (Clardy, 2005; Merriam et al., 2007), and various theorists have proposed different principles for designing curricula which promote adult learning (Zemke and Zemke, 1995; Kolb and Kolb, 2005). But there is broad acceptance of the idea that adult learning is focused on assisting the adult to learn themselves, rather than on instilling unknown content in students (Birzer, 2004; Taylor and Kroth, 2009).
The practice of adult learning techniques to promote the use of evidence in policymaking has been discussed in Langer et al.’s (2016a) literature review on promoting evidence-informed decision making. They found that several studies pointed to the conclusion that the techniques of andragogy can promote both the motivation and ability to use evidence in decision-making processes (Langer et al., 2016b). They found that customising learning to the needs of the learners was particularly important, and that a number of adult learning techniques could be employed to improve the design of capacity-building programmes (Langer et al., 2016b). However, they noted that although there is a large body of evidence on the design of adult learning interventions, programmes aiming to promote evidence-informed decision making differ in context and objective, as does the adult learning literature, making it hard to draw systematic conclusions or to provide universally applicable advice.
The Boot Camp was conceived as a policy engagement training exercise – one that would seek to improve the interface between evidence and research, and policy professionals. But it was also a programme based around the principles of andragogy, with learning designed to be self-guided, collaborative, reciprocal and problem-focused for both students and policymakers. Analysis of the outcomes of the Boot Camp is therefore an opportunity to test Langer et al.’s (2016b) conclusion that andragogy could successfully promote the use of evidence in policymaking. Insofar as the Boot Camp serves as a useful pilot, explicitly bringing these two currents of higher education activity together, our experience of facilitating the course may provide insights into how others could develop and improve this type of student knowledge exchange programme.
This paper will first outline the design of the Boot Camp, how we structured the course, and its links to adult learning principles. It will then summarise the course outcomes and our key insights as facilitators about where the course was successful and where it was unsuccessful. We will then discuss the implications for Langer et al.’s (2016b) framing review and the wider research base, and suggest avenues for further study.
Designing the Boot Camp
We began by devising three learning objectives to measure student outcomes:
-
improve understanding of an evidence-informed policymaking process, and ability to engage with public policy
-
develop transferable skills, such as teamwork, written communication and prioritisation
-
increase awareness of policy careers and inspire students to consider working in the sector.
Because of the pandemic, the Boot Camp had to be delivered online. We used video calls for the ‘live’ elements, and the Slack app for written communication. Although it might have been easier to design a course that was interactive and collaborative if it were in person, going online allowed us to run it jointly across two universities, and students could join from anywhere in the world. It also allowed us to observe the effects of online working on knowledge exchange and adult learning programmes such as this. (See Nguyen et al. [2021] for a description of the promises and pitfalls of online knowledge exchange during Covid-19.)
We decided that the course should be structured around a series of ‘sandpits’ – multi-day workshops, led by policy professionals, in which students worked with experts to develop responses to real-world policy problems. Often, these were problems the professionals were themselves currently working on in their areas of policy. We go into further details in the next section of the paper.
Boot Camp elements
Leading up to the sandpits were three other course elements that sought to give students a brief theoretical background in policymaking, introduce them to the range of roles policy professionals have, and outline some of the specialist and generalist techniques that policy professionals use in their work. Engendering empathy was something we felt was important for supporting effective knowledge exchange when students were to work with professionals on policy challenges, and so these elements were important to help students understand the mindset(s) of a policy professional.
To give the students a grounding in public policy, theory modules were developed specifically for the Boot Camp. Six theory modules covered: (1) what is policymaking?; (2) what are policy actors?; (3) what are policy windows?; (4) resources and constraints; (5) why are some policy problems so hard to solve?; and (6) devolution. Each module consisted of an online video by Andy Westwood, Professor of Public Policy at the University of Manchester, and a seminar led by course facilitators.
The theory modules were followed by ‘Pathways to Policy Talks’. These were a series of optional talks and panel discussions, hosted by senior academics, in which policy professionals from different tiers of government and parliament gave insight into their careers and the pathways they had taken to their current roles, and answered student questions. Purposely broad, students heard from: an MP, a chief scientific adviser, a panel of local authority leaders, a panel of civil servants, a panel of professionals working in think tanks, and a panel of people working in parliament, among others.
Finally, we ran ‘Methodology Workshops’. These workshops ranged from 45 minutes to 2.5 hours. Policy professionals talked to students about a skill or method they used for work, and then led an exercise for students to complete in groups, applying the knowledge they had acquired to practice. The policy professional then provided feedback and led a group discussion to ensure continuous learning loops. Topics included: systems thinking; using open data; rapidly assessing and using evidence; futures and horizon scanning; behavioural insights; polling; focus groups; utilising spheres of influence; and briefing legislators.
Once students had completed some of these initial sessions, they moved on to the more complex, longer and less structured workshops: the policy sandpits. Called sandpits to reference the practice of playing with ideas, building and rebuilding, and being creative with resources, they centred on a central policy challenge to which students had to respond.
Students were put into groups of five or six. Policy professionals (the ‘challenge setter’) set out the policy problem and provided reading and resources to help students get started. This was followed by some structured sessions to build an evidence base, deliberate potential policy responses and research prior solutions, in which groups would work independently but with the policy professional assisting when needed. To add in divergent views, and thus simulate what it is like for policy professionals when they are in the evidence-gathering phase, further expert talks and panel discussions were spaced throughout the sandpit timetable, showcasing competing and diverging views on an issue. Finally, students would be tasked with creating a policy product and presenting this to the challenge setters, who would provide feedback.
Policy sandpit topics included: developing proposals to address digital inequality among young people in the UK; drafting research briefs to aid the design of decarbonisation policies examined in terms of justice at the level of the household; designing a parliamentary inquiry into the UK’s COP26 preparations; developing policy to improve Britain’s industrial strategy; devising options to increase the uptake of welfare credits to boost living standards; creating a plan to help a northern town maximise value from their Towns Fund allocation; and assessing the benefit and practicality of ethnicity pay-gap reporting. All of these are live policy issues with implications for the future of the UK.
Policy products which students were asked to produce by the challenge setter included: briefing notes, research proposals, plans for inquiry evidence-gathering sessions, and a summary Town Investment Plan. These products were then presented back to the policy professional, who gave feedback on each and discussed the ideas. Students were encouraged to give each other constructive criticism and think about the strengths and limitations of different policy responses.
All of the elements described above were co-designed with policy professionals, who brought ideas for policy problems and methodologies to cover; decided which activities and tasks would be useful for students to complete; and, within our rough structure and within the confines of the co-design process, shaped the sessions to hone in on the themes, questions and tasks that were most relevant to their current policy work.
Links to andragogy
Partly consciously, and partly as a natural outcome of the co-design process, the programme we delivered was a good case study in andragogy.
Students were involved in the planning and evaluation of their own learning, in that during the sandpits and within the confines of the policy topic and the end product set, students had a lot of freedom to explore different avenues of thought and decide which techniques and processes to deploy. When they presented policy products, they were invited to reflect on the weaknesses of these, and what they would have done with more time. Groups provided constructive criticism to each other, which elicited thinking around the benefits and limits of their different approaches.
Instead of teaching students skills in the abstract or providing content to memorise, in the sandpits, students had to work out themselves how to get to a final outcome. The experience of working out how to get to a policy response was the intended learning process, and this was why this workshop was structured so loosely. Learning was based on experience and reflecting on setbacks.
Across the range of sandpits and methodology workshops on offer, students could choose those that most interested them and were most relevant to their desired future careers. For instance, some chose to focus on policy relating to the environment and net zero, while others focused on local growth and regeneration. Interestingly, the same was also true for policy professionals involved – rather than the sessions being separate and additional to their day job, the sessions they led were directly relevant to the work they were doing and the challenges they were facing.
As these sessions focused on solving real-world problems, the learning was a by-product of a process of developing a response to a policy challenge: the problem was centre stage, and the learning came as students, through trial and error, explored different response avenues. And because the policy challenges were ‘live’, this meant that some professionals took the students’ ideas back to their organisations to implement.
In this way, then, we can draw parallels between the adult learning principles Langer et al. (2016b) discuss, and the experience of policy professionals in the course. Instead of seeing policy professionals as ‘teachers’ who were there to impart their knowledge to students, in some of the most successful and interactive sandpits, policy professionals participated in workshops with and alongside students, as a partner in evidence-informed policymaking processes, to collectively develop novel responses to policy challenges. The learning process for the professional – if there was one – was one defined by the professional themselves, as they co-designed the session’s milestones and chose the policy topic, resources and reading, and set the product to be created. Given this, the course satisfied many of the key principles of andragogy for the professional as well as for the student: the evaluations of the process and policy outcomes were led by the professional, the content was relevant to their line of work, and the experience was entirely based around a specific problem.
Course outcomes and findings
The student view
The Boot Camp registered 548 students, and around 300 actively participated. Attending all course sessions was not compulsory as students could decide to join the elements that suited them, so levels of student participation varied across the programme. The sandpits – being the longest time commitment – attracted fewer students than the other elements.
We collected data on course outcomes in various ways: through student surveys after each course element, feedback seminars and workshop observations, and through a short email survey for policy professionals taken at the end of their involvement.
We asked students before and after the course to report out of 10 their confidence, ability and understanding in various areas. Following the course, 94 per cent reported an improved understanding of the policymaking process, with an average increase in confidence of 36 percentage points; 70 per cent reported an improved ability to work well online with others, with an average self-reported increase in ability of 18 percentage points; and 43 per cent reported an improved ability to prioritise and manage time effectively, with a small average increase of 7 percentage points. Interestingly, these self-reported figures were slightly but consistently higher for ethnic-minority students and those receiving a bursary or financial support, suggesting that the course served students from these groups particularly well. Facilitators have questioned why this was. We wondered if this was: (1) because the problem-focused style of the course particularly suited these students; (2) because their confidence level and self-perceived level of prior knowledge going into the course (as expressed in the pre-course survey) was lower compared with other participants; and/or (3) purely statistical chance? More research into the varying impact of programmes of this type on different groups of students is needed to properly capture outcomes related to equity, diversity and inclusion.
Qualitative feedback on the course as a whole varied. Comments were generally thematically grouped into three areas: (1) the development of professional and interpersonal skills; (2) the creativity and empathy required for policy development; and (3) the ability to understand and engage with policy processes. Students outlined how working outside narrow academic disciplines felt more like the ‘multidimensional’ real world, and that the Boot Camp provided them with skills in navigating groups with different views and approaches, and negotiating solutions. Students also said that the Boot Camp helped them to produce more creative and inclusive solutions. As a result of the sessions, students reported that the Boot Camp had encouraged them to approach policy challenges with a more rounded, open-minded and interdisciplinary approach. Students also said they would approach policy with a greater focus on evidence, prototyping and innovation in the future. At the same time, some students acknowledged that the Boot Camp had highlighted how hard policymaking is, and how much time and resource goes into any policy initiative. Some students reported that group working proved difficult, particularly if they thought members of their group were less committed to completing the group task, or less willing to work collaboratively. Some commented that this problem was exacerbated by the fact that group work was conducted entirely online.
Drilling down into feedback, it was clear that the vast majority of students thought the most useful course element was the sandpits. Across all the sandpits, on average 92 per cent of students confirmed through self-reporting feedback surveys that they felt they had gained greater understanding of the policymaking process; 86 per cent said they felt they had improved transferable skills, such as team working, presenting, assessing evidence, and writing for a policy audience; and 87 per cent had increased knowledge of, and interest in, policy careers. Student feedback suggested that the sandpits provided a valuable learning experience that was different from the rest of their university learning, as this student commented: ‘I found it really useful to do the sandpits and get a practical experience of what you would be doing if you were a policymaker. The policy world definitely feels more accessible now. Uni[versity] is so theoretical so I’ve never had to write a brief or proposal before.’
This was borne out in facilitators’ observations of the sandpits. Facilitators observed individuals and groups of students over time developing improved confidence, organisational skills and the ability to think strategically about what needed to be done. By the later sandpits, students were self-organising well and planning effectively to accommodate different availabilities, policy interests and strengths within teams. Some students had developed advanced processes to collectively come up with convincing policy proposals, including, for instance, independently creating spreadsheets to group, summarise and assess the quality of sources of evidence and synthesise analysis of sources by different group members. In many cases, students reported that a lot of the benefit of the course came from the fact that they were asked to engage collaboratively with policy professionals. One student noted: ‘We were working with [the policy professional] to get to a shared goal. They shared information with us to help, but weren’t marking our work.’ Facilitators observed that the sandpits in which the professionals treated students as partners in a collective and collaborative, live policy development process, and in which students were more involved in shaping the scope of workshops and avenues of investigation, generated the best outputs – in terms of the quality and innovativeness of policy products, the quality of self-organisation and interaction, and in terms of learning reported.
The policy professional view
Policy professionals also responded positively to the course in written feedback, and overwhelmingly said they would be involved again. Qualitative feedback indicated that in the more collaborative and interactive sessions, policy professionals found engaging with students and delivering sessions useful for their own work. As illustrated in this quotation, for some it was a ‘great opportunity to independently plan and run a workshop style session and present to a large audience, something I had little experience with previously’; others found it a ‘useful opportunity for reflection, particularly on why we use evidence in policymaking’.
Given the busy nature of policy work and the fast turnaround of work, reflection and innovation were reoccurring themes within the feedback. For example, one professional said they found it helpful to take part because ‘it was based on a project I was working on at the time [and it] enabled me to process some of our findings and hear feedback on the topic’. For another, it was because ‘students’ presentations and insights … gave me food-for-thought for potential future projects’. ‘Students,’ another professional said, ‘asked insightful questions and came at the problem with a fresh set of eyes’.
When co-design works well, then, and space is created for positive student–professional engagement, policy professionals can gain a lot from this type of interaction. Indeed, there were examples of policy professionals taking students’ ideas and drawing on them in their policy roles. For instance, in the sandpit on improving uptake of welfare credits to boost living standards, the professional in follow-up feedback said they are taking forward a number of policy ideas that students proposed, such as offering information on how to apply for welfare credits on application forms for free bus passes.
Policy professionals also indicated that there was demand for this kind of training in the policy world. A large majority indicated that programmes like this would be useful to entry-level colleagues, and more broadly that policy engagement training of this nature is useful for preparing staff both within and considering entering the public policy sphere.
Critique of the course
We also faced challenges, and feedback raised problems with the approach. As highlighted above, some students reported struggling to work effectively with each other and to organise work between their team members. Furthermore, the quality of interaction between students and professional also varied. For the methodology sessions, facilitators did not tend to have lengthy pre-engagement with policy professionals and shaped the course design less – largely because we saw the purpose of these sessions as providing specialist knowledge that the professionals themselves were best equipped to deliver. As a result, delivery of content and teaching was variable. Students picked up on this, with mixed student feedback scores and reviews. These sessions also had greater disengagement rates (in which students would leave the session part-way through), and students showed their dissatisfaction through long periods of silence and a lack of willingness to contribute ideas. Analysing negative student feedback for these methodology modules, student comments can be grouped broadly into four distinct areas: (1) content was too specific, and did not relate to the rest of the course; (2) sessions were not delivered well and were boring; (3) sessions were too difficult; and (4) sessions were light on content. These comments suggest that methodology sessions were least successful when students perceived them as not relevant, or policy professionals failed to solicit or interact with student input. This adds to the evidence supporting the importance of embodying the principles of andragogy in Boot Camp sessions.
Implications for the use of adult learning in promoting evidence-informed decision making
In sum, we observed that the best learning took place when the programme strayed from focusing solely on teaching and skills development, into focusing on incorporating student voice and contribution into a real-world policymaking process. Learning was at its most advanced when learning was a by-product of an interactive policymaking process in which students and policy professionals worked together and responded to each other. And when this happened, it yielded significant benefits for professionals as well. This demonstrates that workshop-style interaction, and the incorporation of student voice into policymaking processes that utilise the principles of andragogy to investigate policy challenges, are important techniques for prompting learning among both policy professionals and students. This strongly supports Langer’s (2016b) suggestion that andragogical techniques can successfully improve understanding and motivations on evidence-informed policymaking.
However, our tentative conclusions also seem to support the need for a subtle shift in emphasis in the research on promoting evidence-informed policymaking to fully capture the benefit of programmes such as this. Although some theorists have discussed adult learning as a process in which ‘knowledge is disseminated in a circular and reciprocal way through a collaborative sharing of experiences’ (McDonough, 2013: 345), most research assessing the effectiveness and design of adult learning interventions still assumes a binary teacher–learner relationship (see, for instance, Dunst and Trivette, 2012, for a particularly extensive example). Many theories of andragogy, including Knowles, while emphasising interactivity, student-led course design, and real-world application, assume there is a single subject of learning (the ‘adult learner’) for which interventions are designed.
The implications of our experience are that existing research on adult learning places an overly restrictive focus on the ‘learner’, as a separate entity from the ‘teacher’, and so fails to capture the full benefit of programmes such as this – much of which stems from the complex, mutual interaction between professionals and students and the creation of collaborative partnerships within real policymaking environments. We suggest that incorporating the theory of knowledge exchange into the adult learning framework would allow us to better describe the work of this course in promoting the goals of evidence-informed policymaking among students, and may allow us to more easily recognise the subtle and beneficial outcomes of this intervention for all involved.
Limitations and directions for future research
In presenting the Policy Boot Camp, this paper aims to contribute to deepening the picture of how knowledge exchange practices can crystallise within a dynamic adult learning programme and add to understanding of how knowledge exchange interventions, such as the one described here, can be planned, delivered and reflected on to support developments in sector practice.
The present study has several limitations, which might be addressed in future research. This is a small-scale case study presenting examples of strategies that were used to deliver an online course, designed within the specific contexts of knowledge exchange work at UCL and the University of Manchester. We therefore acknowledge that, while focusing on the modes of interaction in the Policy Boot Camp is a good starting point for understanding effective student knowledge exchange with policy professionals, without examining the wider institutional context and influencing factors such as the values, motivations and institutional backgrounds of those involved, this case study can only ever offer a limited picture. Nonetheless, further research could look to develop, on a larger scale, as Cairney and Oliver (2018) have discussed, a more forensic account, based on hard evaluation data aiming to track the relational effects between the facilitators, students and policy professionals. In particular, it would be important to investigate how achievable the development of shared goals and close alignment between students and policy professionals is in this type of adult learning project, and which factors enable this. Further research could also examine the processes of curriculum co-design between facilitators and policy professionals to better understand and assess the techniques which allow this to succeed or fail.
When collating evidence and recording our experiences for this paper, we were faced with a challenge common to knowledge exchange projects: outcomes are often complex and hard or expensive to quantify, and the subtle and personal interactions that form the base of all knowledge exchange projects are hard to objectively record or develop suitable metrics for. This makes it difficult to draw hard-and-fast conclusions with wide or universal application (CAPE, 2020; Nutley, 2016). We recognise, therefore, that due to the nature of this case study, the evidence presented here is largely qualitative, small scale and subjective. Furthermore, feedback came from only one programme run by UCL and the University of Manchester. Therefore, while our small-scale feedback data and observations do provide evidence supporting the value of Policy Boot Camp-style programmes, our findings may not be generalisable to other contexts or with other policy professionals. Further verification studies should take place, not only online, but in person, and in conjunction with policy professionals from a range of organisations.
We also found that the course was more successful in delivering basic skills improvements for ethnic-minority students and for those on bursaries. We hypothesised that ethnic-minority students may have particularly benefited from the course design, or perhaps joined the course with lower self-perceived levels of skills and confidence. But we did not collect data that would allow us to understand in a systematic way why these students had better outcomes, and further study should take place to gather evidence on this so that courses can be designed in the future to help maintain this learning gain and generate positive outcomes for these groups.
To conclude, our paper adds to our understanding of the role of students in university knowledge exchange agendas. We examined how knowledge exchange programmes involving students can successfully use andragogical techniques to promote evidence use in policymaking. We tested how principles of andragogy can be used to allow students to become participants in a policymaking process, rather than recipients of knowledge, and found that when this occurs, outcomes can be very positive for students and policy professionals.
Acknowledgements
We are grateful to the project team for their contributions to the Policy Boot Camp, and to colleagues for comments on earlier versions of this paper, including Jay Amin, Sarah Chaytor, Professor Andy Westwood and Siobhan Morris. We would also like to thank all those who participated in the programme – from careers services, to policy, business and third-sector organisations, to policy, academic and professional services staff, to student facilitators – without whom this work would not have been possible. We are also grateful for the insightful comments offered by the anonymous peer reviewers and the journal staff. This case study originates from a policy-led collaborative summer schools project funded by an HEFCE Catalyst Fund Award (note that HEFCE was replaced in April 2018 by the Office for Students and by Research England).
Declarations and conflicts of interest
Research ethics statement
The authors declare that, in line with the policy set by the UCL Research Ethics Committee, this article was exempt from needing ethics approval.
Consent for publication statement
The authors declare that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.
Conflicts of interest statement
The authors declare no conflicts of interest with this work. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.
References
Basu-McGowan, A. (2019). New direction for knowledge exchange funding announced. National Centre for Universities and Business blog, September 26 2019 Accessed 6 April 2021 https://www.ncub.co.uk/blog/new-direction-for-knowledge-exchange-funding-announced .
Birzer, M. (2004). Andragogy: Student centered classrooms in criminal justice programs. Journal of Criminal Justice Education 15 (2) : 393–411, DOI: http://dx.doi.org/10.1080/10511250400086041
Cairney, P; Oliver, K. (2018). How should academics engage in policymaking to achieve impact?. Political Studies Review 18 (2) : 228–44, DOI: http://dx.doi.org/10.1177/1478929918807714
CAPE (Capabilities in Academic Policy Engagement). Delving under the hood of academic–policy engagement. Universities Policy Engagement Network blog, Accessed 15 May 2021 https://www.upen.ac.uk/blogs/?action=story&id=146 .
Clardy, A. (2005). Andragogy: Adult learning and education at its best?, Accessed 15 May 2022 https://files.eric.ed.gov/fulltext/ED492132.pdf .
Dunst, C; Trivette, C. (2012). Meta-analysis of implementation practice research In: Kelly, B, Perkins, DF DF (eds.), Handbook of Implementation Science for Psychology in Education. Cambridge: Cambridge University Press, pp. 68–91.
Kearsley, G. (2010). Andragogy (M. Knowles). The Theory Into Practice Database, Accessed 15 May 2022 https://web.archive.org/web/20101216074332/http://tip.psychology.org/knowles.html .
Knowles, M. (1980). The Modern Practice of Adult Education: From pedagogy to andragogy. New York: Cambridge Adult Education.
Knowles, M. (1984). The Adult Learner: A neglected species. 3rd ed Houston: Gulf Publishing.
Kolb, A; Kolb, DA. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of Management Learning and Education 2 (4) : 193–212, DOI: http://dx.doi.org/10.5465/amle.2005.17268566
Langer, L; Tripney, J; Gough, D. (2016a). The Science of Using Science: Researching the use of research evidence in decision-making: Final report. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London. Accessed 5 May 2022 https://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/Science%202016%20Langer%20report.pdf?ver=2016-04-18-142701-867 .
Langer, L; Tripney, J; Gough, D. (2016b). The Science of Using Science: Researching the use of research evidence in decision-making: Technical report. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London. Accessed 5 May 2022 http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/Science%20Technical%20report%202016%20Langer.pdf?ver=2016-04-18-142648-770 .
London School of Economics and Political Science. Introduction to knowledge exchange and public engagement, Accessed 6 April 2021 https://info.lse.ac.uk/staff/services/knowledge-exchange-and-impact/kei-guide/introduction-to-kei .
McAlpine, H. (2019). What do businesses think about the KEF? II. National Centre for Universities and Business blog, January 14 2019 Accessed 6 April 2021 https://www.ncub.co.uk/blog/what-do-businesses-think-about-the-kef-ii .
McDonough, D. (2013). Similarities and differences between adult and child learners as participants in the natural learning process. Scientific Research 4 (3) : 345–8, DOI: http://dx.doi.org/10.4236/psych.2013.43A050
Merriam, S; Caffarella, R; Baumgartner, L. (2007). Learning in Adulthood: A comprehensive guide. 3rd ed Hoboken, NJ: John Wiley & Sons.
Nguyen, VM; Bell, C; Berseth, V; Cvitanovic, C; Darwent, R; Falconer, M; Hutchen, J; Kapoor, T; Klenk, N; Young, N. (2021). Promises and pitfalls of digital knowledge exchange resulting from the COVID-19 pandemic. Socio-Ecological Practice Research 3 : 427–39, DOI: http://dx.doi.org/10.1007/s42532-021-00097-0
Nutley, S. (2016). Using research to shape knowledge mobilisation practice. Knowledge Network for Applied Education Research (KNAER), Accessed 12 September 2022 https://web.archive.org/web/20170701215734/http://www.knaer-recrae.ca/blog-news-events/using-research-to-shape-knowledge-mobilisation-practice .
Office for Students. Funding boost for students to work with business and communities. OFS Press and media, April 20 2020 Accessed 6 April 2021 https://www.officeforstudents.org.uk/news-blog-and-events/press-and-media/funding-boost-for-students-to-work-with-business-and-communities/ .
Taylor, B; Kroth, M. (2009). Andragogy’s transition into the future: Meta-analysis of andragogy and its search for a measurable instrument. Journal of Adult Education 38 (1) : 1–11. Accessed 7 September 2022 https://files.eric.ed.gov/fulltext/EJ891073.pdf .
Tyler, C; Beswick, D; Foxen, S; Geddes, M; Hobbs, A; Rose, D. (2020). How universities can improve parliamentary engagement: A 12 point plan. Transforming Evidence for Policy and Practice blog, June 1 2020 Accessed 7 April 2022 https://transforming-evidence.org/blog/heres-how-universities-can-improve-parliamentary-engagement .
UK Parliament. Knowledge Exchange and Legislators. Parliament.uk. February 2020 Accessed 7 April 2021 https://www.parliament.uk/globalassets/documents/post/final-ke-and-legislatures-web-updated.pdf .
UK Research and Innovation. Knowledge exchange framework. Ukri.org. February 25 2022 Accessed 7 April 2021 https://www.ukri.org/what-we-offer/supporting-collaboration/supporting-collaboration-research-england/knowledge-exchange-framework/ .
Whitely, A; Haigh, K; Wake, D. (2020). Universities and Colleges and the Industrial Strategy: Exploring data on knowledge exchange, research and skills. Industrial Strategy Council. https://www.universitiesuk.ac.uk/sites/default/files/field/downloads/2021-07/universities-and-colleges-the-industrial-strategy.pdf .
Zemke, R; Zemke, S. (1995). Adult learning: What do we know for sure?. Training 32 (6) : 31–4. 36, 38, 40. Accessed 12 September 2022 https://web.archive.org/web/20060617115352/http://www.msstate.edu/dept/ais/8523/Zemke1995.pdf .