Research article

Rethinking assessment: the potential of ‘innovative’ or ‘creative’ assessments in history

Author
  • Sarah Holland orcid logo (Associate Professor, Department of History, University of Nottingham, Nottingham, UK)

Abstract

Assessment forms an integral part of history education at all levels, throughout school, college and university. The process of assessing and being assessed occupies much time and attention of staff and students alike. Research demonstrates that diversifying assessment methods has the potential to enhance student engagement. However, assessment remains relatively traditional compared to innovation in other areas of history curricula. Opportunities to rethink assessment can feel limited, constrained by sector or institutional frameworks. This article uses practitioner research to advocate rethinking assessment in history, and asks how ‘creative’ or ‘innovative’ approaches to assessment can make this a more effective and meaningful process within a subject-specific context. The focus is history assessment in higher education in the UK, but the context, discussion and recommendations have wider reach, extending beyond disciplinary, educational sector and geographical boundaries. It demonstrates the potential, practicalities and possible pitfalls of alternative forms of assessment, using a case study of creative approaches to assessing history.

Keywords: assessment, creativity, employability, history, higher education, student engagement, learner autonomy

How to Cite: Holland, S. (2024) ‘Rethinking assessment: the potential of “innovative” or “creative” assessments in history’. History Education Research Journal, 21 (1), 9. DOI: https://doi.org/10.14324/HERJ.21.1.09.

Rights: 2024, Sarah Holland.

262 Views

Published on
06 Aug 2024
Peer Reviewed

Rethinking assessment: the potential of ‘innovative’ or ‘creative’ assessments in history

Assessment forms an integral part of history education at all levels, throughout school, college and university. The process of assessing and being assessed occupies much time and attention of staff and students alike. Educators have reimagined assessment not merely as an end point (assessment of learning) but also as a process (assessment for learning and as learning), ideally embedded in curricula and underpinned by theories of constructive alignment or backward design (Biggs, 1996; Wiggins and McTighe, 2015). Research demonstrates that diversifying assessment methods has the potential to enhance student engagement and develop learner autonomy (Elkington and Evans, 2017; Sambell, 2016). However, assessment remains relatively traditional compared to innovation in other areas of history curricula (Booth, 2003; Richardson, 2019). Opportunities to rethink assessment can feel limited, constrained by sector or institutional frameworks. This article uses practitioner research to advocate rethinking assessment in history, and asks how ‘creative’ or ‘innovative’ approaches to assessment can make this a more effective and meaningful process within a subject-specific context, as part of a varied assessment diet. Student perceptions are at the heart of this research, in order to determine how students understand, engage with and respond to these assessments within a history degree. The focus is history assessment in higher education in the UK, but the context, discussion and recommendations have wider reach, extending beyond disciplinary, sector and geographical boundaries.

Why rethink assessment?

Bryan and Clegg (2006: 3) argue that as society changes, assessment practices need to adapt, stating that ‘Modern society is demanding and complex yet many of our assessments are magnificently basic in their nature’. As society continues to change, arguably there is an ongoing need for assessment practices to adapt accordingly. It is particularly timely to rethink assessment, including disciplinary approaches, in the context of recent and ongoing pedagogical change and social upheaval. Both the Covid-19 pandemic and concerns about the use of AI have disrupted established assessment practices (Anderson, 2023; Bearman and Luckin, 2020). Anderson (2023: n.p.) argues that higher education providers are ‘at a crossroads where they can either simply return to the status quo or take the opportunity to reconceptualise assessment and feedback practices through a process of inclusive, future-orientated design’. Other recurring drivers of change affecting assessment include the Quality Assurance Agency for Higher Education (QAA) Subject Benchmark Statements, students’ experiences and the National Student Survey, and internal curriculum reviews (Medland, 2016). Moreover, Elkington and Evans (2017) state that assessment itself should drive curriculum change, equipping students to negotiate twenty-first century life, both within and beyond higher education.

Rethinking anything challenges accepted ways of doing things, encouraging us to think about what we do and why we do it, to consider alternatives, and to discuss and debate with others. Rethinking assessment practices is not necessarily about enacting change, but about creating space and opportunity to determine whether change is either required or advantageous. This should be founded upon knowledge and awareness of alternative assessment methods. Alternative assessment is contextual, relating to disciplinary and institutional norms. Research highlights a wide range of diversification in practice, including more creative or innovative assessments, and emphasises that ‘alternative’ can include anything that is different or less familiar, and increases the range of assessments, providing students with variety or choice, and enabling them to meet the learning outcomes (Bloxham and Boyd, 2007; O’Neill and Padden, 2021; Sambell, 2016). It can be alternative both in terms of how something is assessed and in terms of what is being assessed. In other words, an alternative assessment does not simply take on another form for the sake of it, but provides different ways to demonstrate the same knowledge, understanding and skills as other assessment formats, or to assess different skills that other assessments cannot, meeting the required learning outcomes and assessment criteria.

The value of rethinking and diversifying assessment is underpinned by pedagogical research. First, and very significantly, diversifying assessment has the potential to be more inclusive, by providing ‘multiple means of action and expression’, aligned with the principles of Universal Design of Learning and Assessment (Anderson, 2023: n.p.; Novak and Bracken, 2019). Learners can experience barriers to learning relating to ethnicity and race, disability, learning differences, socio-economic background, breaks in education, and work and caring responsibilities (Burnell, 2019; Holland et al., 2023). Greater diversity in student cohorts, and increasing awareness of disability and neurodivergence in education, have contributed to concerns that assessment practices have not evolved accordingly (Elkington, 2020; O’Neill and Padden, 2021). Rather than relying on one or two assessment types, which may privilege a minority of students, developing a more varied range of assessment types has the potential to enable everyone (Bloxham and Boyd, 2007; McConlogue, 2020). As McConlogue (2020: 137) argues ‘Inclusive assessment does not compromise academic or professional standards but improves the opportunities for all students to demonstrate their acquisition of the learning outcomes’. Choice in assessment is different to variety in the curriculum, enabling students to meet the learning outcomes and assessment criteria using different methods and means; it requires comparable levels of complexity, and time and effort from students in order to be equitable (Burnell, 2019; McConlogue, 2020). Students may, however, have to undertake assessment types that they do not like or feel less confident with, and there can be good reasons why assessments have to take certain forms depending on what is being assessed, but this should not simply be by default. Variety in assessment can also provide students with opportunities ‘to enhance their strengths and challenge their less-developed learning and skills’, acquiring transferable skills with wide applicability (Elkington and Chesterton, 2023: 6; Burnell, 2019). Students can also develop confidence which can be applied to other assessment types, especially if they receive support with relevant skills, such as academic writing, and thus can be ‘better equipped and more confident’ when these skills are crucial (Burnell, 2019: 166).

Second, diversifying assessment can provide opportunities to demonstrate the practical application and value of subject-specific knowledge and skills. This is connected to the redefining of authentic assessments encompassing not only the application of knowledge and skills to ‘real world contexts’ and problem solving (Bloxham and Boyd, 2007: 193; McConlogue, 2020), but also the potential of this to add value or transform society (Ajjawi et al., 2023; McArthur, 2022). More diverse and innovative methods have the potential to make assessment more effective and meaningful, reflecting and addressing societal changes. As well as practical application and relevance, varied assessment types and choice of assessment, including more innovative and creative assessments, can foster deeper learning, promote learner autonomy and student engagement and develop assessment literacy (Bryan and Clegg, 2006; Bulaitis, 2009; Elkington and Evans, 2017; Medland, 2016; Pereira et al., 2016; Race, 2020; Sambell, 2016).

Nevertheless, assessment remains relatively traditional, especially in some subject areas. Resistance and caution remain a hindrance or deterrent to innovation (Booth, 2003; Bryan and Clegg, 2006; Bulaitis, 2009; Elkington, 2020; Gibbs, 2006; Murphy, 2006; Sambell, 2016). As Sambell (2016: 3) asserts, alternative approaches ‘demand a radical rethink of the purposes and methods of assessment’. This is time consuming, complex and not risk free (Bloxham and Boyd, 2007; Medland, 2016). Concerns raised have focused on consistency in marking, unfamiliarity with different methods, and a lack of motivation or time to devote to developing such work (Bloxham and Boyd, 2007). Evidence suggests that it is often more challenging and time consuming to secure approval for changes to assessment than for other aspects of the curriculum, and staff are therefore more willing to innovate in teaching than in assessment methods (Bloxham and Boyd, 2007; Gibbs, 2006). Innovation in assessment can therefore be largely reflective of individual staff initiatives, rather than of institution-wide or even department-wide approaches (Medland, 2016).

However, this underscores the importance of providing opportunities to rethink assessment: first, for colleagues to engage in constructive dialogues about assessment, providing support and encouraging collaboration, and, second, to challenge ‘entrenched attitudes’ (Medland, 2016: 90). To facilitate this, staff and students both need to be active participants. There needs to be a shared understanding of the validity of different assessments, ensuring not only academic rigour, but also recognition and appreciation of validity, value and opportunity (McConlogue, 2020; Sambell, 2016). A willingness to try ‘new things’ should be encouraged. The assessment literacy of both staff and students should be developed and expanded. This can promote student-centred learning, putting students at the forefront of decisions about the assessment, empowering them to become more autonomous, self-confident and reflective (Medland, 2016; Pereira et al., 2016; Sambell, 2016). Moreover, it is crucial to understand how students perceive and respond to assessment processes and methods, and to recognise the emotional dimension. Wake et al. (2023) demonstrate how ‘authentic’ or ‘innovative’ assessments often leave students feeling more empowered, inspired and motivated, but also more anxious, uncertain and concerned due to lack of familiarity with the assessment type. Accordingly, they emphasise the importance of balancing ‘pedagogical innovation with psychological safety’, thus providing support that addresses concern or anxiety surrounding unfamiliar assessment types (Wake et al., 2023). After all, assessment shapes learning experiences (Medland, 2016; Sambell, 2016). Variety and innovation should be fostered, but not to the extent that it becomes bewildering or for the sake of it, and only with adequate opportunities for staff and students to become familiar with them (Gibbs, 2006; McConlogue, 2020; Medland, 2016). An assessment diet should be varied but wholesome and coherent, with a clear rationale underpinning it.

History-specific perspectives on rethinking assessment

The notion of ‘innovative’, ‘creative’ or ‘alternative’ assessment can be both subject- and institution-specific. Context matters when rethinking assessment. History as a discipline is traditionally characterised by written assessments, principally the essay and the exam. The pre-eminent role of the long-form written assignment as inherent to the discipline should not be undermined. It enables students to develop and demonstrate a range of skills. However, students can become over familiar with one format, encouraging surface learning, whereas greater variety can facilitate deeper approaches (Bulaitis, 2009). Moreover, historical skills can and should be assessed using varied means. There are undoubtedly numerous examples of innovative, creative and alternative assessment (Bulaitis, 2009; Matthews-Jones, 2019; Richardson, 2019; Zemits, 2017). And yet, tradition, caution and resistance have resulted in assessment remaining relatively traditional compared to innovation in other areas of history curricula. The QAA Subject Benchmark Statement for history (QAA, 2022: 13) states that ‘diversity in assessment is vital’ in order to assess the full range of abilities and to accommodate increasingly diverse student cohorts. It acknowledges that a range of assessment types can demonstrate historical knowledge and understanding, and the ability to communicate effectively. It also asserts that history can be assessed in an ‘increasingly flexible way’, and the value of embedding employability skills with ‘assessment methods focusing on real world tasks’ (QAA, 2022: 6, 10). ‘Authentic’ assessments not only facilitate the development and application of a range of skills and demonstrate the value of the discipline; they can also support students to become historians (Corker and Holland, 2016; Norton, 2009; Richardson, 2019).

Moreover, among a range of capabilities, skills and competencies, the Subject Benchmark Statement includes the ability to ‘exhibit imaginative insight and creativity’ (QAA, 2022: 3). The Creative History in the Classroom initiative, co-led by Lucie Matthews-Jones and Cath Feely, showcased the potential of creativity within the discipline, whether teaching or assessing (Matthews-Jones, 2022). Creativity is a process that encourages different ways of thinking, knowing and communicating. Jackson (2005: 15) argues that creativity is integral to being a historian (or indeed to any other profession or disciplinary field) in order to ‘survive and prosper in a complex, ever changing, unpredictable world’. According to Zemits (2017: 184–85) ‘Representing knowledge with creativity … extends graduate attributes to prepare a student to communicate with others during and after their studies’. Creative approaches and responses can facilitate problem solving. Bryan and Clegg (2006) advocate for assessments that foster creativity to enable students to deal with the unknown. Evidence also suggests that alternative, creative assessment opportunities can motivate and engage students, leading to calls for creativity to be an intrinsic assessment criteria in the humanities (Watson, 2014; Zemits, 2017).

Academic historians increasingly communicate their research using varied modes, including auditory and visual presentations, digital outputs, exhibitions and performance, with an emphasis on co-production, knowledge exchange and impact. The potential and significance of this is not simply measured in terms of extrinsic drivers such as the Research Excellence Framework (REF), but also recognised for its intrinsic or inherent value to the discipline. Moreover, the ways people access information about the past continues to change, with social media and online platforms more prevalent, and often ‘unmediated by the history teacher or professional historian’ (Chapman and Haydn, 2020: 173). After all, as Nicol et al. (2018: 177) note, ‘The aims and purposes of history education are conceived very differently by different constituencies and stakeholders in a society and, like history, change as societies and the world change’. This gives rise to questions about who controls history education, and the purpose of learning history, highlighting the importance of different perspectives, critical thinking, debate and ‘a recognition that often there is no single right answer’ (Nicol et al., 2018: 177). They advocate discussion about relationships between family histories, school history, and heritage and media history, and argue that it is important that students ‘actively learn how to construct accounts of the past and … understand why interpretations may differ but be equally valid’ (Nicol et al., 2018: 178). Critically thinking about the construction of historical knowledge and understanding in the public domain, and evaluating public history outputs, can empower students, and can create a firm foundation for public history-based assessments.

Debates about the future of history assessment in schools are ongoing, coalescing around the purpose of assessment, institutional and sector constraints, and the use of creativity to stimulate motivation and engagement. Tensions between the potential of assessment for historical learning that is meaningful for everyone involved and rigid curriculum targets or levels have been a recurring theme (Brown and Burnham, 2014; Burnham and Brow, 2004; Butler, 2004; Cottingham, 2004; Facey, 2011; Harrison, 2004; Luff, 2016; Wrenn, 2004). Counsell et al. (2008: 2) note that ‘Assessment will always be a vexed question because it embodies so many different purposes’, but they argue that it is ‘all the more remarkable and heartening that in the midst of these constraints, the creativity and determination of history teachers seems undimmed’. More recently, Christine Counsell (2023: 11, 10) has encouraged teachers to not only rethink but also to rescue assessment from what she calls ‘knowledge-rich gone wrong’. She argues that poor assessment is not only ineffective but also ‘warps pupils’ experience of the subject’. Of particular concern are mere retrieval practices that decontextualise knowledge, and thus the importance of assessments that embed historical training such as evidential thinking, causal reasoning and historical argumentation (Counsell, 2023).

Articles in Teaching History have demonstrated the range of innovative assessments being used by schoolteachers, and the role they play in the teaching, learning and assessment of history more broadly. Stanford (2008) emphasises the extent to which history is assessed using the written word, or increasingly the spoken word, and the importance of diversifying ways in which to assess historical understanding. He argues that ‘completely different mediums’ may sometimes be more appropriate and still as rigorous. His study focused on ‘Redrawing the Renaissance’, where students applied their historical knowledge and understanding to a pictorial assessment in Year 7 (Stanford, 2008: 4). Stanford (2008) demonstrates both the important interrelationship between how history is taught and assessed, and the powerful impact of using non-written outputs to assess historical understanding. Processes of selection and rejection resulted in images that represented objects, ideas and artistic techniques from the Renaissance. As Stanford (2008: 10) asks, ‘Why should all history assessments favour one group that have a particular skill-set?’, when we can diversify the ways in which we assess historical understanding. Knight (2008) has explored the relationship between assessment and learning in Year 8 history. The central premise of ‘create something interesting to show you have learnt something’ was adopted with assessment as enquiry, resulting in greater learner autonomy and motivation (Knight, 2008: 21). Students were actively engaged in devising assessment criteria, worked in small groups, and designed a wide range of outputs from a chat show format interview to a historical movie to demonstrate understanding in response to the enquiry question. Conway (2011: 54), referring to Knight’s work, again explored the role of student-led enquiry and the use of different forms of assessment that would ‘allow for creativity and independence but be sufficiently well structured to ensure that they were working towards similar outcomes’. In this instance, students designed and planned a memorial to the Holocaust for Key Stage 3. Luff (2003: 26), arguing that the ‘demands of external assessment and an essentially creative teaching approach need not be mutually exclusive’, advocated for creativity in coursework beyond Key Stage 3. The recurring theme has been an emphasis on assessing knowledge and understanding in ways which encourage exploration and avoid being unnecessarily prescriptive. Assessment practices can and should foster curiosity, the love of history and being a good historian.

Creative and public history assessment in higher education: a case study

The pedagogical arguments for innovative and diverse assessments, and for greater creativity in history, underpin my own work rethinking assessment. Over a period of ten years, this concept of rethinking assessment, and the impact of assessment on students, has been refined, in response to practitioner reflection and student feedback, and has become the basis for wider rethinking of assessment in history and stimulating sector-wide conversations. The focus here is on the use of creative assessment in two undergraduate history modules at the University of Nottingham, UK. Against a backdrop of traditional forms of assessment such as essays and exams, with some presentations, posters and student-led seminars, the introduction of these creative assessments in 2016 represented a significant curriculum innovation. This built on my work to redesign a first-year history module, Making History, at Sheffield Hallam University in 2013. Students (in small groups) undertook a research project on an aspect of Sheffield history and produced a poster for display at a public history exhibition, as well as a research essay and reflective pieces. Evaluation of this demonstrated how public engagement, from field visits and working in the archives or local studies library to producing a piece of public history for an external audience, could enhance student engagement (Corker and Holland, 2016). Moreover, this work was heralded as demonstrating how ‘authentic, experiential and reflective components can be put into practice’ to engage students in active and collaborative learning experiences (Weller, 2019: 88–89).

Aims, research questions and methodology

The aim of this article is to investigate the use of ‘alternative’ assessment in history degrees, with a focus on creative and public history outputs, student perceptions and experiences of these assessments, and the wider implications and applications for rethinking assessment. Three underpinning research questions are addressed:

  • Can ‘alternative’ or ‘creative’ approaches make assessment a more effective and meaningful process within a subject-specific context?

  • What do students perceive is being assessed or developed through these assessments?

  • How do students react to the inclusion of these assessments in history modules?

Qualitative and quantitative research has been undertaken to understand the impact of introducing creative and public history assessments in history modules at degree level. An initial period of evaluation, including discussions with students and colleagues, observation of practice and practitioner reflection was followed by detailed questionnaires spanning two modules (two cohorts on each) between 2020 and 2022. The questions fell into two categories: perceptions of skills development and the value of creative and public history assessments, and student responses or reactions to this type of assessment. All students on the modules were invited to participate, and between a quarter and a half completed the final survey. Due to the relatively small number of respondents, the data sets have predominantly been combined, encompassing both modules across two academic years. There was no discernible difference between cohorts or modules in most instances, and thus any anomalies would be artificial. However, there would be scope with a larger data set to consider differences between modules and cohorts.

It is important to acknowledge positionality in practitioner-research that investigates one’s own practice. As outlined earlier, creating space to rethink assessment, and introducing different assessments where appropriate, has become an important part of my practice. Students also knew how passionate I am about this type of assessment and, in turn, often became advocates of more diverse and creative forms of assessment. As such, respondents to the survey (and certainly participants in discussions) were sometimes self-selecting advocates, and therefore their responses may at times have been akin to providing testimonials in support of this type of assessment. Nevertheless, objectivity has been prioritised throughout the research process. Positive, neutral and negative responses have been included in the findings. Moreover, practitioner-research can add important insights to the research process based on experience, and can shape research-informed practice, which is significant for educational research and practice. It is also important to note that students who undertook this assessment during or in the immediate aftermath of Covid-19 often encountered fewer assessment types than usual (for example, exams were replaced with additional coursework and student-led seminars could not always take place), and responses may reflect this. The findings and subsequent conclusions and recommendations are based on a small sample size. It cannot be presumed that the responses are representative of all students who undertook these specific assessments, or indeed those who undertake similar assessments in different contexts. They are, however, still very revealing. They can tell us not only about how the students felt about these assessments, but also about how this feedback shapes practice going forward and the need for all practitioners to rethink assessment – however ‘traditional’ or ‘innovative’ it might be considered – with a view to understanding the potential opportunities and challenges posed, and how students respond to them.

Context

The creative and public history assessment discussed in this article takes place on two undergraduate history modules at the University of Nottingham. It was first introduced in 2016 on a final year (40 credit) module called ‘Rural Life in Victorian England’, the forerunner of HIST 3091: ‘A Green and (Un)Pleasant Land? Society, Culture and the Evolution of the British Countryside, 1800–1918’. The module initially only had a relatively small cohort of students (about ten), which was ideal for introducing this assessment and working with students to refine it and ensure it worked. This assessment was then embedded into the redesign of this module and the design of a new second year (20 credit) module, HIST 2046: ‘Poverty, Disease and Disability, Britain, 1790–1930’ (both in 2019/20). The assessment weightings varied between the modules, with the creative or public history output being worth 50 per cent of the second-year module alongside a source analysis essay, and 30 per cent of the final year module alongside a primary source essay (also 30 per cent) and a synoptic essay (40 per cent).

The assessment brief requires students to ‘create a piece of public history or a creative output that engages with a theme from the module’, and student work can ‘take any feasible format’. The brief is intentionally very broad. Students can produce any output that effectively communicates their research to a non-academic or non-specialist audience and meets the assessment criteria. As Elkington (2020: 8) argues, ‘incorporating choice and flexibility into assessment design can empower students to take responsibility for their learning’. As outlined earlier, it is also aligned with the principles of Universal Design of Learning and Assessment, providing students with different means of ‘action and expression’ to communicate their knowledge and understanding, and to meet the learning objectives and marking criteria. It also extends the range of skills being assessed, and demonstrates the value, contemporary relevance and application of historical knowledge and understanding via public history and creative interpretation. The outputs produced by students in response to this assessment brief have been highly creative and innovative, and remarkably diverse. They have taken the form of museum exhibits, board games, story books, documentaries, podcasts, educational resources, artworks, music, dance, drama, historical fiction and much more. The output is submitted as a portfolio that includes a detailed bibliography reflecting the underpinning research, a short rationale and an annotated or footnoted version of the output. Optional documentation can include ‘behind the scenes’ or ‘director’s notes’, giving students the opportunity to develop ideas or explain aspects of the output for the intended audience.

The assignment assesses the construction, application and communication of knowledge and understanding. Marking criteria focus on five key areas: evidence of original research or interpretation, accuracy of historical content, creativity and innovation, presentation of research and engagement with audience. As such, it provides students with an opportunity to demonstrate their knowledge and understanding in a different format, and for the tutor to assess many of the same skills as a standard essay could. At the same time, it also provides an opportunity to assess additional skills, including communicating with non-academic audiences. The assessment brief also enabled problem solving, creativity and innovation to be foregrounded. The short rationale is intended to encourage reflective practice in students, and to provide the tutor with additional insights into decision making and circumstances that shaped the final output. It gives students a further opportunity to express themselves, and it recognises that assessment does not operate in a vacuum. It provides space for students to elaborate on what they were aiming to achieve, to reflect on their experience, and to demonstrate understanding of different modes of communication and audience, and greater awareness of the importance and application of historical research. As such, the process is as important as the content, and the short rationale represents a very small proportion of the final grade awarded (slightly more on the third-year module).

It was crucial that this form of assessment was as rigorous and robust as any other, and that staff and students had a clear understanding not only of the marking criteria, but also of the relationship to other forms of assessment, what this meant in practice and the rationale behind it. Detailed assessment criteria were designed and aligned to existing rubrics, with additional supporting documentation. It was important not to overwhelm students with too much written documentation, which ultimately may not be meaningful or helpful if it is not digestible (Adams and McNab, 2012). It was also important to have varied ways to communicate what was required, and for students to gain confidence in producing what was often an unfamiliar piece of assessment. In addition to the written rationale and marking criteria, support gradually broadened to include workshops, showcases of previous submissions, an FAQ document and an audiovisual resource. In part, this responded to issues raised by students in previous years, and to an awareness that if this assessment was intended to be more inclusive, then the support mechanisms also needed to be more diverse and inclusive. This gave students opportunities to develop their assessment literacy, and to build their confidence before submitting their final work.

Students were asked to think carefully about what aspects of the module they wanted to engage with and research further, what type of output they wanted to engage with, the intended audience and their key messages. They were also encouraged to envisage their output as having a real audience, and thus a life beyond the assignment, and some cohorts have had opportunities to engage with external audiences with their outputs via the University of Nottingham History Festival. By critically reflecting on the audience, students began to view their output as a process of multi-layered engagement. Students were constantly reminded not to oversimplify ideas for the sake of it, and to retain the nuances of the topic and research. After all, the assignment assesses the ability to distil detailed historical research into succinct, relevant, interesting and accessible content for a specific audience, and to take complex historical research and present it to a non-academic or non-specialist audience in an accessible way. Public history and popular representations are key components of the modules, which meant that this form of assessment engaged with debates and issues discussed in the classroom about public history, as well as about the subject matter itself.

Findings

The data collected have been analysed in two parts relating to the overarching research questions to address student perceptions of what is being assessed or developed, and student responses to the inclusion of these assessments in history modules.

Student perceptions of what is being assessed or developed

The assessment brief is intended to assess historical knowledge and understanding, and its communication to a non-academic audience. Although this is clearly explained using varied methods, student perceptions of what is being assessed or developed are still variable. Students were asked whether the assessment had enabled them to demonstrate or develop a range of skills and competencies (shown in Table 1). These were determined both by the assessment criteria and by the rationale for introducing this assessment type.

Table 1

Student perceptions of what is being assessed or developed 

Strongly agree Agree Neutral Disagree Strongly disagree
Demonstrate originality 27 8
Develops new skills 21 11 2 1
Use skills that other forms of assessment do not 30 5
Increase my confidence 11 16 6 1 1
Be more motivated 17 11 6 1
Be engaged with the module 27 4 3 1
Develop learner autonomy 15 15 5
Understand the module content better 13 17 5
Demonstrate what I knew and understood more effectively 16 10 8 1
Be more creative or innovative than other forms of assessment 29 5 1
Communicate ideas effectively 14 19 2
Develop research skills 12 21 1 1
Be analytical 6 17 11 1
Engage with the historiography 10 18 5 2
Engage with primary sources 25 9 1
Engage in ‘deeper’ learning 16 15 3 1
Enhance problem solving 14 12 8 1
Develop time management and organisational skills 13 15 5 2
Feel empowered 15 13 4 2 1
Connect with your personal interests 21 10 3 1

Most students, perhaps not surprisingly, strongly agreed or agreed that this assessment enabled them to develop new skills, use skills that other forms of assessment did not, and be more creative or innovative. Perceptions of ‘creativity’ varied, with some students immediately identifying ‘creative’ potential and others stating that they are ‘not at all creative’. Creativity, however, is a way of thinking, problem solving and communicating, which can take many forms and, as outlined earlier, is specified by the QAA as a quality of mind that history graduates should be able to demonstrate. In addition, the majority also strongly agreed or agreed that it enabled them to demonstrate originality, communicate ideas effectively, develop research skills, engage with historiography and primary sources and understand the module content. Although these are criteria for most history coursework assignments, students recognised that these could be met and developed using a non-traditional format. Moreover, some students struggle to demonstrate originality except in extended research projects, such as the dissertation. The absence of an essay question transformed this into a research project, which most students also strongly agreed or agreed enabled them to engage with the module, engage in deeper learning, feel empowered and connect with personal interests. This also suggests a crucial relationship between engagement and empowerment on the one hand, and effectively demonstrating knowledge, understanding and originality on the other.

For some statements, more students responded neutrally. For example, although most still strongly agreed or agreed, a few more were uncertain as to whether the assessment had enabled them to demonstrate what they knew and understood more effectively. A more even spread of responses from strongly agree to neutral were evident for being analytical and enhancing problem-solving skills. This is important, and it further demonstrates the significance of student perceptions regarding assessment. Depending on perspective, all history assessment could be deemed to be a form of problem solving. This assessment definitely involves problem solving, not least in terms of determining how to communicate complex historical research to a non-academic audience, and all students who pass this assignment have demonstrated problem-solving capabilities. There is perhaps a very subjective element to how this question was interpreted. If students already felt confident with problem solving, they may not have felt that the assessment developed this skill, or they may not have identified a strong link with it. From a practitioner perspective, this highlights a skill area worth discussing further with students, and perhaps a presumption about the extent to which students are developing problem-solving skills in other assessments (and, indeed, other areas of their lives). Students also analyse material before presenting it, as well as thinking critically about how to present it. Those who were neutral or disagreed that the assessment enabled them to be analytical may be indicative of two aspects observed in practice: a tendency in a small proportion of outputs to be more descriptive, simply providing information, and the unfamiliar assessment format meaning that analysis was less transparent than in essays or source analyses. From a marking perspective, first-class outputs had undoubtedly demonstrated analytical and problem-solving skills. A couple of students disagreed with most statements, and, thus, based on these responses alone, saw little value in the assessment type. This may reflect lower grades awarded, and a preference for more traditional assessment types. It is important to acknowledge that such assessments cannot necessarily suit everyone, but that does not mean that they should not be introduced as part of a varied assessment diet.

Students were asked what the relationship was between this assessment and both the rest of the module and the rest of their degree. These were open text questions, and responses focused on links to content, approach, and the intrinsic and extrinsic value as perceived by students. Both questions were interpreted in varied ways, but that made the responses even more insightful in terms of student perceptions of skills development and alignment between assessments, modules and the degree. Moreover, the free text comments in response to these questions also drew out more in relation to the skills and competencies outlined above. With respect to the relationship with the module, students reported that the assessment enabled them to explore the module content in greater depth, allowing them to: ‘utilise a breadth and depth’ of knowledge from across the module; ‘explore the module content in depth’; ‘develop a deeper understanding of the module content and how each topic links’ and ‘do a “depth” study into little elements in a way you can’t as much in essays or especially exams’. Responses valued the way in which they felt that the assessment enabled them to explore the module in different ways, and to refine the focus according to their interests, making more connections across the module. This is an important element, which could be equally relevant to other assessment formats where an element of choice, enquiry or problem-based learning is facilitated, but many students saw this as being specific to this type of assessment.

Students also made connections between the content of the module and the approach necessary for the assessment. For HIST 3091, students noted ‘Clear links to art and photography … encourages creativity and the digestion of the countryside in the same original way that artists, writers and photographers viewed the land’, and ‘when learning about representations of the countryside, this type of assessment leant itself very well as they could be considered as pieces of public history’. Similarly, for HIST 2046, a student noted that ‘the project fitted with the aspect of learning how poverty and disease are represented in public history such as films and museum exhibition. It was interesting because many modules did not look at representations in public history’. Close links between learning, teaching and assessment are important, and, as these students highlighted, both modules explore historical and contemporary representations and examples of public history. Indeed, one student reflected that the ‘strong theme of representation’ meant that the assessment fitted really well, but added, ‘This might not be the case for every module’, acknowledging the alignment between the module and the assessment.

Other respondents focused on approaches to public history and communicating with external audiences. For example, one student reflected that this assessment enabled them to ‘take a different perspective on the historiography by translating it into a public history outcome’. Others reflected: ‘it was a good way of exploring and understanding how public history works in practice’, ‘It allowed me to consider in depth how the module content is typically presented to the general public, why this is and the limitations of this. I believe public history to be vitally important and so very much enjoyed this’, and ‘it brings the past to life allowing it to be more engaging’. This not only reinforces the idea that different skills are being acquired and assessed, but also the extent to which students demonstrate the importance of public history and the application of history in society, as well as being another way to demonstrate and assess historical research skills.

Not all students answered this question, some noting that they were not quite sure what the question meant, and a couple of students did not feel that there was a clear connection between the assessment type and the module. One student simply said that the relationship was ‘minimal’, with no explanation, and another noted that it was ‘the same as other assessments – you get marks and they contribute to your degree’, which perhaps represents a rather strategic or pragmatic perspective. For others, it was the perceived wider applicability of the assessment which ‘could be applied to the majority, if not all, of the history modules’, which meant that they did not feel there was ‘an independent connection between the assessment and the module’. Another student said that it was ‘A better way to engage with my learning, and could easily be incorporated into other modules too for the same effect’. This perceived value and transferability shows the potential of alternative assessments as providing a different way to assess history. Nevertheless, from a practitioner perspective, the extent to which the assessment and content were closely interlinked in the design of these modules was important, and would require careful consideration when rethinking assessment.

In terms of the relationship between this assessment and the rest of the degree (which could be single or joint honours), responses explored public history and assessment literacy, as well as what they perceived to be the intrinsic and extrinsic value. A few students commented on the value of public history, and the extent to which it was touched upon in the content of other modules. Other connections made to public history moved beyond the components of their degree (that is, other modules) to the relevance and application of history beyond the degree. For example, students commented:

History is all about being able to interpret the past and explain it in the modern day! Creating a public history piece helped me to understand how to put history across to different audiences.

Having to do an assessment centred around public history, and ways in which history is and can be consumed by the public felt really important to grounding the rest of the degree in the understanding that beyond academic history is public history, and the value of knowing how to translate the academic understanding in a way the public can consume.

It gave me the opportunity to create a history resource myself, one that could be widely used and to encourage the general public to engage interactively with history.

This type of assessment is well suited to the expression of history to the wider public, and engaging with historiography and understanding of the subject on a deeper level. This is largely what history as a whole tries to emphasise.

It was useful for understanding some of the practical ways that history can be used (post-degree).

Assessment literacy or progression through the degree was also evident in many of the responses to this question. Several responses noted the acquisition of different skills, as well as the opportunity to build on ones developed elsewhere in the degree. Occasionally, students noted increased confidence with creative history assessments, but were not sure if it was relevant to the rest of their degree. Interestingly, although students would go on to discuss the relationship between this assessment and future careers in terms of transferable skills, only a few students made a connection between the varied skills developed (and acknowledged by students in Table 1) and their application to other assessments. In these instances, reference was made primarily to research skills and dissertation topics.

There were some very powerful responses to this question relating to the value of a history degree and more varied assessment types. For example, one student said, ‘I picked an arts degree, which encourages creative ideas and thinking. This project was just a more literal interpretation of that creative thinking, which was really nice’. Others referred to emotional responses such as greater enjoyment and positivity, including ‘I feel this assessment has contributed greatly to my enjoyment of my degree and this has inspired greater focus and effort in the rest of my degree as it made my workload feel more varied and interesting’, and ‘It made me feel more positive as I found this year challenging’. Other students noted how memorable this assessment was, and how, in turn, that made the module content memorable. For example, one student said, ‘It brought a sense of uniqueness and definitely a piece of assessment I will remember when I graduate as it stands out against all the other essays’. Another reflected:

It is the assessment I remember and have something to show for. I did a mini folk album, I will have those songs and my writers [sic] notes forever. There are history essays I did in second year that I don’t even recognise, I can’t remember doing them or anything I said in them. I think that in itself is revealing enough … history is inherently out of date so it has to foster connection to the present and be applied in our everyday life where possible to really hammer home that it is an important and useful discipline. I think the creative piece really elevates the module, and degree, because it forces the application of history to the real world.

This is indicative of the value both of history, and of innovative forms of assessment.

Other students responded to this question in terms of transferable skills and employability, including what they referred to as ‘the kind of skills you talk about in job interviews’, the ability to show not only understanding but applying it ‘in a real world context’, ‘practical skills that I could use and talk about in interviews’, and finally ‘It shows that history can be creative as well not just retaining knowledge (challenges the stereotype!)’. More students discussed the relationship between this assessment and employability when directly prompted. Students were asked whether they felt that this type of assessment would have links to future employment, with the option of a free-text response. Most respondents felt that there was a link, but the nature of this varied considerably. This perhaps reflects the wide range of careers that history graduates go on to pursue. For some students, there was a direct link between public history and creative assessment, and careers such as teaching and heritage. Explanations included thinking about audiences, ways to present information and make content accessible, and how history is consumed by the public or engaged with by children.

Many more respondents discussed transferable skills without reference to specific careers. Some specifically stated that they were not intending to pursue careers in public history or were not sure what they wanted to do after the degree, but that they still appreciated the range of skills being developed as relevant to future employment. These skills included research, analysis, creativity, independence, awareness of audience, visual and written communication, and presentation of ideas, time management and problem solving. A couple of students mentioned developing technical skills, including presenting information in new formats and video editing. Longer responses demonstrated an understanding of the value of communicating complex research in an accessible form to non-specialist or non-academic audience. For example, one student said:

I feel that the skills I developed in this assessment would be beneficial to show to future employers. For example, the assessment shows I have the ability to take large and detailed amounts of data or information, identify the most important or relevant aspects and distil them into easily accessible media. It also shows, in conjunction with other forms of assessment, that I can handle various different types of work or project.

This idea of demonstrating a different skill set not always reflected in other formats was noted by a few students, with one noting that they had found these to be in demand when applying for jobs. One student said, ‘Definitely, being creative and thinking outside the box is essential to any job’, while another noted that ‘Rather than just research and analytical aspects from essays and exams, the creative assessment promotes individuality, creativity, problem-solving, and time management. It brings uniqueness to future employment’. Finally, a student connected this to the value of history and its application in modern society, reflecting that ‘Regular people don’t read essays but they do watch The Crown on Netflix. These projects allow students to shine and go into the working world with stuff they can tangibly demonstrate their creative and analytical ability to translate history into today. So yes, it definitely does’.

Other respondents were more sceptical, often due to uncertainty about future careers or the lack of a direct link between the assessment and specific career pathways. A couple of students showed awareness that it could be relevant for specific jobs, but they did not feel that it was for them personally. For example, ‘It could depending on the job type, but I don’t think it will have direct links to my future career’, and:

Potentially. If someone were to go into widening participation, teaching, or perhaps blog-writing, etc. and used similar techniques in their public history output then it could be a good example to produce for an employer. However, if that is not someone’s career aim then it may only be helpful in linking to improvement of research skills (as most of the degree and other forms of assessment would too).

Only one student responded to this question by saying ‘Not one bit’. However, some respondents certainly only saw relevance, if the actual output could be replicated in the workplace or in a career they planned to pursue. In rethinking assessment, it is evident that students may not always understand, or be able to articulate, the transferability of skills developed in different types of assessment, and their application to different contexts.

Student responses to public history and creative assessments in history modules

Students were asked a range of questions that related to how they felt about, and responded to, the inclusion of these assessments in history modules. This is, of course, important, as assessment can shape learning experiences and although innovative assessments are often viewed positively by students, their unfamiliarity can sometimes lead to uncertainty and anxiety. Questions related to the extent to which the assessment affected choice of modules and subsequent enjoyment thereof, as well as to perceptions of inclusion and challenge relating to this assessment.

Table 2 shows student awareness of the assessment type before module enrolment. Just over half of respondents were aware before selecting the second-year option module, whereas a much higher proportion were aware of it as part of the final year module (and several students identified as having undertaken both modules). A follow-up question asked students whether this awareness had influenced their selection of the module. As Table 3 suggests, a higher proportion of students chose the final year module partially based on assessment type, whereas this was less important for selecting the second-year module.

Table 2

Awareness of the public history assessment before selecting the module 

Year Yes No
Year 2 14 10
Year 3 14 2
Table 3

Influence of assessment in selection of module 

Year Yes Perhaps No
Year 2 8 1 4
Year 3 12 1

In providing explanations for why this assessment had influenced their choice of modules, students identified four key themes: a desire for variety, scope to be creative or imaginative, enjoyment and strategic factors (playing to strengths). Variety was noted as particularly important due to the impact of Covid-19, which some students saw as restricting the range of assessment types and multiplying the number of essays they were writing. Students also valued the opportunity to use varied skills, with one saying that it ‘was very appealing as it would provide some variety in the ways I would be working, and the skills I would be using’. Several students stated that the assessment was one of the main reasons for selecting the module, noting:

I liked the fact that assessments were not solely essays. The chance to be creative and imaginative really appealed to me.

I thought that this method of assessment would give me the chance to explore history in a creative and more public minded way.

I thought it was unique and definitely something I wanted to experience.

Several responses embraced multiple themes, such as ‘I thought it was really refreshing to see something other than an exam or essay as a component of the module, and as a creative person, felt choosing this module played to my strengths and would be something I enjoyed’. A few students made it clear that the assessment was not a deciding factor, and that they would have selected the module based on content anyway, but that they still found it an enjoyable and welcome addition.

For those respondents who said that they were not aware of the assessment, their first impressions were often emotive responses. Students noted that they were excited or curious to try something new, but uncertain how to achieve high grades. A few students explored the importance of doing something different, with one stating that it ‘acted as a break from the usual essays but equally as academic’, and another noting that ‘it gave variety from essay writing and a chance to do something creative in a year that was very screen time heavy’. Again, students revisited the importance of having the opportunity to express ideas using different formats. The support and guidance given was cited as important in overcoming the uncertainty. Responses explored assessment literacy, albeit not explicitly, by balancing initial apprehension with enjoyment and achievement after completion. One student noted:

I was very apprehensive at the thought of a creative output replacing a written assessment in 2nd year as it wasn’t like anything else I’d done before. All I had done up until that point was essays, which to be honest I was still trying to get my head around, familiarise myself with, and was quite overwhelmed by. After I had completed this assessment in 2nd year, I really had enjoyed the creative assessment and it definitely drew me to the 3rd year module.

This highlights the importance of providing opportunities to develop confidence with assessment types throughout the degree, but also of mapping skills acquired on particular assessments with different formats. Within the scope of this study, it is impossible to know whether any students did not select the modules on account of the assessment format, but at least one student reflected that ‘I was concerned about not being very creative and worried that it might affect my grade, but not enough to deter me from doing the module because I liked the content’.

Perceptions of inclusivity

Students were asked whether they thought this type of assessment promoted inclusivity. Again, this was interpreted in varied ways, with some students reflecting not only on their degree, but also on the importance of making historical research more accessible and ensuring that the discipline is inclusive. With reference to being an inclusive assessment, students reflected on being able to use ‘personal skill sets – wherever your strengths lie’, ‘play to your strengths’, use ‘different skills’ and ‘different learning styles’, because it could take any format and enabled them to showcase their understanding of the module accordingly. One student summarised this as offering ‘another way to be assessed’. Further comments about the importance of providing different ways to be assessed included:

Allows people to put their thoughts and arguments forward in a way that they’re most confident with

allows people who may be very competent intellectually but perhaps not so good at writing essays to demonstrate their abilities

It’s more inclusive for those who might struggle with memory or writing essays

The creative piece might provide some respite for dyslexic people, it also allows people to flourish beyond the essay format for whatever reason

allows a wider range of skills to be showcased by students

It allows those who feel more creative to have an outlet and display their own talents within the degree

In each case, the importance of having different means by which to demonstrate historical knowledge and understanding was highlighted. With regard to inclusivity and the discipline, one student said, ‘I think this form of assessment does an excellent job of teaching university students the importance of public history and thus can make history a more inclusive discipline outside of universities.’ Several responses mentioned audience, including how to connect with varied audiences, and how this affects how to communicate the research.

A few students recognised that although they found this type of assessment inclusive, it might equally have been daunting to others. For example:

I used to find it really difficult to get high marks on essays, so for someone creative like me, I felt really comforted by the inclusion of this project on the module. I can imagine it might scare some people though, but sessions were run so that people understood the task and it didn’t have to be creative in an artistic sense, so in some capacity, i think it would appeal to anyone.

it offers a reprieve from traditional essay writing and a chance to showcase understanding, independence and originality. Again, it is dependent on an individuals [sic] way of learning and executing a display of their knowledge. If someone is not particularly creative the assessment may have been more frustrating than say an essay. But it was a good option in amongst the other two module assessments.

I think it allows people with different skills to use them. I do have Accessibility concerns regarding this assessment being more likely to mean costs for students- poster boards, cards, printing, etc.

In many cases, a great deal of reassurance was required to support students with an assessment format that they felt less familiar with. This was provided in various formats, as outlined earlier. In addition, after one student raised concerns about potential hidden costs associated with this assignment, additional guidance was provided that ensured that students were not spending unnecessarily on their output, and that expensive resources would not equate to a higher grade. Ideas of creativity also have to be challenged to ensure they are not confined to an artistic definition.

Perceptions of challenge

Students were also asked whether they found this assessment more or less challenging than other forms of assessment. Most students reported finding this type of assessment more challenging, but qualified this by stating that this was not necessarily a bad thing. Indeed, one student described this as ‘an enjoyable challenge’. Greater challenge equated to students finding the assessment more time consuming, requiring a different way of thinking and working, lacking confidence due to unfamiliarity with format, and associated uncertainty as to the standard of work being produced. Reassurance was particularly important with regard to many of these challenges. One student, having outlined the challenges they faced, said that the tutor ‘was very helpful in giving examples but also relevant guidance’.

The challenges were countered by how enjoyable, memorable and rewarding students found the assessment, and the higher grades some students were awarded. Several students mentioned enjoyment, and even the challenges, leading to increased motivation to complete the assessment. With regard to the challenges they experienced, one student said, ‘Worth it though, as I could still tell you about everything I included within the project, where with essays I’ll have forgotten the details of the content quite immediately’. With reference to a board game, one student explained that it was ‘difficult to come up with a coherent plan that would follow through’, but that, having overcome that initial challenge, they found it was ‘a very pleasurable experience’. Another student, referring to the assessment taking longer to complete, said that this meant ‘I could work on it little and often’. A few students referred to being able to spread the workload and, whereas they often could not successfully juggle multiple essays (preferring to complete each in turn), this was a welcome break from essays (and often a screen), and it was something they could do alongside written assessments.

Another student said:

I found the creative history task a bit more challenging but this was a good thing! I only found it challenging because it was completely different to other types of assessment that we do at University. It forced us to be creative and think outside of the box. Thus, this task increased my creativity skills and this is something I can use on my CV.

Some students considered this assessment to be equally challenging, but in different ways. One said,

It makes you use different parts of your brain. In fact, conceptualising how a part of history could be best presented, and how the creative form could be best used or developed really makes you think. I think it is probably a fairer form of assessment by allowing true originality to be shown whilst being adequately challenging to really separate out those who can distil what they’ve learnt flexibly and insightfully.

Another said that it was the ‘same level of challenge as a conventional essay actual content wise. Larger challenge was the actualization ... But that in the end was v useful, for developing my wider skillset’. A few students said that they found it less challenging, even though they also mentioned finding it time consuming, or more difficult to judge the standard of work they were producing. Another said it was less challenging because ‘it could be tailored to our strengths and interests more easily’, which corresponds to the earlier points about enjoyment and motivation. Students do require support and reassurance for unfamiliar tasks, and for those that they find challenging, but with these support mechanisms in place many flourish and embrace the opportunities that challenge can bring.

Discussion and conclusion

This article set out to determine whether ‘alternative’ and ‘creative’ approaches can make assessment a more effective and meaningful process within a subject-specific context, and to understand student perceptions of these assessments in a history degree. As discussed, ‘alternative’ is a relative term indicating it is different to other forms of assessment and provides students with different ways to be assessed, and to demonstrate knowledge and understanding. In a history-specific context, ‘alternative’ usually refers to assessment that is not an extended piece of writing, examination or oral presentation. This case study defined ‘alternative’ as creative or public history outputs, which assessed many of the same skills and competencies as other assessments, just using another format, as well as audience awareness and communicating historical research to non-specialist audiences. Introducing this assessment has been incredibly rewarding, with students not only enjoying the opportunity, but also often excelling as a result.

This research provides evidence that students value having different ways to be assessed, enabling them to develop or demonstrate the same core historical skills as other assessments (for example, historical knowledge and understanding, research, critical analysis, communication of ideas), together with transferable skills or attributes (for example, problem solving, time management, motivation, confidence and learner autonomy). Moreover, students highlighted how they felt that the assessment had enabled them to better understand module content, be more engaged, and demonstrate what they knew and understood more effectively, with a connection evident between this and the increased engagement and empowerment the assessment fostered. In addition, they recognised and appreciated the opportunity to use and develop different skills, including creativity, awareness of and communication with varied audiences, and understanding the wider applications of historical research. Students were clear about how this translated into an enhanced learning experience, encouraging them to think differently and produce something ‘interesting and unique’. Many students also reflected that it was more inclusive to have varied assessments, and that this assessment promoted inclusivity by valuing different skills and diverse ways of demonstrating understanding. It enabled students to construct and apply knowledge and understanding of a topic of their choice using a medium of their choice. By enabling students to put ideas and arguments forward in ways in which they felt most confident, it fostered learner autonomy and active engagement. A sense of community emerged, with one student observing that ‘the whole seminar group was enthused’. Students were motivated to select modules at least in part on the basis of the assessment type. Concerns about the assessment being ‘unfamiliar’ or ‘challenging’ were addressed with additional support and opportunities to develop assessment literacy. Indeed, although ‘more challenging’ was a recurring theme, this was not generally perceived as negative, and ultimately the assessment was judged as more engaging, enjoyable memorable and rewarding. These are powerful sentiments when expressed in relation to assessment.

Students made connections between the assessment and both the module and the degree. Many felt that the inclusion of representations and public history in the module content helped to foster a strong connection between the learning and the assessment. One student even questioned whether this could be applied to other modules as a result. Others felt that the assessment type was applicable to all modules. The relationship between module content and assessment is certainly something that should be taken into consideration when rethinking assessment, particularly checking what is being assessed and what the rationale is for introducing an alternative assessment format. If students are asked to create a piece of public history, then an understanding of the genre and its relevance to the module can provide a firm foundation for the assessment. If students are being offered alternative ways to demonstrate knowledge and understanding, including more ‘creative’ forms of assessment, then this may be less relevant. Students also discussed assessment literacy and connections with future learning and careers. Many noted that this assessment increased their confidence and enjoyment of learning, enhanced their problem-solving skills and encouraged them to think ‘outside the box’, and improved visual, oral and ultimately written communication skills. Many associated the assessment with careers in teaching or heritage, although some students did articulate the transferability of these skills, especially the ability to take complex. specialist research and communicate it in an accessible and engaging manner, using an appropriate format, for a non-specialist audience.

The feedback largely aligns with existing research that suggests students generally perceive ‘alternative’ assessments as ‘facilitating better-quality learning’ as the focus is on understanding not memorising, and thus leads to great engagement and enhanced learning experiences (Bloxham and Boyd, 2007; Elkington, 2020; Elkington and Evans, 2017; Richardson, 2019). It does, however, provide a subject-specific understanding of the relevance and impact, including student perceptions and reactions. Although the sample size was small and cannot begin to be representative of all experiences of this form of assessment, it does demonstrate the potential added value of diversifying assessment in a history degree. Moreover, it showcases the practical application of historical research and the value of history (and the humanities more broadly). Diversity in assessment and creativity are embedded in the QAA (2022) Subject Benchmark Statement for history. Academic historians increasingly communicate their research using varied modes. Many people do not engage with history via formal education. Through this assessment, students critically thought about the construction and consumption of historical knowledge and understanding in the public domain, demonstrated the value of public history, and reflected on the ‘real-life’ application of the tangible outputs they had created.

It should be acknowledged that diversifying assessment and laying the foundations for more innovative types of assessment can be very time consuming, and requires constant reassurance and support in response to the uncertainties about the unfamiliarity of the assessment or the standard of work being produced, and other perceived challenges encountered by students. It may be necessary to challenge misconceptions of creativity or about the perceived need to spend money on producing a high-quality output. It requires a different way of thinking and working for both tutors and students. Students often need more support and reassurance for unfamiliar tasks or those they perceive as more challenging, but, with this support in place, many students flourish and welcome the opportunities that the challenge can afford them, whether in terms of an enriched and memorable learning experience or in terms of the higher grades some students were awarded.

Rethinking and diversifying assessment requires decisions as to whether choice is embedded in assessment practices or greater variety is being provided across a degree programme. In some instances, it would not be possible to give free choice in assessment type in order to assess specific skills. ‘Alternative’ forms of assessment will not be everyone’s preferred choice, and nor should they simply replace existing assessment types to become the dominant mode of assessment. However, by providing opportunities to demonstrate historical knowledge and understanding using different formats, and be assessed accordingly, there is scope for students to gain confidence with historical research and the articulation of ideas, which, with support with academic writing, can be applied to more ‘traditional’ assessment types in due course. Indeed, not to consider ‘alternative’ and more ‘creative’ forms of assessment as part of a varied assessment diet threatens to replicate structural barriers to learning and to stifle the discipline. Historical skills can and should be assessed using varied means, and assessment should foster curiosity and a wide range of historical competencies, enabling students to become historians as well as to develop a wide range of transferable skills. Equally, if not more, important is the need for space to rethink assessment in subject-specific contexts, questioning how and why we assess history and what are we assessing, and asking whether there are different ways of doing this.

Reflecting on the relevance of this curriculum innovation beyond my own practice, two interconnected points are very important. First, implementing any curriculum development takes time, and research indicates that there is greater resistance or caution to changing assessment than most other parts of the curriculum (Bloxham and Boyd, 2007; Gibbs, 2006; Richardson, 2019; O’Neill and Padden, 2021; Sambell, 2016). Therefore, it is important not to underestimate the challenges when rethinking assessment and advocating change. Moreover, ideally this process should take place at programme level to ensure that innovation and change does not take place in isolation from wider curriculum design. This could include programme-level assessments, not attached to specific modules, or a programme-level perspective about how different assessments fit together, and how assessment literacy is being developed throughout the degree or programme of study. The student perspective is crucial, especially understanding how they experience and react to different forms of assessment, and what can be done to ensure they get the most out of them. Second, engaging colleagues (within and beyond the institution) in productive conversations about assessment is far more powerful than them simply adopting new assessment methods. Research advocates ‘wide-ranging stakeholder engagement’ and providing opportunities for colleagues to engage in ‘constructive professional dialogues about creativity and assessment’ (Elkington, 2020; Watson, 2014). This has certainly been my own experience from presenting at teaching conferences and organising assessment workshops, within and beyond the university. Moreover, the History UK assessment event (2022) provided an opportunity for colleagues from different higher education institutions (as well as some schoolteachers) to reflect on, and engage in, productive conversations about assessment and respond to the QAA (2022) history benchmark statement that asserts that diversity in assessment is vital, resulting in a national history and assessment project (HUK, 2024). Collectively, observations of my practice, feedback from students and colleagues, and discipline-specific discussions at national level have all reinforced the potential of these alternative and more innovative and creative forms of assessment to be effective and meaningful, and to facilitate transformative and authentic learning experiences within history. Moreover, this has demonstrated the importance of taking stock of assessment within a discipline-specific context, and asking what its purpose is, why are we using certain assessment tools, what alternatives are there, and how could these be effectively adopted.

Acknowledgements

The author would like to thank the anonymous reviewers for their comments and insights, and the editor for their support and guidance. In addition, thanks go to the students who took part in this study and colleagues who supported the development of these assessments.

Open data and materials availability statement

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Declarations and conflicts of interest

Research ethics statement

The author declares that research ethics approval for this research was provided by the University of Nottingham Faculty of Arts Ethics Board.

Consent for publication statement

The author declares that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.

Conflicts of interest statement

The author declares no conflicts of interest with this work. All efforts to sufficiently anonymise the author during peer review of this article have been made. The author declares no further conflicts with this article.

References

Adams, J; McNab, N. (2012).  ‘Understanding arts and humanities students’ experiences of assessment and feedback’.  Arts and Humanities in Higher Education 12 (1) : 36–52, DOI: http://dx.doi.org/10.1177/1474022212460743

Ajjawi, R; Tai, J; Dollinger, M; Dawson, P; Boud, D; Bearman, M. (2023).  ‘Authentic assessment to authenticity in assessment: Broadening perspectives’.  Assessment and Evaluation in Higher Education 49 (4) : 499–510, DOI: http://dx.doi.org/10.1080/02602938.2023.2271193

Anderson, V. (2023).  ‘How might higher education providers make assessment more inclusive? The potential for Universal Design for Learning’.  BERA, 13 January. Accessed 17 June 2023. https://www.bera.ac.uk/blog/how-might-higher-education-providers-make-assessment-more-inclusive-the-potential-of-universal-design-for-learning. .

Bearman, M; Luckin, R. (2020).  ‘Preparing university assessment for a world with AI: Tasks for human intelligence’.  Re-imagining University Assessment in a Digital World. Bearman, M, Dawson, P; P and Ajjawi, R; R, Tai, J; J, Boud, D D (eds.),   Berlin: Springer, pp. 49–63.

Biggs, J. (1996).  ‘Enhancing teaching through constructive alignment’.  Higher Education 32 (3) : 347–364, DOI: http://dx.doi.org/10.1007/BF00138871

Bloxham, S; Boyd, P. (2007).  Developing Effective Assessment in Higher Education: A practical guide. Maidenhead: Open University Press.

Booth, A. (2003).  Teaching History at University: Enhancing learning and understanding. London: Routledge.

Brown, G; Burnham, S. (2014).  ‘Assessment after levels’.  Teaching History 157 : 8–17.

Bryan, C, Clegg, K K (eds.), . (2006).  Innovative Assessment in Higher Education. London: Routledge.

Bulaitis, J. (2009).  Introducing New Types of Assessment Within the Discipline of History: The experience in a history module at the University of Essex. Warwick: The Higher Education Academy.

Burnell, I. (2019).  ‘Widening participation for non-traditional students: Can using alternative assessment methods level the playing field in higher education’.  Widening Participation and Lifelong Learning 21 (3) : 162–173, DOI: http://dx.doi.org/10.5456/WPLL.21.3.162

Burnham, S; Brow, G. (2004).  ‘Assessment without level descriptions’.  Teaching History 115 : 5–15.

Butler, S. (2004).  ‘Question: When is a comment not worth the paper it’s written on? Answer: When it’s accompanied by a level, grade or mark!’.  Teaching History 115 : 37–41.

Chapman, A; Haydn, T. (2020).  ‘Editorial: History education in changing and challenging times’.  History Education Research Journal 17 (1) : 1–3, DOI: http://dx.doi.org/10.18546/HERJ.17.1.01

Conway, R. (2011).  ‘Owning their learning: Using Assessment for Learning to help students assume responsibility for planning, (some) teaching and evaluation’.  Teaching History 144 : 51–57.

Corker, C; Holland, S. (2016).  ‘Using public engagement to enhance student engagement’.  Student Engagement in Higher Education Journal 1 (1)

Cottingham, M. (2004).  ‘Dr Black Box or how I learned to stop worrying and love assessment’.  Teaching History 115 : 16–23.

Counsell, C. (2023).  ‘Laughing Muppets, lost memories and lethal mutations: Rescuing assessment from knowledge-rich gone wrong’.  Teaching History 193

Counsell, C; Woolley, M; Chapman, A; McConnell, T. (2008).  ‘Editorial: Assessing differently’.  Teaching History 131 : 2.

Elkington, S. (2020).  Essential Frameworks for Enhancing Student Success: Transforming assessment in higher education. Heslington: Advance HE.

Elkington, S; Chesterton, P. (2023).  ‘The (dis)continuities of digital transformation in flexible assessment practice: Educator perspectives on what matters most’.  Studies in Technology Enhanced Learning 3 (2) DOI: http://dx.doi.org/10.21428/8c225f6e.ef49a881

Elkington, S; Evans, C. (2017).  Transforming Assessment in Higher Education. York: The Higher Education Academy.

Facey, J. (2011).  ‘“A is for assessment” … strategies for A-Level marking to motivate and enable students of all abilities to progress’.  Teaching History 144 : 36–43.

Gibbs, G. (2006).  ‘Why assessment is changing’.  Innovative Assessment in Higher Education. Bryan, C, Clegg, K K (eds.),   London: Routledge, pp. 11–22.

Harrison, S. (2004).  ‘Rigorous, meaningful and robust: Practical ways forward for assessment’.  Teaching History 115 : 26–30.

Holland, S; Budd, A; McGuire, C; Williams, M; Peplow, S. (2023).  History UK: History, Pedagogy and EDI Project report, Accessed 17 June 2024. https://www.history-uk.ac.uk/history-uk-history-pedagogy-and-edi-project-report/. .

HUK (History UK). (2024).  ‘Assessment’.  Accessed 17 June 2024. https://www.history-uk.ac.uk/assessment/. .

Jackson, N. (2005).  ‘Making higher education a more creative place’.  Journal for the Enhancement of Learning and Teaching 2 (1) : 14–25.

Knight, O. (2008).  ‘Create something interesting to show that you have learned something’.  Teaching History 131 : 17–24.

Luff, I. (2003).  ‘Stretching the strait jacket of assessment: Use of role play and practical demonstration to enrich pupils’ experience of history at GCSE and beyond’.  Teaching History 113 : 26–35.

Luff, I. (2016).  ‘Cutting the Gordian Knot: Taking control of assessment’.  Teaching History 164 : 38–45.

Matthews-Jones, L. (2019).  ‘Assessing creatively, or why I’ve embraced the #unessay’.  Accessed 17 June 2024. https://lucindamatthewsjones.com/2019/09/11/assessing-creatively-or-why-ive-embraced-the-unessay/. .

Matthews-Jones, L. (2022).  ‘Creative History in the Classroom Workshops, September 2022–January 2023’.  Accessed 17 June 2024. https://lucindamatthewsjones.com/2022/09/14/creative-history-in-the-classroom-workshops-september-2022-january-2023-online/. .

McArthur, J. (2022).  ‘Rethinking authentic assessment: Work, well-being and society’.  Higher Education 85 : 85–101, DOI: http://dx.doi.org/10.1007/s10734-022-00822-y [PubMed]

McConlogue, T. (2020).  Assessment and Feedback in Higher Education: A guide for teachers. London: UCL Press.

Medland, E. (2016).  ‘Assessment in higher education: Drivers, barriers and directions for change in the UK’.  Assessment and Evaluation in Higher Education 41 (1) : 81–96, DOI: http://dx.doi.org/10.1080/02602938.2014.982072

Murphy, R. (2006).  ‘Evaluating new priorities for assessment in higher education’.  Innovative Assessment in Higher Education. Bryan, C, Clegg, K K (eds.),   London: Routledge, pp. 37–47.

Nicol, J; Chapman, A; Cooper, H. (2018).  ‘What is the History Education Research Journal, HERJ?’.  History Education Research Journal 15 (2) : 177–179, DOI: http://dx.doi.org/10.18546/herj.15.2.01

Norton, L. (2009).  ‘Assessing student learning’.  A Handbook for Teaching and Learning in Higher Education: Enhancing academic practice. Fry, H, Ketteridge, S; S and Marshall, S S (eds.),   London: Routledge, pp. 132–149.

Novak, K; Bracken, S. (2019).  ‘Introduction’.  Transforming Higher Education Through Universal Design for Learning: An international perspective. Bracken, S, Novak, K K (eds.),   Abingdon: Routledge, pp. 1–8.

O’Neill, G; Padden, L. (2021).  ‘Diversifying assessment methods: Barriers, benefits and enablers’.  Innovations in Education and Teaching International 59 (4) : 398–409, DOI: http://dx.doi.org/10.1080/14703297.2021.1880462

Pereira, D; Flores, MA; Niklasson, L. (2016).  ‘Assessment revisited: A review of research in Assessment and Evaluation in Higher Education’.  Assessment and Evaluation in Higher Education 41 (7) : 1008–1032, DOI: http://dx.doi.org/10.1080/02602938.2015.1055233

QAA (Quality Assurance Agency for Higher Education). (2022).  QAA Subject Benchmark Statement: History, Accessed 17 June 2024. https://www.qaa.ac.uk/docs/qaa/sbs/sbs-history-22.pdf?sfvrsn=beaedc81_4. .

Race, P. (2020).  The Lecturers’ Toolkit. London: Routledge.

Richardson, S. (2019).  ‘History’.  A Handbook for Teaching and Learning in Higher Education: Enhancing academic practice. Fry, H, Ketteridge, S; S and Marshall, S S (eds.),   London: Routledge, pp. 319–330.

Sambell, K. (2016).  ‘Assessment and feedback in higher education: Considerable room for improvement?’.  Student Engagement in Higher Education Journal 1 (1) : 1–14.

Stanford, M. (2008).  ‘Redrawing the Renaissance: Non-verbal assessment in Year 7’.  Teaching History 130 : 4–11.

Wake, S; Pownall, M; Harris, R; Birtill, P. (2023).  ‘Balancing pedagogical innovation with psychological safety? Student perceptions of authentic assessment’.  Assessment and Evaluation in Higher Education 49 (4) : 511–522, DOI: http://dx.doi.org/10.1080/02602938.2023.2275519

Watson, JS. (2014).  ‘Assessing creative process and product in higher education’.  Practitioner Research in Higher Education 8 (1) : 89–100.

Weller, S. (2019).  Academic Practice: Developing as a professional in higher education. London: Sage.

Wiggins, G; McTighe, J. (2015).  Understanding by Design. Alexandria: ASCD.

Wrenn, A. (2004).  ‘Making learning drive assessment: Joan of Arc – saint, witch or warrior?’.  Teaching History 115 : 44–51.

Zemits, BI. (2017).  ‘Representing knowledge: Assessment of creativity in humanities’.  Arts and Humanities in Higher Education 16 (2) : 173–187, DOI: http://dx.doi.org/10.1177/1474022215601862