Research article

Delivering citizen science online and hybrid: impact of the Covid-19 pandemic on recruitment and engagement

Authors
  • Ana Margarida Sardo orcid logo (Science Communication Unit and Air Quality Management Resource Centre, University of the West of England, Bristol, UK)
  • Sophie Laggan orcid logo (Science Communication Unit and Air Quality Management Resource Centre, University of the West of England, Bristol, UK)

Abstract

This small-scale study explores the impacts of the Covid-19 pandemic on running and delivering a large-scale, international participatory citizen science project. In doing so, it seeks to identify the challenges posed by the pandemic for the research and practitioner community, as well as suggesting useful strategies, tools and helpful approaches for meeting similar challenges in the future. WeCount was a citizen science project aimed at empowering citizens to take a leading role in the production of data, evidence and knowledge around mobility in their own neighbourhoods. The project was originally designed around in-person engagement and interaction with citizens in five European case studies, but it was disrupted by the Covid-19 pandemic. In this pilot study, we conducted seven email interviews with internal project members. Data were analysed using content analysis, and five main themes emerged from the interviews: Recruitment (overall and of specific groups); Uncertainty; Logistics; Digital skills; and Blended approach. We report on the lessons learnt about online citizen recruitment and engagement to support future citizen science and other participatory research projects, not only during times of crisis, but also in deciding when it is and is not a good time to use online methods of recruitment and engagement.

Keywords: citizen science, engagement in times of crisis, online engagement, Covid-19, participatory research

How to Cite:

Sardo, A. & Laggan, S., (2024) “Delivering citizen science online and hybrid: impact of the Covid-19 pandemic on recruitment and engagement”, Research for All 8(1). doi: https://doi.org/10.14324/RFA.08.1.04

Rights: Copyright 2024, Ana Margarida Sardo and Sophie Laggan

438 Views

107 Downloads

Published on
16 May 2024
Peer Reviewed
License
Key messages
In-person kick-off workshops are ideal for building trust and managing expectations, but it is important to manage citizens’ expectations, and to be transparent about what is and is not possible. Circumstances may change, and delivery partners need to be prepared for that.
To avoid excluding participants from low socio-economic groups and individuals who do not use online communication channels, it is important to build lasting connections with intermediaries and, if/when needed, to choose an online platform and tools that suit both the project team and the participants.
Upskilling researchers, practitioners and citizens in using online participatory tools is important, as online engagement is here to stay. Facilitator tools can be found online, and can recreate a similar feel to in-person engagement; these skills will be useful for the foreseeable future.

Introduction

Citizen science is the practice of the active involvement of the public in scientific research. Citizen science can serve to increase scientific knowledge on a topic of interest or concern (Conrad and Hilchey, 2011), can foster two-way dialogue on topical issues, and has the potential to empower citizens in their everyday lives to change behaviours and/or influence policy change (Metcalfe, 2022). It is arguably hundreds of years old (Aono and Kazui, 2008) and today, citizens contribute to all manner of research activities, including crowdsourcing, contributing data for scientists’ use, interpreting data sets and finding solutions.

The integration of participatory and dialogic approaches alongside the traditional deficit model can contribute to the democratisation of science (Cohen et al., 2017; Deutsch and Ruiz-Córdova, 2015; Metcalfe, 2022; Theocharis et al., 2021). While some empowering approaches have been implemented (Cohen et al., 2017; Deutsch and Ruiz-Córdova, 2015), many citizen science projects still assign citizens a contributory role rather than a community-leadership or co-creation role, with citizens primarily serving as data gatherers, and having less involvement in leading projects or setting aims and objectives (Kullenberg and Kasperowski, 2016). A meta-analysis conducted by Kullenberg and Kasperowski (2016) found that data collection and classification are the primary focal points of citizen science, while public participation is the least frequently cited area of focus. Overall, it is important for citizen science projects to prioritise the involvement of citizens in decision making and leadership roles, in order to fully realise the potential of democratisation in science and society.

The growth in popularity of citizen science stems from technical, scientific and political changes – a rise in internet and smartphone users (Haklay, 2013), a ‘participatory turn’ in science studies regarding the involvement of non-experts in research (Jasanoff, 2003), and a more supportive funding landscape, with the European Commission, in particular, funding more citizen science projects each year (Wagenknecht et al., 2021). Despite their increase in popularity, this growth has been largely uncoordinated, leading to several challenges. For example, there are conceptual challenges about what citizen science is and what its role should be (that is, deficit, dialogic, participatory or a combination of all three). Of particular importance for this article, as citizen science projects often work in silos, this makes them more vulnerable to external shocks if political priorities change, funding ends or there is a socio-environmental disaster, such as a flood or a pandemic. To ‘mainstream’ citizen science, and to overcome some of these challenges, platforms have been established to improve the funding opportunities, coordination, delivery, evaluation and impact of the growing network of projects, and much of this work happened during the Covid-19 pandemic (ACTION Project, 2022; Wagenknecht et al., 2021).

The impact of the Covid-19 pandemic on citizen science projects

The pandemic impacted every aspect of our lives, from human health and behavioural patterns to the ecology of the entire planet, making scientific study during this time of particular interest to researchers. Researchers were quick to adapt to imposed restrictions, with some resorting to citizen science to fill in gaps in data collection, analysis and dissemination; some argue that this model should continue beyond the pandemic (Ammendolia and Walker, 2022). Forced indoors and online, existing online citizen science projects saw a spike of activity during the first lockdown (Dinneen, 2020), while new ones emerged, such as the ZOE app (ZOE, 2022), which rapidly crowdsourced data on Covid-19 symptoms and later risk factors. Many online citizen science projects, such as eBird and iNaturalist, thrived in the pandemic, with increases in observations made by the public as they sought relief in nature, although this varied according to location, with community science participants shifting to more urban settings (Crimmins et al., 2021; Rose et al., 2020). One project found that the motivations for joining changed during the pandemic, from more extrinsic to more intrinsic drivers (Diethei et al., 2021). Digital technologies aided this success, as they linked socially distanced professional scientists with citizen scientists. However, some citizen science projects did not see the same benefits as others from virtual events or activities. Kishimoto and Kobori (2021) found that for some projects, the number of participants and observations decreased substantially, despite the shift to virtual events. Interestingly, Kishimoto and Kobori (2021) also observed that while the most enthusiastic participants continued to contribute to these projects at similar levels to previous years, participation from less enthusiastic volunteers drastically declined. This suggests that virtual events may not have been as effective in engaging less motivated individuals, compared to more motivated ones. Rose et al. (2020) have seen an increase in participation in citizen science in South Africa, and they argue that the lockdown environment may have led to more participants wanting to take part and contribute to such projects.

Engaging citizens online and in times of crisis

Continuous engagement is the most important but most effort-intensive task for any citizen science project (Dickinson et al., 2012). For online engagement in citizen science, there needs to be flexibility in how and where the work is carried out; tools, platforms and activities should be selected for their ability to allow participants to connect, collaborate and feel ‘close’ to one another; and resources need to be made available in alternative formats (Richter et al., 2020). Fundamentally, information needs to be clear and simple, minimising the need to be online (more than is necessary) or to go through numerous resources (Richter et al., 2020; Van Haeften et al., 2020). Furthermore, contributions can be positively influenced by making online and offline tasks enjoyable, promoting the social purposes of the project, and providing opportunities to be part of an active community after the experience (Cappa et al., 2016).

Kishimoto and Kobori’s (2021) review highlights the need for careful consideration and planning when shifting to virtual events or activities. While virtual events can offer many benefits, such as increased accessibility and convenience, they may not be equally effective in engaging all participants. Therefore, it is important to develop strategies and approaches that can effectively engage a diverse range of individuals, and ensure that the benefits of virtual events are seen by all.

This pilot study delves into the effects of the worldwide Covid-19 pandemic on the WeCount citizen science project, specifically examining the challenges that the project team faced in recruiting and engaging citizens during the initial stages of the pandemic. This study aimed to shed light on the project team’s experiences during this unprecedented time, and on how they adapted to continue delivering the project, despite the challenges. The WeCount team quickly moved to virtual approaches, adapting methods where necessary to continue engaging with citizen scientists, and to collecting data remotely. The team’s flexibility and creativity in response to the pandemic provide valuable lessons for other citizen science and participatory research projects, which will undoubtedly face similar challenges in the future. This article includes a set of practical guidelines, generalised to other types of crisis, especially situations that impose restrictions on physical contact and require emphasis on online contact.

Methods

The WeCount project

WeCount (Citizens Observing Urban Transport; 2019–21) was a Horizon 2020-funded Science With and for Society citizen science project that aimed to empower citizens to take a leading role in the production of data, evidence and knowledge around mobility in their neighbourhoods, encouraging them to use the evidence to raise local awareness and to influence policy (WeCount, 2021). The project was designed to engage with citizens in five European case studies: Leuven (Belgium), Madrid and Barcelona (Spain), Ljubljana (Slovenia), Dublin (Ireland) and Cardiff (UK). Originally, the project team planned to use face-to-face recruitment, engagement and interaction to involve citizen scientists in co-designing the data platform, collecting and analysing data, engaging with key stakeholders and ultimately taking action to influence local citizens and decision makers. Just as the project began recruitment and platform co-design, the Covid-19 pandemic started, presenting numerous challenges to the original plans. In particular, the face-to-face recruitment and engagement methods were no longer possible due to travel, gathering and social-distancing restrictions. As and when restriction rules changed, WeCount, when permitted, found ways to engage citizens in open spaces.

WeCount used participatory citizen science methods to co-create solutions for traffic issues (Sardo et al., 2022). To collect data, each participant in the WeCount project mounted a low-cost, automated road-traffic counting sensor called a Telraam (Figure 1) on a window in their home with a clear view of the road. This sensor (comprising a Raspberry Pi and camera) tracked the number and speed of cars, large vehicles, cyclists and pedestrians in the neighbourhood. Each participant placed the Telraam on a window facing a road. Throughout their involvement in WeCount, citizens provided feedback on the sensor and digital platform, leading to continuous updates aimed at improving the user experience. Participants were able to monitor street traffic and analyse mobility data through the Telraam digital platform. Alongside in-person sensor pickup and drop-off events when permitted, citizens engaged in workshops on the following themes: collective problem formulation; sensor assembly and Question & Answer (with accompanying how-to videos); data co-analysis; citizen-led dissemination of findings; and advocacy for policy change. Through collaborative efforts in workshops and on the data platform, citizens co-created solutions to address local mobility issues.

Figure 1.
Figure 1.

A Telraam traffic sensor attached to a window (Source: WeCount project)

Interviews

Interviews are judged in the literature to be a useful evaluation method, as they directly access the observations, insights and experiences of participants (Grand and Sardo, 2017; Tong et al., 2007). For this study, email interviews were used, in an effort not to place any extra pressures on project staff during what was already a turbulent time with numerous impacts on their work and personal lives. Email interviews have been described in the literature as a suitable method in research (Hershberger and Kavanaugh, 2017; James, 2017). James (2017) argues that email interviews provide a research design that is suited to participants’ lives, allowing people to take part who might not have been able to if the interviews were taking place face-to-face, online or over the phone. Hershberger and Kavanaugh (2017) have compared the appropriateness of email interviews versus phone interviews, and have found that email interviews, when well designed and planned, provide rich and critical data.

The interview schedule was a semi-structured design, including open-ended questions that allowed participants to provide answers in their own terms (Groves et al., 2004; Tong et al., 2007). The questions explored how the WeCount consortium was adapting to a new way of interacting with citizens, as well as the challenges that they were experiencing and the mitigation strategies that had succeeded. The interview schedule comprised 11 questions, focusing on the impacts on recruiting participants and the impacts on delivering workshops and events. For each of these two main themes, the questions explored general impacts, changes in terms of the original strategies, helpful and unhelpful approaches, and challenges faced by the project team.

The WeCount evaluation team led the research, and invited team members to participate in the interviews. Interviewees were made aware of the research during an online meeting, and then individually invited via email. Seven team members were invited and interviewed about how the pandemic was impacting citizen engagement and recruitment. The interviewees represented all six case study leads (one city had two case study leads, co-sharing the role), and the WeCount Technical Coordinator.

Data analysis

The answers were read in full, and the qualitative data were analysed and coded by hand by two members of the evaluation team, using the step-by-step guide of thematic analysis developed by Braun and Clarke (2006), searching for themes that captured patterned meaning across the data: (1) familiarisation of data; (2) identification of initial themes; (3) searching for themes; (4) reviewing themes; and (5) writing up the report. The coding of the data focused on the recruitment and delivery impacts of the pandemic. For each of these main themes, coding explored general impacts, changes in terms of the original strategies and successful/unsuccessful mitigation strategies.

All interviews were coded by the same researchers, and the codes were refined and accumulated into themes that represented the semantic meaning across the data set. Secondary analysis was performed with review by the co-author to ensure that the themes adequately represented the original data. Following thematic analysis of the interviews, five main themes were identified from the responses provided by the interviewees: Recruitment (overall and of specific groups); Uncertainty; Logistics; Digital skills; and Blended approach.

Results and discussion

WeCount directly engaged with more than 1,000 citizens and stakeholders through workshops, seminars and school activities. A total of 368 citizen scientists owned a traffic sensor and actively contributed to the project. An estimated 230,000 people were engaged indirectly through social media and the project website. Across case studies, a total of 52 events and workshops took place; most of these were online. The most common age categories were under 16 years old (29 per cent of participants) and 35–49 years old (28 per cent), and the project had a nearly equal split of males and females (51 per cent: 49 per cent), with participating citizen scientists being highly educated (82 per cent). Based on the data collected, it was estimated that 10 per cent of WeCount citizens had a low socio-economic status. The original targets were that each case study would have 200–400 participants (1,000–2,000 in total), with a total 25 per cent of participants from low socio-economic groups, across case studies (Sardo et al., 2021).

In this section, we present the results, and discuss them around the five main themes that emerged from the interviews: Recruitment (overall and of specific groups); Uncertainty; Logistics; Digital skills; and Blended approach.

Theme 1: Recruitment

Recruitment (both in general and of specific demographic groups) was one of the main themes in the interviews. Having to move all recruitment online was mentioned by several Case Leads as ‘the [only] possible strategy’, although not necessarily the most effective. Some recruitment strategies were originally already online, such as raising awareness of the project on social media. However, some case studies were relying heavily on in-person recruitment strategies, such as door-to-door campaigns, and activities and events at strategic community groups, which all had to shift to Zoom and Teams online calls:

All recruitment has moved online. However, we noticed that when we were having a chance to talk to potential participants [in person], this was way more effective [at recruitment]. (Case Lead 04)

However, this was not the case across all case studies: two Case Leads had originally planned their recruitment strategy to rely heavily on online tools, such as social media campaigns, which were unaffected by the pandemic:

Social media was always going to be the main recruitment strategy. Face-to-face events may have also helped but not having them isn’t a major deterrent. (Case Lead 06)

New forms of recruitment also emerged, such as engaging with television and radio – approaches that the team had not considered before. Interestingly, face-to-face recruitment was also challenging. Two Case Leads reported that having to organise face-to-face recruitment workshops (when this was allowed) was, in fact, the biggest challenge, as people did not want to attend in person. As the pandemic worsened, people became ‘afraid to join’ (Case Lead 01), due to concerns regarding the virus.

The WeCount teams’ use of indirect ways to contact and recruit citizens shows how access to gatekeepers was instrumental in recruiting citizen scientists. Reaching the target groups in an indirect way, for example, by liaising online (via Zoom calls and workshops) with community centres and schools to install the sensor, rather than liaising directly with citizens, was an effective, albeit not ideal, way to engage with the target audience. A more focused and targeted recruitment strategy proved helpful for several Case Leads: targeting specific volunteer groups, leveraging activists, using existing networks of contacts and having the project endorsed by local venues and institutions, as well as existing communities.

WeCount citizens were encouraged to become dedicated ‘local champions’ to spread awareness of the project and to support other citizens in their neighbourhood. This role was planned already, prior to the pandemic. Having local champions proved crucial in bringing people together and inspiring others (for example, by sharing their findings with neighbours, in their windows or via online workshops); however, there were fewer than hoped, as a result of not being able to meet physically. The strategies described above (focused and targeted recruitment, endorsement by local venues and institutions, and the use of local champions), all indirect ways to contact and recruit citizens, showcase how access to gatekeepers was instrumental in recruiting citizen scientists.

Recruiting participants from underrepresented or marginalised groups, such as those from low socio-economic backgrounds, can be a complex and challenging task in the best of times (Narui et al., 2015). Engaging with these participants can be even more difficult in times of crisis, requiring a significant investment of time, effort and resources to build relationships and establish trust. In WeCount, the team faced challenges in both recruiting and engaging with participants from low socio-economic groups. One of the major challenges in engaging with participants from this demographic is lack of trust and scepticism about science and experts (Jones et al., 2006). This mistrust can be especially pronounced in online interactions, where people may be more wary of strangers, and may fear being exploited or mistreated. As Goffman (2005) points out, people tend to rely on cues such as body language, tone of voice and other non-verbal cues to assess the trustworthiness of others. In an online environment, these cues are not available, which can further exacerbate mistrust and reluctance to engage.

To overcome the challenges of engaging with marginalised communities, researchers can adopt a variety of strategies, such as using participatory research methods that involve community members in the research process from the outset, and working closely with community groups to build trust and partnerships to ensure resources are made available to these groups (for example, laptops/making tools available on mobiles, access to wi-fi, and offline ways to contribute). As Goffman (2005) points out, the importance of face-to-face interaction in building trust cannot be overstated, but with careful planning and sensitivity, it is possible to build strong relationships with marginalised communities even in online settings.

Although online recruitment of participants, such as using social media, is an effective strategy for most target audiences or participant groups, recruiting participants with certain demographics, such as senior citizens or people living in areas of low socio-economic status, has proven trickier for WeCount when done solely online, even with a combination of social media and online live events. Reaching participants from low socio-economic groups, for example, has been reported as more challenging in the online world, as there are very few clear ways to access them. Aerschot and Rodousakis (2008) and, more recently, Mirazchiyski (2016) argue that people living in areas of low socio-economic status are particularly affected by the digital divide (the gap between those who are able to benefit from the digital age and those who are not). Lack of access to technology (laptops, tablets and so on), technical and/or digital skills, and/or reliable internet connection negatively affect the ability of members of low socio-economic groups to participate (Mirazchiyski, 2016).

Originally in WeCount, local teams would go through an intermediary (such as an advocacy group, support organisation or community worker) to arrange face-to-face group meetings, for example, in a school, community centre or care home. However, all these routes were unavailable during the pandemic (for example, schools were closed). Some of the planned recruitment strategies simply had to be abandoned. Door-to-door campaigns and distribution of leaflets to community centres, youth groups and similar groups were no longer possible, as these venues had to close. Plans to recruit participants via informal spaces, such as churches, barbers or on the street, also had to be abandoned due to restrictions imposed by the Covid-19 pandemic:

We can’t leave information leaflets with community centres/youth groups or similar groups, because they are closed right now. Recruitment via church parishes isn’t possible at the moment because they are closed. (Case Lead 02)

In WeCount, the original ambition was that 25 per cent of all participants would come from low socio-economic groups. In the end, from the data gathered, it was estimated that 10 per cent of citizens had a low socio-economic status (Sardo et al., 2021), a number below the initial target.

Theme 2: Uncertainty

Dealing with constant and ongoing uncertainty was challenging in WeCount. During the pandemic, rules and restrictions on movement and meeting other people changed several times, sometimes very quickly, without much notice. Planning workshops and events is very hard when society is having to react to constantly changing policies and government advice. Case Leads described difficulties around having to define a new approach despite the uncertainties, without knowing which restrictions would be in place:

It’s difficult to recruit people if we can’t give them timelines. (Case Lead 02)

Contributing to uncertainty, and equally challenging to manage, was the sudden change in priorities for some external project partners. Several local authorities and schools had committed to be part of the project, but at the time of implementation, they were facing significant challenges in their daily activities, such as having to adapt materials for online teaching, learning new tools for online teaching or deploying staff to support the community in dealing with the pandemic. This meant that many external partners could no longer participate in the project. Not surprisingly, potential citizens targeted by the WeCount project reported a huge increase in online events and email-based communication. This was another challenge for the project, as people started getting tired of online/email engagement:

People’s feedback confirms that during this pandemic there has been an exponential increase of events online and mail-based communication. This did not help WeCount. (Case Lead 04)

Theme 3: Logistics

The Covid-19 pandemic disrupted people’s lives in numerous ways, with many experiencing stresses as a result of working from home, home schooling and caring for family members, among other factors. The WeCount team recognised this burden, and adapted their approach to accommodate the needs of participants. By being flexible with scheduling, providing various ways to contribute, acknowledging all contributions equally, and recording live sessions (so that they could be accessed after the live event for those who could not attend), the team demonstrated care and attention to participants. Supporting materials, for example, printed and video ‘how to’ guides, an online helpdesk and email contact with the team further supported participant involvement, although some of these tools were conceived prior to the pandemic. Such efforts are crucial to maintaining motivation in citizen science projects, as studies have shown (Deutsch and Ruiz-Córdova, 2015; Richter et al., 2020).

Another logistical challenge was the deployment of sensors, which was slower than anticipated, due to the restrictions on movement and socialising imposed by the pandemic. Suddenly, distributing sensor kits (originally planned to happen during sensor-assembling in-person workshops) had complex logistics attached, such as additional demands on the local teams’ workloads and covering large geographical areas. Logistical difficulties were also involved in terms of engaging in safe ways, in line with national guidelines, while wanting to make an initial face-to-face connection with citizens.

A key part of the original WeCount engagement strategy was a series of in-person and hands-on activities, such as practical workshops; local restrictions meant that such engagement activities had to be moved online – the whole strategy had to change. Moving these activities to online workshops meant changes to the allocation of resources. For example, more time, money and energy had to be invested in getting to know the online platforms (for both staff and citizens) and in developing additional resources:

We invested more time in making more posters, social media coverage, press articles and collective emails. Recruitment and workshop materials have now been prepared for online use. (Case Lead 05)

Additional resources included an installation ‘how to’ video in the local language and a welcome pack with instructions to allow participants to install their sensors without in-person support. While it was always a project goal to get to the point where citizen-led deployment and self-installation was the norm, this happened too prematurely. This led to some frustration, complaints and negative experiences, which eroded trust in both the technology and the project, and limited the prospect of citizen autonomy. The project team believe that this issue may have been overcome if explained in person:

We’ve noticed that when we had a chance to talk to participants, this was way more effective [than online engagement only]. (Case Lead 04)

Here, the Case Lead was referring to how much more successful and effective talking to people in person was at recruiting participants to take part in WeCount, compared to talking online.

Theme 4: Digital skills

An overarching challenge for WeCount during the Covid-19 pandemic was the varying levels of digital competence of participants and staff. This became apparent in workshops, where some staff were initially hesitant to explore digital tools to enhance their engagements. Facilitation was much harder for staff, as they had to manage a room of mixed abilities, for example, spending more time in the installation workshops with participants who felt less confident with using the technology. This resulted in frustration among more tech savvy participants:

The experience with a small number of participants was challenging mainly due to the different range of people involved (e.g., those that are tech savvy had to wait for those that took more time, that in turn felt frustrated for this). (Case Lead 04)

For team members more used to engaging with citizens online, or those willing to learn, they embraced the online environment as much as possible, trying to find out how it could be leveraged to benefit the project. They thought creatively to involve citizens online (for example, with Miro whiteboards, Padlet post-its, Mentimeter polls and Google Collaboratory for data analysis in groups) or to make them feel valued from afar (for example, sending them cake, sharing a beer online). Digital novices learnt from those with more experience, listening to and taking on board experiences shared in team meetings. However, for some team members, significant time was spent in learning new skills, for example, watching YouTube tutorials in their own time.

Eventbrite for workshop registration proved helpful in alleviating the pressure of collecting evaluation data during sessions, as it automatically collected the demographic data. During sensor-specific workshops, some project team members found it helpful to split participants into different groups/virtual rooms. This allowed them to give one-to-one support to those experiencing technical difficulties with their sensors. For all case studies that used these techniques during online engagement, a level of intimacy could be emulated, there were fewer concerns, and connections were made that are often the hope for in-person workshops.

Theme 5: Blended approach

Several strategies proved successful in engaging citizen scientists. The combination of face-to-face sensor pickup moments (where restrictions allowed for this) or doorstep drop-offs, followed by online workshops to get citizens set up and/or to form local networks, worked well and was well received by the participants. For instance, the face-to-face pickups with engagement activities, held in a public space and in line with government guidelines and rules at the time, helped the team to identify citizens who could play specific roles in the project, such as potential local champions, as well as to gain tacit knowledge that facilitates trust building (Koskinen and Vanharanta, 2002). This was made possible by being able to have longer, individual conversations with citizens, which allowed better understanding of their specific skills and willingness to be involved. Meanwhile, frequent communication with the citizens, through newsletters, troubleshooting and enjoyable, informative and participatory workshops sustained citizens motivation throughout.

However, a challenge faced by the WeCount team was to find the right balance for online live engagement activities. It proved tricky to find the right number of participants for an online workshop, while still being able to reach the original engagement target:

We’ve been discussing the right size for online workshops, but at the same time keeping in mind that we will have to deliver them to ~300 participants. Finding the balance between not too many workshops, but also not too many people per workshop. (Case Lead 02)

As illustrated by this quotation, it was also difficult to determine the optimal number of workshops: too many and you put added strain on participants and the project team; too few and you run the risk of a less intimate space for participants, where not everyone can/feels able to contribute.

Reflections and conclusions

Citizen science is seeing a shift towards a more participatory model, one where citizens lead on the design, research and action for societal and environmental benefit (Strasser et al., 2019). This shift requires different logistics: more staff time, energy, targeted recruitment and face-to-face engagement; however, as the pandemic has shown, this is not always possible. The growth of digital literacy, wi-fi connectivity and screen usage have made it imperative to embrace digital tools for participatory citizen science. However, as shown in this study, face-to-face interactions should not be neglected, as they are still crucial for reaching a wider audience, as well as for engaging with specific demographics, such as those from low social-economic groups. For effective participatory citizen science, and participatory research in general, a balanced, blended approach is needed – an approach that integrates both digital and in-person opportunities and that may facilitate opportunities to collect, analyse and make use of data during times of crisis and major disruption.

In order to build adaptive capacity for future crises, researchers have to be willing to deal with uncertainty, to develop their own adaptive response mechanisms, and to support citizens to do the same.

To conclude, we list nine broad guidelines for participatory projects that deal with major change and disruption during the lifespan of the project.

  1. Recruitment: Online-only recruitment and engagement cuts ties with low socio-economic groups and people who do not use online communication channels. Build and maintain connections with intermediaries of these groups when able to (re)connect, or, indeed, ask them to lead the way (for example, teachers, community workers).

  2. Recruitment: Allow a buffer for resource reallocation, in particular in the recruitment phase. Online engagement can be more costly and may necessitate upskilling, and changing approaches takes time.

  3. Uncertainty: Remain adaptive and open to the needs of all stakeholders (including team members and citizens), as priorities shift and change more rapidly during a disruption.

  4. Logistics: Tangible hooks will keep audiences engaged online and offline. Drop-off in online learning is common (for example, Kara et al., 2019); tangible hooks such as sensors or games will keep audiences engaged online and offline.

  5. Logistics: Bring individuals from a neighbourhood together online during the project to find common ground and work together. Grounded in social identity (Tajfel, 1974) and social cognitive/learning theory (Cialdini et al., 1990), norm setting and seeing ‘people like me’ in citizen science projects can motivate and inspire (Fogg-Rogers et al., 2021).

  6. Logistics: Make the live online environment intimate, for example, split participants into smaller groups/rooms during online sessions to allow everyone the chance to talk and share their thoughts; make space and time to get to know one another; break the ice.

  7. Digital skills: The type of disruption will determine the degree of online engagement. Regardless, upskill researchers and participants to be able to use online tools and platforms, as these are becoming increasingly common. From digital post-it notes to whiteboards, facilitator tools can be found online to recreate a similar feel to in-person engagement, albeit not the same.

  8. Blended approach: In-person kick-off workshops are ideal for building trust and managing expectations, but it is important to manage citizens’ expectations, and to be transparent about what is and is not possible during major disruption. Circumstances may change, and delivery partners need to be prepared for that.

  9. Blended approach: End the project cycle with an in-person celebration (or online, if in-person is not possible). This allows for closure as much as celebration (Atzmueller et al., 2014). These wrap-up events are also an opportunity to reflect on learnings and share information to keep citizens and stakeholders motivated long after the project ends.

While this article highlights the lengths that WeCount had to go to in adapting to the Covid-19 pandemic, the lessons learnt can be widely applicable to other citizen science and participatory research projects, in particular, those taking place at times of crisis, not just global pandemics such as Covid-19, but extreme events, such as natural disasters. More importantly, the findings can be applicable in any situation where a shift between physical and online contact is required or imposed. The recommendations made here can be used by both the researcher and the practitioner community.

Acknowledgements

The authors would like to express their sincere gratitude to the team who generously contributed their time and insights during the interviews.

Authors’ contributions statement

AMS led, designed and conceptualised the research, developed the interview questions, conducted the interviews, collected the results, analysed the results and led on paper writing.

SL contributed to writing the interview questions, to the analysis of results, the literature review and paper writing, editing and graphic design. Both authors contributed to the article and approved the submitted version.

Declarations and conflicts of interest

Research ethics statement

The authors declare that research ethics approval for this article was provided by the UWE Bristol Research Ethics Committee (FET 20.02.034).

Consent for publication statement

The authors declare that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.

Conflicts of interest statement

The authors declare no conflicts of interest with this work. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.

References

ACTION project. About ACTION, Accessed 19 March 2024 https://actionproject.eu.

Aerschot, LV; Rodousakis, N. (2008).  The link between socio-economic background and Internet use: Barriers faced by low socio-economic status groups and possible solutions.  Innovation: The European Journal of Social Science Research 21 (4) : 317–351, DOI: http://dx.doi.org/10.1080/13511610802576927

Ammendolia, J; Walker, TR. (2022).  Citizen science: A way forward in tackling the plastic pollution crisis during and beyond the COVID-19 pandemic.  Science of the Total Environment 805 149957 DOI: http://dx.doi.org/10.1016/j.scitotenv.2021.149957

Aono, Y; Kazui, K. (2008).  Phenological data series of cherry tree flowering in Kyoto, Japan, and its application to reconstruction of springtime temperatures since the 9th century.  International Journal of Climatology 28 (7) : 905–914, DOI: http://dx.doi.org/10.1002/joc.1594

Atzmueller, M; Ernst, A; Krebs, F; Scholz, C; Stumme, G. (2014). On the evolution of social groups during coffee breaks In:  Proceedings of the 23rd International Conference on World Wide Web. : 631–636, DOI: http://dx.doi.org/10.1145/2567948.2579234

Braun, V; Clarke, V. (2006).  Using thematic analysis in psychology.  Qualitative Research in Psychology 3 : 77–101, DOI: http://dx.doi.org/10.1191/1478088706qp063oa

Cappa, F; Laut, J; Nov, O; Giustiniano, L; Porfiri, M. (2016).  Activating social strategies: Face-to-face interaction in technology-mediated citizen science.  Journal of Environmental Management 182 : 374–384, DOI: http://dx.doi.org/10.1016/j.jenvman.2016.07.092

Cialdini, RB; Reno, RR; Kallgren, CA. (1990).  A focus theory of normative conduct: Recycling the concept of norms to reduce littering in public places.  Journal of Personality and Social Psychology 58 (6) 1015 DOI: http://dx.doi.org/10.1037/0022-3514.58.6.1015

Cohen, S; Herbert, A; Evans, N; Samzelius, T. (2017). From poverty to life chances: Framing co-produced research in the productive margins programme In:  Ersoy, A (ed.),   The Impact of Co-production: From community engagement to social justice. Bristol: Bristol University Press, pp. 61–84, DOI: http://dx.doi.org/10.2307/j.ctt22p7k63.11

Conrad, CC; Hilchey, KG. (2011).  A review of citizen science and community-based environmental monitoring: Issues and opportunities.  Environmental Monitoring and Assessment 176 (1) : 273–291, DOI: http://dx.doi.org/10.1007/s10661-010-1582-5

Crimmins, TM; Posthumus, E; Schaffer, S; Prudic, KL. (2021).  COVID-19 impacts on participation in large scale biodiversity-themed community science projects in the United States.  Biological Conservation 256 109017 DOI: http://dx.doi.org/10.1016/j.biocon.2021.109017

Deutsch, WG; Ruiz-Córdova, SS. (2015).  Trends, challenges, and responses of a 20-year, volunteer water monitoring program in Alabama.  Ecology and Society 20 (3) : 14. DOI: http://dx.doi.org/10.5751/ES-07578-200314

Dickinson, JL; Shirk, J; Bonter, D; Bonney, R; Crain, RL; Martin, J; Phillips, T; Purcell, K. (2012).  The current state of citizen science as a tool for ecological research and public engagement.  Frontiers in Ecology and the Environment 10 (6) : 291–297, DOI: http://dx.doi.org/10.1890/110236

Diethei, D; Niess, J; Stellmacher, C; Stefanidi, E; Schöning, J. (2021). Sharing heartbeats: Motivations of citizen scientists in times of crises In:  Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. : 1–15, DOI: http://dx.doi.org/10.1145/3411764.3445665

Dinneen, J. (2020).  Covid-19 can’t stop citizen science.  Undark, April 17 2020 Accessed 19 March 2024 https://undark.org/2020/04/17/covid-19-citizen-science/.

Fogg-Rogers, L; Hayes, E; Vanherle, K; Pápics, PI; Chatterton, T; Barnes, J; Slingerland, S; Boushel, C; Laggan, S; Longhurst, J. (2021).  Applying social learning to climate communications – visualising “people like me” in air pollution and climate change data.  Sustainability 13 (6) 3406 DOI: http://dx.doi.org/10.3390/su13063406

Goffman, E. (2005).  Interaction Ritual: Essays in face-to-face behavior. Aldine Transaction. New York: Routledge, DOI: http://dx.doi.org/10.4324/9780203788387

Grand, A; Sardo, AM. (2017).  What works in the field? Evaluating informal science events.  Frontiers in Communication 2 (22) DOI: http://dx.doi.org/10.3389/fcomm.2017.00022

Groves, RM; Fowler, FJ; Couper, MP; Lepkowski, JM; Singer, E; Tourangeau, R. (2004).  Survey Methodology. Chichester: Wiley-Interscience.

Haklay, M. (2013). Citizen science and volunteered geographic information: Overview and typology of participation In:  Sui, D, Elwood, S; S and Goodchild, M M (eds.),   Crowdsourcing Geographic Knowledge. Dordrecht: Springer, pp. 105–122, DOI: http://dx.doi.org/10.1007/978-94-007-4587-2_7

Hershberger, PE; Kavanaugh, K. (2017).  Comparing appropriateness and equivalence of email interviews to phone interviews in qualitative research on reproductive decisions.  Applied Nursing Research 37 : 50–54, DOI: http://dx.doi.org/10.1016/j.apnr.2017.07.005

James, N. (2017).  You’ve got mail! Using email interviews to gather academics’ narratives of their working lives.  International Journal of Research & Method in Education 40 (1) : 6–18, DOI: http://dx.doi.org/10.1080/1743727X.2015.1056136

Jasanoff, S. (2003).  Technologies of humility: Citizen participation in governing science.  Minerva 41 (3) : 223–244, DOI: http://dx.doi.org/10.1023/A:1025557512320

Jones, RT; Hadder, JM; Carvajal, F; Chapman, S; Alexander, A. (2006). Conducting research in diverse, minority, and marginalized communities In:  Norris, FH, Galea, S; S and Friedman, MJ; MJ, Watson, PJ PJ (eds.),   Methods for Disaster Mental Health Research. New York: Guilford Press, pp. 265–277.

Kara, M; Erdogdu, F; Kokoç, M; Cagiltay, K. (2019).  Challenges faced by adult learners in online distance education: A literature review.  Open Praxis 11 (1) : 5–22, DOI: http://dx.doi.org/10.5944/openpraxis.11.1.929

Kishimoto, K; Kobori, H. (2021).  COVID-19 pandemic drives changes in participation in citizen science project “City Nature Challenge” in Tokyo.  Biological Conservation 255 109001 DOI: http://dx.doi.org/10.1016/j.biocon.2021.109001

Koskinen, KU; Vanharanta, H. (2002).  The role of tacit knowledge in innovation processes of small technology companies.  International Journal of Production Economics 80 (1) : 57–64, DOI: http://dx.doi.org/10.1016/S0925-5273(02)00243-8

Kullenberg, C; Kasperowski, D. (2016).  What is citizen science? – A scientometric meta-analysis.  PLoS One 11 (1) e0147152 DOI: http://dx.doi.org/10.1371/journal.pone.0147152

Metcalfe, J. (2022).  Comparing science communication theory with participatory practice: Case study of the Australian Climate Champion Program.  Journal of Science Communication 21 (2) : A04. DOI: http://dx.doi.org/10.22323/2.21020204

Mirazchiyski, P. (2016).  The digital divide: The role of socioeconomic status across countries.  Šolsko Polje 3–4 : 23–52.

Narui, M; Truong, KA; McMickens, TL. (2015).  Independent study: How three doctoral students tackled issues recruiting participants and collecting data with historically underrepresented populations.  Journal of Critical Thought and Praxis 4 (1) : 5. DOI: http://dx.doi.org/10.31274/jctp-180810-39

Richter, CF; Lortie, CJ; Kelly, TL; Filazzola, A; Nunes, KA; Sarkar, R. (2020).  Online but not remote: Adapting field-based ecology laboratories for online learning.  Ecology and Evolution 11 (8) : 3616–3624, DOI: http://dx.doi.org/10.1002/ece3.7008

Rose, S; Suri, J; Brooks, M; Ryan, P. (2020).  COVID-19 and citizen science: Lessons learned from southern Africa.  Ostrich 91 (2) : 188–191, DOI: http://dx.doi.org/10.2989/00306525.2020.1783589

Sardo, M; Laggan, S; Fogg Rogers, L; Franchois, E; Bracke, A. (2021).  Deliverable 5.4: Part A – Final Summative Monitoring & Evaluation Project Report, Accessed 19 March 2024 https://zenodo.org/records/6337258.

Sardo, AM; Laggan, S; Franchois, E; Fogg-Rogers, L. (2022).  Reflecting on deepening participation in recruitment and evaluation in citizen science – lessons from the WeCount project.  fteval Journal for Research and Technology Policy Evaluation 53 : 20–32, DOI: http://dx.doi.org/10.22163/fteval.2022.568

Strasser, B; Baudry, J; Mahr, D; Sanchez, G; Tancoigne, E. (2019).  Citizen science? Rethinking science and public participation.  Science & Technology Studies 32 : 52–76, DOI: http://dx.doi.org/10.23987/sts.60425

Tajfel, H. (1974).  Social identity and intergroup behaviour.  Social Science Information 13 (2) : 65–93, DOI: http://dx.doi.org/10.1177/053901847401300204

Theocharis, Y; de Moor, J; Van Deth, JW. (2021).  Digitally networked participation and lifestyle politics as new modes of political participation.  Policy & Internet 13 (1) : 30–53, DOI: http://dx.doi.org/10.1002/poi3.231

Tong, A; Sainsbury, P; Craig, J. (2007).  Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups.  International Journal of Quality in Health Care 19 (6) : 349–357, DOI: http://dx.doi.org/10.1093/intqhc/mzm042

Van Haeften, S; Milic, A; Addison-Smith, B; Butcher, C; Davies, JM. (2020).  Grass gazers: Using citizen science as a tool to facilitate practical and online science learning for secondary school students during the COVID-19 lockdown.  Ecology and Evolution 11 (8) : 3488–3500, DOI: http://dx.doi.org/10.1002/ece3.6948

Wagenknecht, K; Woods, T; Sanz, FG; Gold, M; Bowser, A; Rüfenacht, S; Ceccaroni, L; Piera, J. (2021).  EU-citizen science: A platform for mainstreaming citizen science and open science in Europe.  Data Intelligence 3 (1) : 136–149, DOI: http://dx.doi.org/10.1162/dint_a_00085

WeCount. Live traffic counting by citizens, Accessed 19 March 2024 https://we-count.net.

ZOE. The power of community science tackling global health issues, Accessed 19 March 2024 https://covid.joinzoe.com.