Research article

Enabling interdisciplinary research capacity for sustainable development: Self- evaluation of the Blue Communities project in the UK and Southeast Asia

Authors
  • Fiona Culhane orcid logo (University of Plymouth, Plymouth, UK)
  • Victoria Cheung orcid logo (School of Biological and Marine Science, University of Plymouth, Plymouth, UK)
  • Melanie Austen orcid logo (School of Biological and Marine Science, University of Plymouth, Plymouth, UK)

This is version 2 of this article, the published version can be found at: https://doi.org/10.14324/111.444/ucloe.1970

Abstract

Global challenges such as climate change, food security and human health and well-being disproportionately impact people from low-income countries. These challenges are complex and require an international and transdisciplinary approach to research, with research skills and expertise from different disciplines, sectors and regions. In addressing this, a key goal of the research project, Blue Communities, was to create and expand mutual interdisciplinary capacity of both United Kingdom and Southeast Asian Partners. An existing questionnaire on research capacity was uniquely adapted to include interdisciplinary and international aspects and distributed for the first time as an online survey to the participants of the Blue Communities project comprising researchers across all career stages. Participants were asked about their perceptions of the research capacity and culture of their organisation, team and self and whether they believed any aspects have changed since their involvement with the project. Greatest improvement was seen at the self-level where results indicated a positive relationship between an individual’s current success or skill and their improvement over the course of the research project across 18 out of 22 aspects of research capacity for Southeast Asian, and two for UK respondents. The conflict between achieving research aims, building research capacity and making societal impact was evident. Institutional support is required to value these core aspects of interdisciplinary research.

Keywords: interdisciplinary, transdisciplinary, marine and coastal ecosystems, research culture, environmental sustainability

Funding

290 Views

Published on
04 Jul 2024
Peer Reviewed

 Open peer review from Amartya Nandi

Review

Review information

DOI:: 10.14293/S2199-1006.1.SOR-SOCSCI.AJGR3Y.v1.RQQKHX
License:
This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

ScienceOpen disciplines: Education , Earth & Environmental sciences
Keywords: environmental sustainability , Environmental science , research culture , interdisciplinary , marine and coastal ecosystems , Sustainability , transdisciplinary

Review text

Work is commendable, best of luck

reach at amartya.res@gmail.com for collab



Note:
This review refers to round 2 of peer review.

 Open peer review from Keisuke Okamura

Review

Review information

DOI:: 10.14293/S2199-1006.1.SOR-SOCSCI.AMJY73.v1.RVRCNM
License:
This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

ScienceOpen disciplines: Education , Earth & Environmental sciences
Keywords: environmental sustainability , Environmental science , research culture , interdisciplinary , marine and coastal ecosystems , Sustainability , transdisciplinary

Review text

This review focuses on the preprint entitled ‘ Enabling interdisciplinary research capacity for sustainable development: Self-evaluation of the Blue Communities project in the UK and Southeast Asia ’, authored by Fiona Culhane, Victoria Cheung and Melanie Austen [ 1 ].

Firstly, I commend the authors for meticulously reviewing my feedback [ 2 ] and the other reviewer’s [ 3 ]. They have earnestly engaged with points they deem useful and necessary, significantly enhancing the quality of research dissemination by appropriately revising the paper. Notably, the analysis section shows a noticeable improvement from the previous version, rendering it even more valuable.

Below, I present my thoughts on the revised version [ 1 ]. I conducted the review while adhering to the ‘ General Factors Ratings ’ outlined in the journal’s peer-review guidelines. I took note of each aspect: (a) Level of importance, (b) Level of validity, (c) Level of completeness and (d) Level of comprehensibility. To illustrate the relationship between each point raised and its respective aspects, I have indicated the corresponding aspect (a–d) at the end of each comment. I have not classified these into ‘Major’ or ‘Minor’ categories.

I acknowledge that there might be points in this review where the authors might not necessarily agree. If there are instances due to my misunderstanding or misinterpretation, there is no need to consider those. The decision on whether the authors should address my suggestions (and, if addressed, whether it is done appropriately) will be left to the overseeing editor unless specifically requested otherwise. Even if the overseeing editor deems the current version acceptable for publication without further revision, I have no objection.

A) Comments on the study’s validity and ‘Discussion’

  1. It is noted that responses were received from 56 out of approximately 115 individuals (line 165). It would be beneficial to understand the profile of the non-respondents, including information such as their country/region and sector involvement (academic, NGO, governmental, etc.). As this survey targets project participants, this information should be reasonably ascertainable. This clarification is crucial for discussing the validity of conclusions drawn from the responses of these 56 individuals. For instance, if the other approximately 55 non-respondents hold significantly different perceptions or inclinations, it could influence the assessment of the BC project, consequently affecting broader implications and discussions in line with those changes. Non-response is inevitable and typical in this type of survey research, but if there are aspects of sampling bias that can be discussed from the profile of non-respondents, they should be addressed. [a, b, c]
  2. Regarding the discussion on the ‘limitations’ in Section 4.3, further elaboration on the potential impacts or biases arising from these limitations on the outcomes and analyses of this study would be beneficial. [a, b, c]
  3. Regarding lines 613–614, the authors have not specifically detailed the ‘key lessons’ they have gleaned and how they intend to ‘modify’ or ‘enhance’ their approach. This point is crucial in this kind of survey research, and a more specific discussion would be beneficial. A structured subsection like ‘Lessons learned and implications for future projects’ could aid clarity. Related discussions are also found around lines 545–546. [a, c, d]
  4. There needs to be more clarity regarding the relationship between the ‘Discussion’ (Section 4) and the ‘Conclusion’ (Section 4.4). For instance, the content discussed towards the end of the ‘Conclusion’ section, highlighting issues within the current academic system, is not entirely evident if it directly derives, with persuasive evidence, from the survey conducted or is a kind of ‘Discussion’ segment. It would benefit from further consideration. [b, c, d]
  5. In line with the previous point and echoing comments [ 2 ] on the Version 1 preprint, it is essential to consistently acknowledge the limited scope of insights derived from the responses of these 56 individuals. The scope of conclusions drawn from this data is inherently limited, whether deductively or inductively. Caution must be exercised in drawing conclusions regarding the success or evaluation of the BC project, let alone in making definitive assertions about broader aspects like the nature of ‘interdisciplinary research’ or the ‘academic system’. Exhaustive discussion on internal and external validity is crucial to elevate the paper’s quality. Ensuring a robust discussion and adequately substantiating these aspects within the paper is essential to presenting a compelling argument. [b]
  6. As demonstrated in the current manuscript, combining quantitative and qualitative approaches is commendable and recommended. These two approaches should complement each other judiciously to address their respective limitations. While Section 4 shows instances where this combination is aptly done, there still seems to be an inclination towards an episode-driven discussion overall. Presumably, the authors aim to introduce discourse heard from participants and prior studies as corroboration or support after establishing quantitative evidence from the survey. At times, this intent might not be entirely clear. In this regard, two points are noted:
    • The frequent use of ‘most’ (respondents/markers/levels/parameters), echoing previous review comments [ 2 ], could benefit from specifying figures (‘how many out of how many’ and/or ‘%’). Alternatively, referencing a corresponding table might aid in maintaining the link between the quantitative analysis and qualitative discussion more appropriately. Especially with ‘most respondents’, it is crucial to scrutinise whether it genuinely applies to the population under consideration for deriving various conclusions and implications. [c, d]
    • Additionally, there appear to be instances where individual anecdotes or specific participant remarks more or less determine the overall project evaluation (success) or situation, not necessarily aligning effectively with the quantitative analysis results (typical instances being lines 473–479). Reassessing the appropriateness of such descriptions is advisable. Distinguishing between success experiences or lessons learned at an individual respondent level (obtained from personal responses) and those pertinent to the project as a whole (derived or inferred from statistical approaches) while effectively integrating them in discussions would be advantageous. [b, c]

B) Comments on the presentation of analysis results

  1. The arrangement of the bar graphs in Figures 1 to 5 needs more clarity in their sequence from top to bottom. Being clear about the order of the bars is fundamental. Usually, this information is discernible from the figures but is unclear in the current manuscript. [d]
  2. A minor point regarding Figures 3 (b) and 4 (b) is that reconsidering the horizontal bar graph domain might be advisable. If my understanding of this bar graph is correct, the graph scale extends to a theoretically unattainable upper limit, which might not be scientifically reasonable. [d]
  3. Figures 7 to 9 offer a clearer and significantly enhanced visual representation, marking the most noticeable improvement compared to the Version 1 preprint. To enhance the clarity of these diagrams, the following point should be noted. Line 157 mentions the ‘trend line’, referring to two line segments in each panel of Figures 7 to 9. However, it is not evident what the slope and intercept of these lines signify. An explanation of this point would help readers to understand how these relate to the displayed ‘R’ values. (Note that the ‘R’ value does not represent the slope of the ‘trend line’.) [d]
  4. Although unnecessary to include in the paper, as previously suggested in the review [ 2 ] of the Version 1 preprint, the overlaying histograms for Southeast Asia and the UK corresponding to Figures 7 to 9 for each variable could provide useful insights. If not already attempted, it is worth exploring as it would reveal additional insights not visible from scatterplots.
  5. The authors employ Fisher’s exact test regarding hypothesis testing. Note that Fisher’s exact test is valid when both row and column totals are fixed by design. However, considering the conclusions and discussions the authors aim to draw, it appears they seek to generalise beyond the 56 respondents’ group. Providing clarification on the intended ‘population’ for drawing conclusions and assessing whether the use of Fisher’s exact test remains sufficiently appropriate would be beneficial. [b, c]
  6. Lines 311–313, 347–349 and 383–386 conclude that ‘[…] improved from a lower success or skill level to achieve the same success or skill level that UK respondents/teams/institutions started the project with’. However, this assessment might not be entirely suitable. From the questionnaire, it appears respondents from Southeast Asia evaluated changes in their success or skill levels primarily concerning their own past, not necessarily making direct comparisons with the success or skill levels of UK respondents. [b]

C) Miscellaneous

  1. The data and its metadata on the open repository appear to remain unchanged. It would be advisable to ensure their update coincides with the publication of the latest version. [c, d]
  2. Accruing experience in critically analysing research from diverse perspectives is crucial in academia and society. This belief aligns with the asserted importance of interdisciplinary capacity building highlighted in the authors’ manuscript. I hope that an open platform like UCL Open: Environment , where diverse opinions are exchanged, and various perspectives are considered, fosters a culture that promotes interdisciplinary collaboration. I desire the authors’ paper to be appropriately published and contribute to the advancement of social sciences and better academia-society relationships as envisioned by the journal.

Keisuke Okamura

Washington D.C., USA

17 th December 2023

References

[ 1 ] Culhane, Fiona E. and Cheung, Victoria and Austen, Melanie (2023). Enabling interdisciplinary research capacity for sustainable development: Self-evaluation of the Blue Communities project in the UK and Southeast Asia, 2017-2021. https://doi.org/10.14324/111.444/000189.v2

[ 2 ] Keisuke Okamura (2023). Review of ‘Growing interdisciplinary research capacity for sustainable development: Self-reported evaluation’. https://doi.org/10.14293/S2199-1006.1.SOR-SOCSCI.APE1TG.v1.RRZRYX

[ 3 ] Carla-Leanne Washbourne (2023). Review of ‘Growing interdisciplinary research capacity for sustainable development: Self-reported evaluation’. https://doi.org/10.14293/S2199-1006.1.SOR-SOCSCI.AHPMPZ.v1.RMKJUG



Note:
This review refers to round 2 of peer review.

 Open peer review from Keisuke Okamura

Review

Review information

DOI:: 10.14293/S2199-1006.1.SOR-SOCSCI.APE1TG.v1.RRZRYX
License:
This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

ScienceOpen disciplines: Education , Earth & Environmental sciences
Keywords: environmental sustainability , Environmental science , research culture , interdisciplinary , marine and coastal ecosystems , Sustainability , transdisciplinary

Review text

This review focuses on the preprint titled ‘ Growing interdisciplinary research capacity for sustainable development: Self-reported evaluation ’, authored by Fiona Culhane, Victoria Cheung, and Melanie Austen [ 1 , 2 ].

(1) General comments

The objective of the authors’ manuscript is to quantitatively and/or qualitatively validate the effectiveness of the interdisciplinary collaborative approach, specifically the ‘learning-by-doing’ method, implemented in the Blue Communities (BC) project. This validation is accomplished through surveys and questionnaires administered to participants. The ultimate goal is to gain insights and draw lessons from the research findings, with the aim of enhancing and advancing capacity building through interdisciplinary collaboration in various other present and future cases.

However, the current manuscript needs improvement in several aspects from both internal and external validity perspectives. I hope that the authors will find my comments in this review helpful and incorporate them to enhance the next version of their work. If there are any comments that may be deemed unacceptable due to my misunderstanding or lack of comprehension, the authors are free to disregard them. Moreover, since some comments regarding data presentation and other aspects have already been provided in another review [ 3 ], I will refrain from repeating them here. In fact, I concur with many of the points raised in Ref. [ 3 ]. I recommend referring to it as well and utilising its suggestions to improve the manuscript.

(2) Internal validity perspective

The authors’ decision to publicly share the collected data [ 2 ], along with the questionnaire used, is commendable in terms of promoting transparency and reproducibility. This practice is highly recommended as it enhances the value of open peer-review.

Regarding the adequacy of the questionnaire’s development, I will not address that issue in this review. However, it appears that the authors may not be fully utilising the obtained data. Each respondent answered multiple questions, encompassing individual-level, team-level, and organisation-level aspects, as well as questions regarding attributes and demographics like gender, country of affiliation, and research career stage. As a result, numerous cross-tabulations or regression analyses could be conducted among these question responses.

For instance, attributing a specific perception exhibited by early career researchers in response to a particular question solely to the brevity of their career can be misleading. In reality, it could be influenced by the research environment they are situated in or even their predisposition. Additionally, exploring the correlation between responses to individual-level questions and those related to team-level or organisation-level inquiries would be beneficial. By further harnessing the response data in this manner, it is worth considering methodologies that can yield more dependable and conclusive analytical outcomes.

Regression analysis is a useful tool that can provide valuable insights beyond simple cross-tabulation. However, it is essential to ensure that the regression model satisfies various conditions necessary for causal inference. While acknowledging the limitations, conducting regression analysis can yield helpful insights. Using the CSV data publicly available in Ref. [ 2 ], I present below an example of regression analysis that I have attempted and found suggestive. I hope you find it beneficial.

In this example, I will outline a method for capturing the factors or drivers of perception formation regarding the effectiveness of the BC project at the individual level using regression analysis. This is likely a significant concern for the authors as well. The authors have collected response data for various aspects (e.g. ‘Finding relevant literature’, ‘Critically reviewing the literature’, etc.) in terms of ‘personal current success or skill level’ (rated on a scale of 1–9, hereafter denoted as ‘ x ’) and the perception of how each aspect changed as a result of the BC project (rated on a 5-point scale from ‘much worse’ to ‘much better’, hereafter denoted as ‘ y ’). Therefore, a simple regression model can be formulated with ‘ y ’ as the response variable and ‘ x ’ along with other variables such as gender, country of affiliation, employment type, and research career stage as explanatory variables. If the variable ‘ y ’ is coded as ordered integers, such as 1–5 or –2 to +2, an ordered logit model would be suitable for the regression analysis.

Considering that the number of valid observations (respondents) is relatively small (around 50), it is not feasible to include a large number of explanatory variables. Thus, it becomes crucial to carefully select the most important explanatory variables. With regards to the research career variable, it would be preferable to treat it as individual dummy variables based on their respective values rather than as a single ordered variable. However, this approach may result in an excessive number of explanatory variables compared to the number of observations, making it impractical for this study. Therefore, it is important to strike a balance between the number of explanatory variables and the available sample size in order to ensure the reliability and validity of the regression analysis.

When actually conducting the regression analysis at the individual level, it becomes evident that, in the case of most y -variables, the corresponding x -variables or country of affiliation (either individually or in combination) demonstrate statistical significance, while other variables do not. This process enables a more reliable comprehension of the factors that influence changes in each y -variable. By meticulously interpreting the outcomes from both qualitative and empirical perspectives, deeper insights into the effectiveness of the BC project can be obtained. Consequently, this would further augment the value of the paper by providing a more comprehensive understanding of the project’s impact.

If conducting regression analysis poses challenges, I suggest conducting more detailed descriptive analyses as a minimum. To gain further insights, it would be beneficial to create histograms for different attributes such as gender and country of affiliation, and overlay them for comparison, for both the x and y variables defined earlier. Additionally, creating scatterplots with corresponding x and y pairs as axes would provide valuable information. To account for the discrete nature of the responses, consider introducing jittering. Varying the marker styles in the scatterplot based on attributes like gender and country of affiliation and overlaying them for comparison is also recommended. These visual representations will enhance the understanding of the data and facilitate comparisons across different attributes.

Performing these analyses, including overlaid histograms and scatterplots, will indeed offer a clearer understanding of the distributions of perceptions ( x and y ) across different attributes, as well as the associations between x and y within various attributes. These insights can yield valuable findings and contribute to a more comprehensive understanding of the data. Furthermore, conducting these analyses can serve as valuable preparation for the previously mentioned regression analysis. I highly encourage the authors to explore these visualisations if feasible, as they can provide valuable insights and enhance the overall analysis of the data.

(3) External validity perspective

Overall, I get the impression that the authors’ manuscript resembles more of a report on the organisational activities or activity records specifically of the BC project, rather than a research paper that contributes to the academic knowledge base or provides lessons for other (future) cases.

In the Discussion and Conclusion sections, there are instances where the authors attempt to extrapolate their analysis and interpretations to general theories related to career development, the academic environment, or interdisciplinary collaboration, in an effort to draw significant implications. However, there seem to be logical leaps in many parts of the manuscript. For example, generalising specific comments from certain individuals in the open-ended responses and using them to justify the overall evaluation of the BC project or attempting to derive universal conclusions lacks sufficient credibility.

It is important to ensure that the conclusions drawn in the manuscript are supported by robust evidence and rigorous analysis. Additionally, generalisations should be made cautiously, considering the limitations of the study and the specific context of the BC project. Providing a clear rationale and using appropriate references or theoretical frameworks can strengthen the credibility and reliability of the manuscript’s conclusions.

From that perspective, I suggest revisiting the Discussion and Conclusion sections and carefully examining the descriptions regarding the level of external validity. If the goal is to produce impactful content that is relevant to a broad audience, as advocated by UCL Open: Environment , it is desirable to uncover valuable insights beyond the specific BC project. By doing so, this paper will become even more valuable as a publication in the journal, as it will offer insights and lessons that can be applied to a wider range of contexts and contribute to the broader academic knowledge base.

(4) Miscellaneous

  1. It is recommended to include the actual number of respondents alongside the response rate in Tables 1 and 2. This will provide readers with a better understanding of the sample size and the proportion of participants who responded to the survey.
  2. In my view, the numbers indicated with ‘%’ next to the bar graphs in Figures 1–5 should be removed. These numbers can be misleading and confusing since they do not correspond to the length of the bars.
  3. When stating phrases like ‘Most respondents felt...’, it is advisable to quantify the extent of ‘most’ using the format like ‘X out of Y respondents (Z%)’. This will provide a quantitative representation and enhance the clarity of the statement.
  4. In Section 2.2 (Questionnaire), it is necessary to provide more specific and detailed explanations about the methods of questionnaire development, distribution, and collection, including the survey duration.
  5. In the Supplementary Material and the file named ‘Survey_Questions.pdf’, each question item should be labelled with a number or symbol for individual identification, and it is strongly recommended to ensure a one-to-one correspondence between each response in the data file.
  6. In the response data (CSV file), it seems that responses related to age groups and sectors of affiliation have been removed. If there is a deliberate reason for omitting these responses, it should be mentioned in the document to avoid any confusion or misinterpretation.
  7. It would be advantageous to provide a more compelling rationale for the relative superiority of the ‘learning-by-doing’ approach compared to other approaches. While it is generally expected that any approach implementing in a project could yield positive outcomes, the key lies in demonstrating how the benefits derived from the ‘learning-by-doing’ approach outweigh those that would have been obtained through alternative approaches. This will strengthen the argument and provide a clearer understanding of why the ‘learning-by-doing’ approach is recommended.

(5) Overall impression

This manuscript has the potential to significantly enhance its academic and societal value by conducting a more comprehensive analysis considering both internal and external validity. To achieve this, it would be beneficial to provide meticulous descriptions of the approach employed to draw conclusions, ensuring transparency and clarity. Additionally, improving the methods of data visualisation, presentation, and delivery will contribute to a more effective communication of the research findings. By implementing these enhancements, the overall quality and impact of the paper can be greatly improved, leading to a more valuable contribution to the academic and societal discourse.

References

[ 1 ] Culhane, Fiona E. and Cheung, Victoria and Austen, Melanie (2022). Self-reported Change in Research Capacity Following Participation in an Interdisciplinary Research Project, 2017-2021. https://doi.org/10.14324/111.444/000189.v1

[ 2 ] Culhane, Fiona E. and Cheung, Victoria and Austen, Melanie (2022). Self-reported Change in Research Capacity Following Participation in an Interdisciplinary Research Project, 2017-2021. [Data Collection]. Colchester, Essex: UK Data Service. https://doi.org/10.5255/UKDA-SN-856101

[ 3 ] Washbourne, Carla-Leanne (2023). Review of ‘Growing interdisciplinary research capacity for sustainable development: Self-reported evaluation’. https://doi.org/10.14293/S2199-1006.1.SOR-SOCSCI.AHPMPZ.v1.RMKJUG

Keisuke Okamura

Washington D.C., USA

15 th July 2023



Note:
This review refers to round 1 of peer review and may pertain to an earlier version of the document.

 Open peer review from Carla-Leanne Washbourne

Review

Review information

DOI:: 10.14293/S2199-1006.1.SOR-SOCSCI.AHPMPZ.v1.RMKJUG
License:
This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

ScienceOpen disciplines: Education , Earth & Environmental sciences
Keywords: environmental sustainability , Environmental science , research culture , interdisciplinary , marine and coastal ecosystems , Sustainability , transdisciplinary

Review text

Many thanks for the opportunity to review this interesting article, based on a survey of experiences from participants in a large-scale, interdisciplinary, cross-country research and capacity building project. The article fits within the scope of UCL Open: Environment and provides novel information and insights likely to be of interest to the wider community. I have provided some general and detailed comments and feedback below. Overall, my sense is that the article would be strengthened by relatively minor revisions, mostly focussed on data analysis and presentation.

Comments:

Title: I am not convinced that the title fully does justice to the scope and nature of the article. I would be tempted to switch out ‘Growing’ for a different term that sounds more purposeful / intentional like ‘Enabling’ or ‘Supporting’. While I realise it makes it a lot longer, the sub-title could be rephrased to better capture 1) the nature of the evaluation 2) the link with the project 3) maybe the counties involved e.g. : self-evaluation of the ‘Blue Communities’ project [in the UK and Southeast Asia]

Abstract:

  • ODA funding might need further explanation, or perhaps don’t include in the abstract and only in the body text
  • Line 50: ‘Results were mainly positive’ is too vague here. What do you mean by ‘positive’?
  • Is it possible to include any of the more detailed insights / findings?

Body text

  • Line 63: Why was the UoL 2026 strategy highlighted here? It is a good example, but feels a little arbitrary as so many institutions have similar strategies. Was this one particularly ground-breaking?
  • Line 71 / 72: Consistency with the terms ‘capacity strengthening’ and ‘capacity building’. Make it clear if there is an intended difference between the two uses here and throughout the paper.
  • Line 94 / 95: It would be good to have a footnote with a bit more detail about the nature of GCRF. Especially if the reference to ODA funding is retained in the abstract.
  • Line 106: As above re: GCRF. It would be good to be a bit more explicit about the way in which the nature of this funding influence the scope and approach of the project
  • Line 114: Were these funding calls based on the redistribution of funds already won for the project, or was this something in addition?
  • Line 140: More detail about the survey is needed here. Who developed the survey (Author FC is stated in the author contributions, were their other contributors)? Who distributed the survey and how (via email to the project members, during meetings, through newsletters, via social media etc.)? How long was the survey open for / when did you decide how to close the survey?
  • Line 143: Explain more here how you defined the different career stages (shown in Section 1 of the survey). You do come back to this later in the paper, but it would be helpful to have a brief sense here of what the categories were and how you chose to define these stages.
  • Line 184: What kinds of ‘other’ institutions were represented?
  • Table 2: Check final formatting of table, as the separation between the definitions in the final row could be made clearer
  • Results: Did you use any more detailed statistical approaches to explore the correlations and differences between responses? This strikes me as being especially useful in the case of the data currently presented in Figures 6-9. These are really nicely illustrative, but a further exploration of the correlation would be very instructive.
  • Figure 1, 2, 3a, 4a, 5a: Reconsider the colour scheme used here. The ‘UK and other European’ category would benefit from being recoloured in something more contrasting
  • Figure 3b, 4b, 5b: Reconsider the colour scheme used here. It would be better to use a scheme more clearly distinct from the accompanying charts to the left.  Perhaps it would be better onscreen to use a gradient based on one colour for the categories in (a) and a different colour for (b).
  • Figures 6-9: Why are the categories in reverse order top to bottom? (i.e. ending with A rather than starting)
  • Line 251: See also comments below re: creating a separate section for ‘Limitations of the study’
  • Figures 7-9: For quick visual communication of the results, I would advise keep the x axis consistent on all charts even where there is no data (e.g. some of the (b) charts showing the difference in response)
  • Line 429-430: By ‘this study’ you are referring to the project overall, rather that this manuscript?
  • Line 491-492: I think it is fine to be more definite here. There is plenty of evidence that the value placed on different aspects of the University differs from place to place.
  • Line 503-504: Not absolutely necessary to include, but notable that there are some attempt to change this through the links between REF and Knowledge Exchange Framework (KEF)
  • Line 564-565: As above re: explaining a little more about GCRF and ODA link
  • Line 595-603: I would personally relocate this from the conclusion to a separate section on limitations of the study. This could do right at the end of the methods, or discussion, depending on the preferences of the author. Add in the point about the survey only being available in English.


Note:
This review refers to round 1 of peer review and may pertain to an earlier version of the document.